85” TV

Joined
31 May 2016
Messages
15,782
Reaction score
2,371
Country
United Kingdom
I’m tempted to swap my projector for a big TV as it a 1080p 3D model and I’m thinking some of these cheaper 4K models (under £3k) might be more practical. Current screen is about 120” so I think it will be similar.

I need one with optical out to work with my old AV amp. I’ve heard the picture on very large TVs isn’t so great. Any experience or model recommendations?

thanks
 
The screen size won't be anywhere near the same. 120" is roughly 6000 square inches. 85" is roughly 3000 square inches. Imagine moving from your current living room to one that's half the floor area. Would you really think it's "nearly the same"?

If you want to check the maths yourself then here are the numbers:

120" diag 16:9 = 105" x 59" (w x h)

85" diag 16:9 = 74" x 42" (w x h)

Two points to note: Projector screen sizes are often quoted as a nominal figure , so you could well find that the diagonal is 115" / 116" / 117" when measured for real.

All figures are rounded to the nearest inch.

Making comparisons between a projector and a TV will obviously be affected by the quality of each component. The following comments then presume a decent projector from Optoma / Epson / JVC / Sony / Sim2 versus a good LCD-LED TV from Sony / Panasonic or Samsung.

In general, projectors have better scaling engines than TVs with SD and HD material. The reason is to do with the different applications of each tech.

A TV is for general viewing where the relative screen size versus viewing distance means that the screen fills only a small portion of our field of view.

Where a projector is installed in the same room, we tend not to change the seating position when viewing a projected image. Also, it's usual to go for a much bigger image than a TV can produce. This means the projected image fills more of our eyeline, so any shortcomings in scaling will be more evident than with a TV. Therefore a projector needs a better scaling engine if the massively larger image is to stand up in comparison against the smaller TV image where the defects are more easily hidden.

Clearly though, TVs have some advantages: 4K is standard display res on TV sets now. Higher-end LED TVs will make a brighter image. The colour saturation doesn't depend so much on the ambient light, so the pictures can look more colourful (but not necessarily natural).


Set against that, the cost of TVs rises exponentially above the 'core' size of 65" due to manufacturing costs for smaller volumes.

Once you get in to looking at TV models you're going to find that there's a lot more to buying a screen than brand, size, price and the apps it features.

You'll need to get familiar with the crucial difference between the motion processing rate (a manufactured number and largely meaningless) and the far more important native panel refresh rate which will be either 50/60Hz (poor) or (better) 100/120Hz.

You'll also need to pay attention to whether the TV has the ability to dim the image depending on what's happening with the picture. Edge-lit panels can dim the entire screen as one. This is fine if the entire image is light or dark, but not so useful for images with a combination of both light and dark such as say people sat round a camp fire at night.

Another type of back-light has the lighting directly behind the panel, but lacks any dimming ability at all. The advantage it offers is more even illumination (its less patchy in dark scenes compared to edge lit), but it can't react to picture content.

The best is something called FALD. This stands for Full Array Local Dimming. Here the lights are behind the panel and split in to zones. Each zone can be dimmed independently of its neighbours. It won't come as a surprise that the more you spend on a set with FALD then the greater the number of zones and the better the set handles mixed lighting screens.

Next, there's the panel's bit depth. This determines how many colours the panel can actually display. What's really important here is how the panel handles subtle colour gradations at the darker end of the light spectrum.

Think about the graduated brightness of an evening sky. Where the TV has just an 8bit display you will see colour bands because there's not enough colour resolution in the display to render a smooth image. The best TVs have a true 10bit panel, but it comes at a price. They aren't cheap. Everything in between uses a half-way house technique called dithering.

Dithering is where a TV can't reproduce a specific colour shade, so the panel quickly alternates the pixels between two colours it can reproduce directly. Done fast enough the two colours appear to merge and become the colour in between. Panels that use this technique will be referred to as "10bit (8 +2FRC)". You’ll still get some colour banding but it won't be as pronounced as a direct 8bit panel displaying full colour range 4K images.

If that all wasn't enough, there's the question of which HDR formats the TV will support.

The broadcast TV HDR format known as Hybrid Log Gamma or HLG for short. This is used by Sky on its premium UHD sports channels and UHD downloads. Virgin only has this on its BT Sports Premium channel. Freeview doesn't yet have any live UHD broadcasts, so any support tends to be focussed on UHD downloads.

When watching streaming (Amazon Prime, Netflix Premium, Disney+, and Apple's premium streaming service) you will encounter 3 systems. HDR10, HDR10+, and DolbyVision.

HDR10 is the core (fall back / basic) HDR system. It's decent, and a noticeable improvement over stuff without HDR, but its limitation is that it can't adapt to make the most of the dynamic range as picture content changes.

Both HDR10+ and DolbyVision take the basic HDR10 format and add the ability to change dynamically. The good news is that any program recorded with either HDR10+ or DolbyVision shown on a TV that doesn't support that format will use the fallback of core HDR10.

Currently, IIRC, only Panasonic supports both HDR10+ and DolbyVision in a single TV. It's politics and cost. HDR10+ is licence-free but is a Samsung originated format, so some brands reject it on principle. DolbyVision requires a licence fee payment from the TV manufacturer, and so it makes the tellies a bit more expensive and hence a little less competitive in the cut-throat TV world.

Whilst on the subject of HDR, there's also the question of how bright the TV can get when displaying HDR images. This is a big problem for cheaper/smaller UHD 4K TVs. There aren't enough backlights fitted to make the image the sort of brightness it needs to be for HDR to impress. What these lower-grade sets have then is either HDR that looks very much the same as SDR, or the LEDs that are there are overdriven which results in premature failure. Cheaper LG sets and the odd Samsung fitted with an LG screen (yes, that happens) have this problem. It should be less of an issue with TVs at £2000-£3000, but it's still worth checking the light power figure (nits). If you're seeing 400-600 nits then that's not really good enough. 800-1000 is about where you need to be.

Pulling most of this together, have a look at the Sony KD85XH9505 at £2999. It has a 10bit panel, 100Hz native refresh rate, support for HLG, HDR10 and DolbyVision and HDR brightness of around 900-1000 nits.
 
I have the 65" version of the Sony that Lucid mentions. They aren't cheap and I toyed with going for a cheaper alternative but they truly are incredible screens and I am glad I stuck with it. When the extension is finished this will be moved into "my room" and we will be getting the larger model for the living room.

The picture quality is soo crisp. Everyone comments on it. I dont change TV's regularly so the cost Vs enjoyment is worth while. I cant recommend them highly enough.
 
It took me ages to get Mrs Mottie to agree to change up from a 36” screen with a big fat border to a 40” one with a thin border as there wasn’t much difference in overall size. I’ve been trying for a couple of years to creep it up to a 43” but she’s not having any it. :cautious:
 
It took me ages to get Mrs Mottie to agree to change up from a 36” screen with a big fat border to a 40” one with a thin border as there wasn’t much difference in overall size. I’ve been trying for a couple of years to creep it up to a 43” but she’s not having any it. :cautious:

Lol, the amount of people that have this exact scenario.. don't know what it is with women and insistance on small TV's, elsewhere in life they usually want bigger... Ooh er :oops:
 
Last edited:
Lol, the amount of people that have this exact scenario.. don't know what it is with women and insurance on small TV's, elsewhere in life they usually want bigger... Ooh er :oops:
Yep, I just can’t tempt her with an offer of an extra 3 inches. :whistle:
 
How would the Sony fair against say an Epson EH-TW7000? I currently have the 5900 model. One of the reasons for going tv was the audio out toslink option as it does away with the need for hdmi on the amp and a separate media streamer.
 
I wouldn't buy any TV - and I do mean any TV - on the strength of its apps. Yes, I understand the appeal of the convenience aspect. But if history continues to show us anything it's that TV manufacturers don't have the best track record when it comes to the usability of their apps; and they're pretty hopeless about longer-term support beyond the TV's 12-24 mnth warranty period.

Streaming boxes and HDMI sticks do it slicker, faster and with better long-term support than the TV manufacturers for the simple reasons that (a) they have deeper pockets, (b) they aren't stuck with the hoarding/castle mentality of TV manufacturers, and (c), they want you to subscribe to their additional services. You won't do that if the user experience is pants; so they make sure it's good and that it stays good.

My Panasonic GX800 TV has support for a broad range of apps. My go-to device though for streaming is the Firestick. On this I have access to all the catch-up TV services, and Prime, and Disney+ and Netflix. There's also the app market which isn't restricted to what the TV manufacturer's limited hardware can support. I can also side load apps from other sources so long as they're supported by Android.


Regarding the projector-vs-TV comparison; well, none of the projectors below about £5,000 are true native UHD displays. They all use some form of image manipulation to make the image chips which aren't UHD res (3840x2160 pixels) display a UHD picture. All achieve this, but some are more successful than others.

The entry-level Epson 7000 is £1200-ish...... and it's entry-level. That bit is important because it tells you that Epson's priority here wasn't so much picture quality but making an Epson product to compete with every other brands' entry-level projector. This means you're pitching "the-cheapest-way-we-could-do-it" projector against a mid-range-made-for-picture-quality Sony TV. The prices and the aims of the products are a country mile apart.

Compared to your previous entry-level Epson 5900 then the newer Epson is a diagonal move. A lot has changed in 12 years. Projectors are brighter and you've got features such as lens shift which is really useful. TBH though, if your TV budget is up-to £3000 then why are you pitching a £1200 projector against it when it has to make an image with double the surface area? The Epson 9400 is closer to where you need to be.
 
Part of the problem is a lack of hdmi support in my amp. So if I went down the fire stick route I’d need a new amp with hdmi
 
My go-to device though for streaming is the Firestick. On this I have access to all the catch-up TV services, and Prime, and Disney+ and Netflix. There's also the app market which isn't restricted to what the TV manufacturer's limited hardware can support. I can also side load apps from other sources so long as they're supported by Android.

A full year after swapping a Firestick, for a Firecube on the living room LG TV - last week I found I could operate the Cube with the LG's Magic Remote. Previously, each time I decided to use the Firecube, I had gone seeking its own remote.
 
Part of the problem is a lack of hdmi support in my amp. So if I went down the fire stick route I’d need a new amp with hdmi

There are ways around this. Before I get in to that though let me just say that I feel your pain. There comes a point where we have to move on to newer gear.

In my own situation I had to replace a lovely TAG McLaren AV Pre-amp (£2200) and stack of power amps (£2k) with a AV receiver because the cost and complexity of solution to work around the lack of HDMI support were just getting out of hand for a living room surround system.

Coming back to your plan to use optical out on the TV, currently there's a problem with audio lip sync and Dolby Digital when going this route with sound via the TV.

It came to light because of the increasing use of sound bars. Lots are connected to TVs via optical. Do some reading around the topic. You'll find that it affects internal apps as well as external inputs. The timing errors can't be fully trimmed out with an amp or sound bars lip sync adjustment.

Changing to PCM Stereo works around the issue but at the cost of losing DD5.1

Currently the TV manufacturers are aware of the issue but don't have a fix for it. This has been going on for a couple of years. It might change your plan to use optical out on the TV.

The other alternative to changing to a HDMI amp is to use an audio de-embedder / audio extractor. In simple terms this has a HDMI pass-through and either an optical or coax out for DD/DTS 5.1

There are various versions of these depending on what HDMI features are required to be supported. Cheap ones for under £30 won't do the full range of HDMI 2.0 4K UHD features such as HDR. This means losing the benefit of HDR10 and DolbyVision because they only do the resolution.

There are better devices out there though; this one from Bluestream for example. Not only will it do full-fat 18Gbps HDMI 2.0, but it also has dual HDMI outs. This is useful where a system has both a TV and a projector. Here's the link.

https://www.futureshop.co.uk/blustr...1xDOTMZ07yrrO81SJB8HcJQvdlDNeP3MaAjPvEALw_wcB
 
Now that looks handy. Fire stick in the back and coax digital to the amp. Assuming I can still get such a cable. My amp does have centre delay capability or is it the other way around?

I’m getting the idea that you think the new tv option is a bad idea.

I quite fancy upgrading to the
Yamaha RX-A2080
 
Optical and coax cables are going to be around for some time to come. They have wider applications than just TV audio.

Centre channel delay: if we're talking about the distance/time compensation then you have that on all the speaker channels, but it won't fix lip sync. It's there to align the speaker sound for each channel so that the system images and steers audio accurately. Once set, this doesn't alter unless you move the speakers or change the seating distances.

Lip sync is the sound to picture synchronisation, and it can vary by source and signal type. For this reason your new amp will allow multiple settings relating to the various inputs.

Re: the TV idea. All I am doing is laying out the pros and cons. A TV might still be your best choice depending on your priorities, particularly if measured against an entry-level pseudo-UHD projector.
 
Back
Top