4K consumer televisions are becoming more common, and again, much less expensive. In late 2012, there were only two 4K televisions for sale in the U.S. market, and they were priced at $20,000 and $25,000 respectively. Today, the average selling price for an Ultra HD TV (the new name for 4K video) in North America is just under $2,000, and 50" Seiki and TCL models can be had from Amazon for under $450. Vizio has just started shipping its P-series Ultra HD TVs, which are claimed to be comparable to more expensive models from the top manufacturers; its 50" model sells for $1.000.
The better models from the first tier TV manufacturers (including Vizio) should have excellent picture quality, refresh rates of 120Hz or more, and a good upscaler that resizes conventional HD video to Ultra HD without distortion. However, independent researchers have found that, at the normal distances that viewers sit when watching their televisions, there's almost no meaningful difference in the perceived quality of a HDTV and Ultra HD picture. This chart explains how it works:
There was a huge jump in quality between analog TVs and even 720p HDTV. If you had a 50" set, you could see the full difference at 10 feet; with 1080p, you saw the full benefit over 720p at about six feet. However, with Ultra HD, you won't even begin to see any improvement over HD until you're about five feet from the TV, and you won't get the full benefit until you're only about 3 1/2 feet away (a little more than a meter.) At that distance, the television picture is filling most of your field of vision. So, I'm not planning to buy any of this generation of Ultra HDTVs. The reason is that there's a new technology not too far down the road that will provide a much more dramatic improvement over conventional HD picture quality than Ultra HD provides by itself.
This new technology is called High Dynamic Range, or HDR. HDR expands the contrast range of television pictures. Imagine that you're outside on a bright sunlit day. You can see well-illuminated objects quite clearly, and you can also see what's in shadow. That's because your eye has a contrast range of about 1,000,000:1 (20 f-stops.) LCD televisions have a much lower contrast ratio--Rtings.com tested a variety of 2014 HDTVs and found that the highest contrast ratio, 5,268:1, was measured on a Toshiba L3400U. Manufacturers like to claim much higher ratios--for example, in its current E-series, Vizio claims a contrast ratio of 500,000:1, but Rtings.com measured it at 4,581:1. Still very good for a current-generation HDTV, but less than 1% of the advertised contrast ratio.
Even video cameras don't have the same contrast range as the human eye. The Arri Alexa XT. one of the most popular cameras for episodic television and high-end movie production, has a 16,384:1 contrast range. However, through HDR technology the contrast range can be extended significantly, to as much as 262,144:1 (18 f-stops.) That's still not as wide as what the eye can see, but it's dramatically better than anything ever seen on consumer television sets. Even plasma TVs, which have a much wider contrast range than LCDs (up to 13,000:1) are nowhere near what HDR can represent.
One of the several companies developing HDR technology for consumer television, Dolby, claims that its Dolby Vision technology will provide a dynamic range of as much as 200,000:1. Other companies developing HDR technology for video include Technicolor, Philips and the BBC. In addition to more dynamic range, Dolby and its competitors are implementing bigger color spaces (simply put, displays built using their systems will be able to display more colors than current televisions.)
One of the big reasons why HDR isn't in the consumer market yet is that existing formats for transmitting video don't support the increased dynamic range and bigger color spaces from the HDR system developers. These formats, if they're used for over-the-air broadcasting, usually have to be approved and standardized by each country's governmental broadcasting authority (the FCC in the U.S., Ofcom in the U.K., etc.) These standardization processes take time, and they take more time when there are multiple vendors competing to set the standard. In the U.S., it took almost five years for the competing companies to agree to work together on one digital television standard, and another five years for manufacturers to begin shipping digital televisions that were compatible with the standard.
Implementation of HDR is likely to be much less painful and take significantly less time than the move from analog standard definition television to digital HD. However, it will take several years, and it's likely that some TV manufacturers will release HDR TV sets using different, incompatible formats. HDR system vendors also have to design their HDR formats so that they're 100% compatible with today's formats, so that HDTVs already in use will simply ignore the HDR portion of the signal. Backward compatibility is never easy to do, and that's why digital HDTV had to be a clean break from the earlier analog SD formats.
So, unless my HDTV dies prematurely, I'm not going to buy an Ultra HD until the television industry settles on a single HDR format, either through government agency decisions or the rise of a de facto standard. There's a huge quality difference between HDTV and Ultra HD with HDR--a difference that you'll clearly see in retail stores and in your living room.
No comments:
Post a Comment