Showing posts with label color space. Show all posts
Showing posts with label color space. Show all posts

Sunday, October 05, 2014

Film is dead--and a few big-budget movies won't save it

Last July, after lobbying by directors including J.J. Abrams, Christopher Nolan and Quentin Tarantino, the major U.S. movie studios signed a deal with Kodak to continue to purchase motion picture film for several years, whether or not they actually use the film. Kodak, which emerged from bankruptcy in September 2013, had planned to shut down its (and the industry's) last remaining motion picture film manufacturing facility. Between the virtually total replacement of film cameras with digital and the almost complete transition of U.S. theatrical projectors to digital from film, Kodak's shipments of motion picture film fell 96% between 2006 and 2014.

Why do this relative handful of directors continue to insist on shooting film? There are two primary reasons:
  1. Film can capture a bigger range of colors than digital camera--the equivalent of 16 bits of resolution. By comparison, Arri's Alexa, the most popular camera for high-end cinematography, captures color information with 15 bits of resolution, but then may lose bits of resolution when it converts the video to a color space for editing and viewing.
  2. Under ideal conditions, 35mm film has an image resolution of around 5300 x 4000 pixels, while the Digital Cinema standard for 4K acquisition and projection is 4096 x 2160 pixels. However, many movie theaters are still using 2K projectors, which limits the resolution to 2048 x 1080. Compared to either digital standard, film (hypothetically, at least) provides far more image detail.
So, those directors who still want to use film have a solid rationale for doing so, even if most of the increased resolution and color space are lost once they're projected on digital projectors or watched on HD, or even Ultra HD TVs. However, the vast majority of directors and cinematographers have switched to digital for several reasons:
  1. Digital cameras have dramatically more dynamic range (the ability to capture bright and dark subjects at the same time) than does film. Typically, the dynamic range of movie film is 1,000: 1 (approximately 10 bits or 10 f-stops.) Even video cameras costing a few thousand dollars can capture 10 f-stops or more, and the Arri Alexa has a range of more than 14 f-stops, or better than 16,384:1.
  2. Digital media is much less expensive than film over time, because it can be reused. Movie productions typically offload recorded flash media to hard and flash drives during the day, then erase and reuse the flash media. Digital's much lower costs also enable directors to get coverage from a variety of angles and framings.
  3. 35mm film magazines usually only allow a maximum of 1000 feet of film to be loaded, due to size and weight. That means that magazines have to be changed every 11 minutes of filming (at 24 frames per second.) Depending on the image resolution, dynamic range and color depth, a single piece of digital flash media can hold hours of video. That allows for long continuous shots and far fewer interruptions to change media. 
  4. Digital cameras have an enormous range of sizes and weights, many of which are smaller and lighter than any practical motion picture film camera can be. This gives filmmakers enormous flexibility for shooting in tight quarters and in sports and action situations. It also makes lightweight drones feasible for shooting, where previously only helicopters and airplanes were viable platforms for aerial photography.
  5. Specialized digital cameras can provide much higher frame rates than are either economically or technically feasible for film cameras. For example, Vision Research's Phantom Flex4K digital camera can shoot Digital Cinema 4K at 1,000 frames per second, or 2K at 1,950 frames per second. By comparison, film cameras usually shoot 24 frames per second.
There's a technology coming down the pike that's likely to make film obsolete, even for the directors who still insist on using it. I wrote about High Dynamic Range (HDR) video two weeks ago, and I won't repeat all the arguments I made in that post. Here's a summary of HDR's advantages:
  • Much greater dynamic range; Dolby says that its Dolby Vision HDR system will have as much as a 200,000:1 dynamic range. Many current digital cinema cameras can be adapted to shoot HDR video when there are commercially-available ways to view it.
  • Color spaces that are as big or bigger than motion picture film.
However, there are several problems with HDR, beyond what I stated in my post:
  • There are no current theater projectors, film or digital, that can project HDR images. Film is only capable of 1,000:1 dynamic range, and film can't project a true black because even the blackest part of an image can't block all the light from the powerful xenon bulb in film projectors. Digital projectors use a similar xenon light and have the same problem. It's possible that laser-based digital projectors could project both the dynamic range and color space of HDR, but to my knowledge it hasn't been demonstrated yet.
  • With the exception of the handful of expensive professional Organic LED (OLED) displays in use, today's current HDTVs can't provide either the dynanic range or color space of HDR. However, LCD HD and Ultra HD TVs can be engineered to have a separate LED backlight for every pixel, and most LCD displays are capable of displaying a bigger color space than is currently used. Dolby, Technicolor, Philips and the BBC are all either in talks with or have already licensed technology to consumer electronics manufacturers to implement their HDR formats in future HD and Ultra HD TVs.
I believe that when the aforementioned film-holdout directors see HDR, they're going to want to use it--and that's when film dies, once and for all, for movie production. (Film is already dead in movie theaters, despite some last-gasp attempts to keep it viable.) The problem, of course, is that there's no easy way to implement HDR in movie theaters. However, as I wrote in my previous post, if past experience is a guide, it make take as long as ten years for HDR to become standardized and available to consumers at an affordable price. That will give theater operators and digital projector companies time to figure out a way to make HDR work in theaters.

Sunday, September 28, 2014

Why I'm not racing to buy a Ultra HD TV...yet

Over the last 18 months, there's been an explosion of products for creating and editing 4K video, from cameras and switchers to editing and compositing software. Costs have declined dramatically: A few years ago, there was only a handful of cameras that could do 4K, and they were priced in the mid- to high-five figures. Today there are 4K cinematography-quality cameras priced as low as $2,000, and GoPro is said to be planning to release its 4K HERO 4 sport camera the week of October 5th, probably at a price below $400. (Update, Sept. 29) GoPro announced three new sports cameras today, with prices. The new HERO 4 Black is the 4K/30fps model, and it will sell for $500, not the $400 I estimated. However, it will ship on October 5th.

4K consumer televisions are becoming more common, and again, much less expensive. In late 2012, there were only two 4K televisions for sale in the U.S. market, and they were priced at $20,000 and $25,000 respectively. Today, the average selling price for an Ultra HD TV (the new name for 4K video) in North America is just under $2,000, and 50" Seiki and TCL models can be had from Amazon for under $450. Vizio has just started shipping its P-series Ultra HD TVs, which are claimed to be comparable to more expensive models from the top manufacturers; its 50" model sells for $1.000.

The better models from the first tier TV manufacturers (including Vizio) should have excellent picture quality, refresh rates of 120Hz or more, and a good upscaler that resizes conventional HD video to Ultra HD without distortion. However, independent researchers have found that, at the normal distances that viewers sit when watching their televisions, there's almost no meaningful difference in the perceived quality of a HDTV and Ultra HD picture. This chart explains how it works:



There was a huge jump in quality between analog TVs and even 720p HDTV. If you had a 50" set, you could see the full difference at 10 feet; with 1080p, you saw the full benefit over 720p at about six feet. However, with Ultra HD, you won't even begin to see any improvement over HD until you're about five feet from the TV, and you won't get the full benefit until you're only about 3 1/2 feet away (a little more than a meter.) At that distance, the television picture is filling most of your field of vision. So, I'm not planning to buy any of this generation of Ultra HDTVs. The reason is that there's a new technology not too far down the road that will provide a much more dramatic improvement over conventional HD picture quality than Ultra HD provides by itself.

This new technology is called High Dynamic Range, or HDR. HDR expands the contrast range of television pictures. Imagine that you're outside on a bright sunlit day. You can see well-illuminated objects quite clearly, and you can also see what's in shadow. That's because your eye has a contrast range of about 1,000,000:1 (20 f-stops.) LCD televisions have a much lower contrast ratio--Rtings.com tested a variety of 2014 HDTVs and found that the highest contrast ratio, 5,268:1, was measured on a Toshiba L3400U. Manufacturers like to claim much higher ratios--for example, in its current E-series, Vizio claims a contrast ratio of 500,000:1, but Rtings.com measured it at 4,581:1. Still very good for a current-generation HDTV, but less than 1% of the advertised contrast ratio.

Even video cameras don't have the same contrast range as the human eye. The Arri Alexa XT. one of the most popular cameras for episodic television and high-end movie production, has a 16,384:1 contrast range. However, through HDR technology the contrast range can be extended significantly, to as much as 262,144:1 (18 f-stops.) That's still not as wide as what the eye can see, but it's dramatically better than anything ever seen on consumer television sets. Even plasma TVs, which have a much wider contrast range than LCDs (up to 13,000:1) are nowhere near what HDR can represent.

One of the several companies developing HDR technology for consumer television, Dolby, claims that its Dolby Vision technology will provide a dynamic range of as much as 200,000:1. Other companies developing HDR technology for video include Technicolor, Philips and the BBC. In addition to more dynamic range, Dolby and its competitors are implementing bigger color spaces (simply put, displays built using their systems will be able to display more colors than current televisions.)

One of the big reasons why HDR isn't in the consumer market yet is that existing formats for transmitting video don't support the increased dynamic range and bigger color spaces from the HDR system developers. These formats, if they're used for over-the-air broadcasting, usually have to be approved and standardized by each country's governmental broadcasting authority (the FCC in the U.S., Ofcom in the U.K., etc.) These standardization processes take time, and they take more time when there are multiple vendors competing to set the standard. In the U.S., it took almost five years for the competing companies to agree to work together on one digital television standard, and another five years for manufacturers to begin shipping digital televisions that were compatible with the standard.

Implementation of HDR is likely to be much less painful and take significantly less time than the move from analog standard definition television to digital HD. However, it will take several years, and it's likely that some TV manufacturers will release HDR TV sets using different, incompatible formats. HDR system vendors also have to design their HDR formats so that they're 100% compatible with today's formats, so that HDTVs already in use will simply ignore the HDR portion of the signal. Backward compatibility is never easy to do, and that's why digital HDTV had to be a clean break from the earlier analog SD formats.

So, unless my HDTV dies prematurely, I'm not going to buy an Ultra HD until the television industry settles on a single HDR format, either through government agency decisions or the rise of a de facto standard. There's a huge quality difference between HDTV and Ultra HD with HDR--a difference that you'll clearly see in retail stores and in your living room.