Showing posts with label 4K. Show all posts
Showing posts with label 4K. Show all posts

Sunday, September 28, 2014

Why I'm not racing to buy a Ultra HD TV...yet

Over the last 18 months, there's been an explosion of products for creating and editing 4K video, from cameras and switchers to editing and compositing software. Costs have declined dramatically: A few years ago, there was only a handful of cameras that could do 4K, and they were priced in the mid- to high-five figures. Today there are 4K cinematography-quality cameras priced as low as $2,000, and GoPro is said to be planning to release its 4K HERO 4 sport camera the week of October 5th, probably at a price below $400. (Update, Sept. 29) GoPro announced three new sports cameras today, with prices. The new HERO 4 Black is the 4K/30fps model, and it will sell for $500, not the $400 I estimated. However, it will ship on October 5th.

4K consumer televisions are becoming more common, and again, much less expensive. In late 2012, there were only two 4K televisions for sale in the U.S. market, and they were priced at $20,000 and $25,000 respectively. Today, the average selling price for an Ultra HD TV (the new name for 4K video) in North America is just under $2,000, and 50" Seiki and TCL models can be had from Amazon for under $450. Vizio has just started shipping its P-series Ultra HD TVs, which are claimed to be comparable to more expensive models from the top manufacturers; its 50" model sells for $1.000.

The better models from the first tier TV manufacturers (including Vizio) should have excellent picture quality, refresh rates of 120Hz or more, and a good upscaler that resizes conventional HD video to Ultra HD without distortion. However, independent researchers have found that, at the normal distances that viewers sit when watching their televisions, there's almost no meaningful difference in the perceived quality of a HDTV and Ultra HD picture. This chart explains how it works:



There was a huge jump in quality between analog TVs and even 720p HDTV. If you had a 50" set, you could see the full difference at 10 feet; with 1080p, you saw the full benefit over 720p at about six feet. However, with Ultra HD, you won't even begin to see any improvement over HD until you're about five feet from the TV, and you won't get the full benefit until you're only about 3 1/2 feet away (a little more than a meter.) At that distance, the television picture is filling most of your field of vision. So, I'm not planning to buy any of this generation of Ultra HDTVs. The reason is that there's a new technology not too far down the road that will provide a much more dramatic improvement over conventional HD picture quality than Ultra HD provides by itself.

This new technology is called High Dynamic Range, or HDR. HDR expands the contrast range of television pictures. Imagine that you're outside on a bright sunlit day. You can see well-illuminated objects quite clearly, and you can also see what's in shadow. That's because your eye has a contrast range of about 1,000,000:1 (20 f-stops.) LCD televisions have a much lower contrast ratio--Rtings.com tested a variety of 2014 HDTVs and found that the highest contrast ratio, 5,268:1, was measured on a Toshiba L3400U. Manufacturers like to claim much higher ratios--for example, in its current E-series, Vizio claims a contrast ratio of 500,000:1, but Rtings.com measured it at 4,581:1. Still very good for a current-generation HDTV, but less than 1% of the advertised contrast ratio.

Even video cameras don't have the same contrast range as the human eye. The Arri Alexa XT. one of the most popular cameras for episodic television and high-end movie production, has a 16,384:1 contrast range. However, through HDR technology the contrast range can be extended significantly, to as much as 262,144:1 (18 f-stops.) That's still not as wide as what the eye can see, but it's dramatically better than anything ever seen on consumer television sets. Even plasma TVs, which have a much wider contrast range than LCDs (up to 13,000:1) are nowhere near what HDR can represent.

One of the several companies developing HDR technology for consumer television, Dolby, claims that its Dolby Vision technology will provide a dynamic range of as much as 200,000:1. Other companies developing HDR technology for video include Technicolor, Philips and the BBC. In addition to more dynamic range, Dolby and its competitors are implementing bigger color spaces (simply put, displays built using their systems will be able to display more colors than current televisions.)

One of the big reasons why HDR isn't in the consumer market yet is that existing formats for transmitting video don't support the increased dynamic range and bigger color spaces from the HDR system developers. These formats, if they're used for over-the-air broadcasting, usually have to be approved and standardized by each country's governmental broadcasting authority (the FCC in the U.S., Ofcom in the U.K., etc.) These standardization processes take time, and they take more time when there are multiple vendors competing to set the standard. In the U.S., it took almost five years for the competing companies to agree to work together on one digital television standard, and another five years for manufacturers to begin shipping digital televisions that were compatible with the standard.

Implementation of HDR is likely to be much less painful and take significantly less time than the move from analog standard definition television to digital HD. However, it will take several years, and it's likely that some TV manufacturers will release HDR TV sets using different, incompatible formats. HDR system vendors also have to design their HDR formats so that they're 100% compatible with today's formats, so that HDTVs already in use will simply ignore the HDR portion of the signal. Backward compatibility is never easy to do, and that's why digital HDTV had to be a clean break from the earlier analog SD formats.

So, unless my HDTV dies prematurely, I'm not going to buy an Ultra HD until the television industry settles on a single HDR format, either through government agency decisions or the rise of a de facto standard. There's a huge quality difference between HDTV and Ultra HD with HDR--a difference that you'll clearly see in retail stores and in your living room.

Thursday, April 17, 2014

Don't overbuy your next cinema camera

Last week, I published a post that recommended four steps to take before you buy or rent a 4K cinema camera. There's an important point that I left out: The rate of change in the camera (and for that matter, production and post-production hardware) business is greater than at any time in memory. Consider that it wasn't too long ago that a properly maintained 35mm camera could be expected to last 20 years, and a film editing table (Kem/Steenbeck) could last 30 or 40 years. Today, we're well along with the transition from 2K to 4K (at least on the acquisition side,) and Japan's NHK is already building prototype hardware for the 8K generation.

The rate of change is at least equal to that of the heyday of personal computers, when faster processors and better displays were released every year. Today, it's likely that a camera will become technically obsolete well before it's no longer repairable. Here's a few reasons why:
  • The sensitivity and dynamic range of imagers continues to improve, and rolling shutters are being replaced with global shutters.
  • Codecs are also improving, with support of higher bit-depths and bigger color spaces.
  • Storage speeds and capacities are increasing, while the cost of flash-based storage is falling.
With things changing so fast, you don't want to get locked into a capital investment in a camera that you can't pay back before it's obsolete. My recommendation is to plan on a three-year usable life for most of today's cameras. That doesn't mean that they'll break in three years, but rather, the state of the art will progress so much that you'll want a new camera in three years, especially if your competitors already have one. So, you need to know how often you're likely to use the camera over those three years.

Let's say that the camera you've decided on costs $20,000, including some accessories that you won't be able to use on future cameras. If you'll use the camera ten times a year over the three years, that means that you'll be spreading the $20,000 cost (plus routine maintenance) over 30 shoots, and the camera will cost you $667 per shoot. (Lenses are extra.) If you're only going to use the camera once a year over three years, it will cost $6,667 per production. A cheaper camera doesn't have to be used as much to justify its purchase, so long as it does everything you expect to need over those three years.

One other important consideration is lens mounts. Even if you're planning to rent most of your lenses, you'll probably want to own some lenses that you use often. You don't want to have to sell your lenses on eBay when you buy a new camera, so you should get a camera with a lens mount that's likely to satisfy your needs in the future. EF and PL mounts are the most widely used today, and are likely to be the most widely used down the road. There are fewer MFT- and E-mount lenses available, but there are adapters and Metabones Speed Boosters for both EF- and PL-mount lenses to fit MFT and E mounts.

If you buy (or rent) cameras with a three-year useful life in mind, don't overbuy based on the number of shoots you expect to do over those three years, and choose a lens mount based on your long-term needs, you're far more likely to be happy with your purchase across its entire usable life and beyond.


Friday, April 11, 2014

Blackmagic adds studio cameras to its live production suite, makes its switchers 4K

Blackmagic Design has long been known as a post-production hardware vendor, starting with its DeckLink cards in 2002. In 2010, the company moved into live video production when it acquired switcher manufacturer Echolab's assets out of bankruptcy. Together with its Videohub routers and video & audio monitoring hardware, Blackmagic built a fairly complete line of live production products. Then, in 2012, Blackmagic introduced its first camera, the Blackmagic Cinema Camera (BMCC). Many people wondered if the Cinema Camera could be used for live production since it has an HD-SDI output, but Blackmagic cautioned against using it that way. The BMCC's color output is so flat that it can't really be used without color correction, and Blackmagic's subsequent camera models launched prior to this year aren't much better suited for live use.

However, at NAB earlier this week, Blackmagic introduced a line of cameras designed specifically for live production, the Studio Camera HD and Studio Camera 4K (which outputs video in Ultra HD and HD.) The Studio Cameras are designed around 10" LCDs that do double duty as viewfinders and menu displays. The company claims that the viewfinders are the largest offered by any manufacturer. Unlike the Cinema Camera and Production Camera, the Studio Camera's display isn't touch-sensitive; a row of buttons below the display is used for user inputs. The company claims that by eliminating the touch-sensitive layer, the Studio Camera's display is brighter.

On the back of the display, there's a wedge that contains all of the camera's connectors, the lens mount (active Micro Four Thirds), imager and most of the camera's electronics. The result is a very strange looking camera, but one with significantly better features than previous Blackmagic models. For example, the company's previous cameras have become known for their poor battery life, but Blackmagic says that the battery in the Studio Camera will last for four hours, and a standard four-pin power connector allows users to connect external batteries for more runtime, or AC power for continuous operation. The single minijack or dual 1/4" jacks used for audio input in the previous cameras have been replaced with dual XLR connectors with phantom power.

The Studio Cameras also have several new features:
  • A LANC interface for connecting a remote iris, focus and zoom control (if your lens is compatible)
  • Dual jacks for connecting an aviation headset for intercom use; Blackmagic claims that aviation headsets are much less expensive than video production headsets with comparable features
  • A bidirectional optical fiber connector that's compatible with the ATEM Studio Converter and provides the same functionality as the $595 ATEM Camera Converter. This enables the Studio Camera to send and receive HD or 4K video, stereo audio, talkback/intercom and tally lights over cable runs as long as 28 miles
  • A software-based Remote Camera Control that works with any ATEM Production Studio. All of the settings on the camera can be monitored and controlled with this software. In addition, a full copy of DaVinci Resolve's primary color corrector is included for live color balancing
You may be thinking, "These Studio Cameras are better than Blackmagic's first-generation models in almost every way, and they're the same price, so why would anyone buy the earlier models?" One big reason is that the Studio Cameras have no storage. No SSD, no CFast, no SDXC, nothing. You can, of course, add an external recorder such as Blackmagic's HyperDeck Shuttle, and you've got other options using the Studio Cameras' SDI connections. However, an external recorder adds to the size, weight and cost of the cameras.

The Studio Camera HD is shipping now and is priced at $1,995 (U.S.), while the Studio Camera 4K is expected to ship in June and is priced at $2,995. Given Blackmagic's track record with cameras, don't bet your life on that June ship date, and expect some problems with the cameras that are shipped for the first several months.

Blackmagic has also made a number of changes to its ATEM line of switchers (all of which are shipping):
  • The original HD-only models of the ATEM 1 M/E and 2 M/E have been discontinued; the sole HD-only switcher that remains in the product line is the $995 ATEM Television Studio, which is primarily intended as a "personal" switcher for webcasts and small productions
  • The new ATEM 1 M/E Production Studio and 2 M/E Production Studio support 4K and HD on all inputs and outputs (except the monitor outputs, which are HD only)
  • Last year's ATEM Production Studio 4K, which has similar functionality to the ATEM Television Studio except it supports 4K, remains in the product line at $1,695
  • The ATEM 1 M/E Production Studio 4K is priced at $2,495, and the ATEM 2 M/E Production Studio 4K is priced at $3,995, $1,000 less than last year's model
With the Studio Cameras and its 4K switcher line, Blackmagic now has just about everything needed to build a live production facility.

Wednesday, April 09, 2014

For 4K cameras, price is A thing, but not THE thing

I'm back from NAB, where the overriding theme this year was 4K everything--cameras, monitors, editors, special effects, routers, switchers, etc. Perhaps the biggest battle was in cameras, where AJA Video entered the market for the first time and Blackmagic Design and Sony announced new 4K cameras. (JVC also announced its first 4K digital cinematography cameras, but gave no prices or availability dates.) 4K cameras have been a "thing" ever since the RED One, but $10,000 was the least that you could spend to buy one (Canon's 1D C) until Blackmagic shipped its 4K Production Camera late last year, priced at $2,995.

The floodgates have now opened:
  • Panasonic's GH4: $1,699, or $3,299 bundled with its SDI/XLR interface dock
  • Sony's A7S: $2,499.99, will ship in July
  • Blackmagic's 4K Production Camera: $2,995
  • Blackmagic's URSA EF: $5,995
  • AJA's Cion: $8,995
None of these cameras cost more than a fraction of the price of an ARRI Alexa ($80,000+) or Amira ($40,000-$52,000 depending on enabled features,) RED EPIC-M Dragon ($50,000+), or Sony F55 ($29,000+) or F65 ($65,000+). You'd think that ARRI, RED and Sony would be shaking in their boots, but they're not. There are two reasons why the companies that make high-end cameras aren't necessarily threatened by the new inexpensive models:
  1. There are many elements that determine whether or not a specific camera is appropriate for an application, and
  2. You get what you pay for.
Here are some (but far from all) of the elements of camera design that influence how the camera performs and what it's good (or not good) for:
  • Imager size
  • Imager resolution
  • Color space (e.g., YUV or xvYCC)
  • Color sampling (e.g., 4:2:0, 4:2:2 or 4:4:4)
  • Bit depth (8-bit vs. 10-bit)
  • Video output resolution (DCI 4K (4096 x 2160), UHD (3840 x 2160), 1080, 720)
  • Video compression formats (e.g., AVCHD, H.264, ProRes, DNxHD, XAVC, XAVC S, AVC-Intra, AVC-Ultra)
  • RAW storage and/or output
  • Frame rates supported at specified resolution (e.g., 24, 25, 29.97, 30, 59.94, 60 or 120)
  • Dynamic range in stops
  • Native ISO speed
  • Sensitivity (maximum usable ISO speed, NOT the camera's rated top ISO speed)
  • Lens mount(s)
  • Lens control (manual, automatic or limited automatic)
  • Image stabilization (digital or optical, in the camera body or in the lens, or no stabilization)
  • Viewfinder, display screen, both or none
  • User interface design (e.g., touchscreen(s), menus, dedicated buttons and knobs)
  • Storage capacity
  • Storage media
  • Internal or external storage
  • Video interface(s) (HDMI or HD-SDI, with different HDMI versions and, for HD-SDI, maximum speeds)
  • Audio connector(s)
  • Balanced or unbalanced audio in
  • Phantom power availability
  • Run time on battery
  • Removable or permanent battery
  • External power voltage and connector
  • Camera shape
  • Camera weight
  • Ruggedness
  • Manufacturer and design maturity (how much experience does the manufacturer have in designing cameras, and how long has the manufacturer been making this particular camera)
There's an enormous number of elements to consider, and some elements work much better for certain applications than others. In some cases, buyers have a wealth of cameras to choose from, while in other cases, there may only be a handful that can do what they need.

Rather than salivating when you hear about a new low-priced camera with attractive features, ask yourself these questions:
  1. What am I going to use it for?
  2. What trade-offs am I willing to accept (for example, are you willing to live with less sensitivity in order to get a higher-quality compression format?)
  3. How often will I use the camera (do you know that you'll be using it over and over on new projects, or do you have one project in mind and you don't know when you'll have the next one?)
  4. How much can I afford to pay?
Answering the first two questions will allow you to compile a list of cameras that meet your needs. Answering the final two questions will tell you whether you should buy or rent the camera that you can afford. In some cases, you may decide to buy a less-expensive camera and use your remaining budget to buy lenses or mounting equipment. In other cases, you could rent a camera and use the savings elsewhere on your production, or rent a camera that you can't afford to buy that's superior to other choices for your application. In short, you should answer the four questions first, rather than starting with the price of the camera and instead hoping that it will meet your needs.

Sunday, February 09, 2014

Panasonic's GH4 lowers the bar for 4K pricing, but are the compromises worth it?

Panasonic’s flagship GH series of Micro Four-Thirds ILCs has developed a strong reputation as cinematography cameras, starting with the GH2, which had its firmware hacked to enable much higher bit rates than the stock model. The GH3, introduced last year, took many of the capabilities added by third parties and built them into the base camera. Last month, at the Consumer Electronics Show, Panasonic showed a prototype of a new GH that looked almost identical to the GH3 but supported 4K video. As of this weekend, we’ve learned the specifications for the new camera, called GH4, but not the price or release date.

The GH4 16.05 megapixel CMOS imager doesn’t break any records for still imaging, but it does support 4K video at both standard resolutions: Ultra HD (3840x2160 @ 30p,) the broadcast/video 4K standard and the resolution of consumer Ultra HD displays, and Cinema 4K (4096x2160 @ 24p,) the baseline standard for theatrical production, post-production and exhibition. In 4K mode, it uses IBP compression with I and B frames at 100mbps, while in 2K mode, it supports All-I compression at 200mbps.

The GH4 has an interesting (and confusing) approach to how it handles storage of 4K video: When using an SD card, video is stored in 8-bit 4:2:0, and is output to the HDMI terminal as 8-bit 4:2:2. If you remove the SD card and use an external recorder, the HDMI output is 10-bit 4:2:2. You can also opt for a dock (Panasonic refers to it as an “interface unit”) that provides two XLR audio inputs with LED meters, four SDI outputs (two of which are 3G) and a 12VDC power socket. The SDI outputs can presumably drive an external recorder and monitor simultaneously at 10-bit 4:2:2. The dock won’t win any design awards—it just about doubles the size of the GH4—but it does add the interfaces that professional users need (or are forced to add with third-party hardware.)

Panasonic’s approach to storing 4K video means that an external recorder will be a necessity. In this regard, Blackmagic Design’s Production Camera 4K (the camera most likely to be compared to the GH4) has an advantage, because users can insert a SSD directly into the Blackmagic camera, eliminating the need for an external recorder.

I have to admit that I’m disappointed with how Panasonic chose to implement storage on the GH4. It would have been nice to be able to use the camera in a handheld mode without a lot of additional hardware, but in 4K mode, the SD card is only good for proxy recording in 4K mode. (To be clear, you can store 4K video on the SD card, but if you're serious enough about 4K to put up with all the other issues you'll need to deal with in post-production--massive storage, faster PCs, bigger monitors, etc.--8-bit 4:2:0 won't cut it.)

At CES, Panasonic representatives said that they expect the GH4 to be priced less than $2,000 in the U.S. That price won’t include the dock, which is likely to cost at least $1,000. So, Panasonic could get the GH4 plus dock to market at around $3,000—but I wouldn’t be surprised if the pair launches at closer to $4,000.

Update (March 10, 2014): Panasonic has revealed prices and availability dates for the GH4. The GH4 body's suggested retail price is $1,699.99 (U.S.). The dock (officially called the DMW-YAGH XLR/SDI Interface Unit) is priced at $1,999.99. If you buy both the GH4 and dock as a bundle, it's priced at $3,299.99. I’d love to see a third party develop a more elegant (and cheaper) dock, but there may not be a big enough market for it to make financial sense.

The GH4 is one of the least expensive ways of getting into 4K, if not the least expensive way. However, as we’ve learned from Blackmagic’s cameras, inexpensive means compromises, and the GH4 is no exception. It remains to be seen if the GH4’s compromises are ones that you can live with.

Wednesday, April 10, 2013

4K for video production and post-production: Buy or wait?

You can summarize this year's theme at NAB very easily: 4K. There are vendors selling 4K cameras, monitors, storage devices, production switchers, routers, video capture cards, video editors and post-production software all over the show floor. Salespeople will tell you that 4K is "the next big thing," and you'd better start buying equipment and software to support it. The problem is that the last "next big thing," 3D, wasn't, so does it really make sense for television stations, video production companies and post-production houses to buy 4K equipment?

There are clearly applications for which 4K makes sense, of which motion picture production and post production are the most obvious. In movies, more resolution is almost always better, especially where special effects are used--you can lose resolution in the process and still have enough for for acceptable quality when projected onto a big screen. However, when it comes to television, 4K may be too much of a good thing. 4K video requires four times as much storage and much faster connections than 2K, both of which increase costs. 4K monitors are still scarce and are much more expensive than 2K monitors.

The question for video producers is, will 4K television sets and media players take off with consumers, and if so, when? Based on 3D's track record, producers may want to wait a while to spend their money. 3D was originally launched by motion picture distributors and adopted by consumer electronics companies in order to increase their revenues. However, they didn't consider whether consumers were really interested in 3D, how much they were willing to pay for it or how much grief they were willing to go through in order to get it.

Like Blu-Ray before it, there was no single standard for 3D, which led to consumer confusion and frustration. Some 3DTVs required heavy, expensive, battery-powered "active" 3D glasses, while others required lighter, less-expensive "passive" glasses that many users felt didn't deliver good enough picture quality. Glasses for one manufacturer's 3DTVs usually didn't work with other manufacturer's sets, and most 3DTVs only came with one or two pairs of glasses, so families with children had to shell out more money to buy additional pairs. Movie distributors struck exclusive deals with consumer electronics companies for some of their films; for example, 20th Century Fox gave Panasonic an exclusive for "Avatar." Consumers who purchased Panasonic 3DTVs a free copy of the movie, but buyers of other brands couldn't get it at all.

A few television producers jumped into 3D early; for example, Discovery, Sony and IMAX launched a 3D cable channel called 3net, and ESPN launched a 3D channel. While ESPN got fairly wide carriage, 3net has spotty availability. That's about it when it comes to regular 3D service, and even for that limited selection, consumers are usually required to pay extra for 3D by their video service providers.

So, is 4K likely to be different? It won't cause the headaches and nausea that some viewers get with 3D, so in that sense, there's likely to be less resistance to 4K. On the other hand, consumers will need new 4K televisions. They'll have to buy a new video player, because Blu-Ray is limited to 2K. They'll have to replace their home theater systems and A/V receivers, because a single HDMI connection can only handle one-fourth of the bandwidth required for 4K. Cable, satellite and IPTV video providers will have to provide new set-top boxes and dedicate multiple channels for a single 4K signal, as well as upgrade their signal distribution systems, potentially at an enormous capital cost. Finally, ATSC 2.0, the upcoming new standard for digital television broadcasting in the U.S., won't support 4K. For 4K to be supported by broadcasters, it will have to wait for ATSC 3.0, which is still in an early state of development and won't be implemented for over-the-air use for years.

There will eventually be a big consumer market for 4K; it's the obvious next step in resolution. However, it may be five to ten years before we get there. If you can get the 4K-capable equipment you need at the same price as 2K, you should buy it, but unless you're working on movies, it's better to let consumer adoption tell you when it's right to buy into 4K, rather than vendors.
Enhanced by Zemanta