You'd have to be living under the proverbial rock not to know that Peter Jackson's "The Hobbit" will reach theaters this week. Jackson shot the movie at 48 frames per second, twice the usual 24 fps rate used for theatrical movies, using RED video cameras. Jackson says that the 48 fps rate provides "hugely enhanced clarity and smoothness", but many reviewers are saying that the effect "looks much like video." As well it should, because it is video.
For decades, the standard television frame rate in Europe has been 50 fps, to coincide with the continent's 50 Hz AC power frequency. In the U.S., the standard frame rate is 60 fps (actually 59.94 fps), again corresponding with the country's 60 Hz AC power frequency. (Yes, in analog television, those were field rates, and the frame rates were one-half the field rates, but with HD, we commonly use 50 and 59.94 progressive frame rates.) 48 fps is so close to 50 fps that, from a perceptual standpoint, there's no difference. The kinds of visual artifacts that reviewers are complaining about are much the same as any viewer from the U.S. commonly sees when they watch television in Europe--the image seems to "stutter", especially in scenes with lots of motion or with quick pans. After watching for a while, however, the eye gets used to the slower frame rate. That's exactly what's happening with "The Hobbit"--most people, even motion picture professionals, say that it takes about an hour for their eyes to get used to 48 fps. Of course, the problem with adjusting to 48 fps is exacerbated by 3D, which causes headaches and even nausea in some viewers.
48 fps is no miracle and no big advancement in cinema technology. For all intents and purposes, it's a video frame rate, shot with a video camera. No one should be surprised that "The Hobbit" looks like video. Jackson himself admits that the high frame rate issue was explored decades ago when Douglas Trumbull and his partners developed a 60 fps, 65mm film format called Showscan, which never took off because of its high costs. Back in the late 1970s and early 1980s, there was an enormous difference in the resolution of film and video, so there was no way that 65mm film shot at 60 fps could be mistaken for video. Today, however, RED and other cameras have 4K resolution--close enough to film that the differences are subjective rather than obvious.
If Jackson had shot "The Hobbit" at 50 fps, many people would have complained that he shot it on video. But, he did shoot it on video, at 48 fps instead of 50 fps. What's the difference?