Sunday, May 29, 2011

3D: Not dead, but headed for a niche

The New York Times has an article about the back-to-back poor boxoffice performance of 3D versions of "Pirates of the Caribbean: On Stranger Tides" and "Kung Fu Panda 2". "Pirates of the Caribbean" only sold 47% of its tickets in 3D, and "Kung Fu Panda 2" only sold 45%, even though both movies were heavily promoted as 3D titles. In fact, 3D's share of boxoffice revenues has been declining almost since "Avatar" left the theaters. Audiences are getting tired of movies that were originally shot in 2D and then poorly converted to 3D (like "Clash of the Titans" and "Thor"), along with the low brightness, glasses and headache-inducing qualities of 3D as shown in most theaters, and, perhaps most importantly, the steep ticket prices.

As the New York Times points out, the movie studios have a bumper crop of 3D titles coming out this year, and the industry has become dependent on 3D's higher ticket prices to try to compensate for declines in DVD sales and the failure of Blu-Ray to pick up the slack. It now appears that 3D isn't the "Hail Mary" pass that the movie industry was looking for.

That doesn't mean that 3D is dead, but it does mean that the studios will have to become a lot more selective about the movies they release in the format. 3D needs to be reserved for "event" movies that can truly take advantage of the effect--the novelty of 3D for its own sake has worn off. Studios also have to drop 2D to 3D conversions; the audience has caught on, and we're rapidly approaching the point where it will cost more to do the conversions than the incremental revenues that movies will earn with 3D.

The studios also have to, once and for all, get over their spending addictions. DVD sales underwrote a wild period of production and marketing cost escalation, but since the Great Recession, the studios have been looking for replacements for DVD revenues like crack addicts looking for a fix. It's time for the studios to go into rehab. Average production costs of $150 million or more have got to drop to more reasonable amounts. The studios can easily afford 3D becoming a niche format for special events, but only if they bring their spending in line with market realities.
Enhanced by Zemanta

Wednesday, May 25, 2011

Content incubators

Many readers are familiar with the current generation of technology incubators, best represented by Y Combinator and TechStars. They provide seed funding, experienced mentors, office space and equipment, primarily for startups working on web-based services, as well as app and software developers. In return, they take an equity stake in participating startups. Content startups don't often get into these programs, and the mentors that participate usually have little or no experience in content-related businesses.

What's needed is a new type of incubator specifically for content and content-related technology startups. Here's what would be included in a typical content incubator:
  • Like technology incubators, content incubators would provide seed funding, although the amount might be less than that usually given by technology incubators.
  • The mentors would come from incumbent media (publishers, editors and writers from print, station managers and program directors from radio, and station, programming and news directors from television), new media, and Internet/software technology, as well as financial and legal counselors.
The facility would have the typical desks and offices that you'd see in a tech incubator, but it would also have editing, production and post-production facilities:
  • A divisble newsroom with workstations running print and broadcast newsroom software
  • Print layout and design workstations
  • Audio and video webcasting studios
  • A news/talk studio for live or recorded video webcasts
  • A performance space studio seating 100-200 for concerts, comedy and talk shows with audiences
  • Video and audio production equipment similar to what I've previously written about for webcasting
  • Editing suites for audio and video post-production, and for video graphic design
  • Software workstations for app and software development
The idea, in short, is to provide content startups with the mentors and facilities that they need to develop their concepts and produce examples. The startups would use the incubator's facilities, equipment and in-house team for production and editing. They get a very low cost, low risk way of trying out new ideas, and the incubator gains equity in the businesses.


Enhanced by Zemanta

Sunday, May 22, 2011

A look at "The X Factor" Chicago auditions (Updated)

(Updated 2 June 2011 to include comments on the dismissal of show judge Cheryl Cole). I attended a taping of auditions for "The X Factor" last Thursday at the Sears Centre Arena, located in Hoffman Estates, IL, just outside Chicago. The auditions were for the U.S. version of "The X Factor", which will premiere on Fox in September. I won't give any spoilers; instead, I'll focus on some of the technical and production aspects of the show.

The Sears Centre Arena is one of several sports and concert arenas in the Chicago area, and it was well-suited for the taping. The producers effectively cut the arena in half with a floor-to-ceiling cloth partition that formed the backdrop for the performance stage. The arena was made to look and sound like a concert, with the usual lighting, speakers and amps, albeit with lower sound levels. According to Simon Cowell, there were more than 3,000 people watching the taping.

The producers used seven cameras in the arena. Two were positioned in the audience and equipped with long zoom lenses for covering the performers. Three were positioned stage left for shooting the four judges (Simon Cowell, Paula Abdul, Cheryl Cole and L.A. Reid). A crane-mounted camera was used for overhead shots of performers, the judges and audience, and a seventh camera was used for audience reaction shots. Additional cameras (not visible from the arena) were used backstage for interviews with show hosts Nicole Scherzinger and Steve Jones, the performers and their families. There were two multiviewers positioned stage left to show the outputs of the various cameras on a single screen, but it appeared that they were being used by producers, not the show's director.

Prior to the start of taping, production teams shot interviews and raw footage of some of the performers and their families outside the arena. The editors on the show are going to have a lot of work, simply due to the volume of footage produced by all those cameras in the course of a 3 1/2 hour taping. (Keep in mind that this was only one of four tapings scheduled for Chicago. Multiple tapings were held earlier in the month in Los Angeles, and four more tapings will be done in June in each of four locations: New Jersey, Miami, Dallas and Seattle.)

From the taping that I saw, it's not clear how much "The X Factor" differs from the format of "American Idol" (I haven't seen the U.K. version of the show). As mentioned above, there are four judges, and three of them have to say "yes" in order for an individual singer or group to pass through to the next round in Los Angeles. Given the number of individuals and groups that went through from the taping I attended, there are going to be many performers going to L.A. by the time the auditions end next month.

Since I originally wrote this post, judge Cheryl Cole has been dropped from "The X Factor". There's been no official statement from either the show's producers or the Fox network as to why she's been dropped, but there are a number of rumors:  Her Geordie dialect might be too difficult for U.S. audiences to understand, her weight, and a lack of chemistry with the other judges. I had no trouble whatsoever understanding her, and although I was in the "cheap seats", I couldn't see anything that would indicate that she has a weight problem. However, I and other people sitting around me commented on a lack of chemistry.

Simon and Paula came out and immediately continued where they had left off on "American Idol". Their playful (and sometimes not so playful) friction was exactly what the audience expected. Early in the taping, it seems like there were two panels--Simon and Paula, and Cheryl and L.A. Reid. As the night went on, L.A. Reid synced up with Simon and Paula, and his personality started to come alive, but Cheryl seemed to be left out.

Fox executives probably got a chance to see rough edits of the footage from the four Chicago tapings, and saw the same thing that we saw in the audience. The vast majority of the U.S. audience neither knows nor cares who Cheryl Cole is--I'm not saying that as an insult, it's simply reality--and host Steve Jones pointed out that no one in the audience in Chicago probably knew who he was. "The X Factor" is too important to Fox and its producers for them to take any chance with the show's popularity. In addition, they had a short break before taping 16 more auditions in four more cities throughout the month of June. If they were to make any on-screen personnel changes, they needed to make them quickly, before the June tapings began.

Enhanced by Zemanta

Saturday, May 21, 2011

Separating the medium from the message

Wikipedia defines the word medium as follows: "In communications, a medium is the storage and transmission channel or tool used to store and deliver information or data." It's the means of getting information from point A to point B, not the content or structure of that information. However, it's virtually impossible to consider a medium without including its usual content and structure. For example, we have expectations of what we're going to hear when we turn on the radio, and what we're going to see when we turn on the television. We don't expect to find the nameplate and headline of a newspaper on an inner page; we expect to find them on the front page.

Content and structure have been an integral part of our understanding of what a medium is, until we got to the Internet era. The Internet is a medium that can reproduce the content and structure of radio, television, movies, compact discs, books and newspapers. The Internet decouples the physical medium from its content and structure.

The Internet's flexibility is both a blessing and a curse. It makes it relatively easy to copy the content and structure of other media, but history shows that copying one medium into another isn't a successful long-term strategy. Radio copied theater, concerts and vaudeville, but it didn't come into its own until unique styles of entertainment were developed specifically for radio. Television initially copied radio and the media that radio itself originally copied, but like radio, television didn't take off until artists started taking advantage of television's unique capabilities.

The Internet can be a replacement for today's radio and television, with the added benefit of time-shifting, but with its sometimes tinny sound and small displays, it's a poor substitute. You can make a website or app look just like a newspaper page, but experience shows that people read in a different way online than they do in print. Despite apps, most magazines available online look just like their print editions, with a few additional features. These "digital magazines" are often very hard to read, requiring lots of zooming (some readers only allow one level of zoom), panning and scrolling.

One can argue that webpages are themselves a unique form of structure, if not content; with the exception of interactivity, they're an amalgam of text, audio and video forms that existed well before the web itself. What are some of the other unique capabilities of Internet media that can differentiate them from existing forms?
  • Interactivity: Video games and interactive CD-ROMs predate the web, but the web brings interactivity to a new level, and mobile apps are accelerating the trend.
  • Multi-way creation: Instead of the "one creator to many consumers" model inherent in incumbent (old) media, the Internet enables many creators to reach a few or many consumers. It also enables creators to interact with each other, and makes the roles of creators and consumers fluid--one can become the other, and one can play both roles simultaneously.
  • Support for most kinds of existing media: The temptation to simply copy one medium's content and structure onto the Internet is great, but the ability to integrate the capabilities of multiple mediums into a single composition is very powerful.
  • No gatekeepers: Creators can reach consumers and other creators inexpensively, without having to go through distributors, retailers and networks.
  • Low production and distribution cost: The Internet has helped to drive the cost of the software and services needed for content creation down to a tiny fraction of the prices paid by old media companies, and Moore's Law has driven down the prices and increased the capabilities of the devices needed to create and access the content.
Simply taking a radio show and moving it to the Internet, either as a webcast or simulcast, won't cause the Internet to displace radio; it only creates a poor substitute. The same goes for television. Magazines that simply reproduce the content of their print editions in electronic form aren't reversing their downward circulation and advertising revenue trends--the best they're doing is slowing down the decline. Newspapers aren't adding enough extra value on the Internet to make their paywalls work.

One-to-many media don't work on the Internet, or more accurately, they don't work well enough to maintain the business models of incumbent media companies. Native Internet media have to be highly interactive and incorporate an audience of content creators, not just content consumers. If any content consumer can instantly and painlessly become a creator of virtually any kind of content, and if consumers of that content can in turn create their own content, that's when Internet media becomes very different than any incumbent media.

Twitter, YouTube and The Huffington Post are all early examples of true Internet media, although they have limitations:
  • Twitter's is the kinds of content that can be delivered in-line with its 140-character messages (which itself is a limitation).
  • YouTube's is the amount of effort necessary to create a video that doesn't look amateurish.
  • The Huffington Post's is that it's very text-based; the HuffPo is adding more video, but it's primarily produced in-house and represents a regression to the "one-to-many" model. Also, anyone can submit posts to the HuffPo, but that doesn't mean that they'll be accepted.
If you think about how those three companies' models can be mixed and improved, you can come up with some very interesting new visions of Internet media.
Enhanced by Zemanta

Sunday, May 15, 2011

A new model for video production, take 2

Last August, I wrote an entry about how new technologies and the vastly improved quality of consumer- and prosumer-grade video and audio equipment would allow producers to build an HD webcasting studio with four automated cameras, a camera control unit and a production switcher for around $47,000. In that iteration of the design, I specified Panasonic's IP cameras and switcher. I also included a very inexpensive Zoom audio mixer/recorder, four Line 6 digital wireless microphones, four Litepanels LED lights and some other equipment.

After I returned from NAB last month, I reworked the budget and replaced many of the components with new, lower-priced models. For example, I replaced the Panasonic cameras with Canon XA10 camcorders--not automated, but for my money, they give a better picture and are much more flexible--and then went with third-party pan/tilt heads and controllers. The Panasonic switcher, which was bare-bones in the extreme, was replaced with Blackmagic Design's ATEM 1 M/E, which is better in just about every possible dimension.

I also dramatically improved the audio mixer, moving from the Zoom R24 to a PreSonus 16.4.2. By judiciously swapping out components (but without replacing anything with junk and without losing any functionality), I added four teleprompters, an intercom, a broadcast graphics server and a complete talk show call-in phone system, while actually saving $1,000 on the original system. That's in a bit more than eight months.

My goal is to get a studio built with this new design by the end of the year for a startup I'm working on. IBC, the other big broadcasting conference of the year, will be held in September. Who knows what else we'll see there, and what more you'll be able to do with $50,000?

Why the Burson-Marsteller/Facebook blow-up won't change anything

You've probably read about Facebook's botched attempt to smear Google by planting stories about alleged privacy threats in a little-used Google service called Social Circle. Facebook hired public relations firm Burson-Marsteller to plant the stories, and the PR firm assigned two recently-hired former journalists, Jim Goldman and John Mercurio, to execute the plan.

The two reporters-turned-flacks had no success in getting the story picked up by conventional media outlets--in fact, Goldman's pitch to USA Today turned into a story not about Google but about Burson-Marsteller's fevered attempt to convince the magazine to write negatively about Google. When USA Today independently investigated Burson-Marsteller's claims, it found them to be "largely untrue", at which point Goldman stopped talking to the newspaper. As an alternative, Burson-Marsteller turned to bloggers. Mercurio pitched the story to blogger and security expert Chris Soghoian, going so far as to offer to assist in the writing of the blog post, which Mercurio would then pitch to a variety of sites, including the Huffington Post. Soghoian asked Mercurio who his client was, Mercurio refused to answer, and Soghoian published his email correspondence with Mercurio on the web.

Dan Lyons at The Daily Beast picked up on the story, and got Facebook to confirm that it was the client behind Burson-Marsteller's efforts. Once Lyons' story hit the web, interest in what had happened exploded. Burson-Marsteller announced that, in essence, the problem was all due to Facebook, and it would no longer work for the firm. Facebook announced that it hadn't instructed Burson-Marsteller to plant the Google story in the way the PR firm attempted, but that it (Facebook) was justified in its actions, in large part because Google was mining Facebook's own data in order to offer Social Circle.

Industry observers were expecting both Burson-Marsteller and Facebook to fire some of their employees as a result of the fiasco, but it hasn't happened so far. In fact, Burson-Marsteller has announced that it won't fire anyone, but will give the involved employees "additional ethics training." The question is, why not fire them? The probable reason is that any employee who Burson-Marsteller fired would file suit against the company for wrongful termination. The discovery process for the suit would in turn reveal that Burson-Marsteller management approved the plan, and possibly even conceived of it in the first place. It would further reveal that Burson-Marsteller runs these kinds of campaigns for its clients on a regular basis, and that the problem with this particular campaign wasn't that it violated the company's ethical standards as actually practiced, but that the company and its client got caught.

But what about the ex-journalists who executed the scheme, Jim Goldman and John Mercurio? They were almost certainly hired specifically for their journalism experience and connections. Goldman was most recently CNBC's Silicon Valley reporter, and worked for a number of publications and networks over the years, including the San Jose Mercury News, Silicon Valley's hometown newspaper. Mercurio was most recently the Executive Editor of the National Journal's Hotline, and prior to that was Political Editor for CNN; Burson-Marsteller itself claims that Mercurio has more than 20 years of journalism experience.

Shouldn't Goldman's and Mercurio's decades of journalistic experience have led them to conclude that the campaign they were executing on behalf of Facebook was unethical? And if it didn't, will a few hours of ethics training make either man any more ethical? (Update, May 18, 2011: According to Burson-Marsteller's own website, Jim Goldman got a B.A. degree from Brown University in Ethics in Political Journalism and Political Philosophy. I guess the saying that "most people don't retain much of what they learned in college" must be true, at least in Goldman's case.)

My belief is that neither man objected because Burson-Marsteller's tactics are, in fact, common practice. They no doubt were on the receiving end of such campaigns, and quite possibly actively participated in them as journalists. They saw no ethical conflict because that's how things are actually done.

The reputation of the journalism profession in the U.S. has fallen to the level of used car salespeople. If you want to know why, you need to go no further than to consider how many other reporters and editors with Goldman's and Mercurio's ethical standards, or lack thereof, are writing the news that you're reading, listening to and watching. If those two men are in any way representative of the journalists employed by this country's media outlets, the journalistic profession deserves its reputation.

As for Facebook, no one should be surprised that it would be behind a smear campaign, and that it would react with righteous indignation when asked to apologize for its conduct. Anyone who has followed Facebook knows that the company has a habit of regularly changing its privacy controls and rules to make it harder for users to control how much of their information is made public, and of positioning users' losses of privacy as "improvements" until third parties reveal the truth. To say that Facebook is "ethically challenged" would be an understatement.

Given all that, I have no expectation that this fiasco, compared by Dan Lyons to a Keystone Kops episode, will result in any meaningful changes at either Burson-Marsteller or Facebook, other than that both companies will work harder not to get caught in the future.

Enhanced by Zemanta

Wednesday, May 11, 2011

Want to know why Xoom and PlayBook are struggling? Watch their ads

It's well-known that sales of Motorola's Xoom Android tablet have been disappointing. RIM hasn't yet announced any sales figures for its PlayBook tablet, but the early word is that its sales are also very slow. If you want to know the reason for both products' struggles, all you have to do is look at their television ads and compare them to those from Apple for the iPad 2. I'm not talking about the "artistic" value of the ads--I'm talking about the content, and what that content says about the products.

Motorola and Verizon have two Xoom ads on U.S. television; the content of both are similar, but I'll focus on the more current version. It starts with a man with an angry/aroused look on his face, breaking a notebook computer into four parts, which turns into a Xoom. Then he holds the Xoom in front of himself, still with that angry/aroused look. The commercial switches to a close-up of the screen, and a hand moving quickly between a movie, mail, a game, a video call, another movie...you get the picture. Then, it finishes with Angry/Aroused Man holding the Xoom in front of himself again.

The PlayBook ad dispenses with Angry/Aroused Man--all it shows is a close-up of the screen and a hand moving between various windows: A movie. Some images. A game. Another movie. Yet another movie. And then, the PlayBook tagline. It doesn't show a single business-oriented application, not even email, even though that's RIM's strength.

Compare that with Apple's long-running campaign for the iPad, and now the iPad 2. Apple's commercials show apps. Every commercial shows a different set of apps, for education, entertainment, medicine, business and so on. And that's the key to why the iPad 2 continues to sell extremely well, and the Xoom and PlayBook are struggling.

Both Motorola and RIM released their products well before they were ready. In the Xoom's case, the Honeycomb version of Android itself was rushed out, and developers didn't have sufficient time to build tablet-aware apps. The PlayBook shipped without native email, calendar and directory apps. That functionality is supplied by a user's BlackBerry phone, but for whatever reason, RIM decided not to show it. (If you're not a BlackBerry user, the only way to get that functionality today on a PlayBook is with web applications.)

Apple focuses on all the different ways in which an iPad 2 can be used. Motorola and RIM, without the library of tablet apps that Apple has, are focusing on eye candy. Both companies appear to have approached the tablet market in much the same way as PC manufacturers approach the PC market: Their job is to supply good hardware. The operating system is taken care of by Microsoft, and the applications are taken care of by everyone else, so they focus on the hardware. Motorola and RIM, by and large, got the hardware right: Big, viewable displays, fast dual-core processors, front and back cameras, etc. RIM, with its QNX acquisition, got the operating system right, while Motorola relied on Google, which didn't quite get there with Honeycomb.

What neither Motorola nor RIM got right was the apps, and that's the point of differentiation for tablets. The lesson for tablet manufacturers: If all you can show in your television ads are movies and games, you're not ready to ship.


Enhanced by Zemanta

Tuesday, May 10, 2011

Google and partners commit to guaranteed Android updates

Google's I/O 2011 conference opened this morning in San Francisco, and the company made a number of announcements, including a new version of Honeycomb (3.1) for both tablets and Google TV, a preview of Ice Cream Sandwich (4.0), which is scheduled for release late this year, a new music service, specifications for interfacing to a variety of hardware devices, and a home automation initiative. However, perhaps the most important announcement was made concerning a policy for device updates.

One of the biggest drawbacks to the Android platform, both for users and developers, has been the lack of an official policy on which version of Android devices are released with, and when (or if) existing devices get updated to new versions of Android. As of today, Google has finally addressed the problem in partnership with some (but not all) of the members of its Open Handset Alliance. Vodafone, all four national U.S. mobile carriers (Verizon, AT&T, Sprint and T-Mobile), as well as handset manufacturers Samsung, HTC, LG, Motorola and Sony Ericsson have committed to provide timely upgrades for 18 months from the date that their new Android devices are released, so long as the devices are capable of supporting the upgrades.

It's not yet clear what "timely" means, but it's likely to mean anywhere from a few weeks to a few months after a new version of Android is released. Also, the 18-month period starts when a new device is released, not when a consumer purchases it, so this policy will encourage consumers to purchase Android devices shortly after they're released (and penalize those who buy a device fairly late in its life). In addition, it still doesn't compare to Apple's track record for iOS upgrades.

Nevertheless, this new policy is a big win for both consumers and developers. Consumers can purchase with reasonable assurance that their devices will remain current for at least 18 months, and developers can have a reasonable assurance that apps written for the latest version of Android will be adopted by customers much more quickly than in the past.
Enhanced by Zemanta

Friday, May 06, 2011

New media has to break its addiction to old media

People have been trying to turn the Internet into a new medium that can compete on an equal footing with television, radio, newspapers, etc. since the Netscape days of the mid-1990s. So, fifteen years on, what have we accomplished?
  • Netflix has more subscribers than Comcast, but it lives or dies based on which television networks, cable networks and movie studios are willing to do business with it, what shows they're willing to supply, when they're willing to supply them and at what cost.
  • Hulu has much the same problem, even though it's owned by three of the four major U.S. television networks.
  • YouTube is trying to cut distribution deals with many of the same television networks, cable networks and movie studios as Netflix and Hulu.
  • Pundits spend an inordinate amount of time discussing how much The New York Times and The Wall Street Journal are charging for access to their newspapers online, whether paywalls work, how to circumvent paywalls, etc.
  • Hearst, Condé Nast and Time Warner will offer their eMagazines on the iPad if they can only get a business deal worked out with Apple. Meanwhile, News Corporation's "The Daily" is on the iPad and is losing money.
  • Clear Channel is building its own clone of the Pandora streaming music service and plans to launch it this summer.
The "new media" has largely become a repackaging of old media for Internet delivery: Old wine in new bottles. Almost all of the content on the Internet that's economically viable comes from old media companies.

In order for content to be economically viable, it has to have two key attributes:
  1. It has to attract a large audience, and
  2. It has to be repeatable--audiences have to be willing to come back day after day, week after week
Content that repeatably attracts large audiences can be sold to national advertisers, which generates the revenues necessary to create more content and make the business attractive to investors. Viral videos, like those found on YouTube, meet the first criteria: A popular viral video can get millions of views. The problem is that they're not repeatable. The vast majority of viral videos are "one-hit wonders". Google has found that it's possible, but very difficult, to sell advertising against viral videos. Many advertisers don't want their ads to run alongside "objectionable" content, yet it's that same objectionable content that makes many videos go viral.

On the other hand, webcast networks like TWiT and Revision3 get audiences that come back week after week for original shows, but the audiences aren't big enough to generate a lot of advertising revenue. They make enough money to make a nice living for a few people, but not enough to attract investors.

That's why new media companies keep turning to old media companies to get their content. The problem is that old media companies don't want to risk their existing revenue streams, even if those revenue streams are already being eroded. If you're an Internet company and your business plan depends on convincing old media companies to license their content to you, you're starting with two strikes against you. Even worse, your biggest suppliers are in a position to become your biggest competitors, if they aren't already competing against you.

New media companies have to break their dependence on old media, and the only way to do that is to produce original content in new forms that old media companies can't, or won't, duplicate.
Enhanced by Zemanta

Tuesday, May 03, 2011

Thunderbolt strikes Apple's iMacs

As I forecast last year (albeit a month or so later than forecast), Apple has launched its line of Sandy Bridge-based iMacs. The four models range in base price from $1,199 to $1,999. Two displays are available: 21.5" and 27", and the models are further differentiated by their base amount of RAM, hard disk space (with or without SSDs) and graphics cards. As with the current MacBook Pros, Apple has gone "all in" with AMD and only offers Radeon 67XXM (Mobile) GPUs. The base processor for all the iMacs is Intel's Core i5 with quad cores (no more dual-core models), and the top-of-the-line model in either 21.5" or 27" can be equipped with a Core i7 quad-core CPU for an extra $200.

The 21.5" models come with a single Thunderbolt/Mini Displayport connection, while the 27" models come with dual Thunderbolt/Mini Displayport connections. The 21.5" models support 8GB of RAM, and the 27" models support 16GB. Studio Daily "tricked out" the top-of-the-line 27" model with an i7 processor, 16GB of RAM (keep in mind that third-party RAM is much less expensive than Apple's), 2TB of hard drive and a 256GB SSD, and came up with a price of $3.818.00. It's not as much of a bargain as is $1,995, but it's still a nice savings over a comparably-equipped Mac Pro.

The full power of the new iMacs will be released over the next few months, when peripheral manufacturers start delivering Thunderbolt-compatible devices, and when Apple ships Final Cut Pro X. AJA Video and Blackmagic Design displayed Thunderbolt-compatible video capture devices at NAB, while Promise Technology displayed Thunderbolt-compatible RAID arrays and La Cie showed individual Thunderbolt hard drives and SSDs. You might want to wait until some real-world benchmarks and reviews are published before buying a new iMac, to see if Thunderbolt's throughput and flexibility are really sufficient for high-end video editing.
Enhanced by Zemanta