Investor Place has published a summary of a survey from ChangeWave Research that says that iPad owners as a percentage of all eBook reader owners it surveyed have doubled from 16% in August 2010 to 32% in November, while the percentage saying that it owns Amazon Kindles has declined from 62% to 47% in the same period. The simple way to read this is that the iPad is growing rapidly and stealing overall market share for eBook readers from the Kindle, but it's not the whole story.
It's not surprising that the Kindle's market share is declining overall, and will decline further in 2011 as Android tablets that actually work well as tablets, rather than overgrown smartphones, reach the market. (Given that Amazon is widely rumored to be working on its own Android tablet, that might be a shift it encourages, especially if it makes more money per unit on its tablets than its Kindles.) There's no doubt that the iPad is outselling the Kindle, and every iPad can be used as an eBook reader.
Elsewhere in the survey, ChangeWave asked people who don't already own eBook readers but plan to buy one in the next 90 days which one they prefer. 42% of respondents chose the iPad, while 33% chose the Kindle. It's not great for Amazon, but not disastrous, either. In addition, one question that the survey leaves out is what software iPad owners are using to read eBooks. The chances are that a majority of them are using Amazon's Kindle reader on the iPad as one of their eBook reader applications, if not the primary one.
That's where Amazon wins either way: If a customer purchases a Kindle, they're locked into the Amazon eBook ecosystem, and Amazon makes money on both the hardware and eBooks. Even if a customer purchases an iPad, Amazon still makes money so long as the customer installs the free Kindle reader and purchases eBooks from Amazon. So, just because the Kindle's overall market share is declining, it doesn't necessarily mean that it's bad news for Amazon.
Tuesday, November 30, 2010
Sunday, November 28, 2010
The Feldman File's videoblog is on vacation this week
Because of last week's U.S. Thanksgiving holiday and a "slow news week", the Feldman File's videoblog is taking the week off. It will return next weekend.
Saturday, November 27, 2010
Opprtunities for angel investors and VCs in "Flyover Country"
Earlier this week, Technori published a two-part article that I wrote for it on "Angels in Chicago". The first part explained the concept of angel investing and introduced readers to four active angel investors in Chicago, and the second part discussed the problems that angels see in funding proposals, compared angel investing to incubators, and offered some suggestions for entrepreneurs.
There's a surprising number of angel investors in Chicago. AngelList only names 16 Chicago angels, but Hyde Park Angels, the biggest angel investing group in the city, has 76 members, only a handful of which are on AngelList. There are other angel groups just getting started, like Wildcat Angels. And Groupon notwithstanding, the angels I interviewed said that the valuation inflation and competition for deals that investors are seeing in Silicon Valley and New York isn't happening in Chicago.
Terry Howerton, the head of the Illinois Technology Association, believes that the lack of VCs in Chicago is a bigger problem for startups than the number of angel investors. He said that there's only three or four VCs in the city doing Series A or later rounds, so VCs from other cities in the Midwest come to Chicago to look for deals. There's an enormous opportunity for angels and VCs outside Chicago to do deals here.
According to Bankrate.com, the cost of living in Silicon Valley is more than 30% higher than Chicago, while New York/Brooklyn is more than 50% higher, and New York/Manhattan is almost 80% higher. That means that a startup's labor costs (as well as taxes and some other operating costs) are going to be dramatically less in Chicago and other Midwestern locations. In addition, there isn't the competition for deals and valuation inflation that's occurring in Silicon Valley or New York.
Chicago has become a center of startup activity, especially for online services, as well as vertical B-to-B applications (financial, real estate, travel, etc.). There's a real opportunity here for investors, especially if you keep the companies in Chicago (rather than moving them to Silicon Valley or New York) and take advantage of the area's lower costs.
There's a surprising number of angel investors in Chicago. AngelList only names 16 Chicago angels, but Hyde Park Angels, the biggest angel investing group in the city, has 76 members, only a handful of which are on AngelList. There are other angel groups just getting started, like Wildcat Angels. And Groupon notwithstanding, the angels I interviewed said that the valuation inflation and competition for deals that investors are seeing in Silicon Valley and New York isn't happening in Chicago.
Terry Howerton, the head of the Illinois Technology Association, believes that the lack of VCs in Chicago is a bigger problem for startups than the number of angel investors. He said that there's only three or four VCs in the city doing Series A or later rounds, so VCs from other cities in the Midwest come to Chicago to look for deals. There's an enormous opportunity for angels and VCs outside Chicago to do deals here.
According to Bankrate.com, the cost of living in Silicon Valley is more than 30% higher than Chicago, while New York/Brooklyn is more than 50% higher, and New York/Manhattan is almost 80% higher. That means that a startup's labor costs (as well as taxes and some other operating costs) are going to be dramatically less in Chicago and other Midwestern locations. In addition, there isn't the competition for deals and valuation inflation that's occurring in Silicon Valley or New York.
Chicago has become a center of startup activity, especially for online services, as well as vertical B-to-B applications (financial, real estate, travel, etc.). There's a real opportunity here for investors, especially if you keep the companies in Chicago (rather than moving them to Silicon Valley or New York) and take advantage of the area's lower costs.
Labels:
Angel investor,
AngelList,
Chicago,
Hyde Park Angels,
Illinois,
Silicon Valley
Thursday, November 25, 2010
Alas, poor Stringer: Regime change underway at Sony?
Bloomberg News reported earlier today that Sony is seeking a new President. Sir Howard Stringer, who currently is Chairman, CEO and President of the company, is relinquishing the President position, and the primary candidates are senior executives at Sony who came out of the Engineering organization. According to Bloomberg, the new President will most likely succeed Stringer in running the company.
Few people would look at Howard Stringer's reign at Sony as being successful. In the almost 5 1/2 years since Stringer took over, Sony's stock has declined 25%, while the overall Nikkei 225 average fell half as much. Sony has lost market leadership in almost every major product segment that it once dominated: Apple took over the portable audio player business, Samsung took over in televisions, and Nintendo regained its market leadership in videogame consoles. Sony, which was once the company that others copied, found itself copying its competitors in order to survive; the Playstation Move is a slightly-improved copy of Nintendo's Wiimote, and Sony's digital camera business has had to copy Canon. Sony's prosumer and professional camcorder businesses are copying Panasonic's designs. For all of Stringer's boasting about Sony's leadership in 3D, both Samsung and Panasonic delivered 3D HDTVs before Sony, and Panasonic is shipping 3D digital cameras and camcorders before Sony.
To understand why Stringer was brought to Sony, you need to understand events starting in the late 1970s. In the so-called "Betamax" case, members of the Motion Picture Association of America went to court to block sales of Sony's Betamax home videocassette recorders, on the grounds that they would facilitate content theft. Losing the case would have not only forced Sony to withdraw its Betamax recorders from the U.S. market--it would have crippled future product opportunities. Sony won the case in the U.S. Supreme Court in 1984, but the company realized that it had to become a major player in the media business in order to insure a reliable supply of content for its devices.
To get into the movie business, Sony acquired Columbia Pictures from Coca-Cola in 1989. In addition, Sony had long been partners with CBS Records in the Japanese market, but in 1987, Sony acquired CBS' worldwide music business, and then Bertelsmann's music business in 2008. These moves made Sony one of the world's largest motion picture and music companies, as well as a major player in U.S. television syndication.
Howard Stringer joined Sony after a 30-year career at CBS (one of the four major commercial broadcast television networks in the U.S.,) where he spent most of his career in the News division before running the entire company for seven years, followed by a two-year stretch at an ill-fated joint venture called Tele-TV. Stringer was appointed President of Sony Corporation of America in 1997, made a corporate board member a year later, and was appointed Chairman and CEO of Sony Corporation in Japan in 2005. Four years later, he took over the President position as well. Sony's goal in bringing Stringer to Sony's corporate offices was to try to strategically unite the company's electronics and media businesses.
Stringer came to Sony with two strikes against him: First, he was a gaikokujin (foreigner), which is a problem with any Japanese company, and second, he had no engineering background or experience, which was a huge problem at Sony, a very engineering-driven company. Sony's board was perhaps hoping that Stringer would be to the company what Carlos Ghosn was to Nissan Motors. Ghosn, the Brazilian-born engineer who was at the time in charge of engineering at Renault, was sent to run Nissan by Renault (Nissan's largest shareholder) to try to rescue the company. Not only did Ghosn turn Nissan around, he became a Japanese folk hero in the process.
Ghosn moved to Japan, learned how to speak fluent Japanese, and communicated with the Nissan team with the universal language of engineering. He adapted to the organization and culture, and gained the respect of both his team at Nissan and all of Japan. Stringer, on the other hand, has no engineering background, and it's unclear if he understands consumer electronics any better now than he did the day that he joined Sony. He never learned Japanese, kept his primary office in New York City and his family in London.
It may have been unrealistic for Sony to put the expectations on Stringer that it did, but it also should have seen the problems well before it did. Without a respected manager running the company, individual divisions within Sony went in their own directions. Stringer took over the President position in 2009 to try to pull the company back together, both strategically and tactically. He was largely successful, but has been unable to provide any long-term strategic blueprint to the company's engineers. As a result, Sony is largely "rowing in circles" rather than pursuing a strategic plan to regain market leadership.
Americans who know Howard Stringer speak very positively about him, and he's well-liked in the media industry. Stringer wasn't a bad manager, but he was the wrong person for the job. Had he understood Sony's culture or Japanese culture better, he might have demurred at taking the Chairman and CEO positions. The good news is that, while Sony may never recover leadership in all the markets that it once dominated, it still has a very strong engineering organization that's capable of developing great products. It just needs someone running the company who can tell it what those great products should be.
Few people would look at Howard Stringer's reign at Sony as being successful. In the almost 5 1/2 years since Stringer took over, Sony's stock has declined 25%, while the overall Nikkei 225 average fell half as much. Sony has lost market leadership in almost every major product segment that it once dominated: Apple took over the portable audio player business, Samsung took over in televisions, and Nintendo regained its market leadership in videogame consoles. Sony, which was once the company that others copied, found itself copying its competitors in order to survive; the Playstation Move is a slightly-improved copy of Nintendo's Wiimote, and Sony's digital camera business has had to copy Canon. Sony's prosumer and professional camcorder businesses are copying Panasonic's designs. For all of Stringer's boasting about Sony's leadership in 3D, both Samsung and Panasonic delivered 3D HDTVs before Sony, and Panasonic is shipping 3D digital cameras and camcorders before Sony.
To understand why Stringer was brought to Sony, you need to understand events starting in the late 1970s. In the so-called "Betamax" case, members of the Motion Picture Association of America went to court to block sales of Sony's Betamax home videocassette recorders, on the grounds that they would facilitate content theft. Losing the case would have not only forced Sony to withdraw its Betamax recorders from the U.S. market--it would have crippled future product opportunities. Sony won the case in the U.S. Supreme Court in 1984, but the company realized that it had to become a major player in the media business in order to insure a reliable supply of content for its devices.
To get into the movie business, Sony acquired Columbia Pictures from Coca-Cola in 1989. In addition, Sony had long been partners with CBS Records in the Japanese market, but in 1987, Sony acquired CBS' worldwide music business, and then Bertelsmann's music business in 2008. These moves made Sony one of the world's largest motion picture and music companies, as well as a major player in U.S. television syndication.
Howard Stringer joined Sony after a 30-year career at CBS (one of the four major commercial broadcast television networks in the U.S.,) where he spent most of his career in the News division before running the entire company for seven years, followed by a two-year stretch at an ill-fated joint venture called Tele-TV. Stringer was appointed President of Sony Corporation of America in 1997, made a corporate board member a year later, and was appointed Chairman and CEO of Sony Corporation in Japan in 2005. Four years later, he took over the President position as well. Sony's goal in bringing Stringer to Sony's corporate offices was to try to strategically unite the company's electronics and media businesses.
Stringer came to Sony with two strikes against him: First, he was a gaikokujin (foreigner), which is a problem with any Japanese company, and second, he had no engineering background or experience, which was a huge problem at Sony, a very engineering-driven company. Sony's board was perhaps hoping that Stringer would be to the company what Carlos Ghosn was to Nissan Motors. Ghosn, the Brazilian-born engineer who was at the time in charge of engineering at Renault, was sent to run Nissan by Renault (Nissan's largest shareholder) to try to rescue the company. Not only did Ghosn turn Nissan around, he became a Japanese folk hero in the process.
Ghosn moved to Japan, learned how to speak fluent Japanese, and communicated with the Nissan team with the universal language of engineering. He adapted to the organization and culture, and gained the respect of both his team at Nissan and all of Japan. Stringer, on the other hand, has no engineering background, and it's unclear if he understands consumer electronics any better now than he did the day that he joined Sony. He never learned Japanese, kept his primary office in New York City and his family in London.
It may have been unrealistic for Sony to put the expectations on Stringer that it did, but it also should have seen the problems well before it did. Without a respected manager running the company, individual divisions within Sony went in their own directions. Stringer took over the President position in 2009 to try to pull the company back together, both strategically and tactically. He was largely successful, but has been unable to provide any long-term strategic blueprint to the company's engineers. As a result, Sony is largely "rowing in circles" rather than pursuing a strategic plan to regain market leadership.
Americans who know Howard Stringer speak very positively about him, and he's well-liked in the media industry. Stringer wasn't a bad manager, but he was the wrong person for the job. Had he understood Sony's culture or Japanese culture better, he might have demurred at taking the Chairman and CEO positions. The good news is that, while Sony may never recover leadership in all the markets that it once dominated, it still has a very strong engineering organization that's capable of developing great products. It just needs someone running the company who can tell it what those great products should be.
Sunday, November 21, 2010
Episode 4 of the Feldman File videoblog is live!
It's Sunday night, and that means that I've posted a new episode of the Feldman File videoblog on YouTube! Here's the rundown for this week's edition:
- Apple's less-than-earthshaking announcement about adding the Beatles' music catalog to iTunes
- Sony follows up on its Super 35MM camcorder, the PMW-F3, with yet another Super 35MM camcorder, the 35MM NXCAM
- U.S. cable operators lose 741,000 subscribers in Q3--are consumers really cutting the cord?
- The Obama Adminsitration is looking for 500MHz of additional broadband bandwidth, and the National Telecommunications and Information Administration found 2.2GHz of bandwidth available within ten years. So, do we really have a bandwidth shortage?
Labels:
35mm NXCAM,
apple,
Beatles,
Broadband Internet access,
Cable,
Direct broadcast satellite,
IPTV,
ITunes,
Sony
Saturday, November 20, 2010
Microsoft (finally) does the right thing with Kinect
Earlier this month, shortly after Microsoft's Kinect shipped, a group of open-source hardware developers called Adafruit Industries offered a $1,000 bounty to the first person who wrote and released open-source drivers for the device. In response, Microsoft told CNET: "Microsoft does not condone the modification of its products. With Kinect, Microsoft built in numerous hardware and software safeguards designed to reduce the chances of product tampering. Microsoft will continue to make advances in these types of safeguards and work closely with law enforcement and product safety groups to keep Kinect tamper-resistant." In response, Adafruit increased its bounty to $2,000, and then $3,000.
To an extent, Microsoft's position was understandable--the Kinect was intended to drive sales of Xbox 360s and Xbox games, and a Kinect sold for some other purpose wouldn't generate additional revenue for Microsoft. Also, if the number of Kinects available for the holiday season is limited, Microsoft might not have enough to meet demand. However, the ham-handed way that Microsoft went about threatening anyone who dared write a driver for the Kinect, which at the end of the day is simply a USB 2.0 device, backfired.
Less than a week later, a Spanish developer Hector Martin wrote and released his open-source driver and won the Adafruit bounty. Since then, other developers have begun adapting the Kinect for a variety of applications, such as 3D video and contactless measurements, and as the vision system for a robot. It has great potential in education, machine vision, communications and a variety of other applications.
Yesterday, after the EFF weighed in, Microsoft apparently "saw the light." CNET reports that a Microsoft representative on NPR's Science Friday said that the Kinect was left open by "design", and a tweet from the Science Friday account stated that "(Xbox director of incubation) Alex Kipman says Kinect interface was left unprotected 'by design.' [And Microsoft's] Shannon Loftis says she's 'inspired' by community finding new uses." Adafruit Industries replied on its blog with "Congrats to everyone in the open source community, in about one week we turned 'work closely with law enforcement' to 'inspired' by community finding new uses for Kinect."
To an extent, Microsoft's position was understandable--the Kinect was intended to drive sales of Xbox 360s and Xbox games, and a Kinect sold for some other purpose wouldn't generate additional revenue for Microsoft. Also, if the number of Kinects available for the holiday season is limited, Microsoft might not have enough to meet demand. However, the ham-handed way that Microsoft went about threatening anyone who dared write a driver for the Kinect, which at the end of the day is simply a USB 2.0 device, backfired.
Less than a week later, a Spanish developer Hector Martin wrote and released his open-source driver and won the Adafruit bounty. Since then, other developers have begun adapting the Kinect for a variety of applications, such as 3D video and contactless measurements, and as the vision system for a robot. It has great potential in education, machine vision, communications and a variety of other applications.
Yesterday, after the EFF weighed in, Microsoft apparently "saw the light." CNET reports that a Microsoft representative on NPR's Science Friday said that the Kinect was left open by "design", and a tweet from the Science Friday account stated that "(Xbox director of incubation) Alex Kipman says Kinect interface was left unprotected 'by design.' [And Microsoft's] Shannon Loftis says she's 'inspired' by community finding new uses." Adafruit Industries replied on its blog with "Congrats to everyone in the open source community, in about one week we turned 'work closely with law enforcement' to 'inspired' by community finding new uses for Kinect."
Labels:
Adafruit Industries,
EFF,
Kinect,
Microsoft,
Science Friday
Thursday, November 18, 2010
Post on Customer Development and Product Management
I've written an article for Solvate on how Product Management can benefit startups that use Customer Development. Solvate is a service that enables businesses of all sizes to find professional talent "on-demand". The article is titled "Product Management in an Agile World", and you can read it on the Solvate site.
Labels:
Agile,
Customer Development,
Product management,
Solvate
Wednesday, November 17, 2010
U.S. pay TV subscribers fall for a second straight quarter
SNL Kagan is reporting that the number of pay TV subscribers in the U.S. has fallen for the second straight quarter, the first time that's happened since Kagan first started tracking the industry in 1980. Cable operators lost 741,000 cable-only subscribers, the largest quarterly loss for cable ever measured by SNL Kagan, while IPTV operators (primarily Verizon and AT&T) gained 476,000 subscribers. Together, satellite operators Dish and DirecTV gained 145,000 subscribers for the quarter. The net loss for all subscription television services was 119,000 subscribers.
Multichannel News points out one of the big reasons for cable's decline: When analog over-the-air broadcasts were phased out in the U.S., cable operators in particular offered very enticing offers to over-the-air households to get them to adopt cable. Those deals are expiring or have already expired, and former over-the-air television viewers are facing big increases in their cable rates. In addition, the price difference between the "basic cable" tier, which is closest to conventional over-the-air TV, and even the cheapest premium tier can be substantial. Multichannel News gives the example of a Comcast cable system that goes from $13.65/month for "lifeline" service to $62.60/month for its "Digital Starter" service.
IPTV and satellite operators have targeted price-sensitive cable subscribers with low-cost service and, in the case of IPTV, triple-play (video, high-speed Internet and telephone) packages priced below the "magic" $99/month number. That explains the gains by IPTV and satellite providers, but it doesn't explain the whole picture. Television Broadcast quotes SNL Kagan senior analyst Ian Olgeirson: "... it is becoming increasingly difficult to dismiss the impact of over-the-top [Internet] substitution on video subscriber performance, particularly after seeing declines during the period of the year that tends to produce the largest subscriber gains due to seasonal shifts back to television viewing and subscription packages."
Economic conditions may be driving the changes, but the changes are real, and they may not be temporary. Unless cable operators get a lot more competitive in how they price and package their services, they're going to be increasingly vulnerable to IPTV and satellite providers, as well as over-the-top Internet video.
Multichannel News points out one of the big reasons for cable's decline: When analog over-the-air broadcasts were phased out in the U.S., cable operators in particular offered very enticing offers to over-the-air households to get them to adopt cable. Those deals are expiring or have already expired, and former over-the-air television viewers are facing big increases in their cable rates. In addition, the price difference between the "basic cable" tier, which is closest to conventional over-the-air TV, and even the cheapest premium tier can be substantial. Multichannel News gives the example of a Comcast cable system that goes from $13.65/month for "lifeline" service to $62.60/month for its "Digital Starter" service.
IPTV and satellite operators have targeted price-sensitive cable subscribers with low-cost service and, in the case of IPTV, triple-play (video, high-speed Internet and telephone) packages priced below the "magic" $99/month number. That explains the gains by IPTV and satellite providers, but it doesn't explain the whole picture. Television Broadcast quotes SNL Kagan senior analyst Ian Olgeirson: "... it is becoming increasingly difficult to dismiss the impact of over-the-top [Internet] substitution on video subscriber performance, particularly after seeing declines during the period of the year that tends to produce the largest subscriber gains due to seasonal shifts back to television viewing and subscription packages."
Economic conditions may be driving the changes, but the changes are real, and they may not be temporary. Unless cable operators get a lot more competitive in how they price and package their services, they're going to be increasingly vulnerable to IPTV and satellite providers, as well as over-the-top Internet video.
The 35mm NXCAM: Another Sony Super 35mm camcorder?
The Japanese InterBEE broadcast electronics conference is underway, and Sony is showing what's being called the "35mm NXCAM" camcorder. Unlike the PMW-F3 that was announced last week, this new model looks more like a prosumer version of the NEX-VG10 consumer camcorder. According to Alistair Chapman, it has a Super 35mm sensor, flip-out monitor and modular attachment points for a handgrip and microphone, so that the camcorder can be configured to meet the user's needs. (Shades of Red, anyone?) It also has a cavity at the rear of the camera that's too big for a battery alone, and may be another modular attachment point for a solid state memory module and/or a HD-SDI output.
Notes on Video is reporting that a Sony Electronic Marketing Manager has tweeted the following specifications for the new camcorder:
Notes on Video is reporting that a Sony Electronic Marketing Manager has tweeted the following specifications for the new camcorder:
- Large format sensor Super 35mm is over 20% larger than micro 4/3"
- Existing and future E-mount lenses attach directly. Alpha A-mount lenses via LA-EA1 adapter
- Designed from the ground up for "motion picture capability"
- From the design team that brought you the HXR-NX5u
Broadcasters and Cable Operators: Hypocrites on the Hill
Broadcasters and cable operators are facing off today in hearings at the U.S. Congress over compensation for broadcast retransmission rights and the ability of broadcasters to withhold their programming from cable, satellite and IPTV service providers. The broadcasters are being represented by Fox/News Corp. and Univision, and the cable operators by Cablevision and Time Warner Cable.
Today's hearings were triggered by the standoff between Fox and Cablevision that led to Fox's television stations and most of its cable channels being unavailable to Cablevision subscribers for almost two weeks. The broadcasters, led by Chase Carey of News Corporation, want the government to keep out of the negotiations and impose no requirements for binding arbitration. The cable operators want broadcasters to be required to make their programming available so long as negotiations are continuing, and want binding arbitration at a minimum, if not outright controls on the prices that broadcasters can charge for retransmission rights.
Let's take the broadcasters' side first. They don't want any government interference in or controls on their negotiations. However, their right to set prices for and control retransmission of their programming was established by the U.S. Government in the 1996 Telecommunications Act. Prior to that, they had no choice but to provide their programming to any cable operator who wanted it and was willing to pay the U.S. Copyright Office for the right to use it. If the government hadn't "interfered", broadcasters wouldn't have the rights that it doesn't want the government to interfere with.
In addition, other than a modest fee for a license issued by the U.S. Government, broadcasters don't pay a penny for the bandwidth that they use. If they had to pay the true market value for the bandwidth they use, broadcasters might have a stronger argument, but they're getting the bandwidth that makes their businesses possible for free.
Now, consider the cable operators. The rates that consumers pay for cable service have been going up steadily for years, faster than the rate of inflation, even before the current round of retransmission negotiations. Cable operators have managed to rid themselves of most local controls over their pricing, and they steadfastly refuse to implement a la carte pricing, which would allow consumers to pay for only the channels that they want to watch. The result is that cable (as well as satellite and IPTV) subscribers are forced to pay for dozens of channels that they never watch and wouldn't miss if they weren't available.
Broadcasters want the U.S. Government to subsidize their bandwidth and give them the right to charge for their programming, but they don't want government interference in their negotiations with cable operators. Cable operators plead poverty but have been raising rates for years, and refuse to give their customers the right to pay for only the channels that they want to watch. Both sides are hypocrites.
Today's hearings were triggered by the standoff between Fox and Cablevision that led to Fox's television stations and most of its cable channels being unavailable to Cablevision subscribers for almost two weeks. The broadcasters, led by Chase Carey of News Corporation, want the government to keep out of the negotiations and impose no requirements for binding arbitration. The cable operators want broadcasters to be required to make their programming available so long as negotiations are continuing, and want binding arbitration at a minimum, if not outright controls on the prices that broadcasters can charge for retransmission rights.
Let's take the broadcasters' side first. They don't want any government interference in or controls on their negotiations. However, their right to set prices for and control retransmission of their programming was established by the U.S. Government in the 1996 Telecommunications Act. Prior to that, they had no choice but to provide their programming to any cable operator who wanted it and was willing to pay the U.S. Copyright Office for the right to use it. If the government hadn't "interfered", broadcasters wouldn't have the rights that it doesn't want the government to interfere with.
In addition, other than a modest fee for a license issued by the U.S. Government, broadcasters don't pay a penny for the bandwidth that they use. If they had to pay the true market value for the bandwidth they use, broadcasters might have a stronger argument, but they're getting the bandwidth that makes their businesses possible for free.
Now, consider the cable operators. The rates that consumers pay for cable service have been going up steadily for years, faster than the rate of inflation, even before the current round of retransmission negotiations. Cable operators have managed to rid themselves of most local controls over their pricing, and they steadfastly refuse to implement a la carte pricing, which would allow consumers to pay for only the channels that they want to watch. The result is that cable (as well as satellite and IPTV) subscribers are forced to pay for dozens of channels that they never watch and wouldn't miss if they weren't available.
Broadcasters want the U.S. Government to subsidize their bandwidth and give them the right to charge for their programming, but they don't want government interference in their negotiations with cable operators. Cable operators plead poverty but have been raising rates for years, and refuse to give their customers the right to pay for only the channels that they want to watch. Both sides are hypocrites.
Labels:
cable television,
Cablevision,
Fox,
News Corporation
Tuesday, November 16, 2010
Insights from "Unstoppable": Based on the novel "Derailed" by Sapphire
I watched Tony Scott's movie "Unstoppable" last night, and learned a number of things:
- Never let fat, dumb guys drive a train.
- The train that the fat dumb guy is driving will inevitably be loaded with explosive materials.
- Television stations from Pittsburgh have an infinite number of ENG trucks.
- Even with infinite numbers of ENG trucks, you can never have enough helicopters.
- You know those fat, dumb guys? Don't promote them into senior management.
- Corollary to the previous: Executives always forget that they have insurance when adding up the costs of a disaster.
- Avoid putting a bending overhead railroad trestle right next to a dozen tanks full of fuel oil in the middle of a city.
- Fat, dumb guys notwithstanding, Pennsylvania railroads have the best-looking employees.
- While people from Pennsylvania divide the state into "Western", "Central" and "Eastern", movies about runaway trains in Pennsylvania divide the state into "Northern", Central" and "Southern".
- The best way to get your wife to lift a restraining order against you is to keep a runaway train from careening off an overhead railroad trestle into a dozen tanks of fuel oil.
Monday, November 15, 2010
Bandwidth shortage? What bandwidth shortage?
Television Broadcast magazine reports that the U.S. National Telecommunications and Information Administration (NTIA) released a report today that identifies 2,200MHz of potential available over-the-air bandwidth for the National Broadband Plan. The Obama Administration and Federal Communications Commission are trying to free 500MHz of bandwidth for broadband services, and they're doing so by moving existing services such as wireless microphones, authorizing new unlicensed services to use the "white space" between television channel allocations, and convincing television broadcasters to voluntarily relinquish portions of their licensed channels for broadband use. Every action so far has resulted in a hail of criticism from the affected parties.
The NTIA report identified 155MHz that can be opened up for commercial broadband use within five years, and another 2.1GHz that could be available within ten years. The NTIA recommends taking 120MHz from television broadcasters in the VHF and UHF bands, another 160MHz from other existing commercial services, and 500MHz from bandwidth currently used for C-band and ship-based satellite receivers. 640MHz would come from bandwidth currently reserved strictly for Federal use, and another 814MHz would come from bandwidth currently shared by the U.S. Government and private users.
Even if the NTIA is very optimistic about the amount of bandwidth that can be made available within ten years, there's clearly more than enough bandwidth available to meet the 500MHz goal, even without getting television broadcasters involved. Given that it's increasingly looking as though the Obama Administration only has two more years, time is running out for the FCC to implement its vision of a broadband future. By pursuing sources other than television broadcasters, the FCC could dramatically lessen industry resistance to its broadband plans.
The NTIA report identified 155MHz that can be opened up for commercial broadband use within five years, and another 2.1GHz that could be available within ten years. The NTIA recommends taking 120MHz from television broadcasters in the VHF and UHF bands, another 160MHz from other existing commercial services, and 500MHz from bandwidth currently used for C-band and ship-based satellite receivers. 640MHz would come from bandwidth currently reserved strictly for Federal use, and another 814MHz would come from bandwidth currently shared by the U.S. Government and private users.
Even if the NTIA is very optimistic about the amount of bandwidth that can be made available within ten years, there's clearly more than enough bandwidth available to meet the 500MHz goal, even without getting television broadcasters involved. Given that it's increasingly looking as though the Obama Administration only has two more years, time is running out for the FCC to implement its vision of a broadband future. By pursuing sources other than television broadcasters, the FCC could dramatically lessen industry resistance to its broadband plans.
AT&T will sell the Galaxy Tab for $650?
According to Engadget, AT&T has finally announced its pricing and availability for Samsung's Galaxy Tab Android tablet. AT&T carefully evaluated the prices announced by Verizon, Sprint and T-Mobile, and in an effort to be competitive, priced its version of the Galaxy Tab at $650, $50 more than anyone else. AT&T stores will make it available on November 21st. To be fair, AT&T customers can use the same no-contract 250MB/$15/month and 2GB/$25/month data plans that the company offers for the iPad. However, given that the reviews for the Galaxy Tab have been far from positive (even David Pogue, who perhaps gave it the most glowing major publication review, said that it's too expensive at $599), AT&T's price might not matter all that much.
U.S. Cellular, a regional wireless carrier based in the Chicago area, also announced pricing and availability: $399 for a two-year contract and $599 without contract. Data rates are 200MB/$14.99/month and 5GB/$54.99/month. U.S. Cellular stores will have the Galaxy Tab available for sale on November 19th.
My recommendation is to wait until Google formally releases the Gingerbread version of Android (which will officially support tablets), and Samsung (and others) start shipping tablets with Gingerbread installed early next year.
U.S. Cellular, a regional wireless carrier based in the Chicago area, also announced pricing and availability: $399 for a two-year contract and $599 without contract. Data rates are 200MB/$14.99/month and 5GB/$54.99/month. U.S. Cellular stores will have the Galaxy Tab available for sale on November 19th.
My recommendation is to wait until Google formally releases the Gingerbread version of Android (which will officially support tablets), and Samsung (and others) start shipping tablets with Gingerbread installed early next year.
Saturday, November 13, 2010
Episode 3 of The Feldman File Videoblog
Episode 3 of The Feldman File videoblog has been posted below--if you can't view it here, you can watch it on YouTube. This week's episode covers:
- Sony U.K.'s announcement of the PMW-F3 "budget" professional camcorder with Super 35mm sensor
- Boxee's introduction of the Boxee Box, the forthcoming version of Boxee's software and new content partnerships
- Kno's announcement of prices and an availability date for its single- and dual-screen eBook readers for the higher education market
Thursday, November 11, 2010
You may not be able to see Clear(ly) for much longer
Clearwire, the U.S. wireless broadband provider, is fighting for its life. Operating under the Clear name, Clear offers WiMAX services itself and through Sprint. Even as the company was announcing record subscriber growth last week, it also announced a 15% layoff, delays in opening up the Denver and Miami markets, dramatic slowdowns in the number of new retail stores to be opened, and a virtual shutdown of its advertising and promotion efforts. In addition, executives from Sprint, Clearwire's largest investor, resigned from the company's board of directors.
Now comes news that Sprint has initiated arbitration proceedings with Clearwire over the amount of money that Sprint has to pay Clearwire for use of that company's 4G mobile phones. According to FierceTelecom, Clearwire claims that several hundred thousand 4G phones are being used in areas with no 4G coverage, and that Sprint is supposed to make monthly payments to Clearwire for every 4G phone it sells, whether or not it's used in an area that supports 4G. Sprint disagrees and has initiated arbitration. Sprint charges its subscribers $10/month extra for the 4G phones it sells, whether or not the 4G service is used, and Clearwire is apparently claiming some or all of that $10 fee.
This comes on the heels of yet another story suggesting that Sprint might invest additional money in Clearwire. However, it's difficult to see how motivated Sprint is to invest more money in Clearwire if it can't be bothered to make monthly payments of no more than a few million dollars for the right to use all its 4G phones on Clearwire's network.
In more normal economic times, the most likely outcome for Clearwire is that Sprint would purchase 100% of the company and fold it into Sprint's operations. However, Sprint doesn't appear to want to do that. What may actually be happening is that Sprint is looking for a strategy for transitioning to LTE, which is being adopted by all major U.S. carriers and is widely assumed to be the replacement for WiMAX, even for Clearwire. Sprint needs a 4G solution as a differentiating advantage until it gets LTE up and running, and for that it needs Clearwire. However, that advantage isn't worth acquiring all of Clearwire.
Therefore, even with the arbitration, the most likely outcome is that Sprint will invest enough in Clearwire to keep it afloat with no new markets or major capital investments until Sprint gets LTE running nationwide. After that, Clearwire will be on its own.
If you're thinking about buying a Sprint 4G phone or Clear's service and equipment, you may want to wait. Verizon will launch its LTE service later this year, AT&T will follow soon after in 2011, and by this time next year, WiMAX may be a footnote in wireless history.
Now comes news that Sprint has initiated arbitration proceedings with Clearwire over the amount of money that Sprint has to pay Clearwire for use of that company's 4G mobile phones. According to FierceTelecom, Clearwire claims that several hundred thousand 4G phones are being used in areas with no 4G coverage, and that Sprint is supposed to make monthly payments to Clearwire for every 4G phone it sells, whether or not it's used in an area that supports 4G. Sprint disagrees and has initiated arbitration. Sprint charges its subscribers $10/month extra for the 4G phones it sells, whether or not the 4G service is used, and Clearwire is apparently claiming some or all of that $10 fee.
This comes on the heels of yet another story suggesting that Sprint might invest additional money in Clearwire. However, it's difficult to see how motivated Sprint is to invest more money in Clearwire if it can't be bothered to make monthly payments of no more than a few million dollars for the right to use all its 4G phones on Clearwire's network.
In more normal economic times, the most likely outcome for Clearwire is that Sprint would purchase 100% of the company and fold it into Sprint's operations. However, Sprint doesn't appear to want to do that. What may actually be happening is that Sprint is looking for a strategy for transitioning to LTE, which is being adopted by all major U.S. carriers and is widely assumed to be the replacement for WiMAX, even for Clearwire. Sprint needs a 4G solution as a differentiating advantage until it gets LTE up and running, and for that it needs Clearwire. However, that advantage isn't worth acquiring all of Clearwire.
Therefore, even with the arbitration, the most likely outcome is that Sprint will invest enough in Clearwire to keep it afloat with no new markets or major capital investments until Sprint gets LTE running nationwide. After that, Clearwire will be on its own.
If you're thinking about buying a Sprint 4G phone or Clear's service and equipment, you may want to wait. Verizon will launch its LTE service later this year, AT&T will follow soon after in 2011, and by this time next year, WiMAX may be a footnote in wireless history.
Labels:
3GPP Long Term Evolution,
4G,
ATT,
Clearwire,
LTE,
Sprint,
Sprint Nextel,
Verizon,
WiMAX
Tuesday, November 09, 2010
Kno announces prices for its eBook readers
Kno, the Chegg spinoff that plans to rent college eTextbooks, has announced prices for its two eBook readers. The single-screen model will be priced at $599, and the dual-screen model will be $899. A limited number of readers, manufactured for Kno by Foxconn, will be available by the end of the year. Kno plans to test its readers and rental program at ten unnamed U.S. colleges and universities.
Kno's single-tablet pricing isn't too far from the price of the equivalent iPad, which has become the de facto industry benchmark. However, the Kno readers have a very different set of use cases than the iPad. Kno's devices are designed to provide as close of an electronic substitute as possible for the experience of using printed textbooks, so they have big screens that can fit most textbooks in current use on the screen at full scale. Students can take notes and highlight with a stylus, or use their fingers. They're not general-purpose devices like the iPad or notebook computers.
Kno claims that their rental program will cover the cost of their single-screen tablet in three semesters. However, the question is whether students want an expensive dedicated eBook reader at all, or if they can live with the smaller screen of the iPad and similar devices. I think that most students will go with the iPad. They'll get a ligher, easier-to-carry device, an enormous selection of apps and a choice of eBook vendors, rather than being locked into Kno.
Kno started developing their eBook reader well before the iPad was announced. If they had waited until the iPad was available, I suspect that they would have chosen to support the iPad and other tablets rather than building their own. If Kno's eBook readers don't get market traction quickly, they may be forced to alter their strategy and support other devices.
Kno's single-tablet pricing isn't too far from the price of the equivalent iPad, which has become the de facto industry benchmark. However, the Kno readers have a very different set of use cases than the iPad. Kno's devices are designed to provide as close of an electronic substitute as possible for the experience of using printed textbooks, so they have big screens that can fit most textbooks in current use on the screen at full scale. Students can take notes and highlight with a stylus, or use their fingers. They're not general-purpose devices like the iPad or notebook computers.
Kno claims that their rental program will cover the cost of their single-screen tablet in three semesters. However, the question is whether students want an expensive dedicated eBook reader at all, or if they can live with the smaller screen of the iPad and similar devices. I think that most students will go with the iPad. They'll get a ligher, easier-to-carry device, an enormous selection of apps and a choice of eBook vendors, rather than being locked into Kno.
Kno started developing their eBook reader well before the iPad was announced. If they had waited until the iPad was available, I suspect that they would have chosen to support the iPad and other tablets rather than building their own. If Kno's eBook readers don't get market traction quickly, they may be forced to alter their strategy and support other devices.
Monday, November 08, 2010
Section 111 of the Copyright Act is alive and well
In this week's videoblog, I did a piece about FilmOn.com, a U.K. company that's streaming broadcast signals from television stations in Los Angeles, as well as a variety of cable networks, across the Internet. Like ivi, FilmOn.com is depending on Section 111 of the U.S. Copyright Act, which requires television broadcasters to grant statutory (mandatory) licenses to retransmit their signals to cable systems, in return for payments made to the Copyright Office and distributed to broadcasters.
Some people (including myself) have argued that Section 111 was superseded by the 1996 Telecommunications Act, which gives broadcasters control over whether or not their signals can be retransmitted by cable systems and similar services. The Act also allows broadcasters to specify the price for retransmission rights, which is paid directly to broadcasters, not to the Copyright Office.
Last week, however, the Copyright Office issued two notices in the Federal Register: Distribution of the 2008 Cable Royalty Funds and Distribution of the 2008 Satellite Royalty Funds. Note that the Copyright Office is still collecting royalties, and so far as they're concerned, Section 111 is still in full effect. Here's a direct quote from the Federal Register posting:
"Each year cable systems must submit royalty payments to the Register of Copyrights as required by the statutory license set forth in section 111 of the Copyright Act for the retransmission to cable subscribers of over-the-air television and radio broadcast signals. See 17 U.S.C. 111(d). These royalties are then distributed to copyright owners whose works were included in a qualifying transmission and who timely filed a claim for royalties. Allocation of the royalties collected occurs in one of two ways. In the first instance, these funds will be distributed through a negotiated settlement among the parties. 17 U.S.C. 111(d)(4)(A). If the claimants do not reach an agreement with respect to the royalties, the Copyright Royalty Judges (‘‘Judges’’) must conduct a proceeding to determine the distribution of any royalties that remain in controversy. 17 U.S.C. 111(d)(4)(B)."
"The 'Phase I Parties' are the Program Suppliers, Joint Sports Claimants, Public Television Claimants, Commercial Television Claimants (represented by National Association of Broadcasters), Music Claimants (represented by American Society of Composers, Authors and Publishers, Broadcast Music, Inc., and SESAC, Inc.), Canadian Claimants, National Public Radio, and the Devotional Claimants. In Phase I of a cable royalty distribution proceeding, royalties are allocated among certain categories of broadcast programming that have been retransmitted by cable systems. The categories have traditionally been movies and syndicated television series, sports programming, commercial and noncommercial broadcaster-owned programming, religious programming, music, public radio programming, and Canadian programming."
Now, that pretty much covers every kind of programming and every kind of broadcaster. If broadcasters are relying on the Telecommunications Act, why are they still collecting royalties from the Copyright Office? It could be that there are broadcasters who don't bother to negotiate for retransmission compensation who still want to claim the statutory royalties from the Copyright Office. Nevertheless, it adds an interesting dimension to the ivi and FilmOn.com story
Some people (including myself) have argued that Section 111 was superseded by the 1996 Telecommunications Act, which gives broadcasters control over whether or not their signals can be retransmitted by cable systems and similar services. The Act also allows broadcasters to specify the price for retransmission rights, which is paid directly to broadcasters, not to the Copyright Office.
Last week, however, the Copyright Office issued two notices in the Federal Register: Distribution of the 2008 Cable Royalty Funds and Distribution of the 2008 Satellite Royalty Funds. Note that the Copyright Office is still collecting royalties, and so far as they're concerned, Section 111 is still in full effect. Here's a direct quote from the Federal Register posting:
"Each year cable systems must submit royalty payments to the Register of Copyrights as required by the statutory license set forth in section 111 of the Copyright Act for the retransmission to cable subscribers of over-the-air television and radio broadcast signals. See 17 U.S.C. 111(d). These royalties are then distributed to copyright owners whose works were included in a qualifying transmission and who timely filed a claim for royalties. Allocation of the royalties collected occurs in one of two ways. In the first instance, these funds will be distributed through a negotiated settlement among the parties. 17 U.S.C. 111(d)(4)(A). If the claimants do not reach an agreement with respect to the royalties, the Copyright Royalty Judges (‘‘Judges’’) must conduct a proceeding to determine the distribution of any royalties that remain in controversy. 17 U.S.C. 111(d)(4)(B)."
"The 'Phase I Parties' are the Program Suppliers, Joint Sports Claimants, Public Television Claimants, Commercial Television Claimants (represented by National Association of Broadcasters), Music Claimants (represented by American Society of Composers, Authors and Publishers, Broadcast Music, Inc., and SESAC, Inc.), Canadian Claimants, National Public Radio, and the Devotional Claimants. In Phase I of a cable royalty distribution proceeding, royalties are allocated among certain categories of broadcast programming that have been retransmitted by cable systems. The categories have traditionally been movies and syndicated television series, sports programming, commercial and noncommercial broadcaster-owned programming, religious programming, music, public radio programming, and Canadian programming."
Now, that pretty much covers every kind of programming and every kind of broadcaster. If broadcasters are relying on the Telecommunications Act, why are they still collecting royalties from the Copyright Office? It could be that there are broadcasters who don't bother to negotiate for retransmission compensation who still want to claim the statutory royalties from the Copyright Office. Nevertheless, it adds an interesting dimension to the ivi and FilmOn.com story
Sony posts details about its new PMW-F3 Camcorder
Sony's U.K. division has posted details about its new PMW-F3 camcorder, which will use a Super 35mm Exmor CMOS sensor with sensitivity equal to ISO 800 and a signal-to-noise ratio of 63dB. 1080 frame rates will be 59.94i, 50i, 29.97P, 25P, and native 23.98P. The camera also supports 720P in a variety of frame rates.
The F3 is positioned significantly higher than Panasonic's forthcoming AG-AF100/101: Its standard HD-SDI interface will output 10-bit 4:2:2, vs. the 8-bit output of the Panasonic AG-AF100, and in April 2010, a Dual Link HD-SDI option will be available that will output 10 bit uncompressed RGB and support 1080/59.94P and 50P. It will record using the MPEG-2 Long GOP codec at 35Mbps or 25 Mbps, not AVCHD. The F3 will support variable frame rates from 1 to 60 fps in 720P and from 1 to 30 fps in 1080P in 1 fps increments. SxS cards will be used for storage, and an optional adapter will support Memory Sticks and SD cards.
The F3 will have its own proprietary lens mount and will come with a PL mount adapter. Optional hot shoe interfaces will be available for Cooke /i and ARRI LDS lenses. Sony is offering some unique bundling options: The PMW-F3L will come without lenses, and the PMW-F3K will come with not one, but three lenses: 35mm, 50mm and 85mm, all at T2.0.
Both PMW-F3 models will ship in the U.K. in January. UrbanFox.TV is reporting that the "tentative list price" for the F3L will be 14,500 Pounds, and 20,700 Pounds for the F3K. Sony has also announced Japanese pricing, which converts into around $17,000 for the F3L. There's been a fair amount of push-back on the Internet to Sony's pricing. The U.S. list price for Panasonic's AG-AF100 is $4,995, and while the F3 has a bigger imager, 10-bit 4:2:2 HD-SDI output and an optional Dual-Link HD-SDI interface, it's hard to argue that those features make the F3 worth more than three times as much money as the AF100.
The F3's price puts it at where the RED originally was at its introduction, and while the current RED is more expensive, it's not that much more expensive. It appears that Sony didn't want to cannibalize sales of its EX3 camcorder by pricing the F3 too low, but if they keep their tentative pricing, they could end up helping Panasonic more than themselves. Panasonic positioned the price of the AF100 against DSLRs, while Sony appears to be positioning the price of the F3 against the rest of the CineAlta product line.
The F3 is positioned significantly higher than Panasonic's forthcoming AG-AF100/101: Its standard HD-SDI interface will output 10-bit 4:2:2, vs. the 8-bit output of the Panasonic AG-AF100, and in April 2010, a Dual Link HD-SDI option will be available that will output 10 bit uncompressed RGB and support 1080/59.94P and 50P. It will record using the MPEG-2 Long GOP codec at 35Mbps or 25 Mbps, not AVCHD. The F3 will support variable frame rates from 1 to 60 fps in 720P and from 1 to 30 fps in 1080P in 1 fps increments. SxS cards will be used for storage, and an optional adapter will support Memory Sticks and SD cards.
The F3 will have its own proprietary lens mount and will come with a PL mount adapter. Optional hot shoe interfaces will be available for Cooke /i and ARRI LDS lenses. Sony is offering some unique bundling options: The PMW-F3L will come without lenses, and the PMW-F3K will come with not one, but three lenses: 35mm, 50mm and 85mm, all at T2.0.
Both PMW-F3 models will ship in the U.K. in January. UrbanFox.TV is reporting that the "tentative list price" for the F3L will be 14,500 Pounds, and 20,700 Pounds for the F3K. Sony has also announced Japanese pricing, which converts into around $17,000 for the F3L. There's been a fair amount of push-back on the Internet to Sony's pricing. The U.S. list price for Panasonic's AG-AF100 is $4,995, and while the F3 has a bigger imager, 10-bit 4:2:2 HD-SDI output and an optional Dual-Link HD-SDI interface, it's hard to argue that those features make the F3 worth more than three times as much money as the AF100.
The F3's price puts it at where the RED originally was at its introduction, and while the current RED is more expensive, it's not that much more expensive. It appears that Sony didn't want to cannibalize sales of its EX3 camcorder by pricing the F3 too low, but if they keep their tentative pricing, they could end up helping Panasonic more than themselves. Panasonic positioned the price of the AF100 against DSLRs, while Sony appears to be positioning the price of the F3 against the rest of the CineAlta product line.
Labels:
Arri PL,
AVCHD,
Camcorder,
Memory Stick,
MPEG-2,
PMW-F3,
Serial digital interface,
Sony,
Super 35mm
Sunday, November 07, 2010
Episode 2 of The Feldman File Videoblog
The second episode of The Feldman File's videoblog is ready for viewing! In this week's episode:
- Panasonic announces its new GF2 Micro Four Thirds camera with Full HD video
- A reader of EOSHD.com discovers a mode in Panasonic's new GH2 DSLR that could dramatically decrease rolling shutter effects
- A team at MIT, the University of California, Berkeley and the University of Michigan are working on a way to put a team of human editors into Microsoft Word
- An editor at Cooks Source Magazine is reprinting content without permission and wants to be thanked for it
- FilmOn.com, an over-the-top Internet video service, is having a hard time figuring out the difference between broadcast television stations and cable networks
Thursday, November 04, 2010
Panasonic's new GF2: Full HD in a compact camera
Panasonic's long-rumored GF2 has finally been announced, and Digital Photography Review has had a few days to test a pre-production model. Panasonic will ship the GF2 in the U.S. in January, and will release U.S. pricing in December.
The GF2 is slightly smaller and lighter than its predecessor, the GF1. (It's still slightly bigger overall than Sony's NEX-5, but when using its pancake lens, it's a bit thinner.) It uses the same 12.1 Megapixel imager as the previous model. In order to reduce its size, Panasonic had to remove some of the physical controls that were on the GF1 and replace them with menu options.
Perhaps the biggest change in the GF2 is that it now has Full HD video, rather than the AVCHD Lite in the GF1. That means that the GF2 has 1080/60i and 720/60p video (the 60i mode reads the image data off the sensor in 30p mode). However, the GF2's AVCHD bitrate maxes out at 17Mbps, not the 24Mbps maximum that Panasonic's GH2 and some other DSLRs can achieve. ISO sensitivity has been increased to 6400 from the 3200 maximum of the GF1. The GF2 also has a built-in stereo microphone, but as with most DSLRs, you'll be better off capturing audio with an external recorder. It fully supports Panasonic's new 3D lens, which will turn it into a tiny, albeit limited, 3D camera.
Digital Photography Review didn't do image and video quality testing, so that will have to wait for another review. The fact that the GF2 only goes to 17Mbps AVCHD means that its video output almost certainly won't stand up to that of the GH2. There's no word about whether the GF2 provides a live output on its HDMI interface while recording, what the resolution of the live output is, and what (if anything) is superimposed on it by the camera.
The GF2 is clearly a consumer-oriented camera, a step up from point-and-shoots rather than a smaller replacement for a DSLR. It's unlikely to have many cinematography applications, especially if it's priced only a few hundred dollars less than the GH2.
The GF2 is slightly smaller and lighter than its predecessor, the GF1. (It's still slightly bigger overall than Sony's NEX-5, but when using its pancake lens, it's a bit thinner.) It uses the same 12.1 Megapixel imager as the previous model. In order to reduce its size, Panasonic had to remove some of the physical controls that were on the GF1 and replace them with menu options.
Perhaps the biggest change in the GF2 is that it now has Full HD video, rather than the AVCHD Lite in the GF1. That means that the GF2 has 1080/60i and 720/60p video (the 60i mode reads the image data off the sensor in 30p mode). However, the GF2's AVCHD bitrate maxes out at 17Mbps, not the 24Mbps maximum that Panasonic's GH2 and some other DSLRs can achieve. ISO sensitivity has been increased to 6400 from the 3200 maximum of the GF1. The GF2 also has a built-in stereo microphone, but as with most DSLRs, you'll be better off capturing audio with an external recorder. It fully supports Panasonic's new 3D lens, which will turn it into a tiny, albeit limited, 3D camera.
Digital Photography Review didn't do image and video quality testing, so that will have to wait for another review. The fact that the GF2 only goes to 17Mbps AVCHD means that its video output almost certainly won't stand up to that of the GH2. There's no word about whether the GF2 provides a live output on its HDMI interface while recording, what the resolution of the live output is, and what (if anything) is superimposed on it by the camera.
The GF2 is clearly a consumer-oriented camera, a step up from point-and-shoots rather than a smaller replacement for a DSLR. It's unlikely to have many cinematography applications, especially if it's priced only a few hundred dollars less than the GH2.
Subscribe to:
Posts (Atom)