- Press releases: They're advertisements, highly biased and often very inaccurate, but they're useful for identifying new products and services that might be of value to you, and competitors that you may not have known about. They can also alert you to management appointments and changes at your customers and competitors.
- Analyst reports: I almost completely ignore analyst forecasts, although reports of actual sales can be very useful. No forecast that covers more than 12 to 24 months is in any way accurate (and I say that as a former analyst who made my living doing five-year forecasts.) In addition, analysts often bias their forecasts upward to make their reports more appealing to vendors, or to placate an existing client or encourage a potential client to sign on. Analyst reports that make recommendations about which products or services to buy, or which vendors to consider, are subject to similar biases, and should always be looked at skeptically.
Also, pay attention to the backgrounds and experience of the analysts themselves. How much working experience do they have in the industry they're covering? Did they work for a customer or for a vendor? Was the report written by the analysts whose names are on it, or was it actually written by junior, less-experienced researchers? - White papers: Most white papers, whether they're directly written by a vendor or by a research firm for a vendor, are advertising. What's worse, in order to get them, you usually have to identify yourself, and you can expect a sales contact shortly thereafter.
- Webcasts: Again, these are mostly advertising for the companies participating in or sponsoring the webcasts. Unless you want to spend an hour listening to an ad, they're a poor use of your time. Even when customers are included as presenters, they may be there only for show, and the vendor representatives will dominate the conversation.
- Tech business blogs, such as TechCrunch and Silicon Alley Insider: These blogs generate traffic with controversy, often mix facts, opinion and conjecture together, and in general, are more entertaining than reliable.
- Hardware blogs, such as AnandTech and Tom's Hardware for computers, or Digital Photography Review for digital cameras: These blogs often do an excellent job of covering and reviewing new products, but be sure that the reviewer received no compensation from the vendor for the review.
- Newspapers and magazines (print or online): The quality of the information that newspapers and magazines report is directly related to the experience and quality of their reporters and editors. It takes time for a reporter to understand a subject area, especially technology, business, medicine and science. Budgetary pressures have forced many publications to fire their experienced specialty journalists and rely on freelancers, news services and press releases. Check reporters' bylines to find out where they came from and how much experience they have in the subject area. Also, anything labeled "Advertorial" is advertising, even if it looks like normal editorial content.
Thursday, March 31, 2011
Ways to turn down the noise
Like a lot of people who keep track of new business and technology developments, I find myself getting overwhelmed with information. It's sometimes difficult to separate the signal from the noise. To keep from being completely buried under the noise, I've come up with some rules for determining what (and what not) to pay attention to:
Labels:
analysts,
blogs,
Magazines,
Newspaper,
Press release,
webcasts,
white papers
Wednesday, March 30, 2011
Roku signs Best Buy to sell its video players
Engadget is reporting that Roku has signed Best Buy to carry the Roku XD, its mid-priced model ($79.99 U.S.), in its stores. RadioShack will also carry the same model. Frys Electronics will carry the XD and more expensive XD|S model, and BJ's Wholesale Club will carry the XD|S. At the same time, Netgear has announced that it's discontinuing its rebranded version of Roku's player.
Roku's expansion into brick & mortar retail channels will place the brand on stronger competitive footing vs. Apple TV and Google TV, although getting its product into retail is only the first step in building consumer demand for the Roku brand. Apple TV holds a big branding advantage over Roku, and Google TV is considerably more functional. However, the XD is $20 less than Apple TV and around $200 cheaper than Logitech's Revue Google TV set-top box, so price-sensitive customers may go for the Roku, especially if store salespeople can demonstrate it effectively.
Roku's expansion into brick & mortar retail channels will place the brand on stronger competitive footing vs. Apple TV and Google TV, although getting its product into retail is only the first step in building consumer demand for the Roku brand. Apple TV holds a big branding advantage over Roku, and Google TV is considerably more functional. However, the XD is $20 less than Apple TV and around $200 cheaper than Logitech's Revue Google TV set-top box, so price-sensitive customers may go for the Roku, especially if store salespeople can demonstrate it effectively.
Labels:
Apple TV,
BestBuy,
Frys Electronics,
Google,
Logitech,
Netgear,
RadioShack,
Roku
Saturday, March 26, 2011
Figuring out when it's 11:30
There's a famous quote attributed to Lorne Michaels, the creator and producer of NBC's "Saturday Night Live": "The show doesn't go on because it's ready. It goes on because it's 11:30." His point was that you can strive for perfection, but at some point, you have to go with what you've got.
Old-timers like me remember when the philosophy behind product development was that you only got one shot, so you needed to get it exactly right. That was the era of "waterfall" development: First, you completely specified the product with all the features that you thought customers were looking for, then you built it, tested it and delivered it. Waterfall development came of age when product development cost a lot and took a long time. Committing to the project at all meant spending a lot of money and committing a year or two to product development, so you'd better get it right the first time. The problem, of course, is that very few teams got it right the first time, or even bothered to talk with potential customers before the product was 90% completed, so the vast majority of new products failed.
Waterfall development has now been replaced in most cases with agile development. Products, especially software and online services, can be developed in a tiny fraction of the time and at a tiny fraction of the cost required when I first entered the business. The new goal is to develop a Minimum Viable Product--a product or service with the least amount of features and capabilities necessary in order to be considered usable--and then modify the product and add functionality on the basis of customer feedback.
However, there's a very important part that's often left out of consideration, and that's what's going on in the heads of customers. Both waterfall and agile development processes are driven by the amount of time needed for development: Waterfall takes a long time, so you have to offer a "complete" product when you go to market; agile can deliver something that works in weeks, so you get "something" out quickly and iterate. Customers, however, neither know nor care how long it takes to develop a product. They don't care about whether the product was developed using waterfall or agile techniques. All they care about is what the product does for them. Does it solve a real problem? Does it make their lives easier? And, most importantly, is what the product does valuable enough for them to pay for it?
There's an old saying: "You never get a second chance to make a first impression." Most of the time, the Minimum Viable Product that a development team shows to potential customers has what the team thinks are the minimum necessary set of features and benefits, but what if they're not? Agile product development and customer development practice says that you iterate--make some changes to the product--and try again. In the worst case, you pivot and make major changes to the product, target market or distribution. However, those potential customers who you pitched your original MVP to already formed an impression about you and your product, and some of them are unlikely to give you another chance.
That's not a big problem if your market is big and you have lots of customers, but if you've got a market with a relatively small number of large customers, losing even a few of them could easily add a year to your sales cycle, if it doesn't kill your product altogether. So, the real question is, when is it 11:30?
The one hard-and-fast rule is that the only people who can tell you if your Minimum Viable Product is truly viable are your customers, and if you wait to start talking to them until you've got what you believe to be a MVP, you may already be too late. The time to get "out the door" and start talking to customers is as early in the product development process as possible. This is especially true if no one on your development team has deep knowledge and experience in the market you're addressing.
In the world of agile development, your customers are the timekeepers. They're the ones who'll tell you when it's 11:30.
Old-timers like me remember when the philosophy behind product development was that you only got one shot, so you needed to get it exactly right. That was the era of "waterfall" development: First, you completely specified the product with all the features that you thought customers were looking for, then you built it, tested it and delivered it. Waterfall development came of age when product development cost a lot and took a long time. Committing to the project at all meant spending a lot of money and committing a year or two to product development, so you'd better get it right the first time. The problem, of course, is that very few teams got it right the first time, or even bothered to talk with potential customers before the product was 90% completed, so the vast majority of new products failed.
Waterfall development has now been replaced in most cases with agile development. Products, especially software and online services, can be developed in a tiny fraction of the time and at a tiny fraction of the cost required when I first entered the business. The new goal is to develop a Minimum Viable Product--a product or service with the least amount of features and capabilities necessary in order to be considered usable--and then modify the product and add functionality on the basis of customer feedback.
However, there's a very important part that's often left out of consideration, and that's what's going on in the heads of customers. Both waterfall and agile development processes are driven by the amount of time needed for development: Waterfall takes a long time, so you have to offer a "complete" product when you go to market; agile can deliver something that works in weeks, so you get "something" out quickly and iterate. Customers, however, neither know nor care how long it takes to develop a product. They don't care about whether the product was developed using waterfall or agile techniques. All they care about is what the product does for them. Does it solve a real problem? Does it make their lives easier? And, most importantly, is what the product does valuable enough for them to pay for it?
There's an old saying: "You never get a second chance to make a first impression." Most of the time, the Minimum Viable Product that a development team shows to potential customers has what the team thinks are the minimum necessary set of features and benefits, but what if they're not? Agile product development and customer development practice says that you iterate--make some changes to the product--and try again. In the worst case, you pivot and make major changes to the product, target market or distribution. However, those potential customers who you pitched your original MVP to already formed an impression about you and your product, and some of them are unlikely to give you another chance.
That's not a big problem if your market is big and you have lots of customers, but if you've got a market with a relatively small number of large customers, losing even a few of them could easily add a year to your sales cycle, if it doesn't kill your product altogether. So, the real question is, when is it 11:30?
The one hard-and-fast rule is that the only people who can tell you if your Minimum Viable Product is truly viable are your customers, and if you wait to start talking to them until you've got what you believe to be a MVP, you may already be too late. The time to get "out the door" and start talking to customers is as early in the product development process as possible. This is especially true if no one on your development team has deep knowledge and experience in the market you're addressing.
In the world of agile development, your customers are the timekeepers. They're the ones who'll tell you when it's 11:30.
Thursday, March 24, 2011
The death of "mass"
No, I'm not talking about some physics discovery that portends the end of the universe. I'm talking about "mass" in the context of mass production, mass media and mass education. We're in a transition period that, perhaps fifty to 100 years from now, will clearly mark the end of one era and the beginning of another. The industrial age led to the era of mass production. Mass production, electrification and the invention of wireless technologies led to mass media. Demand for skilled workers and the migration from farms to cities led to mass education. All of these trends were well-established by the early 1920s, and now, almost 100 years later, the post-mass era is beginning.
Mass production, as implemented by Henry Ford and others, brought the cost of manufactured goods down to a price that almost anyone could afford...at the cost of the humanity of people who work on production lines. However, we're now replacing everything physical that we can with bits, and bits have no need for mass production. A single copy of a song, eBook, video, television show or movie can be endlessly reproduced at effectively no cost, and sent anywhere in the world that can be reached by broadband or a wireless connection.
Today, our demand for new products and new technologies is stretching the ability of mass production to respond. We've taken all the slack that we can out of production systems--implemented just-in-time manufacturing, minimized inventories, and created processes to eliminate defects rather than fix them after they've occurred. However, the mass production environment we've built doesn't work well when chains of supply break (as with the Japanese disaster), or when wars break out that increase the cost of raw materials or interfere with transportation.
Our mass production systems also don't work well when markets can change at a moment's notice. Rapid hikes in the price of gasoline killed demand for gas-guzzling trucks, undermined the business strategies of U.S. car manufacturers, and ultimately led two of the Big Three to file for bankruptcy. A one-hour presentation by Apple announcing its iPad 2 made most of its competitor's tablets obsolete overnight. Samsung was able to respond and reengineer its Galaxy Tab 10.1 to compete, but Motorola faces the prospect of taking huge losses on its Xoom and replacing it with a new model much sooner than expected.
By definition, mass production requires making a commitment to manufacture of large numbers of products in order to keep costs down. However, those large commitments increase the risk of having to discount unpopular or obsolete products, or of having to scrap some inventory entirely. The solution is to manufacture products only in the way that consumers specify them, when customers are ready to buy them. That means a return to localized, bespoke manufacturing that uses the same technology as mass production, but in smaller, less capital- and space-intensive forms. It also means that manufacturing will once again be done close to the customers, rather than thousands of miles away in plants chosen for the lowest possible manufacturing cost. This doesn't mean that mass production as we know it will go away completely, but it does mean that an ever-increasing percentage of what we buy will come from local manufacturers, reversing a forty-year trend.
The next "mass" that's evolving out of existence is mass media. The very concept of mass media is fairly young; even in the golden era of newspapers, each one served a single city, and Hearst, the largest newspaper chain owner, covered only a small percentage of the U.S. It wasn't until the advent of network radio broadcasting that media in the U.S. truly became mass, and that didn't occur until the 1920s. Starting in the 1950s, we entered a period where everyone watched the same television shows and got essentially the same news and the same viewpoints, whether they were reading newspapers, watching television or listening to the radio.
The mass media era first began to break down when talk became the dominant radio format, which eroded the influence of the original radio networks. The next crack in the wall was when the viewing audience for cable television networks exceeded that of broadcast networks, and the third break was the erosion of newspaper circulation, which actually predates the Internet era.
The Internet, however, represents the death blow to the mass media era. There are more sources of entertainment, information, news and opinion available to more people today than at any time in history. Anyone can write articles, record audio and shoot video, and they can make their work available to a worldwide audience without going through a publisher, distributor or broadcaster. An increasing number of people find out what to pay attention to not from editors, but from people they follow on Twitter. Writers still write, producers still produce and editors still edit, but the process of curation, of deciding what's worth paying attention to and what isn't, is now in the hands of ourselves, our friends and people whose opinions we respect.
The average age of print newspaper readers, television viewers and radio listeners has been increasing for years. The most likely advertisers on U.S. national television news programs and late night talk shows are makers of denture cleaners and adhesives, arthritis and pain medications, and drugs to treat gout, erectile dysfunction and depression. Younger consumers get their information and entertainment from the Internet, and when they watch television, they increasingly watch it online or use DVRs that allow them to skip commercials.
The number of cable television subscribers has been declining for almost a year, and the growth of multichannel video services (cable, satellite and IPTV) in the U.S. has just about come to an end. The audience for broadcast radio is being eroded by Internet services like Pandora and Spotify. Newspaper publishers are "circling the drain". In short, mass media as we've known it, especially advertising-supported mass media, is on its last legs.
That brings us to the last "mass", mass education. The modern public education system in the U.S. dates back to after the Civil War, when huge demand for educated workers in factories coincided with a mass migration of people from rural areas to cities. The single-room schoolhouses that were found in most communities, and provided all the education for students from the first to eighth grade, were replaced with factory-like buildings that applied mass production techniques to education. Curricula were standardized and the subjects taught were based on the knowledge that employers believed that their employees needed to master in order to succeed. Even sports programs were introduced to improve the fitness of workers for factory jobs; this was especially true in areas with heavy industry such as automobile manufacturing, steel and mining.
Today, however, the mass education techniques used in most primary and secondary school districts in the U.S. are failing to engage students and meet their educational needs. For example, a high school student in the Chicago Public School district, one of the largest school districts in the U.S., has only a 50% chance of graduating. Schools and school districts have hierarchies of teachers, unions and administrators that are every bit as as bureaucratic and sclerotic as the worker/union/management hierarchies found in the most backward, old-guard factories. The U.S. government has responded by implementing national standardized tests, which try to fix the problems of mass education by applying yet another mass production technique.
The solutions to the problems of mass education have to come from individualized instruction, education based on mastery of subjects rather than completion of a certain amount of time sitting in a chair, and tearing down the bureaucracies found in schools. We need to go back to the one-room schoolhouse, but with 21st Century technology and the latest thinking in educational theory. Otherwise, the mass education system will collapse under its own inefficiency and ineffectiveness.
We've entered the post-mass era--the decline and fall of mass production, mass media and mass education--but we'll truly understand what's going on only when we can look at it in hindsight.
Mass production, as implemented by Henry Ford and others, brought the cost of manufactured goods down to a price that almost anyone could afford...at the cost of the humanity of people who work on production lines. However, we're now replacing everything physical that we can with bits, and bits have no need for mass production. A single copy of a song, eBook, video, television show or movie can be endlessly reproduced at effectively no cost, and sent anywhere in the world that can be reached by broadband or a wireless connection.
Today, our demand for new products and new technologies is stretching the ability of mass production to respond. We've taken all the slack that we can out of production systems--implemented just-in-time manufacturing, minimized inventories, and created processes to eliminate defects rather than fix them after they've occurred. However, the mass production environment we've built doesn't work well when chains of supply break (as with the Japanese disaster), or when wars break out that increase the cost of raw materials or interfere with transportation.
Our mass production systems also don't work well when markets can change at a moment's notice. Rapid hikes in the price of gasoline killed demand for gas-guzzling trucks, undermined the business strategies of U.S. car manufacturers, and ultimately led two of the Big Three to file for bankruptcy. A one-hour presentation by Apple announcing its iPad 2 made most of its competitor's tablets obsolete overnight. Samsung was able to respond and reengineer its Galaxy Tab 10.1 to compete, but Motorola faces the prospect of taking huge losses on its Xoom and replacing it with a new model much sooner than expected.
By definition, mass production requires making a commitment to manufacture of large numbers of products in order to keep costs down. However, those large commitments increase the risk of having to discount unpopular or obsolete products, or of having to scrap some inventory entirely. The solution is to manufacture products only in the way that consumers specify them, when customers are ready to buy them. That means a return to localized, bespoke manufacturing that uses the same technology as mass production, but in smaller, less capital- and space-intensive forms. It also means that manufacturing will once again be done close to the customers, rather than thousands of miles away in plants chosen for the lowest possible manufacturing cost. This doesn't mean that mass production as we know it will go away completely, but it does mean that an ever-increasing percentage of what we buy will come from local manufacturers, reversing a forty-year trend.
The next "mass" that's evolving out of existence is mass media. The very concept of mass media is fairly young; even in the golden era of newspapers, each one served a single city, and Hearst, the largest newspaper chain owner, covered only a small percentage of the U.S. It wasn't until the advent of network radio broadcasting that media in the U.S. truly became mass, and that didn't occur until the 1920s. Starting in the 1950s, we entered a period where everyone watched the same television shows and got essentially the same news and the same viewpoints, whether they were reading newspapers, watching television or listening to the radio.
The mass media era first began to break down when talk became the dominant radio format, which eroded the influence of the original radio networks. The next crack in the wall was when the viewing audience for cable television networks exceeded that of broadcast networks, and the third break was the erosion of newspaper circulation, which actually predates the Internet era.
The Internet, however, represents the death blow to the mass media era. There are more sources of entertainment, information, news and opinion available to more people today than at any time in history. Anyone can write articles, record audio and shoot video, and they can make their work available to a worldwide audience without going through a publisher, distributor or broadcaster. An increasing number of people find out what to pay attention to not from editors, but from people they follow on Twitter. Writers still write, producers still produce and editors still edit, but the process of curation, of deciding what's worth paying attention to and what isn't, is now in the hands of ourselves, our friends and people whose opinions we respect.
The average age of print newspaper readers, television viewers and radio listeners has been increasing for years. The most likely advertisers on U.S. national television news programs and late night talk shows are makers of denture cleaners and adhesives, arthritis and pain medications, and drugs to treat gout, erectile dysfunction and depression. Younger consumers get their information and entertainment from the Internet, and when they watch television, they increasingly watch it online or use DVRs that allow them to skip commercials.
The number of cable television subscribers has been declining for almost a year, and the growth of multichannel video services (cable, satellite and IPTV) in the U.S. has just about come to an end. The audience for broadcast radio is being eroded by Internet services like Pandora and Spotify. Newspaper publishers are "circling the drain". In short, mass media as we've known it, especially advertising-supported mass media, is on its last legs.
That brings us to the last "mass", mass education. The modern public education system in the U.S. dates back to after the Civil War, when huge demand for educated workers in factories coincided with a mass migration of people from rural areas to cities. The single-room schoolhouses that were found in most communities, and provided all the education for students from the first to eighth grade, were replaced with factory-like buildings that applied mass production techniques to education. Curricula were standardized and the subjects taught were based on the knowledge that employers believed that their employees needed to master in order to succeed. Even sports programs were introduced to improve the fitness of workers for factory jobs; this was especially true in areas with heavy industry such as automobile manufacturing, steel and mining.
Today, however, the mass education techniques used in most primary and secondary school districts in the U.S. are failing to engage students and meet their educational needs. For example, a high school student in the Chicago Public School district, one of the largest school districts in the U.S., has only a 50% chance of graduating. Schools and school districts have hierarchies of teachers, unions and administrators that are every bit as as bureaucratic and sclerotic as the worker/union/management hierarchies found in the most backward, old-guard factories. The U.S. government has responded by implementing national standardized tests, which try to fix the problems of mass education by applying yet another mass production technique.
The solutions to the problems of mass education have to come from individualized instruction, education based on mastery of subjects rather than completion of a certain amount of time sitting in a chair, and tearing down the bureaucracies found in schools. We need to go back to the one-room schoolhouse, but with 21st Century technology and the latest thinking in educational theory. Otherwise, the mass education system will collapse under its own inefficiency and ineffectiveness.
We've entered the post-mass era--the decline and fall of mass production, mass media and mass education--but we'll truly understand what's going on only when we can look at it in hindsight.
Wednesday, March 23, 2011
Samsung gets it right--announces prices and release dates for new Galaxy Tabs
Regular readers of this blog (both of you) know that I've given Samsung (and Motorola, RIM, HP, etc.) a hard time for announcing tablets without announcing firm prices or release dates. I can understand the problem when mobile operators are involved, because they control their own pricing and release schedules. However, for Wi-Fi-only products, that shouldn't be an issue.
Yesterday, Samsung announced firm prices and release dates at the CTIA conference for its new Galaxy Tab 10.1 Android tablet, and firm prices for its Galaxy Tab 8.9 Android tablet, with a release date in "early summer". The Wi-Fi-only version of the Galaxy Tab 10.1 will go on sale on June 8th in the U.S., priced at $499 for the 16GB model and $599 for the 32GB model. The Wi-Fi Galaxy Tab 8.9 will go on sale in early summer, priced at $469 for the 16GB model and $569 for 34GB. Samsung redesigned the Galaxy Tab 10.1 since it was last shown at the Mobile World Congress; both it and the Galaxy Tab 8.9 are now slightly slimmer than the iPad 2.
Samsung deserves a lot of credit for announcing firm prices and availability dates, as well as for redesigning the Galaxy Tab 10.1 "on the fly" to better compete with the iPad 2. I hope that announcing prices and ship dates when new products are introduced becomes common, not only for Samsung but for its competitors as well.
Yesterday, Samsung announced firm prices and release dates at the CTIA conference for its new Galaxy Tab 10.1 Android tablet, and firm prices for its Galaxy Tab 8.9 Android tablet, with a release date in "early summer". The Wi-Fi-only version of the Galaxy Tab 10.1 will go on sale on June 8th in the U.S., priced at $499 for the 16GB model and $599 for the 32GB model. The Wi-Fi Galaxy Tab 8.9 will go on sale in early summer, priced at $469 for the 16GB model and $569 for 34GB. Samsung redesigned the Galaxy Tab 10.1 since it was last shown at the Mobile World Congress; both it and the Galaxy Tab 8.9 are now slightly slimmer than the iPad 2.
Samsung deserves a lot of credit for announcing firm prices and availability dates, as well as for redesigning the Galaxy Tab 10.1 "on the fly" to better compete with the iPad 2. I hope that announcing prices and ship dates when new products are introduced becomes common, not only for Samsung but for its competitors as well.
Sony gives its Super 35MM NXCAM a name: NEX-FS100
Sony has given its modular Super 35MM NXCAM a name: NEX-FS100. It's designed to compete with Panasonic's AG-AF101, but it's got an interesting mix of professional and prosumer features: It uses the same sensor as Sony's new F3 camcorder, which is approximately 10% larger than the 4/3" sensor Panasonic uses. The FS100 uses Sony's E-mount lenses, but can support other mounts with adapters. It's got dual XLR connectors like the AF101, but it doesn't have the AF101's HD-SDI outputs; its outputs are HDMI (4:2:2 uncompressed), component and composite. And, it uses the same AVCHD codec as the AF101. So, with the exception of HD-SDI outputs, the FS100 and AF101 are very similar.
Digital Photography Review is quoting a list price of $5,850 (U.S.) for the FS100 with a 18-200mm F3.5-6.3 lens. That would make the body-only price of the FS100 roughly comparable to Panasonic's $4,995 list price for the AF101. However, Slashgear is reporting a price of $6,550 and availability in July 2011, so there's clearly some confusion about the final price. Also, given the Japanese disaster, it's likely that final availability will be delayed, or the quantity of units initially available may be very small.
Update: Nofilmschool.com is reporting prices for the U.S. models of the FS100 to be $5,850 for the FS100U model without lens, and $6,550 for the FS100UK model with lens. That would explain the multiple prices reported by different sites, but it also means that the FS100 will be about $1,000 more than the AF101 for comparably-equipped models.
The physical design of the FS100 is where it diverges widely from the AF101. Its viewfinder is hinged on the top of the camcorder, and can be augmented with an eyepiece or folded down if the cinematographer wants to use an external monitor. The FS100 is considerably smaller and looks more like a consumer camcorder than the AF101, which may make it less noticeable for news and documentary-style shooting.
The first videos shot with the FS100 are available for viewing, and the quality is very good. A video on the same site explains the camcorder's features and capabilities, and leads into a music video that was shot with the FS100. More footage and reviews are likely to appear between now and NAB.
Digital Photography Review is quoting a list price of $5,850 (U.S.) for the FS100 with a 18-200mm F3.5-6.3 lens. That would make the body-only price of the FS100 roughly comparable to Panasonic's $4,995 list price for the AF101. However, Slashgear is reporting a price of $6,550 and availability in July 2011, so there's clearly some confusion about the final price. Also, given the Japanese disaster, it's likely that final availability will be delayed, or the quantity of units initially available may be very small.
Update: Nofilmschool.com is reporting prices for the U.S. models of the FS100 to be $5,850 for the FS100U model without lens, and $6,550 for the FS100UK model with lens. That would explain the multiple prices reported by different sites, but it also means that the FS100 will be about $1,000 more than the AF101 for comparably-equipped models.
The physical design of the FS100 is where it diverges widely from the AF101. Its viewfinder is hinged on the top of the camcorder, and can be augmented with an eyepiece or folded down if the cinematographer wants to use an external monitor. The FS100 is considerably smaller and looks more like a consumer camcorder than the AF101, which may make it less noticeable for news and documentary-style shooting.
The first videos shot with the FS100 are available for viewing, and the quality is very good. A video on the same site explains the camcorder's features and capabilities, and leads into a music video that was shot with the FS100. More footage and reviews are likely to appear between now and NAB.
Sunday, March 20, 2011
Why the AT&T/T-Mobile deal may mean less than it seems
Earlier today, AT&T announced an agreement with Deutsche Telekom to acquire its U.S. T-Mobile operation for $39 billion in cash and stock. The timing of the announcement was very interesting: AT&T and T-Mobile chose to announce the deal on a Sunday, while Western forces are attacking Libya, the Japanese disaster continues and the U.S. college basketball championships are underway--in other words, when very few people are likely to pay attention to it.
Part of the companies' caution is due to the fact that the deal will undergo intensive investigation by the Federal Communications Commission, Federal Trade Commission and U.S. Department of Justice. There's an excellent chance that the deal will be challenged in court; it will add T-Mobile's 33.7 million subscribers to AT&T's 95.5 million, making the merged company the largest mobile operator in the U.S.
On the other hand, there's also a good chance that the deal will go through, at least in some form. T-Mobile is the Number 4 mobile operator in the U.S., the smallest of the four nationwide operators. It's struggling to come up with the capital to upgrade its network to the worldwide LTE standard, and despite its ads portraying its existing network as 4G, most consumers realize that it's not true. Both AT&T and T-Mobile use the same GSM transmission system (albeit at different frequencies), so integration of the two companies' networks will be much easier than if the rumored Sprint/T-Mobile merger had occurred.
AT&T is likely to argue that it's the most natural partner for T-Mobile, and that T-Mobile is unlikely to survive as a national operator in the long term if it stays independent, is acquired by a company with an incompatible infrastructure, or is acquired by a private equity investor that doesn't have extensive telecom experience.
The obvious concern is that an AT&T/T-Mobile merger will result in higher prices and poorer service for consumers, and given AT&T's prior track record with acquisitions, that's likely to be the case. Regulators may require the two brands to maintain separate identities, even if the infrastructure of the two companies is merged. My belief is that AT&T will continue to use T-Mobile as a "value" brand to compete with prepaid and lower-priced postpaid services from operators such as MetroPCS and Leap Wireless, but will migrate T-Mobile's most profitable customers to AT&T.
T-Mobile may not have much of a future in the U.S., whether or not the AT&T acquisition goes through. It's up to AT&T can convince regulators that the most likely outcome for an independent T-Mobile is, at best, to become a regional carrier without the scale to compete with AT&T, Verizon and Sprint.
Part of the companies' caution is due to the fact that the deal will undergo intensive investigation by the Federal Communications Commission, Federal Trade Commission and U.S. Department of Justice. There's an excellent chance that the deal will be challenged in court; it will add T-Mobile's 33.7 million subscribers to AT&T's 95.5 million, making the merged company the largest mobile operator in the U.S.
On the other hand, there's also a good chance that the deal will go through, at least in some form. T-Mobile is the Number 4 mobile operator in the U.S., the smallest of the four nationwide operators. It's struggling to come up with the capital to upgrade its network to the worldwide LTE standard, and despite its ads portraying its existing network as 4G, most consumers realize that it's not true. Both AT&T and T-Mobile use the same GSM transmission system (albeit at different frequencies), so integration of the two companies' networks will be much easier than if the rumored Sprint/T-Mobile merger had occurred.
AT&T is likely to argue that it's the most natural partner for T-Mobile, and that T-Mobile is unlikely to survive as a national operator in the long term if it stays independent, is acquired by a company with an incompatible infrastructure, or is acquired by a private equity investor that doesn't have extensive telecom experience.
The obvious concern is that an AT&T/T-Mobile merger will result in higher prices and poorer service for consumers, and given AT&T's prior track record with acquisitions, that's likely to be the case. Regulators may require the two brands to maintain separate identities, even if the infrastructure of the two companies is merged. My belief is that AT&T will continue to use T-Mobile as a "value" brand to compete with prepaid and lower-priced postpaid services from operators such as MetroPCS and Leap Wireless, but will migrate T-Mobile's most profitable customers to AT&T.
T-Mobile may not have much of a future in the U.S., whether or not the AT&T acquisition goes through. It's up to AT&T can convince regulators that the most likely outcome for an independent T-Mobile is, at best, to become a regional carrier without the scale to compete with AT&T, Verizon and Sprint.
Tuesday, March 15, 2011
U.S. cable subscribers continue to decline
SNL Kagan has released its Q4 2010 report on U.S. video service subscribers. The number of cable subscribers declined by 526,000 for the quarter, the third quarter in a row that cable's subscriber count has declined. Satellite operators added 133,000 subscribers, and IPTV service providers (primarily Verizon and AT&T) added 458,000 subscribers. Overall, the net number of multichannel video subscribers (cable, satellite and IPTV together) increased by 65,000.
The numbers don't add much evidence one way or the other for "cord-cutting", but they clearly suggest that the cable industry's subscriber losses are becoming a long-term trend. However, cable's competitors aren't doing themselves any favors. In particular, AT&T raised its rates considerably this year, and two days ago, the company announced that it will begin capping its DSL bandwidth starting May 2nd. Conventional DSL customers will be capped at 150GB, and U-Verse customers will be limited to 250GB. (The limits will not be imposed until a customer has exceeded their limits three times during the life of their AT&T account.)
The primary reason for customers to shift from cable operators to satellite and IPTV (as well as the primary motivator for cord-cutting) is saving money. If cable competitors raise rates and put limits on their services, they'll end up giving cable subscribers a reason not to switch. SNL Kagan reports that cable operators have almost 60% of the market, so it's their market to lose.
The numbers don't add much evidence one way or the other for "cord-cutting", but they clearly suggest that the cable industry's subscriber losses are becoming a long-term trend. However, cable's competitors aren't doing themselves any favors. In particular, AT&T raised its rates considerably this year, and two days ago, the company announced that it will begin capping its DSL bandwidth starting May 2nd. Conventional DSL customers will be capped at 150GB, and U-Verse customers will be limited to 250GB. (The limits will not be imposed until a customer has exceeded their limits three times during the life of their AT&T account.)
The primary reason for customers to shift from cable operators to satellite and IPTV (as well as the primary motivator for cord-cutting) is saving money. If cable competitors raise rates and put limits on their services, they'll end up giving cable subscribers a reason not to switch. SNL Kagan reports that cable operators have almost 60% of the market, so it's their market to lose.
Thursday, March 10, 2011
Amazon and YouTube are trying to manufacture hits
Earlier today, YouTube announced its Creator Institute, a training program intended to teach "camerawork, storytelling, promotion, and new media skills" to a small, selected group of artists. USC in Los Angeles and Columbia College in Chicago have partnered with YouTube to provide facilities, educators and equipment. 20 writers and directors, selected in a competition by YouTube, the partner colleges and the YouTube community, will participate in all-expense-paid programs this summer at one of the two schools.
Late last year, Amazon introduced Amazon Studios, a competition that awards cash prizes to screenwriters and directors who submit scripts and "test movies" for review by Amazon and its customers. Amazon's program encourages participants to "improve" submitted scripts by adding to or rewriting them, effectively making anyone who contributes to or changes a script a credited co-writer.
Both Amazon's and YouTube's programs are intended to produce original content that they can distribute, and either sell to viewers or advertisers. The problem is that neither program is likely to accomplish what its sponsors intend. Pick up any copy of MovieMaker Magazine, for example, and you'll see that it's loaded cover to cover with ads for schools and seminars that teach movie and television production techniques. There's no lack of places that students can learn how to make movies, and there's scant evidence that participating in a two-month program is going to turn a novice into a hit-making director or writer.
In Amazon's case, companies have run movie and script contests for years, and they very rarely find scripts or directors of much note. You may remember Project Greenlight from a few years ago, which was sponsored by LivePlanet (Matt Damon, Ben Affleck and two other partners) and Miramax. The most interesting things that came out of the project were the television episodes chronicling the production of the three movies, which ran on HBO for two seasons and Bravo in the U.S. for the final season. The films that came out of Project Greenlight, on the other hand, were considerably less interesting: The first film grossed less than $140,000 at the boxoffice, the second less than $280,000, and Miramax refused to distribute the third one, so after one night in a single theater, it went directly to video.
If anything, Project Greenlight should have been much more successful than Amazon's project: It was backed by the most successful independent film distributor in the U.S. at the time, had active participation from Academy Award-winning filmmakers, and was publicized weekly on heavily-watched cable networks. Yet, all three films were financial busts.
The problem is that making a popular film or television show requires a combination of talent, timing and luck that can't be taught or identified in a contest. It's the basis of writer William Goldman's famous quote, "Nobody knows anything." It's why movie studios are much more likely to make sequels of a successful movie than they are to make a movie about an original topic, with an unknown screenwriter or director. It's also why U.S. television viewers get to watch "C.S.I.", "C.S.I. Miami" and "C.S.I. New York".
I applaud both Amazon and YouTube for encouraging and training talent, but their programs aren't likely to create the popular content that the companies want. The problem is that there is no systematic way to create hits, or even identify them before they're produced.
Late last year, Amazon introduced Amazon Studios, a competition that awards cash prizes to screenwriters and directors who submit scripts and "test movies" for review by Amazon and its customers. Amazon's program encourages participants to "improve" submitted scripts by adding to or rewriting them, effectively making anyone who contributes to or changes a script a credited co-writer.
Both Amazon's and YouTube's programs are intended to produce original content that they can distribute, and either sell to viewers or advertisers. The problem is that neither program is likely to accomplish what its sponsors intend. Pick up any copy of MovieMaker Magazine, for example, and you'll see that it's loaded cover to cover with ads for schools and seminars that teach movie and television production techniques. There's no lack of places that students can learn how to make movies, and there's scant evidence that participating in a two-month program is going to turn a novice into a hit-making director or writer.
In Amazon's case, companies have run movie and script contests for years, and they very rarely find scripts or directors of much note. You may remember Project Greenlight from a few years ago, which was sponsored by LivePlanet (Matt Damon, Ben Affleck and two other partners) and Miramax. The most interesting things that came out of the project were the television episodes chronicling the production of the three movies, which ran on HBO for two seasons and Bravo in the U.S. for the final season. The films that came out of Project Greenlight, on the other hand, were considerably less interesting: The first film grossed less than $140,000 at the boxoffice, the second less than $280,000, and Miramax refused to distribute the third one, so after one night in a single theater, it went directly to video.
If anything, Project Greenlight should have been much more successful than Amazon's project: It was backed by the most successful independent film distributor in the U.S. at the time, had active participation from Academy Award-winning filmmakers, and was publicized weekly on heavily-watched cable networks. Yet, all three films were financial busts.
The problem is that making a popular film or television show requires a combination of talent, timing and luck that can't be taught or identified in a contest. It's the basis of writer William Goldman's famous quote, "Nobody knows anything." It's why movie studios are much more likely to make sequels of a successful movie than they are to make a movie about an original topic, with an unknown screenwriter or director. It's also why U.S. television viewers get to watch "C.S.I.", "C.S.I. Miami" and "C.S.I. New York".
I applaud both Amazon and YouTube for encouraging and training talent, but their programs aren't likely to create the popular content that the companies want. The problem is that there is no systematic way to create hits, or even identify them before they're produced.
Saturday, March 05, 2011
Apple's control advantage, and what Google needs to do with Android
Earlier this week, Apple announced the iPad 2, a solid, if incremental, next step for the iPad design. When the iPad 2 goes on sale on March 11th, nine days after it was announced, it will be launched with two important media creation apps, iMovie and GarageBand, and a new version of iOS, 4.3. For its part, iOS 4.3 will be deployed at no cost onto all iPads and most iPhones and iPod touches starting the same day. (Update, March 10, 2011: Apple made iOS 4.3 available for download starting yesterday.)
Apple controls its own hardware, software and distribution infrastructure. That level of control causes a lot of consternation on the part of consumers and developers who'd like the freedom to run what they want, when they want, on Apple's devices. However, Apple's control gives it a significant ongoing advantage over Android.
The existing and forthcoming collection of Android tablets demonstrates the disadvantages of Google's approach. Last fall, Samsung introduced its Galaxy Tab tablet, which launched with Android 2.2. It was officially endorsed by Google, meaning that it got access to the Android Market and could run Google's own Android apps, although Google's own executives cautioned that Android 2.2 was designed for smartphones, not tablets. Galaxy Tab purchasers reported that the product functioned more like a big smartphone than a tablet, and as reported by a number of sources, sales have been disappointing.
Google's hardware and mobile carrier partners are under no obligation to upgrade smartphones and tablets in the field to the most recent version of Android. They're not even obligated to release new devices with the latest version of Android. Google itself released Google TV with an earlier version of Android. Google keeps track of the versions of Android in use by monitoring Android Market app downloads, and according to its own statistics, more than 10% of Android devices actively in use are still using Android 1.5 or 1.6. (Google's statistics don't track the "unauthorized" Android devices that don't have access to the Android Market.)
Update, March 6, 2011: Here's an excellent example of Google's dilemma: According to Engadget, Olivetti in Italy just announced its OliPad Android tablet. It has all the hardware features it needs to support Android 3.0: Tegra 2 processor, 10" 1024 x 600 display, WiFi and 3G, yet it's being released with Android 2.2 rather than Android 3 as its operating system.
Many developers have complained about the variety of Android versions in use, but Google's management says that it's a "non-issue". Nevertheless, buyers of Android devices have no guarantee that their smartphone or tablet will ever be upgraded to a newer version of Android. Apple developers can largely target the newest version of iOS, with assurance that it will quickly spread to most iOS devices, but Android developers have no such assurances.
That brings us to the current wave of tablets running on Android 3.0, also known as Honeycomb. First out was Motorola's Xoom, which launched with less than 20 tablet-aware apps, a price very near Apple's top-of-the-line iPad, and future compatibility with LTE that will require the tablets to be sent back to Motorola for free upgrades. Early reports indicate that the Xoom is selling much more slowly than Motorola or Verizon, its sole carrier partner in the U.S., expected. Next up will most likely be LG's G-Slate, with similar specifications and a carrier partnership with T-Mobile, followed by Samsung's Galaxy Tab 10.1.
Google could have taken control of the process and required hardware vendors and mobile operators to agree to upgrade their devices to the latest version of Android within a reasonable period of time after they're released, but it chose not to do so. It could also have compelled vendors to hold back their tablets until a critical mass of tablet-aware apps was available, but again, it chose not to do so. If you buy a Motorola Xoom today or a G-Slate or Galaxy Tab 10.1 tomorrow, do you have any assurance that it will run Android 3.1, or 3.5, or 4.0, or that the manufacturer or carrier will allow you to upgrade? The answer is no.
Google could have orchestrated a huge day-long event in April or May to introduce the Xoom, G-Slate and Galaxy Tab 10.1 together, each with its carriers, along with perhaps 1,000 tablet-aware apps. Each vendor could separately announce their products, along with ship dates, prices and carrier partnerships, at the event. A separate Developer Showcase at the event could have shown off the best of the new apps. But, none of that is going to happen. All three tablets, along with their apps, will trickle out over the next few months.
Google needs to start exercising more control over its hardware partners, carrier partners, and the Android Market (see this week's malware breakout). Android is now important enough to its partners that Google has the power to coordinate product launches and updates, if it chooses to do so. Google can still have open source and an open development process, but it needs to act a bit more like Microsoft used to when it coordinated hardware partner launches with new versions of Windows.
Apple controls its own hardware, software and distribution infrastructure. That level of control causes a lot of consternation on the part of consumers and developers who'd like the freedom to run what they want, when they want, on Apple's devices. However, Apple's control gives it a significant ongoing advantage over Android.
The existing and forthcoming collection of Android tablets demonstrates the disadvantages of Google's approach. Last fall, Samsung introduced its Galaxy Tab tablet, which launched with Android 2.2. It was officially endorsed by Google, meaning that it got access to the Android Market and could run Google's own Android apps, although Google's own executives cautioned that Android 2.2 was designed for smartphones, not tablets. Galaxy Tab purchasers reported that the product functioned more like a big smartphone than a tablet, and as reported by a number of sources, sales have been disappointing.
Google's hardware and mobile carrier partners are under no obligation to upgrade smartphones and tablets in the field to the most recent version of Android. They're not even obligated to release new devices with the latest version of Android. Google itself released Google TV with an earlier version of Android. Google keeps track of the versions of Android in use by monitoring Android Market app downloads, and according to its own statistics, more than 10% of Android devices actively in use are still using Android 1.5 or 1.6. (Google's statistics don't track the "unauthorized" Android devices that don't have access to the Android Market.)
Update, March 6, 2011: Here's an excellent example of Google's dilemma: According to Engadget, Olivetti in Italy just announced its OliPad Android tablet. It has all the hardware features it needs to support Android 3.0: Tegra 2 processor, 10" 1024 x 600 display, WiFi and 3G, yet it's being released with Android 2.2 rather than Android 3 as its operating system.
Many developers have complained about the variety of Android versions in use, but Google's management says that it's a "non-issue". Nevertheless, buyers of Android devices have no guarantee that their smartphone or tablet will ever be upgraded to a newer version of Android. Apple developers can largely target the newest version of iOS, with assurance that it will quickly spread to most iOS devices, but Android developers have no such assurances.
That brings us to the current wave of tablets running on Android 3.0, also known as Honeycomb. First out was Motorola's Xoom, which launched with less than 20 tablet-aware apps, a price very near Apple's top-of-the-line iPad, and future compatibility with LTE that will require the tablets to be sent back to Motorola for free upgrades. Early reports indicate that the Xoom is selling much more slowly than Motorola or Verizon, its sole carrier partner in the U.S., expected. Next up will most likely be LG's G-Slate, with similar specifications and a carrier partnership with T-Mobile, followed by Samsung's Galaxy Tab 10.1.
Google could have taken control of the process and required hardware vendors and mobile operators to agree to upgrade their devices to the latest version of Android within a reasonable period of time after they're released, but it chose not to do so. It could also have compelled vendors to hold back their tablets until a critical mass of tablet-aware apps was available, but again, it chose not to do so. If you buy a Motorola Xoom today or a G-Slate or Galaxy Tab 10.1 tomorrow, do you have any assurance that it will run Android 3.1, or 3.5, or 4.0, or that the manufacturer or carrier will allow you to upgrade? The answer is no.
Google could have orchestrated a huge day-long event in April or May to introduce the Xoom, G-Slate and Galaxy Tab 10.1 together, each with its carriers, along with perhaps 1,000 tablet-aware apps. Each vendor could separately announce their products, along with ship dates, prices and carrier partnerships, at the event. A separate Developer Showcase at the event could have shown off the best of the new apps. But, none of that is going to happen. All three tablets, along with their apps, will trickle out over the next few months.
Google needs to start exercising more control over its hardware partners, carrier partners, and the Android Market (see this week's malware breakout). Android is now important enough to its partners that Google has the power to coordinate product launches and updates, if it chooses to do so. Google can still have open source and an open development process, but it needs to act a bit more like Microsoft used to when it coordinated hardware partner launches with new versions of Windows.
Friday, March 04, 2011
NAB 2011: Free exhibit pass and discount on sessions
It's getting close to NAB 2011, the premier U.S. technology and business conference for broadcasters, videographers and Internet media producers, which will be held in Las Vegas from April 9th through the 14th. Conference sessions will run from April 9th to 14th, and exhibits will be open from April 11th to 14th. NAB is offering some special deals for "Feldman File" readers:
If you're visiting the exhibits only, give yourself at least one full day to get through all three exhibit halls (if you're well-organized and are willing to do some running). If you want to take your time, two or three days are best.
- You can attend the exhibits for free by clicking this link, or go to the main registration page and use code SM01.
- You can also save $100 on an all-access SMART Pass, a Conference Flex Pass, which gives you access to all Conferences except Post|Production world, or Post|Production world itself, by clicking this link or going to the main registration page and using the code SM08.
If you're visiting the exhibits only, give yourself at least one full day to get through all three exhibit halls (if you're well-organized and are willing to do some running). If you want to take your time, two or three days are best.
Thursday, March 03, 2011
Why you shouldn't announce products without delivery dates and prices
Yesterday, Apple announced the iPad 2 in San Francisco. I won't go over the details, since it seems like every news outlet in the U.S. has already covered them. However, I want to point out one thing that Apple does consistently that its competitors still do all too rarely: Apple announced hard retail availability dates and prices for the iPad 2, iMovie and GarageBand for iOS, as well as the availability date for iOS 4.3. When the event was over, everyone knew when the products would be available and how much they'll cost. Apple regularly does this; in fact, it's rare when Apple announces a product without giving hard prices and availability dates.
Compare that to what its competitors have done. When Samsung announced the original Galaxy Tab Android tablet, it didn't release any prices or availability dates. The information leaked out over the next several weeks. Motorola and Verizon didn't announce prices or availability dates when the Xoom tablet was shown at Mobile World Congress; that information leaked out of Best Buy weeks later. LG's G-Slate came out with German pricing but no U.S. pricing or availability date. Samsung's new Galaxy Tab 10.1 doesn't have either availbility dates or prices.
It's not just the Android tablet vendors who can't get their numbers straight. RIM has been showing the BlackBerry PlayBook for months, still without hard prices or a release date, although a date of April 11th has been leaked. HP held a big event in San Francisco to launch its new WebOS-based smartphones and TouchPad tablet, but gave no prices. As for the ship dates, HP was unwilling to get more specific than "Spring" or "Summer".
It's difficult to take a product announcement seriously if the manufacturer isn't willing to say how much it will cost or when it will be available. I understand the problem when products are sold through mobile carriers, who have their own release schedules and pricing plans. However, Apple works with carriers around the world and has managed to be able to announce consistent release dates and prices.
In hindsight, given the difficulties that Samsung had with the original Galaxy Tab and that Motorola is having with the Xoom, it wouldn't have hurt to delay the announcements until price and availability dates were set. In the Xoom's case, it probably wouldn't have hurt to wait until it ships with LTE built-in and until there's a decent population of tablet-optimized Android apps. Rushing product announcements out in order to "freeze" the market and prevent consumers from buying competitive products may have worked once, but today, when new products are released continuously, consumers won't wait. For example, they're going to compare the second-generation iPad 2 and its 65,000 tablet-optimized apps with a version 0.9 Xoom--not quite ready to ship and with less than 100 tablet-optimized apps--and in the vast majority of cases, the iPad 2 will win.
Google and its partners went through this with Google TV, which should have been announced as a product concept to encourage app developers, but instead was rushed to market at too high a price, with inadequate content partnerships and insufficient user experience testing. RED preannounced its Scarlet camcorder years before it was ready, and ended up educating its competitors, frustrating its customers and frittering away its market credibility.
If you can't announce a hard price and release date, you shouldn't announce a product. It's as simple as that.
Compare that to what its competitors have done. When Samsung announced the original Galaxy Tab Android tablet, it didn't release any prices or availability dates. The information leaked out over the next several weeks. Motorola and Verizon didn't announce prices or availability dates when the Xoom tablet was shown at Mobile World Congress; that information leaked out of Best Buy weeks later. LG's G-Slate came out with German pricing but no U.S. pricing or availability date. Samsung's new Galaxy Tab 10.1 doesn't have either availbility dates or prices.
It's not just the Android tablet vendors who can't get their numbers straight. RIM has been showing the BlackBerry PlayBook for months, still without hard prices or a release date, although a date of April 11th has been leaked. HP held a big event in San Francisco to launch its new WebOS-based smartphones and TouchPad tablet, but gave no prices. As for the ship dates, HP was unwilling to get more specific than "Spring" or "Summer".
It's difficult to take a product announcement seriously if the manufacturer isn't willing to say how much it will cost or when it will be available. I understand the problem when products are sold through mobile carriers, who have their own release schedules and pricing plans. However, Apple works with carriers around the world and has managed to be able to announce consistent release dates and prices.
In hindsight, given the difficulties that Samsung had with the original Galaxy Tab and that Motorola is having with the Xoom, it wouldn't have hurt to delay the announcements until price and availability dates were set. In the Xoom's case, it probably wouldn't have hurt to wait until it ships with LTE built-in and until there's a decent population of tablet-optimized Android apps. Rushing product announcements out in order to "freeze" the market and prevent consumers from buying competitive products may have worked once, but today, when new products are released continuously, consumers won't wait. For example, they're going to compare the second-generation iPad 2 and its 65,000 tablet-optimized apps with a version 0.9 Xoom--not quite ready to ship and with less than 100 tablet-optimized apps--and in the vast majority of cases, the iPad 2 will win.
Google and its partners went through this with Google TV, which should have been announced as a product concept to encourage app developers, but instead was rushed to market at too high a price, with inadequate content partnerships and insufficient user experience testing. RED preannounced its Scarlet camcorder years before it was ready, and ended up educating its competitors, frustrating its customers and frittering away its market credibility.
If you can't announce a hard price and release date, you shouldn't announce a product. It's as simple as that.
Labels:
Android,
apple,
BlackBerry PlayBook,
Google,
HP,
IOS (Apple),
iPad,
Motorola Xoom,
Research In Motion,
Verizon
Tuesday, March 01, 2011
Blockbuster: Pass reorganization, head directly to liquidation?
In court filings yesterday and today, several parties to Blockbuster's bankruptcy plan filed objections. According to the Los Angeles Times, the objectors include "Walt Disney Studios, Universal Studios, landlords, unsecured creditors, other parties and the office of the U.S. Trustee, a Justice Department unit that oversees bankruptcy proceedings." The problem is that the plan filed on February 21st for the sale of the company to four major secured creditors for $290 million would give the buyers the right to determine who receives the proceeds of the sale. Unsecured creditors, such as the movie studios and landlords that own Blockbuster's stores, would get little or nothing.
It's not a surprise that the unsecured creditors objected to the sale, but when the U.S. Trustee also objected, it made it much more likely that the court will take the matter very seriously. According to the Wall Street Journal, the unsecured creditors would prefer that the court convert the proceedings to a Chapter 7 liquidation, rather than allow the sale to go ahead as proposed. As a practical matter, it's very difficult to see how Blockbuster could survive without the support of the movie studios and distributors. It's their movies and television shows that Blockbuster rents, and if they refuse to supply product to the reorganized Blockbuster, it would cease to be a viable business.
Blockbuster could still modify the sale proposal to satisfy its unsecured creditors and the U.S. Trustee, but it's looking increasingly likely that Blockbuster will end up in liquidation.
It's not a surprise that the unsecured creditors objected to the sale, but when the U.S. Trustee also objected, it made it much more likely that the court will take the matter very seriously. According to the Wall Street Journal, the unsecured creditors would prefer that the court convert the proceedings to a Chapter 7 liquidation, rather than allow the sale to go ahead as proposed. As a practical matter, it's very difficult to see how Blockbuster could survive without the support of the movie studios and distributors. It's their movies and television shows that Blockbuster rents, and if they refuse to supply product to the reorganized Blockbuster, it would cease to be a viable business.
Blockbuster could still modify the sale proposal to satisfy its unsecured creditors and the U.S. Trustee, but it's looking increasingly likely that Blockbuster will end up in liquidation.
Subscribe to:
Posts (Atom)