According to the Wall Street Journal, Borders announced yesterday that it is delaying payments to some publishers in order to conserve cash, and is trying to restructure payments to those publishers. The company is also trying to refinance its operations, but it says that "there can be no assurance" that it will so do. If it doesn't refinance its existing debt, it will default on some of its lending agreements in Q1 2011, which could lead to "a liquidity shortfall"--in other words, not enough money to fund ongoing operations.
What's even more concerning to some analysts is that Borders is doing this after the end of the Christmas shopping season, when the company's cash reserves should be at their highest level of the year. It indicates that Borders may have significantly missed its sales targets for Christmas. Borders has only shown a profit in two of the last 11 quarters--Q4 2008 and 2009, which included the Christmas selling seasons in those years.
The only publisher that has so far announced that Borders has delayed its payments is Hachette, which includes Little, Brown and Grand Central Publishing. Hachette is one of the "Big 6" U.S. trade publishers (including HarperCollins, Macmillan, Penguin Group, Random House/Bertlesmann and Simon & Schuster), so if Borders is delaying payments to one of them, it's probably delaying payments to all of them.
Even if only one of the "Big 6" decides to stop shipments to Borders, it will dramatically impact the company's ability to keep operating. If customers learn that they can't purchase the books they're looking for at Borders, they'll switch to Barnes & Noble or Amazon. I doubt that any publisher wants to see Borders fail, but they also don't want to advance more inventory to Borders on extended credit, only to see it frozen should the company declare bankruptcy.
The clock is ticking, and the next 90 days may be the most important in Borders' history.
Friday, December 31, 2010
Thursday, December 30, 2010
A peaceful alternative to the "war" for top talent?
The Wall Street Journal ran an article yesterday that merely reinforces what anyone in Silicon Valley has known for some time--there's a war going on between the startups looking for hire top developers and the established companies looking to keep them. (Example #1 of the established companies is Google, which has raised salaries by 10%, gave every employee a surprise $1,000 cash bonus and offered insane stock options to top developers to keep them from defecting.)
According to the WSJ, Okta, the San Francisco-based startup that's the focus of the article, plans to spend 80% of its $10 million Series A round on salaries, most of which will be for developers. Part of the problem is that salaries for developers in the San Francisco Bay Area are dramatically higher than in most other parts of the country; Okta is paying $75,000 for developers just out of school, and up to $150,000 for top developers, while the national median salaries for entry-level developers is $51,000, and $101,000 for experienced, senior-level developers (based on figures from Salary.com).
These costs are driven in part by the cost of living in Silicon Valley, which is higher than anywhere in the U.S. except for portions of New York City. In addition, even though there are far more developers in Silicon Valley than in any other comparable area in the U.S., everyone wants the best developers, and there are only so many of them to go around. That competition inflates the salaries that companies have to pay for talent.
In addition, the decline in IPOs has made candidates skeptical about the value of equity. In the "Dot-Com" years, startups could offer sub-par salaries, even to top talent, so long as they gave them substantial stock options. Today, when the exit strategy for most startups is to be acquired, most of the proceeds go to the angel investors, venture capitalists and founders; very little is left over for employees. So, while startups are very picky as to who they hire, the candidates demand top salaries (and still demand equity as well).
One solution to this logjam is to stay away from Silicon Valley. I've written about this many times before, but moving from almost anywhere else in the U.S. to Silicon Valley will instantly impose an operating cost penalty of 30-50% on your startup. Cities like Austin, Boston, Boulder/Denver, Chicago, Pittsburgh, Portland and Raleigh/Durham/Chapel Hill have excellent quality of life, strong technology bases, top universities and much lower costs of living than Silicon Valley.
Here's the key: These areas don't need to be "the next Silicon Valley" in order to be successful. They don't have to replicate the entire Silicon Valley infrastructure: Money and resources are now global. The biggest investor in Chicago's Groupon, for example, is Digital Sky Technologies, based in Moscow. Moscow is a long drive from Sand Hill Road.
Having lived in Chicago for two years now after 25 years in Silicon Valley, there's not much that I miss. The weather was much more to my liking, I enjoyed being able to drive to the Coast in an hour, and the seafood was far better. On the other hand, I paid 50% more for a semi-squalid apartment, virtually everything cost much more, and state income taxes were three times higher than those in Illinois. For me at least, it's a reasonable trade off.
So, one solution to the "war" for top talent is to stay out of the battlefield.
According to the WSJ, Okta, the San Francisco-based startup that's the focus of the article, plans to spend 80% of its $10 million Series A round on salaries, most of which will be for developers. Part of the problem is that salaries for developers in the San Francisco Bay Area are dramatically higher than in most other parts of the country; Okta is paying $75,000 for developers just out of school, and up to $150,000 for top developers, while the national median salaries for entry-level developers is $51,000, and $101,000 for experienced, senior-level developers (based on figures from Salary.com).
These costs are driven in part by the cost of living in Silicon Valley, which is higher than anywhere in the U.S. except for portions of New York City. In addition, even though there are far more developers in Silicon Valley than in any other comparable area in the U.S., everyone wants the best developers, and there are only so many of them to go around. That competition inflates the salaries that companies have to pay for talent.
In addition, the decline in IPOs has made candidates skeptical about the value of equity. In the "Dot-Com" years, startups could offer sub-par salaries, even to top talent, so long as they gave them substantial stock options. Today, when the exit strategy for most startups is to be acquired, most of the proceeds go to the angel investors, venture capitalists and founders; very little is left over for employees. So, while startups are very picky as to who they hire, the candidates demand top salaries (and still demand equity as well).
One solution to this logjam is to stay away from Silicon Valley. I've written about this many times before, but moving from almost anywhere else in the U.S. to Silicon Valley will instantly impose an operating cost penalty of 30-50% on your startup. Cities like Austin, Boston, Boulder/Denver, Chicago, Pittsburgh, Portland and Raleigh/Durham/Chapel Hill have excellent quality of life, strong technology bases, top universities and much lower costs of living than Silicon Valley.
Here's the key: These areas don't need to be "the next Silicon Valley" in order to be successful. They don't have to replicate the entire Silicon Valley infrastructure: Money and resources are now global. The biggest investor in Chicago's Groupon, for example, is Digital Sky Technologies, based in Moscow. Moscow is a long drive from Sand Hill Road.
Having lived in Chicago for two years now after 25 years in Silicon Valley, there's not much that I miss. The weather was much more to my liking, I enjoyed being able to drive to the Coast in an hour, and the seafood was far better. On the other hand, I paid 50% more for a semi-squalid apartment, virtually everything cost much more, and state income taxes were three times higher than those in Illinois. For me at least, it's a reasonable trade off.
So, one solution to the "war" for top talent is to stay out of the battlefield.
Labels:
Austin,
Boston,
Boulder,
Chapel Hill,
Chicago,
Denver,
Durham,
Pittsburgh,
Portland,
Raleigh,
Silicon Valley,
startups
Monday, December 27, 2010
Logitech halts manufacturing of its Google TV-based Revue: The question is, will it ever resume?
This one slipped past me on Christmas Eve, but DigiTimes reported that Logitech has told the manufacturer of the Google TV-based Revue, Gigabyte, to halt manufacturing until the beginning of February, 2011 at the earliest. Google has asked its partners not to show new Google TV devices at January's Consumer Electronics Show in Las Vegas, because it's working on improved software.
Update, 27 December: In response to the DigiTimes article, Logitech issued a very nuanced statement to Barron's, a daily financial newspaper owned by the Wall Street Journal. Logitech said that Google did not ask the company to suspend its shipments of the Revue. It says that it is continuing to ship "products" to its customers (although the statement doesn't name the Revue as one of the products), and that it doesn't comment on specific production plans for any of its products. In other words, Logitech doesn't deny the DigiTimes report, nor does it confirm that it's specifically fulfilling new orders of the Revue to anyone.
I anticipated all of this when Google TV was first demonstrated. It was clearly rushed to market, with little to no coordination with other product teams within Google, no third-party apps and little third-party content. The question now is, even if Google improves the Google TV software platform, will Logitech restart production, and will the other companies (including Toshiba, LG Electronics, Sharp and Vizio) that were working on Google TV products follow through with their plans?
Update, 27 December: In response to the DigiTimes article, Logitech issued a very nuanced statement to Barron's, a daily financial newspaper owned by the Wall Street Journal. Logitech said that Google did not ask the company to suspend its shipments of the Revue. It says that it is continuing to ship "products" to its customers (although the statement doesn't name the Revue as one of the products), and that it doesn't comment on specific production plans for any of its products. In other words, Logitech doesn't deny the DigiTimes report, nor does it confirm that it's specifically fulfilling new orders of the Revue to anyone.
I anticipated all of this when Google TV was first demonstrated. It was clearly rushed to market, with little to no coordination with other product teams within Google, no third-party apps and little third-party content. The question now is, even if Google improves the Google TV software platform, will Logitech restart production, and will the other companies (including Toshiba, LG Electronics, Sharp and Vizio) that were working on Google TV products follow through with their plans?
Friday, December 24, 2010
A year-end head-scratcher from Google
Last Wednesday, Google announced that it's giving $1,000 credits for purchases at B&H Photo to 500 of its YouTube Partners. I hate to fall into the "no good deed goes unpunished" category, but I'm scratching my head over Google's logic. Google's goal is to help its most popular partners improve the quality of their videos, but $1,000 doesn't go a very long way.
With $1,000, you can purchase a decent HD consumer camcorder, or a few lights, or a copy of Final Cut Studio, but not a computer to run it on. None of this is going to move the quality needle very much. Further, these grants are taxable, so the net value is considerably less than $1,000. The real value of the program seems to be to B&H--to get any real improvements, people will have to buy more than $1,000 worth of products, and they have to buy them from B&H. In addition, YouTube has 15,000 Partners, yet only 500 got the grants. That means that more than 96% of YouTube's partners are angry that they didn't get any money.
Google would have gotten a lot more value for its money if, instead of giving $1,000 to 500 partners, it gave $10,000 to 50 partners. With $10,000, you can buy much better camcorders (two Panasonic AG-AF100s, for example), or a complete editing and color-correction system. You can buy much better audio equipment. In fact, if you're careful, you can buy enough hardware and software to dramatically improve the quality of your videos, which is the point of the program.
If I received $1,000 from Google, I wouldn't complain, but this program seems like a waste of money.
With $1,000, you can purchase a decent HD consumer camcorder, or a few lights, or a copy of Final Cut Studio, but not a computer to run it on. None of this is going to move the quality needle very much. Further, these grants are taxable, so the net value is considerably less than $1,000. The real value of the program seems to be to B&H--to get any real improvements, people will have to buy more than $1,000 worth of products, and they have to buy them from B&H. In addition, YouTube has 15,000 Partners, yet only 500 got the grants. That means that more than 96% of YouTube's partners are angry that they didn't get any money.
Google would have gotten a lot more value for its money if, instead of giving $1,000 to 500 partners, it gave $10,000 to 50 partners. With $10,000, you can buy much better camcorders (two Panasonic AG-AF100s, for example), or a complete editing and color-correction system. You can buy much better audio equipment. In fact, if you're careful, you can buy enough hardware and software to dramatically improve the quality of your videos, which is the point of the program.
If I received $1,000 from Google, I wouldn't complain, but this program seems like a waste of money.
Thursday, December 23, 2010
Improve Firefox's performance by turning off plugins
This is a bit off-topic, but if you're a Firefox user, it may be helpful. Like many Firefox users, I've installed many extensions, but since those extensions are essentially separate programs, they bog the browser down. Eventually, I go through and turn off or uninstall all but the essential extensions. However, that doesn't always fix the performance problems.
There's another category of Firefox add-on that most users pay much less attention to--plugins. Media players, such as those for Adobe Flash and Apple Quicktime, are installed into Firefox as plugins, but there are many other applications that silently add plugins to Firefox. You can uninstall an application, but its plugin may be left behind.
To review and clean up your plugins, go to the Tools menu, select Add-ons, and then click on the Plugins icon. Turn off all the plugins for media formats and applications that you no longer use. If you see a plugin that you don't recognize, you can turn it off, and if it causes problems with viewing certain content or running applications, simply turn it back on. Depending on the number of plugins you've got, you may see a significant improvement in Firefox's performance.
There's another category of Firefox add-on that most users pay much less attention to--plugins. Media players, such as those for Adobe Flash and Apple Quicktime, are installed into Firefox as plugins, but there are many other applications that silently add plugins to Firefox. You can uninstall an application, but its plugin may be left behind.
To review and clean up your plugins, go to the Tools menu, select Add-ons, and then click on the Plugins icon. Turn off all the plugins for media formats and applications that you no longer use. If you see a plugin that you don't recognize, you can turn it off, and if it causes problems with viewing certain content or running applications, simply turn it back on. Depending on the number of plugins you've got, you may see a significant improvement in Firefox's performance.
Labels:
Add-Ons,
Firefox,
Mozilla Firefox,
Plug-in (computing)
Wednesday, December 22, 2010
Lessons learned from the Feldman File videoblog
I produced five episodes of the Feldman File videoblog from late October to early December, in part as a challenge to see how much I could do with inexpensive, consumer-grade hardware and software. I put the videoblog on hold after five episodes because viewership dropped below the level where it made sense to continue producing it in its current format. However, I learned a few things that may be helpful to you:
- It's perfectly reasonable to use an inexpensive, Flip-style camcorder to shoot a videoblog, and the results are much better than using a webcam. The camcorder I chose was Sanyo's VP-CG102. It's priced about the same as a Flip, but it has a monitor that folds out and turns 180 degrees, so that I can check framing and focus without jury-rigging mirrors.
However, I ran into problems with the Sanyo when I shot some green-screen footage. The footage looked fine by itself, but when I composited it with a background image in iMovie '11, it looked terrible. Noise levels were very high.
- iMovie '11 is a great video editor for its price, but there's very little control over "trick features" such as green screen mattes. You get what you get. Rather than move to a more expensive editing package, I did some investigation and purchased FXhome's CompositeLab Pro. It offers much more control over the quality of mattes and only costs $149. (If you're also interested in video effects, take a look at their VisionLab Studio, which combines the green screen compositing of CompositeLab with visual effects and color grading, for $349.)
That improved, but didn't eliminate, the noise problems. The real solution will be to replace the Sanyo with a better camcorder.
- Singular Software's DualEyes is a real life-saver. It allows me to record audio on a separate recorder and sync the audio tracks from the recorder and camcorder automatically. Manually syncing the tracks would have taken hours per episode.
- If I were starting all over, knowing what I know now, I'd spend the money on a better camcorder. I probably could make do with iMovie '11's green screen capabilities, and could record usable audio directly from the camcorder, eliminating the need for an external audio recorder and DualEyes.
- Adobe Photoshop, Illustrator and Fireworks (creating graphics, title slides, lower thirds, etc.)
- Adobe Audition (editing audio tracks, noise reduction, EQ, etc.)
- Apple iMovie '11 (video editing)
- Apple Keynote (displaying title slides)
- Singular Software DualEyes (synchronizing camcorder footage with separately-recorded audio)
- FXhome CompositeLab Pro (compositing green screen footage)
- There has to be more visual interest than just pictures of products and text slides. The comment that I got most often was "Why can't I see you?". I went to a narration-only format after a few episodes, but viewers like to see people.
- I've been doing everything myself, but it's much easier if you have at least one other person helping you.
Labels:
Adobe Audition,
Adobe Photoshop,
apple,
CompositeLab Pro,
DualEyes,
IMovie,
IWork,
Videoblog
Sony's new subscription music service swings and misses
The Wall Street Journal's All Things D reports that Sony is launching a new cloud-based music service in the U.K. and Ireland today, and will expand it worldwide in 2011. The service works with Sony's Playstation 3, Vaio PCs and Bravia TVs. It requires a continuous Internet connection, so it doesn't work with any of Sony's digital music players--exactly the types of devices that you'd expect a digital music service to work with. It also doesn't work with any mobile phones, not even those from Sony Ericsson.
A cloud-based music service is a good idea, but one that doesn't have a download option or support "out of the chute" for mobile devices? What were they thinking? I keep asking if there's anyone in charge at Sony who knows what they're doing, and the answer always seems to be no. If you want a portable music solution from Sony, you might want to consider purchasing an old Walkman from eBay.
A cloud-based music service is a good idea, but one that doesn't have a download option or support "out of the chute" for mobile devices? What were they thinking? I keep asking if there's anyone in charge at Sony who knows what they're doing, and the answer always seems to be no. If you want a portable music solution from Sony, you might want to consider purchasing an old Walkman from eBay.
Monday, December 20, 2010
Someone at Cisco is having a very bad day
Engadget is running a story about Oliver Kreylos, a lecturer and researcher at UC Davis. He was one of the earliest experimenters with open-source drivers for Microsoft's Kinect, and he's posted a video demonstration of his primary project: 3D telepresence. By placing two Kinects almost 180 degrees apart and facing each other, he's able to create a live 3D image of a remote participant in a teleconference, matted into a virtual set that looks like an office. Here's a video of the demonstration:
To navigate around and through the remote scene, he's using a Wiimote to move a virtual camera (the actual Kinects stay in fixed locations). The person in the remote location can see the location of the "camera" by donning 3D glasses.
The quality of the video is still somewhat lacking--the resolution of the Kinects is 640 x 480--and there's just so much live streaming data that you can push over a USB 2.0 interface. But, Kreylos did this with two $150 Kinects--everything else either already existed in his lab, or he could get it easily.
Cisco sells telepresence systems that are designed to make you feel that you're in the same room as the other person, at prices of $300,000 a system and up. They have less-expensive systems that are essentially minor variations on teleconferencing systems sold by many companies. If a developer at UC Davis could cobble together a 3D telepresence system using $300 worth of off-the-shelf hardware, how can a business justify $300,000 for a system that does something similar?
We've been waiting for Microsoft's "next great innovation" for years. Surface wasn't it, nor was Zune, and neither will it be Windows Phone 7. It looks like Kinect is the first truly game-changing (no pun intended) innovation from Microsoft in years. So, start thinking about what can be done with HD Kinects using USB 3.0 interfaces. This is the 3D future, folks--not Jeff Katzenberg rolling out endless inane 3D movies, but rather, individuals, businesses, schools and institutions integrating 3D into our everyday lives.
To navigate around and through the remote scene, he's using a Wiimote to move a virtual camera (the actual Kinects stay in fixed locations). The person in the remote location can see the location of the "camera" by donning 3D glasses.
The quality of the video is still somewhat lacking--the resolution of the Kinects is 640 x 480--and there's just so much live streaming data that you can push over a USB 2.0 interface. But, Kreylos did this with two $150 Kinects--everything else either already existed in his lab, or he could get it easily.
Cisco sells telepresence systems that are designed to make you feel that you're in the same room as the other person, at prices of $300,000 a system and up. They have less-expensive systems that are essentially minor variations on teleconferencing systems sold by many companies. If a developer at UC Davis could cobble together a 3D telepresence system using $300 worth of off-the-shelf hardware, how can a business justify $300,000 for a system that does something similar?
We've been waiting for Microsoft's "next great innovation" for years. Surface wasn't it, nor was Zune, and neither will it be Windows Phone 7. It looks like Kinect is the first truly game-changing (no pun intended) innovation from Microsoft in years. So, start thinking about what can be done with HD Kinects using USB 3.0 interfaces. This is the 3D future, folks--not Jeff Katzenberg rolling out endless inane 3D movies, but rather, individuals, businesses, schools and institutions integrating 3D into our everyday lives.
Sezmi drops basic cable channels
Sezmi, the hybrid broadcast/over-the-top Internet service, announced last week that it will discontinue its $19.99/month bundle of 23 basic cable channels. The cable channel service was only launched in a portion of the Los Angeles market, and Sezmi claims that customers didn't want it, although it was the only thing differentiating the Sezmi service from a good over-the-air antenna.
Now, Sezmi is falling back to a package combining broadcast TV, video-on-demand and Web content for $4.99/month. However, in order to use the Sezmi service, subscribers need a high-speed Internet connection and Sezmi's $150 bundle of a broadcast antenna and 1 Terabyte DVR. By comparison, consumers could subscribe to the ivi TV service for $4.99, which only requires a high-speed Internet connection and runs on most personal computers.
I don't think that lack of customer interest was the only, or even the primary, reason why Sezmi dropped its cable package. However, Sezmi now has an additional problem--many customers in its 35 other markets bought the Sezmi system with the expectation that they would eventually get access to the cable channel package. Now that the cable channel option is dead, I expect many users to discontinue the service or demand refunds.
Sezmi has changed its focus to providing IPTV services for telecom providers in countries with minimal infrastructure, such as the company's recent deal with Malaysian service provider YTL Communications, using YTL's LTE wireless network. As a result, the eventual discontinuation of its U.S. service may not be a big problem. However, Sezmi's problems once again illustrate the difficulties for new players trying to break into the U.S. multichannel video business.
Now, Sezmi is falling back to a package combining broadcast TV, video-on-demand and Web content for $4.99/month. However, in order to use the Sezmi service, subscribers need a high-speed Internet connection and Sezmi's $150 bundle of a broadcast antenna and 1 Terabyte DVR. By comparison, consumers could subscribe to the ivi TV service for $4.99, which only requires a high-speed Internet connection and runs on most personal computers.
I don't think that lack of customer interest was the only, or even the primary, reason why Sezmi dropped its cable package. However, Sezmi now has an additional problem--many customers in its 35 other markets bought the Sezmi system with the expectation that they would eventually get access to the cable channel package. Now that the cable channel option is dead, I expect many users to discontinue the service or demand refunds.
Sezmi has changed its focus to providing IPTV services for telecom providers in countries with minimal infrastructure, such as the company's recent deal with Malaysian service provider YTL Communications, using YTL's LTE wireless network. As a result, the eventual discontinuation of its U.S. service may not be a big problem. However, Sezmi's problems once again illustrate the difficulties for new players trying to break into the U.S. multichannel video business.
Wednesday, December 15, 2010
Attention Jeff Bewkes: It's coming, whether you like it or not
Jeff Bewkes, the CEO of Time Warner, has been getting some cheap shots off recently at Netflix's expense. Last week, at an entertainment conference run by UBS, Bewkes was asked about Netflix's offer to acquire rights to in-season television shows for $70,000 to $100,000 an episode, and he said that it "... is a measly little offer...that is not attractive or incremental". So, what's Mr. Bewkes' attractive and incremental opportunity? According to him, it's syndication revenues and DVD sales.
Let's examine that argument. In the U.S., television broadcasters are struggling to recover from the recession. That's a big reason why there's been so much emphasis on payment for retransmission rights by cable, satellite and IPTV service providers; broadcasters are trying to tap into whatever revenue sources they can find. There's not a lot of money in broadcasters' pockets to pay more for syndication rights. But what about DVD sales? According to research firm In-Stat, they're forecasting physical video media sales to decline by $4.6 billion from 2009 to 2014. That's not just a decline in DVD sales--that's a total decline in both DVD and Blu-Ray sales. In other words, Blu-Ray, the technology that was going to save the movie studios, won't. On the other hand, In-Stat is forecasting that video downloads and streaming are going to increase $4 billion, from $2.3 to $6.3 billion, over the same period. Essentially, Bewkes' company has no choice but to sell to Netflix and its competitors, simply to compensate for the decline in DVD and Blu-Ray sales.
In a separate interview with the New York Times last week, Bewkes referred to Netflix this way: "“It’s a little bit like, is the Albanian army going to take over the world? I don’t think so.” The problem is that in many ways, Netflix's army is bigger and has more weapons than Bewkes' own. As The Wrap pointed out, Netflix now has the rights to stream movies comprising 49% of the 2010 U.S. theatrical box office. By comparison, Bewkes' HBO only has access to 43% of the 2010 box office, and HBO is bleeding subscribers, while Netflix continues to grow.
Perhaps by bad-mouthing Netflix, Bewkes believes that he's improving his negotiating position or demonstrating what a tough manager he is. The overall impression, however, is that he's out of touch with reality. Given Time Warner's history of mismanagement over the years, that's not reassuring, either to Time Warner's shareholders or employees. It might be a good time for him to decline additional interview requests and get a better handle on what's really going on in his businesses.
Let's examine that argument. In the U.S., television broadcasters are struggling to recover from the recession. That's a big reason why there's been so much emphasis on payment for retransmission rights by cable, satellite and IPTV service providers; broadcasters are trying to tap into whatever revenue sources they can find. There's not a lot of money in broadcasters' pockets to pay more for syndication rights. But what about DVD sales? According to research firm In-Stat, they're forecasting physical video media sales to decline by $4.6 billion from 2009 to 2014. That's not just a decline in DVD sales--that's a total decline in both DVD and Blu-Ray sales. In other words, Blu-Ray, the technology that was going to save the movie studios, won't. On the other hand, In-Stat is forecasting that video downloads and streaming are going to increase $4 billion, from $2.3 to $6.3 billion, over the same period. Essentially, Bewkes' company has no choice but to sell to Netflix and its competitors, simply to compensate for the decline in DVD and Blu-Ray sales.
In a separate interview with the New York Times last week, Bewkes referred to Netflix this way: "“It’s a little bit like, is the Albanian army going to take over the world? I don’t think so.” The problem is that in many ways, Netflix's army is bigger and has more weapons than Bewkes' own. As The Wrap pointed out, Netflix now has the rights to stream movies comprising 49% of the 2010 U.S. theatrical box office. By comparison, Bewkes' HBO only has access to 43% of the 2010 box office, and HBO is bleeding subscribers, while Netflix continues to grow.
Perhaps by bad-mouthing Netflix, Bewkes believes that he's improving his negotiating position or demonstrating what a tough manager he is. The overall impression, however, is that he's out of touch with reality. Given Time Warner's history of mismanagement over the years, that's not reassuring, either to Time Warner's shareholders or employees. It might be a good time for him to decline additional interview requests and get a better handle on what's really going on in his businesses.
U.S. holiday buyers are avoiding high-end consumer electronics
Yesterday, the price of Best Buy's stock plunged on the company's report of lower year-to-year sales for its fiscal Q3, and expected lower sales for the entire year. Analysts said that Best Buy poorly merchandised its "Black Friday" deals, while the company said that it deliberately decided not to offer the "third tier" (their term) consumer electronic brands and deals that Walmart and Target offered.
Even though the U.S. Commerce Department reported that overall retail sales rose 0.8% in November, consumers were much smarter about their Black Friday purchases this year. Historically, retailers have offered a limited number of low-priced products on Black Friday, expecting customers who find the product they want to be sold out to buy more expensive (and profitable) products. This year, consumers either walked out without buying anything if the bargain they were looking for was sold out, or put so much pressure on store management that they were allowed to purchase more expensive products at the price of the sold-out models.
At the same time, ECN Magazine is reporting that consumers are avoiding 3D and Internet-enabled HDTVs. The article quotes a fund investor who questions the logic of CE companies pushing expensive new 3D HDTVs just a year or two after most people purchased their first big-screen HDTVs. Consumers don't see the value proposition, especially in a still-fragile econony. In addition, consumers are put off by the requirement to purchase expensive 3D glasses for every person who wants to watch. There's also a dearth of 3D content, and consumers are afraid of becoming nauseated or getting headaches if they watch 3D.
In addition, consumers are avoiding Internet-enabled HDTVs if they're priced significantly higher than comparable non-networked devices. They realize that they can add Internet connectivity by purchasing an Apple TV or an Internet-enabled Blu-Ray player (some of which are available for under $100). They're also comparing the prices of those products with Google TV-enabled devices, and are going with the less-expensive products. As with 3D, they don't get Google TV's value proposition and can't justify the higher price.
What does it all mean? 3D HDTV won't really take off until an economical, reliable system that doesn't require glasses is available--which is what the CE industry should have waited for in the first place. As for Google TV, it may already be dead, but it certainly won't take off until the price is $99 or less. Google and its partners had no good reason to rush the products, which were clearly half-baked, into the market so soon. By the time they fix their many problems, consumers will have moved on.
Even though the U.S. Commerce Department reported that overall retail sales rose 0.8% in November, consumers were much smarter about their Black Friday purchases this year. Historically, retailers have offered a limited number of low-priced products on Black Friday, expecting customers who find the product they want to be sold out to buy more expensive (and profitable) products. This year, consumers either walked out without buying anything if the bargain they were looking for was sold out, or put so much pressure on store management that they were allowed to purchase more expensive products at the price of the sold-out models.
At the same time, ECN Magazine is reporting that consumers are avoiding 3D and Internet-enabled HDTVs. The article quotes a fund investor who questions the logic of CE companies pushing expensive new 3D HDTVs just a year or two after most people purchased their first big-screen HDTVs. Consumers don't see the value proposition, especially in a still-fragile econony. In addition, consumers are put off by the requirement to purchase expensive 3D glasses for every person who wants to watch. There's also a dearth of 3D content, and consumers are afraid of becoming nauseated or getting headaches if they watch 3D.
In addition, consumers are avoiding Internet-enabled HDTVs if they're priced significantly higher than comparable non-networked devices. They realize that they can add Internet connectivity by purchasing an Apple TV or an Internet-enabled Blu-Ray player (some of which are available for under $100). They're also comparing the prices of those products with Google TV-enabled devices, and are going with the less-expensive products. As with 3D, they don't get Google TV's value proposition and can't justify the higher price.
What does it all mean? 3D HDTV won't really take off until an economical, reliable system that doesn't require glasses is available--which is what the CE industry should have waited for in the first place. As for Google TV, it may already be dead, but it certainly won't take off until the price is $99 or less. Google and its partners had no good reason to rush the products, which were clearly half-baked, into the market so soon. By the time they fix their many problems, consumers will have moved on.
Tuesday, December 14, 2010
Comcast's Xcalibur: Another race back to the past?
Light Reading has published an article about Comcast's new Xcalibur service, which is in an early test in Augusta, GA. The details of the service are still sketchy, but it appears to be a hybrid web-to-TV set-top box codenamed "Parker", possibly the one under development in Comcast's Seattle labs that I wrote about last year. According to Light Reading's sources, Xcalibur is a "managed over-the-top" video service that's designed to compete with offerings such as Apple TV, Boxee, Roku and Google TV while keeping subscribers inside the Comcast tent.
Xcalibur is said to offer a limited, hand-picked selection of websites and video services. It's yet another "walled garden" approach, and it's very likely that you won't find any content on Xcalibur that's competitive with any of Comcast's existing services.
It's true that the other over-the-top video services are also walled gardens to some extent, but with the exception of Apple TV, that's not because they're deliberately closed off. It requires some development work, but anyone can make their video content available on Boxee, Roku or Google TV. In the case of Boxee and Google TV, they want to offer even more web video content but are actively being blocked.
If you want to envision what Comcast could be doing, consider France's Free. Earlier today, Engadget wrote an article about a new over-the-top IPTV set-top box that's been launched by France's Free (Iliad) Internet service, the Freebox 6. Here's what the box has:
- 250GB hard drive
- Blu-Ray drive
- 802.11n WiFi adapter
- HDMI output
- 3D support
- Built-in DECT adapter for wireless telephones
- Web browser
- Motion-sensing remote
- Joystick
- Two powerline Ethernet adapters
Comcast, and virtually every other U.S. multichannel video service provider, is driving to the future while staring in the rear-view mirror. A walled garden approach won't fly, but you wouldn't expect anything more from a company whose most exciting recent development is an app that turns an iPad into a remote control.
Saturday, December 11, 2010
Amazon Web Services: An incredible set of building blocks
Werner Vogels, Amazon's CTO, announced on Thursday that Amazon Web Services' S3 file storage and management service has increased its maximum object size from 5GB to 5TB--a 1,000x increase. Vogels uses a two-hour uncompressed 1080p HD movie requiring 1.5TB of storage as an example of the kind of content that Amazon's S3 expansion can handle.
Amazon's cloud computing capabilities are almost taken for granted now, but it's useful to step back and think about what can be done. You can store and distribute vast quantities of data through AWS. Through EC2, the Elastic Compute Cloud, you can configure as large or as small a CPU processing surface as you need, including clusters. If you're doing numerically intensive processing or even graphic rendering for motion pictures and games, AWS offers GPU (Graphic Processing Unit) clusters based on Nvidia's Tesla GPUs. If you've got a large library of video, audio or HTML content, you can use Amazon's CloudFront CDN (Content Distribution Network) instead of contracting with companies such as Akamai, Limelight and Level 3.
There are many other things you can do--build iOS and Android applications that use AWS services, deploy relational databases, and even run your own DNS servers. Through third-party services built on top of AWS, you can encode and transcode videos. It's all impressive, but what's even more impressive is that you can create, configure and manage all of these services from a notebook computer in your living room. Further, you can get started with many of these services, and even launch small-scale alpha and beta tests, for free.
Even the smallest startup now has access to world-class infrastructure, and the barriers to entry based on infrastructure are almost completely gone. With AWS, Apple's iOS, Google's Android, Javascript, HTML5 and the server programming language of your choice, you've got an incredible set of building blocks, most of which are free or nominally priced.
Amazon's cloud computing capabilities are almost taken for granted now, but it's useful to step back and think about what can be done. You can store and distribute vast quantities of data through AWS. Through EC2, the Elastic Compute Cloud, you can configure as large or as small a CPU processing surface as you need, including clusters. If you're doing numerically intensive processing or even graphic rendering for motion pictures and games, AWS offers GPU (Graphic Processing Unit) clusters based on Nvidia's Tesla GPUs. If you've got a large library of video, audio or HTML content, you can use Amazon's CloudFront CDN (Content Distribution Network) instead of contracting with companies such as Akamai, Limelight and Level 3.
There are many other things you can do--build iOS and Android applications that use AWS services, deploy relational databases, and even run your own DNS servers. Through third-party services built on top of AWS, you can encode and transcode videos. It's all impressive, but what's even more impressive is that you can create, configure and manage all of these services from a notebook computer in your living room. Further, you can get started with many of these services, and even launch small-scale alpha and beta tests, for free.
Even the smallest startup now has access to world-class infrastructure, and the barriers to entry based on infrastructure are almost completely gone. With AWS, Apple's iOS, Google's Android, Javascript, HTML5 and the server programming language of your choice, you've got an incredible set of building blocks, most of which are free or nominally priced.
The death of the elevator pitch
You've probably heard the term "elevator pitch": It's a very brief description of a product, service or business idea--no more than a few sentences, and no longer than 60 to 90 seconds. The term "elevator pitch" comes from the length of time that you have to pitch someone if you're riding the elevator together. A clear, coherent elevator pitch distills your idea down to its essence. If you can't distill your idea down to an elevator pitch, the conventional wisdom is that it's either too complex or you haven't thought it through sufficiently.
The elevator pitch idea originated in the entertainment industry, where it was called "High Concept". Producers and screenwriters had to reduce their story ideas and scripts down to a few words: "Cars that turn into giant robots!". "Spider-Man as a movie!". "Aliens invade (fill in the blank)!". High Concept has led to an endless stream of remakes, movies based on successful concepts from other media, and an almost complete absence of films with complex stories that don't rely on things that blow up.
The elevator pitch has become an overworked shortcut to critical thinking. Entrepreneurs frame their business ideas on how well they can turn them into elevator pitches, not whether they really represent a unique opportunity. By definition, a 60- to 90-second pitch can't convey a complex idea or a truly disruptive innovation. That's one reason why we get endless clones of ideas such as location-based check-in services, or personal financial services, or any number of other businesses.
As Lon Chow, a partner with Apex Venture Partners in Chicago, pointed out in an article I recently wrote on angel investors in Chicago, he sees far too many entrepreneurs who say that they're going to be the "X of Y", where X is Google, Facebook, Twitter, Salesforce.com, eBay, Groupon, or another successful business. Analogies are simple to create, present and understand, but they can be dangerous. This kind of analogy lends itself to an elevator pitch but indicates that the person making the pitch hasn't thought through their idea well enough.
Here's a practical example: Every year (and for a few years, twice a year), the DEMO Conference has featured a hand-picked list of startups. The key requirement for consideration is that this must be the first public demonstration of any product or service to be displayed at DEMO. Each company only has six minutes to present its product or service on stage. After all the pitches are made, the conference organizers and audience vote on the best presentations--these are what they call the DEMOgods. Some of the companies that have won the DEMOgod award include:
The process of developing an elevator pitch can be very helpful for honing marketing messages and cutting away unnecessary clutter, but big ideas can't, and shouldn't, be reduced to elevator pitches. In fact, if an idea can easily be reduced to an elevator pitch, if you can easily explain it in a couple of sentences, it's likely either trivial or can easily be duplicated by others.
Smart investors spend much more time examining the people in a startup than they do the elevator pitch. There's always value in a concise message, but if you haven't thoroughly thought out all of the issues behind those first 60 seconds, sooner or later your investors will figure that out. At best, the elevator pitch is a way to open the door; at worst, a bad elevator pitch can close the door before you ever get a chance to open it. If your idea doesn't lend itself to an elevator pitch, pursue the idea, not the elevator pitch.
The elevator pitch idea originated in the entertainment industry, where it was called "High Concept". Producers and screenwriters had to reduce their story ideas and scripts down to a few words: "Cars that turn into giant robots!". "Spider-Man as a movie!". "Aliens invade (fill in the blank)!". High Concept has led to an endless stream of remakes, movies based on successful concepts from other media, and an almost complete absence of films with complex stories that don't rely on things that blow up.
The elevator pitch has become an overworked shortcut to critical thinking. Entrepreneurs frame their business ideas on how well they can turn them into elevator pitches, not whether they really represent a unique opportunity. By definition, a 60- to 90-second pitch can't convey a complex idea or a truly disruptive innovation. That's one reason why we get endless clones of ideas such as location-based check-in services, or personal financial services, or any number of other businesses.
As Lon Chow, a partner with Apex Venture Partners in Chicago, pointed out in an article I recently wrote on angel investors in Chicago, he sees far too many entrepreneurs who say that they're going to be the "X of Y", where X is Google, Facebook, Twitter, Salesforce.com, eBay, Groupon, or another successful business. Analogies are simple to create, present and understand, but they can be dangerous. This kind of analogy lends itself to an elevator pitch but indicates that the person making the pitch hasn't thought through their idea well enough.
Here's a practical example: Every year (and for a few years, twice a year), the DEMO Conference has featured a hand-picked list of startups. The key requirement for consideration is that this must be the first public demonstration of any product or service to be displayed at DEMO. Each company only has six minutes to present its product or service on stage. After all the pitches are made, the conference organizers and audience vote on the best presentations--these are what they call the DEMOgods. Some of the companies that have won the DEMOgod award include:
- DoDots
- uTOK
- Zaplet
- e-tractions
- Groxis
- Speechi
- FourSticks
- YackPack
- UniPrivacy
The process of developing an elevator pitch can be very helpful for honing marketing messages and cutting away unnecessary clutter, but big ideas can't, and shouldn't, be reduced to elevator pitches. In fact, if an idea can easily be reduced to an elevator pitch, if you can easily explain it in a couple of sentences, it's likely either trivial or can easily be duplicated by others.
Smart investors spend much more time examining the people in a startup than they do the elevator pitch. There's always value in a concise message, but if you haven't thoroughly thought out all of the issues behind those first 60 seconds, sooner or later your investors will figure that out. At best, the elevator pitch is a way to open the door; at worst, a bad elevator pitch can close the door before you ever get a chance to open it. If your idea doesn't lend itself to an elevator pitch, pursue the idea, not the elevator pitch.
Friday, December 10, 2010
Putting the videoblog on hiatus
I've decided to put the Feldman File videoblog on hiatus. Viewership increased nicely for the first three episodes, but it dropped for the most recent two; this week's episode has only had 14 views so far. Clearly, the content and production style aren't finding an audience, so I'm going to consider some other approaches, including a conventional audio podcast. For those of you who watched my videoblog, thank you! I hope to come up with something that's still informative but is more entertaining.
Wednesday, December 08, 2010
Google and Facebook: More similar than you might think
Google and Facebook are physically located fairly close to each other, in Mountain View and Palo Alto, California respectively, but they do very different things: Google is primarily a search engine, and Facebook is a social networking site. Dig beneath the surface, however, and you'll find that the two companies are actually very similar: Both of them sell your personal information in order to make money.
Google uses your search queries to feed you targeted advertising. If you use Gmail, Google displays ads based on the subject and contents of your emails. It targets ads to you on YouTube based on what you watch. If you use a Google location service, such as Google Maps, it points you to advertisers in your area. Google claims that it doesn't warehouse or mine the information that you give it, but it sucks up enormous amounts of data, and it's nonsensical to believe that Google isn't correlating that information.
Facebook, on the other hand, gets you to give it as much personal information as it can so that it can send you targeted advertising. The company also sells your information to its partners so that they can send you advertising and target their sales messages to you. Facebook correlates the information that users provide with that of their friends to build a comprehensive demographic and psychographic profile of each user. Most of Facebook's "initiatives" over the years have been attempts to convince its users to give it more of their personal information, changes in policies to make more of that information public, or programs for monetizing that information.
Whenever either company introduces a new product or service, it's important to ask: How it will generate more salable information or offer more opportunities to monetize that information? At their core, that's what Google and Facebook are all about.
Google uses your search queries to feed you targeted advertising. If you use Gmail, Google displays ads based on the subject and contents of your emails. It targets ads to you on YouTube based on what you watch. If you use a Google location service, such as Google Maps, it points you to advertisers in your area. Google claims that it doesn't warehouse or mine the information that you give it, but it sucks up enormous amounts of data, and it's nonsensical to believe that Google isn't correlating that information.
Facebook, on the other hand, gets you to give it as much personal information as it can so that it can send you targeted advertising. The company also sells your information to its partners so that they can send you advertising and target their sales messages to you. Facebook correlates the information that users provide with that of their friends to build a comprehensive demographic and psychographic profile of each user. Most of Facebook's "initiatives" over the years have been attempts to convince its users to give it more of their personal information, changes in policies to make more of that information public, or programs for monetizing that information.
Whenever either company introduces a new product or service, it's important to ask: How it will generate more salable information or offer more opportunities to monetize that information? At their core, that's what Google and Facebook are all about.
Labels:
Facebook,
Gmail,
Google,
Personally identifiable information,
YouTube
Tuesday, December 07, 2010
Eric Schmidt says that Chrome OS is a thin client. Really?
Google just ended its launch event for Chrome OS and the Chrome Web Store. The Chrome Web Store will offer HTML5/AJAX applications, and it's pretty straightforward. Far more perplexing was Google's launch of its Chrome OS. The principle of Chrome OS is that it's the Chrome browser with only the necessary additional system-level hooks to control the underlying hardware of a netbook or similar device. The entire user interface is inside the browser window--there's no separate "desktop".
Google repeatedly said that Chrome OS is perfect for netbooks, and the company has been running it on its own prototype netbooks in-house. Acer and Samsung have committed to ship netbooks with Chrome OS some time in mid-2011. However, netbook sales have slowed as the iPad has gained popularity, and a flood of new tablets next year will drive netbook sales even lower.
So, will Chrome OS work on tablets? The Chrome OS team was asked about it in the Q&A session and sidestepped the question. Google didn't show any tablets running Chrome OS, and it didn't demonstrate any multitouch features. Android appears to be Google's tablet solution. If the market is moving away from netbooks and toward tablets, and Android is Google's tablet platform, why is Google putting resources into Chrome OS?
The answer was surprising. Google's Eric Schmidt said that Chrome OS is, in fact, the first commercially-acceptable implementation of the thin client architecture that Sun tried to sell years ago. Really? The reason for thin clients was that personal computers cost a lot of money and were difficult to maintain in a corporate IT architecture. Today, PCs are dirt cheap (cheaper than thin clients were "back in the day"), and it's far easier for IT departments to maintain their networks of computers.
So, does this mean that Chrome OS is being built for a platform (netbooks) that's rapidly becoming obsolete, and for an application (thin clients) that's already obsolete? Is anyone from the Android team talking with the Chrome OS team? Are they even in the same company? In short, is there anyone in charge at Google?
Google repeatedly said that Chrome OS is perfect for netbooks, and the company has been running it on its own prototype netbooks in-house. Acer and Samsung have committed to ship netbooks with Chrome OS some time in mid-2011. However, netbook sales have slowed as the iPad has gained popularity, and a flood of new tablets next year will drive netbook sales even lower.
So, will Chrome OS work on tablets? The Chrome OS team was asked about it in the Q&A session and sidestepped the question. Google didn't show any tablets running Chrome OS, and it didn't demonstrate any multitouch features. Android appears to be Google's tablet solution. If the market is moving away from netbooks and toward tablets, and Android is Google's tablet platform, why is Google putting resources into Chrome OS?
The answer was surprising. Google's Eric Schmidt said that Chrome OS is, in fact, the first commercially-acceptable implementation of the thin client architecture that Sun tried to sell years ago. Really? The reason for thin clients was that personal computers cost a lot of money and were difficult to maintain in a corporate IT architecture. Today, PCs are dirt cheap (cheaper than thin clients were "back in the day"), and it's far easier for IT departments to maintain their networks of computers.
So, does this mean that Chrome OS is being built for a platform (netbooks) that's rapidly becoming obsolete, and for an application (thin clients) that's already obsolete? Is anyone from the Android team talking with the Chrome OS team? Are they even in the same company? In short, is there anyone in charge at Google?
Labels:
Android,
Chrome Web Store,
Eric E. Schmidt,
Google,
Google Chrome,
Google Chrome OS,
iPad
U.S. newspaper advertising declines for 15th straight quarter
The Newspaper Association of America reports that print newspaper advertising declined year-to-year in Q3 2010 for the 15th straight quarter, and in 18 of the last 19 quarters. The last year where there was more than one positive quarter was 2005. The story for national advertising (national advertisers running ads in local newspapers) is even worse: Q3 2010 was the 23rd straight quarter of declines, and the last "up" year for national advertising was 2004.
The declines in advertising revenue were to be expected, given that print circulation has been declining for years, but the newspapers are caught in a vicious circle: As circulation declines, circulation revenue falls, and the reduced circulation causes advertising revenue to decline. After two years of online advertising revenue declines, revenue increased for newspapers in all three quarters of this year so far, but total online revenues in Q3 were less than 14% of print revenues--far less than that needed to offset the print declines.
Most newspapers are going to have to jump to an online-only strategy sooner or later, but how far can they ride their print businesses before they have no choice but to make the transition? Further, can they afford to offer a meaningful newsgathering and editorial service on their online revenues alone?
The declines in advertising revenue were to be expected, given that print circulation has been declining for years, but the newspapers are caught in a vicious circle: As circulation declines, circulation revenue falls, and the reduced circulation causes advertising revenue to decline. After two years of online advertising revenue declines, revenue increased for newspapers in all three quarters of this year so far, but total online revenues in Q3 were less than 14% of print revenues--far less than that needed to offset the print declines.
Most newspapers are going to have to jump to an online-only strategy sooner or later, but how far can they ride their print businesses before they have no choice but to make the transition? Further, can they afford to offer a meaningful newsgathering and editorial service on their online revenues alone?
Monday, December 06, 2010
No "true" Android tablets until mid-2011?
Andy Rubin, the "father" of Android, was interviewed by Walt Mossberg and Kara Swisher at the D: Dive into Mobile conference this evening. After displaying and talking about the Samsung Nexus S, which will be the first mobile phone to ship running Android 2.3, or Gingerbread, Rubin showed a prototype of a Motorola tablet running Android and a new version of Google Maps. Mossberg asked Rubin what version of Android the tablet was running, and he said "This is Honeycomb. And it'll be out sometime next year."
That reply, along with other things that Rubin said, strongly suggest that Honeycomb, not Gingerbread, will be the first "officially sanctioned" version of Android for tablets. Given how long it takes Google's carrier and hardware partners to roll out new versions of Android, that means that we're unlikely to see tablets with Google's full endorsement until mid-2011 at the earliest. By "full support", I mean support of and permission to distribute all of Google's apps, access to the Android Marketplace, and a solid library of third-party apps designed to take advantage of the tablet's screen size. There may be tablets with pre-Gingerbread versions of Android that get "special dispensations" from Google, as Samsung's Galaxy Tab did, but no wide selection of fully-supported Android tablets before Honeycomb.
There will undoubtedly be plenty of tablet prototypes running Gingerbread at the Consumer Electronics Show next month, but it's almost certain that Apple will ship its second generation of tablets before the first Honeycomb tablets ship.
That reply, along with other things that Rubin said, strongly suggest that Honeycomb, not Gingerbread, will be the first "officially sanctioned" version of Android for tablets. Given how long it takes Google's carrier and hardware partners to roll out new versions of Android, that means that we're unlikely to see tablets with Google's full endorsement until mid-2011 at the earliest. By "full support", I mean support of and permission to distribute all of Google's apps, access to the Android Marketplace, and a solid library of third-party apps designed to take advantage of the tablet's screen size. There may be tablets with pre-Gingerbread versions of Android that get "special dispensations" from Google, as Samsung's Galaxy Tab did, but no wide selection of fully-supported Android tablets before Honeycomb.
There will undoubtedly be plenty of tablet prototypes running Gingerbread at the Consumer Electronics Show next month, but it's almost certain that Apple will ship its second generation of tablets before the first Honeycomb tablets ship.
Labels:
Android,
AndyRubin,
Gingerbread,
Google,
Honeycomb,
Kara Swisher,
Motorola,
Walt Mossberg
Is Groupon subject to network effects?
As virtually everyone with a computer knows, Groupon turned down Google's acquisition offer, said to total $6 billion, last Friday evening. Parties who claim to have been close to the negotiations say that Groupon didn't want to risk its culture as part of Google, and that the company believes that it's following in the path of Facebook, which turned down several acquisition attempts. (For that matter, it's following in the path of Google, which tried to sell itself to Excite in 1999 for $1 million, without success.)
Critics of the Google/Groupon deal say that it's easy to create a "daily deals" service, and in fact, there are dozens of them in the U.S. alone. The technical barriers to entry are very low. The same was true in the 90s, when eBay became the dominant Internet auction site. There were dozens of other auction sites, some of which focused on vertical niches, and others that took eBay head-on. However, eBay became dominant because it took advantage of network effects, where the value of a product or service increases as the number of people using it increases. (Bob Metcalfe argued that the value is roughly the square of the number of users.)
In eBay's case, as more people used the service and more people posted products for sale, it generated a "virtuous circle" that made it more popular and gave users ever fewer reasons for using other auction sites. Groupon, however, is a different animal. Most Groupon users don't go to the site to find deals; instead, they get daily deals via email. Groupon's value comes in cornering the market for daily deals, and there are simply too many retailers out there for that to happen. Even Groupon realizes that it can't add enough salespeople to grow its business fast enough, so it's implementing a service that enables merchants to post their own deals.
Groupon may get to the point where consumers no longer pay attention to other daily deals and merchants no longer bother to post deals with other services, but it's not there yet and may never get there. That doesn't mean that Groupon won't continue to grow and won't be successful enough to eventually IPO, but it does mean that the "low barriers to entry" argument is credible. It's far too early to say whether Google dodged a bullet or Groupon made a brilliant decision, but either way, Groupon is no eBay.
Critics of the Google/Groupon deal say that it's easy to create a "daily deals" service, and in fact, there are dozens of them in the U.S. alone. The technical barriers to entry are very low. The same was true in the 90s, when eBay became the dominant Internet auction site. There were dozens of other auction sites, some of which focused on vertical niches, and others that took eBay head-on. However, eBay became dominant because it took advantage of network effects, where the value of a product or service increases as the number of people using it increases. (Bob Metcalfe argued that the value is roughly the square of the number of users.)
In eBay's case, as more people used the service and more people posted products for sale, it generated a "virtuous circle" that made it more popular and gave users ever fewer reasons for using other auction sites. Groupon, however, is a different animal. Most Groupon users don't go to the site to find deals; instead, they get daily deals via email. Groupon's value comes in cornering the market for daily deals, and there are simply too many retailers out there for that to happen. Even Groupon realizes that it can't add enough salespeople to grow its business fast enough, so it's implementing a service that enables merchants to post their own deals.
Groupon may get to the point where consumers no longer pay attention to other daily deals and merchants no longer bother to post deals with other services, but it's not there yet and may never get there. That doesn't mean that Groupon won't continue to grow and won't be successful enough to eventually IPO, but it does mean that the "low barriers to entry" argument is credible. It's far too early to say whether Google dodged a bullet or Groupon made a brilliant decision, but either way, Groupon is no eBay.
Sunday, December 05, 2010
Episode 5 of the Feldman File videoblog is live!
This week's episode of the Feldman File videoblog is live on YouTube! If you can't see it here, click here to view it in your browser. Here are the stories in this week's show:
- Google's $6 billion offer for Groupon is rejected, but it acquires DRM specialist Widevine
- Verizon Wireless fires up its LTE broadband network and launches its first devices
- Comcast extends its agreement to acquire NBC Universal and contends with complaints from Level 3 and Zoom Telephonics
- Adobe releases the first beta of Flash Player 10.2
- The U.S. Justice Department is forced to drop charges against an accused Xbox 360 modder
- Flipboard adds HTML5 support and advertising
Labels:
Adobe,
Comcast,
FCC,
Flash Player,
Flipboard,
Google,
Groupon,
HTML5,
Level 3,
LTE,
U.S. Department of Justice,
Verizon Wireless,
Widevine,
Xbox 360,
Zoom Telephonics
Thursday, December 02, 2010
Flash Player 10.2: A big improvement?
Like many users, my experience with Adobe's Flash Player hasn't been a happy one, and it's gotten worse as Adobe has "improved" the player over time. Video playback using the Flash Player in both Firefox and Internet Explorer has gotten so bad for me on Windows that I've switched to Google Chrome for viewing Flash video. Adobe has apparently been paying attention to the criticism, and the beta of Flash Player 10.2 November 30th that the company released on its Adobe Labs website is the first version in a long time with significantly better video performance for virtually all users, not just those with dedicated H.264 acceleration.
Flash Player 10.2 is the first public implementation of Adobe's Stage Video architecture, which makes better use of whatever video acceleration (usually in the GPU) is available in the user's computer. Adobe claims that Stage Video decreases CPU usage by up to 85%. As a practical matter, Flash videos are playing cleanly, without stuttering or dropouts, on Windows using Flash Player 10.2 in both Firefox and IE. Windows 7/IE 9 users will also benefit from graphic acceleration using any available hardware rendering capabilities. If you're using Flash Player 10.1 or earlier and you don't need to continue using it for development purposes, I strongly recommend uninstalling 10.1 and replacing it with 10.2.
Flash Player 10.2 is the first public implementation of Adobe's Stage Video architecture, which makes better use of whatever video acceleration (usually in the GPU) is available in the user's computer. Adobe claims that Stage Video decreases CPU usage by up to 85%. As a practical matter, Flash videos are playing cleanly, without stuttering or dropouts, on Windows using Flash Player 10.2 in both Firefox and IE. Windows 7/IE 9 users will also benefit from graphic acceleration using any available hardware rendering capabilities. If you're using Flash Player 10.1 or earlier and you don't need to continue using it for development purposes, I strongly recommend uninstalling 10.1 and replacing it with 10.2.
RED, Apple and Plato's Cave
A couple of weeks ago, Jim Jannard of RED announced that the Scarlet would henceforth be called the EPIC Light, with new specifications and pricing to be announced. That set off a flurry of speculation about the new camera. The same thing is happening with Apple's Final Cut Suite, for the opposite reason--a lack of news beyond a couple of cryptic emails from Steve Jobs. There's also a lot of speculation about unannounced EVIL (electronic viewfinder, interchangeable lenses) cameras from Canon and Nikon, complete with "leaks" from unnamed sources.
You can easily find (unintentionally) funny exchanges between people who speculate about the unannounced features and delivery dates of these and many other products. It's like people looking at the shadows on the wall of Plato's Cave and debating what and who they are.
It's human nature to speculate, and I've certainly done lots of it, but at the end of the day, it's wasted time and effort. RED, Apple, Canon, Nikon, etc. will release the details of their products when they're ready, and if there's one thing I've learned after years in the technology business, it's that nothing is fixed in stone until you can actually order the product. So relax. It's fun to guess, but it's better to know.
You can easily find (unintentionally) funny exchanges between people who speculate about the unannounced features and delivery dates of these and many other products. It's like people looking at the shadows on the wall of Plato's Cave and debating what and who they are.
It's human nature to speculate, and I've certainly done lots of it, but at the end of the day, it's wasted time and effort. RED, Apple, Canon, Nikon, etc. will release the details of their products when they're ready, and if there's one thing I've learned after years in the technology business, it's that nothing is fixed in stone until you can actually order the product. So relax. It's fun to guess, but it's better to know.
Wednesday, December 01, 2010
Adobe Audition for the Mac: A change of heart
Last June, Adobe announced that it would begin beta testing a Mac version of its Audition digital audio workstation software late this year. I've used Audition on Windows since the late 1990s, when it was called Cool Edit Pro and was published by a company called Syntrillium. Audition became a favored audio editing tool for radio broadcasters and podcasters, but Adobe hasn't updated the software since 2007. I wondered why Adobe was bothering to release a Mac version of an application that had been bypassed by Avid's Pro Tools, Apple's Logic, and many other DAWs.
A few weeks ago, Adobe released the first beta version of Audition for the Mac, and even in beta, it's an excellent piece of software. It follows the user interface design of Audition 3.0 for Windows fairly closely, but takes advantage of OSX's multitouch capabilities. Many of the effects have been improved--for example, the noise reduction processor is both easier to use and much more effective than its Windows predecessor.
So, I take back the snarky remarks I made about Adobe porting obsolete software to the Mac. However, it doesn't change the fact that the Windows version is still three years old, hasn't had even a minor point update since early 2008, and desperately needs an upgrade.
A few weeks ago, Adobe released the first beta version of Audition for the Mac, and even in beta, it's an excellent piece of software. It follows the user interface design of Audition 3.0 for Windows fairly closely, but takes advantage of OSX's multitouch capabilities. Many of the effects have been improved--for example, the noise reduction processor is both easier to use and much more effective than its Windows predecessor.
So, I take back the snarky remarks I made about Adobe porting obsolete software to the Mac. However, it doesn't change the fact that the Windows version is still three years old, hasn't had even a minor point update since early 2008, and desperately needs an upgrade.
Tuesday, November 30, 2010
Amazon: Heads I win, tails I win
Investor Place has published a summary of a survey from ChangeWave Research that says that iPad owners as a percentage of all eBook reader owners it surveyed have doubled from 16% in August 2010 to 32% in November, while the percentage saying that it owns Amazon Kindles has declined from 62% to 47% in the same period. The simple way to read this is that the iPad is growing rapidly and stealing overall market share for eBook readers from the Kindle, but it's not the whole story.
It's not surprising that the Kindle's market share is declining overall, and will decline further in 2011 as Android tablets that actually work well as tablets, rather than overgrown smartphones, reach the market. (Given that Amazon is widely rumored to be working on its own Android tablet, that might be a shift it encourages, especially if it makes more money per unit on its tablets than its Kindles.) There's no doubt that the iPad is outselling the Kindle, and every iPad can be used as an eBook reader.
Elsewhere in the survey, ChangeWave asked people who don't already own eBook readers but plan to buy one in the next 90 days which one they prefer. 42% of respondents chose the iPad, while 33% chose the Kindle. It's not great for Amazon, but not disastrous, either. In addition, one question that the survey leaves out is what software iPad owners are using to read eBooks. The chances are that a majority of them are using Amazon's Kindle reader on the iPad as one of their eBook reader applications, if not the primary one.
That's where Amazon wins either way: If a customer purchases a Kindle, they're locked into the Amazon eBook ecosystem, and Amazon makes money on both the hardware and eBooks. Even if a customer purchases an iPad, Amazon still makes money so long as the customer installs the free Kindle reader and purchases eBooks from Amazon. So, just because the Kindle's overall market share is declining, it doesn't necessarily mean that it's bad news for Amazon.
It's not surprising that the Kindle's market share is declining overall, and will decline further in 2011 as Android tablets that actually work well as tablets, rather than overgrown smartphones, reach the market. (Given that Amazon is widely rumored to be working on its own Android tablet, that might be a shift it encourages, especially if it makes more money per unit on its tablets than its Kindles.) There's no doubt that the iPad is outselling the Kindle, and every iPad can be used as an eBook reader.
Elsewhere in the survey, ChangeWave asked people who don't already own eBook readers but plan to buy one in the next 90 days which one they prefer. 42% of respondents chose the iPad, while 33% chose the Kindle. It's not great for Amazon, but not disastrous, either. In addition, one question that the survey leaves out is what software iPad owners are using to read eBooks. The chances are that a majority of them are using Amazon's Kindle reader on the iPad as one of their eBook reader applications, if not the primary one.
That's where Amazon wins either way: If a customer purchases a Kindle, they're locked into the Amazon eBook ecosystem, and Amazon makes money on both the hardware and eBooks. Even if a customer purchases an iPad, Amazon still makes money so long as the customer installs the free Kindle reader and purchases eBooks from Amazon. So, just because the Kindle's overall market share is declining, it doesn't necessarily mean that it's bad news for Amazon.
Labels:
Amazon Kindle,
Android (operating system),
apple,
ChangeWave,
E-book,
iPad
Sunday, November 28, 2010
The Feldman File's videoblog is on vacation this week
Because of last week's U.S. Thanksgiving holiday and a "slow news week", the Feldman File's videoblog is taking the week off. It will return next weekend.
Saturday, November 27, 2010
Opprtunities for angel investors and VCs in "Flyover Country"
Earlier this week, Technori published a two-part article that I wrote for it on "Angels in Chicago". The first part explained the concept of angel investing and introduced readers to four active angel investors in Chicago, and the second part discussed the problems that angels see in funding proposals, compared angel investing to incubators, and offered some suggestions for entrepreneurs.
There's a surprising number of angel investors in Chicago. AngelList only names 16 Chicago angels, but Hyde Park Angels, the biggest angel investing group in the city, has 76 members, only a handful of which are on AngelList. There are other angel groups just getting started, like Wildcat Angels. And Groupon notwithstanding, the angels I interviewed said that the valuation inflation and competition for deals that investors are seeing in Silicon Valley and New York isn't happening in Chicago.
Terry Howerton, the head of the Illinois Technology Association, believes that the lack of VCs in Chicago is a bigger problem for startups than the number of angel investors. He said that there's only three or four VCs in the city doing Series A or later rounds, so VCs from other cities in the Midwest come to Chicago to look for deals. There's an enormous opportunity for angels and VCs outside Chicago to do deals here.
According to Bankrate.com, the cost of living in Silicon Valley is more than 30% higher than Chicago, while New York/Brooklyn is more than 50% higher, and New York/Manhattan is almost 80% higher. That means that a startup's labor costs (as well as taxes and some other operating costs) are going to be dramatically less in Chicago and other Midwestern locations. In addition, there isn't the competition for deals and valuation inflation that's occurring in Silicon Valley or New York.
Chicago has become a center of startup activity, especially for online services, as well as vertical B-to-B applications (financial, real estate, travel, etc.). There's a real opportunity here for investors, especially if you keep the companies in Chicago (rather than moving them to Silicon Valley or New York) and take advantage of the area's lower costs.
There's a surprising number of angel investors in Chicago. AngelList only names 16 Chicago angels, but Hyde Park Angels, the biggest angel investing group in the city, has 76 members, only a handful of which are on AngelList. There are other angel groups just getting started, like Wildcat Angels. And Groupon notwithstanding, the angels I interviewed said that the valuation inflation and competition for deals that investors are seeing in Silicon Valley and New York isn't happening in Chicago.
Terry Howerton, the head of the Illinois Technology Association, believes that the lack of VCs in Chicago is a bigger problem for startups than the number of angel investors. He said that there's only three or four VCs in the city doing Series A or later rounds, so VCs from other cities in the Midwest come to Chicago to look for deals. There's an enormous opportunity for angels and VCs outside Chicago to do deals here.
According to Bankrate.com, the cost of living in Silicon Valley is more than 30% higher than Chicago, while New York/Brooklyn is more than 50% higher, and New York/Manhattan is almost 80% higher. That means that a startup's labor costs (as well as taxes and some other operating costs) are going to be dramatically less in Chicago and other Midwestern locations. In addition, there isn't the competition for deals and valuation inflation that's occurring in Silicon Valley or New York.
Chicago has become a center of startup activity, especially for online services, as well as vertical B-to-B applications (financial, real estate, travel, etc.). There's a real opportunity here for investors, especially if you keep the companies in Chicago (rather than moving them to Silicon Valley or New York) and take advantage of the area's lower costs.
Labels:
Angel investor,
AngelList,
Chicago,
Hyde Park Angels,
Illinois,
Silicon Valley
Thursday, November 25, 2010
Alas, poor Stringer: Regime change underway at Sony?
Bloomberg News reported earlier today that Sony is seeking a new President. Sir Howard Stringer, who currently is Chairman, CEO and President of the company, is relinquishing the President position, and the primary candidates are senior executives at Sony who came out of the Engineering organization. According to Bloomberg, the new President will most likely succeed Stringer in running the company.
Few people would look at Howard Stringer's reign at Sony as being successful. In the almost 5 1/2 years since Stringer took over, Sony's stock has declined 25%, while the overall Nikkei 225 average fell half as much. Sony has lost market leadership in almost every major product segment that it once dominated: Apple took over the portable audio player business, Samsung took over in televisions, and Nintendo regained its market leadership in videogame consoles. Sony, which was once the company that others copied, found itself copying its competitors in order to survive; the Playstation Move is a slightly-improved copy of Nintendo's Wiimote, and Sony's digital camera business has had to copy Canon. Sony's prosumer and professional camcorder businesses are copying Panasonic's designs. For all of Stringer's boasting about Sony's leadership in 3D, both Samsung and Panasonic delivered 3D HDTVs before Sony, and Panasonic is shipping 3D digital cameras and camcorders before Sony.
To understand why Stringer was brought to Sony, you need to understand events starting in the late 1970s. In the so-called "Betamax" case, members of the Motion Picture Association of America went to court to block sales of Sony's Betamax home videocassette recorders, on the grounds that they would facilitate content theft. Losing the case would have not only forced Sony to withdraw its Betamax recorders from the U.S. market--it would have crippled future product opportunities. Sony won the case in the U.S. Supreme Court in 1984, but the company realized that it had to become a major player in the media business in order to insure a reliable supply of content for its devices.
To get into the movie business, Sony acquired Columbia Pictures from Coca-Cola in 1989. In addition, Sony had long been partners with CBS Records in the Japanese market, but in 1987, Sony acquired CBS' worldwide music business, and then Bertelsmann's music business in 2008. These moves made Sony one of the world's largest motion picture and music companies, as well as a major player in U.S. television syndication.
Howard Stringer joined Sony after a 30-year career at CBS (one of the four major commercial broadcast television networks in the U.S.,) where he spent most of his career in the News division before running the entire company for seven years, followed by a two-year stretch at an ill-fated joint venture called Tele-TV. Stringer was appointed President of Sony Corporation of America in 1997, made a corporate board member a year later, and was appointed Chairman and CEO of Sony Corporation in Japan in 2005. Four years later, he took over the President position as well. Sony's goal in bringing Stringer to Sony's corporate offices was to try to strategically unite the company's electronics and media businesses.
Stringer came to Sony with two strikes against him: First, he was a gaikokujin (foreigner), which is a problem with any Japanese company, and second, he had no engineering background or experience, which was a huge problem at Sony, a very engineering-driven company. Sony's board was perhaps hoping that Stringer would be to the company what Carlos Ghosn was to Nissan Motors. Ghosn, the Brazilian-born engineer who was at the time in charge of engineering at Renault, was sent to run Nissan by Renault (Nissan's largest shareholder) to try to rescue the company. Not only did Ghosn turn Nissan around, he became a Japanese folk hero in the process.
Ghosn moved to Japan, learned how to speak fluent Japanese, and communicated with the Nissan team with the universal language of engineering. He adapted to the organization and culture, and gained the respect of both his team at Nissan and all of Japan. Stringer, on the other hand, has no engineering background, and it's unclear if he understands consumer electronics any better now than he did the day that he joined Sony. He never learned Japanese, kept his primary office in New York City and his family in London.
It may have been unrealistic for Sony to put the expectations on Stringer that it did, but it also should have seen the problems well before it did. Without a respected manager running the company, individual divisions within Sony went in their own directions. Stringer took over the President position in 2009 to try to pull the company back together, both strategically and tactically. He was largely successful, but has been unable to provide any long-term strategic blueprint to the company's engineers. As a result, Sony is largely "rowing in circles" rather than pursuing a strategic plan to regain market leadership.
Americans who know Howard Stringer speak very positively about him, and he's well-liked in the media industry. Stringer wasn't a bad manager, but he was the wrong person for the job. Had he understood Sony's culture or Japanese culture better, he might have demurred at taking the Chairman and CEO positions. The good news is that, while Sony may never recover leadership in all the markets that it once dominated, it still has a very strong engineering organization that's capable of developing great products. It just needs someone running the company who can tell it what those great products should be.
Few people would look at Howard Stringer's reign at Sony as being successful. In the almost 5 1/2 years since Stringer took over, Sony's stock has declined 25%, while the overall Nikkei 225 average fell half as much. Sony has lost market leadership in almost every major product segment that it once dominated: Apple took over the portable audio player business, Samsung took over in televisions, and Nintendo regained its market leadership in videogame consoles. Sony, which was once the company that others copied, found itself copying its competitors in order to survive; the Playstation Move is a slightly-improved copy of Nintendo's Wiimote, and Sony's digital camera business has had to copy Canon. Sony's prosumer and professional camcorder businesses are copying Panasonic's designs. For all of Stringer's boasting about Sony's leadership in 3D, both Samsung and Panasonic delivered 3D HDTVs before Sony, and Panasonic is shipping 3D digital cameras and camcorders before Sony.
To understand why Stringer was brought to Sony, you need to understand events starting in the late 1970s. In the so-called "Betamax" case, members of the Motion Picture Association of America went to court to block sales of Sony's Betamax home videocassette recorders, on the grounds that they would facilitate content theft. Losing the case would have not only forced Sony to withdraw its Betamax recorders from the U.S. market--it would have crippled future product opportunities. Sony won the case in the U.S. Supreme Court in 1984, but the company realized that it had to become a major player in the media business in order to insure a reliable supply of content for its devices.
To get into the movie business, Sony acquired Columbia Pictures from Coca-Cola in 1989. In addition, Sony had long been partners with CBS Records in the Japanese market, but in 1987, Sony acquired CBS' worldwide music business, and then Bertelsmann's music business in 2008. These moves made Sony one of the world's largest motion picture and music companies, as well as a major player in U.S. television syndication.
Howard Stringer joined Sony after a 30-year career at CBS (one of the four major commercial broadcast television networks in the U.S.,) where he spent most of his career in the News division before running the entire company for seven years, followed by a two-year stretch at an ill-fated joint venture called Tele-TV. Stringer was appointed President of Sony Corporation of America in 1997, made a corporate board member a year later, and was appointed Chairman and CEO of Sony Corporation in Japan in 2005. Four years later, he took over the President position as well. Sony's goal in bringing Stringer to Sony's corporate offices was to try to strategically unite the company's electronics and media businesses.
Stringer came to Sony with two strikes against him: First, he was a gaikokujin (foreigner), which is a problem with any Japanese company, and second, he had no engineering background or experience, which was a huge problem at Sony, a very engineering-driven company. Sony's board was perhaps hoping that Stringer would be to the company what Carlos Ghosn was to Nissan Motors. Ghosn, the Brazilian-born engineer who was at the time in charge of engineering at Renault, was sent to run Nissan by Renault (Nissan's largest shareholder) to try to rescue the company. Not only did Ghosn turn Nissan around, he became a Japanese folk hero in the process.
Ghosn moved to Japan, learned how to speak fluent Japanese, and communicated with the Nissan team with the universal language of engineering. He adapted to the organization and culture, and gained the respect of both his team at Nissan and all of Japan. Stringer, on the other hand, has no engineering background, and it's unclear if he understands consumer electronics any better now than he did the day that he joined Sony. He never learned Japanese, kept his primary office in New York City and his family in London.
It may have been unrealistic for Sony to put the expectations on Stringer that it did, but it also should have seen the problems well before it did. Without a respected manager running the company, individual divisions within Sony went in their own directions. Stringer took over the President position in 2009 to try to pull the company back together, both strategically and tactically. He was largely successful, but has been unable to provide any long-term strategic blueprint to the company's engineers. As a result, Sony is largely "rowing in circles" rather than pursuing a strategic plan to regain market leadership.
Americans who know Howard Stringer speak very positively about him, and he's well-liked in the media industry. Stringer wasn't a bad manager, but he was the wrong person for the job. Had he understood Sony's culture or Japanese culture better, he might have demurred at taking the Chairman and CEO positions. The good news is that, while Sony may never recover leadership in all the markets that it once dominated, it still has a very strong engineering organization that's capable of developing great products. It just needs someone running the company who can tell it what those great products should be.
Sunday, November 21, 2010
Episode 4 of the Feldman File videoblog is live!
It's Sunday night, and that means that I've posted a new episode of the Feldman File videoblog on YouTube! Here's the rundown for this week's edition:
- Apple's less-than-earthshaking announcement about adding the Beatles' music catalog to iTunes
- Sony follows up on its Super 35MM camcorder, the PMW-F3, with yet another Super 35MM camcorder, the 35MM NXCAM
- U.S. cable operators lose 741,000 subscribers in Q3--are consumers really cutting the cord?
- The Obama Adminsitration is looking for 500MHz of additional broadband bandwidth, and the National Telecommunications and Information Administration found 2.2GHz of bandwidth available within ten years. So, do we really have a bandwidth shortage?
Labels:
35mm NXCAM,
apple,
Beatles,
Broadband Internet access,
Cable,
Direct broadcast satellite,
IPTV,
ITunes,
Sony
Subscribe to:
Posts (Atom)