Friday, December 30, 2005

Showdown at the IPTV Corral

According to this story, SBC (soon to be AT&T) has filed suit against the city of Walnut Creek, CA. To summarize, SBC is upgrading its telephone network in Walnut Creek to support broadband data and video. As part of the installation, SBC has to run fiber optic cables under a bikeway owned by the city, and requested a permit to close access to the bikeway while work was in progress. Walnut Creek’s city administration refused to issue the permit unless SBC agreed to get a local cable franchise. In its lawsuit, SBC claims that both federal and state laws say that SBC’s service will not be a cable service, and so shouldn’t be required to get a cable franchise.

For decades, the cable industry has been subject to local franchise laws. Cable service was effectively a monopoly because it was both uneconomic and infeasible for more than one cable company to serve an area of a city. Thus, municipalities retained the right to approve the cable company that wanted to build in the area. That approval, called a franchise, specifies both the fees that the cable operator has to pay to the city and the services that the company has to provide. These services vary from city to city, but typically include free service for municipal offices and public schools, community channels and studios where local programs can be produced, and at least one tier of low-cost analog service.

Thanks to new technology, the rules have changed. Telcos are now able to offer IPTV television services along with high-speed data and telephone service. Satellite providers aren’t subject to local franchise rules. Even power companies are experimenting with high-speed data services across their power lines that could be used for IPTV. Every one of these services threatens to take away subscribers from the local cable franchisee, and fewer customers means less revenue for both the cable operator and the city.

Cable operators argue that they’re at a cost disadvantage vs. telcos (thanks to those franchise fees,) and therefore, telcos should either be subject to the same franchise rules, or cable operators should be relieved of their franchise obligations. In response to a recent Supreme Court ruling, the FCC recently issued a proposed rule finding that telcos offering “wireline broadband Internet access service” are information services and are thus not subject to local regulation. Cable operators’ broadband Internet services are equally free of local regulation, but here’s the rub: Conventional analog and digital cable services are not information services, and thus are subject to local regulation, but telcos will offer their video services as IPTV over their broadband Internet services, and are not subject to local regulation.

This is one of those situations where both sides have valid points. The FCC proposes to create a level playing field with respect to broadband data services, but unless telco IPTV services are regulated in the same way as cable video services, the cable operators will be at a significant disadvantage. There are literally thousands of local franchises, and if the telcos have to negotiate with each one, rollout of widespread IPTV services could be stalled for years.

The issue will undoubtedly end up back at the FCC. No matter what the FCC rules, however, the losers are going to appeal in federal court, and the case will probably end up back at the Supreme Court. Congress could also jump into the fray with legislation that waives local franchising of cable operators, telcos, or both. The technology for telcos to offer IPTV video already works, but consumers may not get to use it for quite some time.

Wednesday, December 21, 2005

Podcast Perfecto, Part 2

In Part 1, I gave some tips on what to include in your podcast. In this part, I’ll give you some technical tips and tricks for creating and improving the sound of your podcasts. Again, in no particular order:

  • Consider buying a podcast package. There are a variety of online merchants that are selling packaged systems (usually with software included.) You don’t need to spend a lot of money: Sweetwater has a complete podcasting package from M-Audio for $150. Musician’s Friend (owned by Guitar Center) has a variety of packages from $100 to $280. BSW’s podcast packages run from $169 to $1,699. Make sure that the software in the package is compatible with your PC; some packages are only available for Windows or Mac.

  • Get the right microphone. Buying a mic is very confusing, even for experienced users. There are many brands at prices from under $100 to over $5,000. You don’t have to spend a lot for a mic, but make sure that you get one that’s optimized for voice and narration. Mics have pickup patterns that determine from where they can hear sounds. Get a mic with a cardioid or hypercardioid pickup pattern: Cardioid means “heart-shaped,” and this kind of pattern rejects sound to the rear of the microphone. Hypercardioid mics are even more focused on the front and reject sound from the sides and rear.

    There are several technologies that microphones use to capture sound; I’ll discuss the two most common ones. Dynamic microphones don’t require their own power. They’re most commonly used for vocals and musical instruments. They’re rugged and can take a fair amount of abuse. Condenser microphones are more sensitive to quieter sounds than are dynamic mics, and they tend to sound better (although some dynamic mics are equal to or superior than condensers in the same price range.) Condensers require a power supply, which on most models is an external +48 volt supply (called “phantom power.”) Some condensers can be powered by a battery for those times when external power is not available. Condensers are much more fragile than dynamic mics.

    For podcasts, a good dynamic microphone is fine. Dynamic mics don’t pick up soft sounds as well as condensers, but that can be a benefit for podcasts, where a condenser might pick up noise from your computer more easily than a dynamic mic. If you get a condenser, get a large diaphragm model; they’re more sensitive and tend to reproduce sounds more accurately than small diaphragm mics.

  • Buy a pop filter for your microphone. A pop can occur when you say a word like “pizza” or “pack.”  The “p” in these words is called a plosive, and it can momentarily overload a microphone. A pop filter is a small nylon screen that’s placed in front of the microphone to dissipate the energy of plosives. The result is that the microphone hears a “p” but doesn’t overload.

  • Get a good computer audio interface. There are lots of these, and they can be very inexpensive. Some install in your computer (from E-Mu and other companies,) some are dedicated interfaces (from Digidesign, Focusrite, Lexicon, Mackie, M-Audio, etc.,) and some are audio mixers that include a computer interface (Alesis, Digidesign, Mackie, M-Audio, etc.) PC audio systems built into the motherboard tend to pick up electrical noise from other components, so you’re better off to use a separate external interface. Some of the less expensive interfaces don’t have phantom power for condenser microphones, so be sure to check before you buy.

  • Consider getting a headset. Headsets combine headphones with a microphone. Broadcast-grade headsets have high-quality microphones, and you can move around without going out of range of your mic. They’re also less sensitive to noise. Most broadcast-quality headsets are shipped without connectors, so unless you’re good with a soldering iron, buy a headset that already comes with connectors for the microphone and headphones. Whatever you do, don’t use a conventional computer headset. The microphone quality is generally poor.

  • Good audio editing software is essential. There are lots of choices, from Audacity, a free editor that runs on Mac, Windows and Linux, to for-fee software such as Adobe Audition, Digidesign Pro Tools, Mackie Traktion, Steinberg Cubase and Sony Sound Forge. They all work well and, if you know what you’re doing, can be used for creating superb podcasts. All of these products come with filters and plug-ins for adjusting the sound of your recording, reducing noise, etc. Take the time to play with whatever package you select before you record your first podcast. A little training time at the start will pay big dividends later on.

  • Save your podcast as a MP3. All podcasts have to be sent to listeners in a compressed audio format in order to shorten download time and save hard disk space. Every podcast player that I know of supports MP3 files, and every audio editing package can easily create them. AAC, Apple’s standard audio format, is also often used, but not all players can handle it, so unless you specifically want to tailor your podcast to the Apple audience, use MP3.The compression bitrate that you use is critical in determining the sound quality of your podcast. A podcast that’s been compressed too much can sound like a bad telephone call. The good news is that voice can be compressed much more than music. Typical bitrates run from 48Khz to 128Khz. The lower the bitrate, the smaller the file. My suggestion is to experiment until you find the lowest bitrate that doesn’t sacrifice your audio quality. If you’re not sure, err on the side of using a higher bitrate.

  • Learn microphone technique. Using a microphone is a skill that’s easy to learn and is essential for a good-sounding podcast. Use your normal speaking voice, as if you’re having a conversation. Stay close to the mic (two to four inches is usually best,) and speak directly into it, not from the top or sides. If you’re recording directly into an audio editing program, watch the sound meters and keep the volume of your voice below 0 dB (the red zone on the meters) and well above the bottom of the meter. If you’re too loud, you’ll cause distortion, or clipping. If you speak too softly, you’ll have to amplify your voice in the audio editor. Any noise in the original recording will be amplified along with your voice.
Those tips should get you started. Good luck!

Podcast Perfecto, Part 1

I’m planning to launch a podcast for the Feldman File early in 2006. In the process of getting ready, I’m doing all the “due diligence” research needed to make a podcast work, and I’d like to share some of what I’ve found with you. Part 1 of this article will deal with content, while part 2 will cover technical tips.

In no particular order:

  • Pick a subject for your podcast, and stay on it. Just as in radio, where every station has a format (news, talk, oldies, progressive rock, etc.) that stays the same from day to day, you should establish a subject for your podcast that listeners can reliably listen to week after week. If you do a podcast on photography, don’t talk about politics (unless it relates directly to photography.) Wandering off topic will confuse and frustrate your listeners.

  • Bring in more than one speaker. I recently listened to a podcast by an industry analyst, and all he did was read an article that he wrote (including headings.) There’s nothing as boring as listening to a single person drone on about a topic. Again, consider radio: Howard Stern has a studio full of on-air staff and guests, and Rush Limbaugh takes phone calls. Bring in a co-host, have on-air guests, do interviews or take phone calls.

  • Entertain your audience: Make it fun for people to listen to your podcast. Even if you’re covering a very dry topic, find a way to lighten the mood. Have a sense of humor.

  • Inject some controversy. I resisted this concept for a long time, but controversy is entertaining. People like to listen to arguments. You can also take the Howard Stern approach of being outrageous. Don’t artificially create controversy; choose some topics where there’s a genuine difference of opinion.

  • Keep it short. Make the length of the show fit the material; don’t pad it out to make it 30 for 60 minutes long. With podcasts, there’s no “right” length.

  • Include music and/or sound effects. It’s essential to draw listeners in from the very start of your podcast, and opening music is a great way to do it. You can have music in other places in your podcast (to separate stories or topics, for example.) However, be very careful about the music that you use. Don’t use music from a commercially-recorded CD, unless you have performance rights to the song. You can acquire royalty-free music from many suppliers, or you can purchase software such as SmartSound’s Sonicfire Pro to quickly compose your own without requiring any musical knowledge. If you have any question about whether or not you can use a musical selection in your podcast, consult an attorney.

  • Find advertisers that fit your subject. When I produced a comedy talk show several years ago, one of the “promotional consideration” sponsors was a vodka. It didn’t fit the show. The broader your subject, the greater the variety of advertisers you can choose from. For example, a sports show could have beer and car ads, but a show on knitting should have advertisers that make or sell products related to knitting.
That’s it for content. Stay tuned for Part 2, technical tips and tricks for improving your podcasts.

Tuesday, December 20, 2005

A Return to “Walled Gardens”?

This week's Business Week draws attention to an issue that's been hot in the Internet community for the last month. Currently, companies like eBay, Google and Yahoo pay a lot of money to telecom companies for local loop connections to their networks, and from there to the Internet backbone. However, some of the major telephone and cable companies (led by AT&T's Edward Whitacre) want the content and service providers to pay for access to their residential and business Internet subscribers. Specifically, this is what’s being considered:

  • Using Quality of Service (QoS) to give content providers that agree to pay them more bandwidth and better response times than content providers that don’t pay, and blocking some websites and services altogether.

  • Limiting or barring competitive VoIP vendors from accessing their networks.

The result could be a return to the “walled garden” model pursued by AOL, CompuServe, Delphi, etc. in the years before the Internet. SBC, for example, could give priority to its partner Yahoo, and make competitive services like Google and MSN much slower and harder to reach. It could also decide to block all usage of file-sharing services because they use too much bandwidth.

None of this is new, at least to corporations that have limited employees’ access to certain websites and services for years. Companies regularly block access to porn and shopping sites, and they forbid the use of file-sharing software and services such as Skype. However, these controls are only imposed within these companies’ firewalls. Now, the major telecom companies want the ability to impose these same kinds of limitations on their subscribers.

The risks of this scenario might explain some moves by Google over the last year. At the beginning of 2005, rumors surfaced about a plan by Google to purchase large amounts of “dark fiber”. There was some speculation that Google was planning to become a “super-ISP”, but that talk quickly died down as the press moved on to other stories. Then, in late summer, Google pursued providing citywide WiFi service to San Francisco and Mountain View, CA, where Google is headquartered. The San Francisco franchise is still being debated, but Google won the contract in Mountain View.

Now, why would Google need nationwide dark fiber? It makes sense for the company to run its own private network in order to interconnect its server sites. But why does the company want to get into WiFi? I believe that Google’s dark fiber and WiFi initiatives are closely linked. In the event that the major telecom companies try to force Google to pay for access to their networks, Google can “light up” its own competitive ISP service in major metropolitan areas. After all, there’s no reason why Google can’t overbuild wireless data services in the same areas as the major telecom companies. With WiMAX service and mesh networking, Google could cover an entire city quickly, and at a comparatively low cost.

Of course, Google could use QoS to give priority to its own content and services, but there’s no reason why other content providers couldn’t launch their own ISP services. It’s even reasonable to assume that two or three major content providers could get together and finance a competitive ISP service, guaranteeing all of them equal access to subscribers.

The fight for open access to the Internet is just beginning. Stay tuned—you may see a lot more competitors soon.

Thursday, December 15, 2005

Telephone in the 21st Century

There’s an interesting article in this week’s Television Week magazine about SureWest, a local telephone company based in Roseville, CA, just outside Sacramento. The company is offering its subscribers IPTV with 260 channels, and is about to launch 17 HD channels. SureWest has 85,000 telephone subscribers, of which 20% (approximately 17,000) are taking the video service. SureWest is offering a package of telephone service, DSL and video for $110/month, about 15% less than AT&T/SBC and Comcast.

What’s interesting about SureWest is not only that it’s offering IPTV video in the first place, but that it’s got 17,000 subscribers. AT&T has only a few dozen users of its U-Verse service in a test market, and Verizon has a few thousand users of its FIOS service right now, although it predicts that it will have 400,000 subscribers by the end of 2006. (Verizon is also far ahead of AT&T in development and deployment.)

SureWest’s service is highly competitive with that of Comcast, although Comcast boasts that it has 2,200 hours of video-on-demand (VOD) content available, whereas SureWest has only 400 hours. However, the size of SureWest’s library is only a temporary disadvantage as more content providers offer VOD programming. SureWest also beats DirecTV and Echostar; those companies offer less expensive video packages, but their services are inherently one-way, so they can’t offer much in the way of data services and no real-time VOD.

SureWest’s engineering approach is dramatically different than that of the traditional cable operators, which typically purchase both set-top boxes and headend (the cable operator’s central distribution site) equipment from the same company. If they’ve got a Motorola headend, they use Motorola STBs, and the same goes for Scientific-Atlanta. By comparison, SureWest has gone for a “best of breed” strategy, where it’s installed what it thinks are the best products in each category. For example, SureWest uses IPTV STBs from a UK-based company called Amino, which I’d never heard of before I read the TV Week article. Amino has a range of STBs with a variety of features and price points, as well as a PVR, all of which are about the size of a paperback book. SureWest also uses equipment from BigBand Networks, Irdeto, Kasenna, Minerva and Cisco in its network and headend. It’s not locked into any single vendor or technical approach, because all of its vendors’ products conform to industry standards.

AT&T and Verizon have to eventually support millions of subscribers, so they’re going to get most of their technology from a single supplier. Installing and supporting a nationwide network of “best of breed” equipment would be a nightmare for them. However, that enables the smaller telcos like SureWest, as well as cable overbuilders like Grande and RCN to add services and respond to market demand faster than the behemoths.

In short, if you want to see the future of video distribution today, take a look at SureWest. It’s already played host to over 100 companies that want to learn about deploying IPTV. I suspect that quite a few companies will be going there for an education. - Alliance hopes to simplify HDTV networks

Yesterday, the High-Definition Audio-Video Network Alliance (that name just rolls off the tongue, doesn't it?), also known as HANA, announced its formation and its plans to connect TVs, DVRs, set-top boxes, media centers, etc. via IEEE 1394, otherwise known as Firewire. Members of the group include JVC, Mitsubishi, Samsung, Sun Microsystems, Charter Communications and NBC Universal. (Notice that Sony, Panasonic, Philips, Toshiba and Apple aren't members.)

The establishment of yet another industry consortium isn't news--there seems to be at least one or two announced every week. What's interesting about this consortium is that Samsung appears to be leading it. The president of HANA is Heemin Kwon, who's the executive VP and general manager of Samsung's Digital Solution Center. To the best of my knowledge, this is the first time that Samsung has been the lead partner in an international standards consortium. The leadership role usually falls to a company such as Sony, Panasonic or Toshiba, but Samsung is now clearly in position to be taken just as seriously as the "incumbent" industry leaders.

It'll be very interesting to see if Samsung can convince any of the big Japanese "alliance builders" to join HANA, and what compromises (including sharing of power) they'll demand in order to gain their cooperation.


Wednesday, December 14, 2005

January Action

There’s lots of new product announcement that are expected next month. Here’s a quick look:

  • Intel has confirmed that it will release its Yonah dual-core processor for notebooks in early January. Yonah will combine significantly better performance than today’s Pentium M processor with lower power consumption. The plan is for manufacturers to begin shipping Yonah-based products at the same time that Intel makes its announcement. You can expect to see both Yonah-based notebooks and small form-factor desktop PCs.

  • While not confirming the date, Intel will announce the first products based on its VIIV home entertainment platform in January. The date bandied about is just before both the Consumer Electronics Show and Macworld. There should be several VIIV-based home media center prototypes and early production models at CES.

  • As usual, Apple remains mum on what (if anything) the company plans to introduce at Macworld, but Intel’s timetable lends credence to rumors of the announcement of new iBook models. Yonah will enable Apple to make faster iBooks that use less power. However, Apple can’t risk making the Powerbooks obsolete by selling iBooks with higher performance, so we’re likely to either see iBooks with slower processors or Powerbooks announced at the same time with higher-speed Yonah chips.

  • The VIIV and Yonah launches also make Apple’s introduction of a Mac Mini-based media center more possible. However, even if this kind or product is announced, Apple won’t necessarily use the VIIV platform. Intel has been very circumspect as to whether or not Apple has committed to adopt VIIV, and Intel claims that VIIV will only support Microsoft’s Windows Media Center software.

None of this includes the dozens, if not hundreds, of new consumer electronics products and services that will be shown for the first time at CES. Personally, I expect to see a bunch of new satellite radio receivers, as that business really begins to take off. (Sirius showed a mobile video service at last year’s CES conference, and XM came back shortly later and said “We can do it too,” so I expect to see more developed versions of video over satellite radio at CES.)

By the end of January, we may have a very different playing field and set of competitors than we have today. It may be cold, but it’s going to be a very interesting month.

Sunday, December 11, 2005

The Role of Marketing

I’ve spent most of my career in technology marketing: Product management, product marketing and market communications. I’ve launched dozens of products, run departments and worked with engineering teams over the last 25 years. And yet, I’ve always felt that I had a “Kick Me” sign on my back, because in high tech, marketers are second-class citizens.

Years ago, when I worked at HP, I joined the company as a product manager. I spent a year in marketing before I got so frustrated about my lack of influence that I transferred into the lab. As an engineer, I had far more say over how products turned out than when I was in marketing. This pattern has been repeated again and again.

I was prompted to write this entry by the cover article in Business Week a few weeks ago, one of the many articles that have been published lately about Google. Toward the end of the article, the authors discuss the “pecking order” at the company: Engineers come first, followed by product management (which is closely aligned with engineering at Google,) while sales and marketing are at the bottom of the pecking order.

I don’t mean to pick on Google, because this pattern is repeated at most high tech companies. Don’t believe me? Try this simple test: Go to the websites of several high tech companies. Click on “Company” or “About Us,” and then select “Management.” Look at the management roster, and see who’s at the bottom of the list. Most times, it’ll be marketing and sales.

High tech companies are famous for confusing sales and marketing. Sales generates revenues for companies, while marketing supports sales, figures out what products are needed for future growth and profitability, and does whatever it can to support customers so that they buy again and again. By its nature, sales has a short-term focus; the emphasis in sales organizations is to “make the numbers” for the current month, quarter or year. Marketing has to have a longer-term focus; it must figure out what products and services are needed in the next months and years, and how to communicate the benefits of those products and services to the salesforce, customers, investors and the press.

One of the most common situations in high tech is the company that has no “second act.” This is a company that launched a successful product or service, but couldn’t follow up with an equally successful second product. Silicon Valley is littered with the corpses of companies that fell into this trap. It’s the job of marketing to figure out what that second act should be, by talking with current and potential customers, talking with engineering for their concepts and considerations, and analyzing the competition. That doesn’t mean that engineering shouldn’t be developing new products and technology independent from marketing, because that’s where most of the true discontinuous innovations come from. However, engineering, marketing, sales and finance have to tightly coordinate their actions, or you run the risk of “getting on your horse and riding off in all directions.”

I would love to see high tech companies take the discipline of marketing as seriously as they take engineering, but I’m not likely to see it in my lifetime. However, there is one branch of high tech that does take marketing more seriously (because it has to,) and that’s consumer products and services. Engineering-driven products that are difficult to use or don’t fill the needs or wants of a reasonably large consumer audience will die a horrible death.

A good example is Sony’s past digital audio players. For a number of years, Sony went its own way, using its proprietary ATRAC format to encode and protect the songs, and a program called SoundStage to manage the PC-to-player connection. Even though virtually every portable player could handle MP3s, Sony required that MP3s be converted to ATRAC by SoundStage before they could be copied to its players. Users of other formats with DRM (such as Windows Media and AAC with Apple’s FairPlay) were completely out of luck.

The result was that sales of Sony’s players were miniscule, while Apple’s grew until the company reached its current point of market domination with iPods and iTunes. It’s only recently that Sony finally acquiesced to the market and began supporting other formats. If Sony had bothered to listen to consumers when it was planning its original players instead of insisting on ATRAC because it was a “better” format, it might dominate the market now and Apple would be an also-ran.

It’s essential that companies know who their customers are, and talk to them. It doesn’t matter whether a company sells to businesses or consumers—the basic principles are the same. Products and services have to meet customer needs in order to sell. If it’s a truly innovative product, customers may not know that they need it, but it’s marketing’s job to identify the needs and wants that the product fulfills and communicate them to the salesforce and customers.

Marketing, especially for companies that want to survive past their first product, is as essential as engineering. I firmly believe that the survival rate of new ventures will dramatically increase as more companies take marketing seriously. Will that ever happen? We’ll see.

Animation: It’s the Script That Counts

There’s an interesting article in this month’s Fast Company (subscription required) that discusses how small companies are getting into the computer animation business, using off-the-shelf software and workstations to create films at a fraction of the cost of those from Pixar and Dreamworks Animation. However, hardly mentioned, if at all, is the importance of scripts in the success of animated films.

Over the past few years, along with the mega-hits The Incredibles and Shrek II, we also got films such as Valiant and Robots, both of which disappeared from theaters in a few weeks after their releases. All four movies were well-animated; in fact, Robots came from the same studio that produced the hit Ice Age. However, almost every review of the films came down to the scripts: Brad Bird’s script was nominated for Best Original Screenplay at last year’s Oscars. Valiant and Robots, on the other hand, were panned by critics for their scripts, even though Robots was praised for its animation.

Perhaps a better example is The Polar Express. The characters had almost creepy features, but the story was strong enough to make the film one of the biggest hits of last Christmas.

The impact of scripts doesn’t just apply to fully-animated movies. Star Wars Episodes I and II, Hulk and Van Helsing were all criticized, not for the quality of their animation but rather for their scripts. While the Star Wars movies still did very well at the boxoffice, their performance (especially Episode II) was below industry expectations. Hulk and Van Helsing both performed much worse than their pre-release estimates. (In fact, many audience members were text messaging their friends while they were watching Hulk, telling them not to bother seeing the movie.)

My point is that while the animation is very important, the story and script are far more important. By focusing on the technology rather than the writing, it’s easy to draw the wrong conclusions. Yes, animation technology has gotten much cheaper, and excellent animators can be found worldwide, but they’re only as good as the scripts they’re given.

Friday, December 09, 2005

Where’s Google Going?

It seems that Google has been on the cover of just about every business magazine in the last two months. With a share price over $400, a lot of people are interested in the company. However, Google’s strategic direction is about as clear as mud. The company says that its goal is to index and make available all the knowledge in the world, which is laudable but has already gotten them into lots of trouble with book publishers, who really don’t want their “knowledge” to be given away.

Google now has eighteen different search-related services plus Blogger, Google Analytics, Picasa, Google Talk, Google Earth, Gmail, Google Desktop and Google Toolbar. Google Labs is testing twelve more services. The company also sells a search appliance for large organizations. And let’s not forget Google Adwords, the financial engine that drives the company. (Google is even experimenting with selling print advertising space to its Adwords advertisers.)

Most of Google’s services fit well with its stated goal, but services like Blogger, Google Analytics, Google Talk and even Gmail are a stretch. Further, it appears that the things that are generating the vast majority of Google’s revenue are search and Blogger. The other services and software are either being given away or (in the case of the search appliance and the pro version of Google Earth) contribute a miniscule share of revenues.

The situation at Google reminds me of Netscape in many ways (although Netscape had a far more coherent product strategy.) Netscape had a full line of servers and a suite of development tools along with its ubiquitous browser, but the vast majority of the company’s revenues and profits came from sales of the browser to commercial customers. That’s why Microsoft gave away Internet Explorer and rushed to bring it to parity with Netscape Navigator. Microsoft saw that Netscape was totally dependent on its browser for revenues; cut off those revenues, and Netscape would be crippled, which is exactly what happened.

Similarly, Google is almost totally dependent on Adwords for its revenues and profits. Microsoft is rumored to be negotiating with AOL to create a joint keyword advertising network to compete with Adwords. Would Microsoft offer its affiliates a much better percentage of advertising revenues than Google? More than likely; in fact, Microsoft might even give its affiliates 100% of its advertising revenues from their sites. Would that cut off Google’s oxygen supply? Not necessarily, but it would make Google much more dependent on advertising revenues from its own site.

Google has to start building its non-advertising revenue base, and the sooner the better. It needs to look at more effective ways to generate revenue from its services and software, and it also needs to put more emphasis on getting its products right than on constantly pushing new products into the market.  It’s easy to drive adoption when you give everything away for free. It’s much more difficult to create services and software that customers will pay for. Google’s challenge is to raise its game to the next level.

HD is in the Eye of the Beholder

Scientific-Atlanta recently released the results of a telephone survey that found that 49% of owners of HD-capable televisions or monitors aren’t using them to watch HD programming because they don’t have a cable or satellite set-top box, CableCARD or antenna capable of receiving HD. (A study by Forrester Research confirms this number; in fact, they say that by the end of the year, less than 44% of HD-capable sets will be used to watch HD programming.) What’s even more interesting is that 41% of owners of HD-capable sets think that they’re already watching HD when they’re not.

There’s undoubtedly a variety of reasons why these viewers think that they’re watching HD when they’re not. Some viewers thought that they could watch HD without any additional equipment, while others thought that they were automatically watching HD when they saw the “Broadcast in HDTV” bugs at the beginning of some TV shows and sporting events. My personal belief is that these viewers have never actually seen a HD show in HD, so they have no standard of comparison. Or, it may also indicate that some of them can’t tell the difference between HD and conventional programming.

In 2004, I attended a DVD industry conference where Blu-Ray and HD DVD advocates pitched their formats to senior industry execs. (The heads of all the major studios’ home video divisions were in attendance.) The Blu-Ray team showed a scene from Lawrence of Arabia in which one side of the screen was in HD, and the other was in standard definition. The difference was so subtle as to be almost invisible. (And I’m hardly a HD newbie; I saw my first HD pictures in Japan in 1989.) The same thing happened at the Consumer Electronics Show last year; I saw Blu-Ray movies that were virtually indistinguishable from the DVD versions.

To give the Blu-Ray forces the benefit of the doubt, the most recent demo was almost a year ago, so improvements have probably been made by both their and the HD DVD team. However, if a significant percentage of the audience either can’t see or don’t know the difference between HD and SD, are they going to bother to buy a HD blue-laser player? DVDs look a lot better than conventional SD broadcasts, so the difference between DVD and blue-laser discs is likely to be much more subtle than the difference between HD and SD broadcasts. Even in an in-store comparison (which is where most consumers will see HD discs,) will there be enough of a difference to get them to trade up?

I haven’t seen any results of independent side-by-side consumer comparisons between blue-laser discs and DVDs, but my suspicion is that they’re not going to be very favorable to the HD disc crowd. Could this turn into a repeat of the DVD Audio vs. Super Audio CD (SACD) battle, where consumers didn’t buy either one? Stay tuned.

Wednesday, December 07, 2005

The DRM Pandemic

Unless you’ve spent the last few weeks housesitting Saddam Hussein’s “spider hole,” you’ve heard about Sony’s software that silently installed itself on listeners’ PCs in an attempt to keep its CDs from being ripped. Instead, the listeners got ripped, thanks to a just plain ugly job of programming that opened a massive security hole in their computers.

I bring up this issue because last night, I tried to play some music stored on my PC that’s protected with Microsoft’s Windows Media Digital Rights Management (DRM) system. I purchased the songs months ago from Napster’s (legal) music store. I’ve got one copy on my PC and one on my music player, so I’m not even close to the limit on the number of copies of each song that I’m allowed to make by the DRM.

When I tried to play one of the Napster-sourced songs on my PC, instead of music I got a dialog box that said “License Acquisition Project page to upgrade to Premium.” After being unable to translate that message with my Captain Midnight Decoder Ring, I thought that the problem might be with Napster, so I launched it only to be told that I had to upgrade to a new version. After downloading and installing the upgrade, I tried to play a couple of the songs through Napster, only to learn that I no longer had a valid license for the music, Napster no longer had the right to sell the song, or both.

This is the second time in the last two months that I’ve had a similar problem. The first time, I got an error message from Microsoft’s Windows Media Player that said that my licenses were corrupt and had to be replaced with a backup (which of course I didn’t have.) Napster walked me through getting around that problem, and I’m now waiting for them to reply to my customer service email and help me fix the latest problem.

I’m a fairly sophisticated user, yet I’ve lost control of content that I purchased months ago and I need help from tech support. If this is a problem for me, it’s going to be at least an order of magnitude bigger problem for most consumers. The only reason why the DRM piñata hasn’t spewed over the entire media and consumer electronics landscape is that there still aren’t a lot of actual DRM consumer users…but that’s about to change.

Today’s DVD players use a security system called CSS (Content Scrambling System,) which encrypts the content on manufactured DVDs. The problem is that once an enterprising programmer named Jon Lech Johansen figured out how to derive the encryption keys for each movie, anyone with a personal computer and DVD drive could remove the encryption and make their own perfect copies. Because CSS is hard-wired into all DVD players, and because every new disc has to play on the entire installed base of hundred of millions of DVD players, the movie studios are stuck.

Enter Blu-Ray. This almost-here, super-duper high definition replacement for DVDs uses three security systems: Advanced Access Content System (AACS,) which is the next-generation version of CSS, BD+, which enables content providers to update DRM on already-installed Blu-Ray players, and ROM-Mark, which is a defense against bootlegging. AACS allows content providers to limit access to their content through an Internet connection to the viewer’s player. For example, a vendor could specify that a disc can only be watched for a maximum of 72 hours following the first time that it’s played. BD+ enables content providers to completely disable the ability of a Blu-Ray player to play discs, in the event that the current DRM system is compromised. Consumers would be required to install a DRM upgrade before they could play any more discs. Sounds great! What could go wrong?

Well, let’s see…you get a disc from Netflix in the mail, and when you put it into the player, the disc tries to “call home” over the Internet. What if your Internet connection is down or the content provider’s server has crapped out—will the disc play or not? What if a BD+ “upgrade” that’s hidden on the Blu-Ray disc you just bought manages to trash your player’s ability to play anything? How will that get fixed? Or what if someone who’s less then scrupulous manages to distribute a BD+ upgrade that will only allow certain movies to play?

According to the Digital Entertainment Group (DEG,) more than 147 million DVD players and more than 5 billion DVDs have been sold to customers in the U.S. since 1997. A hot DVD can sell several million copies in its first week. If something goes wrong with the AACS or BD+ code for an equivalent Blu-Ray disc, the carnage will be massive. If the disc won’t play, are video retailers and rentailers going to be willing (or able) to support customers? If the players themselves are damaged, who’s responsible for fixing the problem—the content provider or the player manufacturer?

DRM, by its very nature, is an accident waiting to happen. DRM systems create a game of cat & mouse between DRM developers and hackers: A new DRM system is released, hackers figure out how to break it, the DRM vendor works around the hack, the hackers work around the fix, and so on. Sooner or later, something is going to go wrong with either a hack or a fix, and when it does, lots of people are going to hear about it. One guy who got ripped off by an Internet camera store posted the story on digg and turned the name PriceRitePhoto into a synonym for sleaze. Imagine what will happen when a few million people find out they can’t play that shiny new Blu-Ray disc they just bought.

Content providers, consumer electronics and computer companies have to start looking beyond DRM to come up with better market-based ways to limit piracy. If version 1 of a DRM system doesn’t stop piracy, release a more powerful (and onerous) version 2; repeat indefinitely. However, there’s always a point past which it doesn’t make sense to protect content. We could have bank vault doors on our homes and apartments to make them burglar-proof, but they’d be massive overkill. Instead, we live with wood or metal doors with cylinder locks and deadbolts, even though we know that they won’t stop a determined burglar.

There is no practical DRM system that will stop a determined pirate. Sooner or later, someone will find a way around the DRM. Rather than engineering DRM systems for “worst-case scenarios,” we need legal, pricing, packaging and distribution strategies, along with content protection, that discourage pirating and encourage purchase of legitimate copies. The sooner we focus on the carrot rather than the stick, the better.

Tuesday, December 06, 2005

The Annual Apple Rumor Mill

The rumor mill is running full blast with speculation about what (if anything) Apple will announce next month at Macworld. There are three core questions being asked:

  1. Will Apple announce any Intel-based products?

  2. If so, what will they be?

  3. Will Apple announce any further PowerPC-based hardware?

Here’s my uninformed speculation to go along with everyone else’s:

  1. Apple will announce Intel-based products at Macworld. With Intel apparently making better progress than expected in getting new processor designs out, Apple’s in an excellent position to announce Intel-based products in January, for release in the subsequent 60 to 90 days.

  2. I really have no idea what they’ll be, except that I don’t expect to see a G5-replacement tower-style computer at the show. Apple’s biggest win would be to take advantage of the low-power Pentium M/Core processors for its notebooks now. I’d also like to see the rumored Intel Mac Mini home media center running Front Row 2.0, but that’s primarily wishful thinking on my part.

  3. No. Once Apple introduces its first Intel-based systems, no further PowerPC-based products will be released. From what I’ve seen publicly from Apple, the company is pulling out all the stops to get its third-party developers to port their software over to Intel ASAP. Apple has to be concerned about being Osborned (the act of killing sales of your existing products by preannouncing future products.) Once it starts shipping Intel-based products, existing PowerPC-based products will appear obsolete. They control the timing for the start of the transition to Intel, but once it starts, they’ll have to move everything over to Intel quickly. I wouldn’t be surprised to see the first Intel-based products at Macworld in January 2006 and the entire computer line transitioned over to Intel by Macworld in January 2007.

Saturday, December 03, 2005

Simpler Set-Tops, Later Date?

Last week, the National Cable & Telecommunications Association (NCTA,) the United States’ official cable television trade organization, submitted a report to the FCC concerning conditional access (the ability to give or deny cable users access to various services and programming tiers.) Today, cable operators purchase set-top boxes that are compatible with the conditional access system that they’ve implemented (usually from Motorola, Scientific-Atlanta or NDS.) Boxes built for one system won’t work on another.

To enable consumers to have a choice of suppliers for set-top boxes, the FCC ruled that cable operators had to separate their conditional access systems from the set-top boxes themselves. The result was CableCARD, a device that provides conditional access for specially-equipped set-top boxes, television sets, DVRs and other products. The current version of CableCARD (1.0) is one-way only, so it doesn’t support video-on-demand, pay-per-view, interactive program guide or DVR functionality. CableCARD 2.0, which will be two-way and will support all of the above features, is due some time in 2006.

Last week’s report from the NCTA discusses another way to enable conditional access, called the Downloadable Conditional Access System, or DCAS. With DCAS, when a compatible device (set-top box, receiver, DVR, etc.) is attached to a cable system, it downloads the necessary software to support that cable operator’s conditional access system. No CableCARD or other hardware is needed.

DCAS is the access control system that pretty much all parties have wanted from the beginning: It doesn’t require cable operators to either toss out their existing set-top boxes or lease CableCARDs to their subscribers, it fully supports two-way operation, and consumers should be able to add compatible equipment to their cable connections without requiring the cable company to send out an installer.

According to the NCTA’s report, DCAS should be deployed nationally by July, 2008 (a date that I think is wildly optimistic.) The cable industry is currently operating under FCC rules that require them to soon replace all their existing integrated set-top boxes with boxes that use CableCARDs for conditional access. With the NCTA’s report, the cable industry is essentially saying that CableCARD will be obsolete by July 1, 2008. The obvious argument that follows is that it makes no sense for cable operators to replace all the existing set-top boxes with models that use CableCARD, only to replace them again two years later.

The not-so-subtle intent behind the NCTA’s report is to get the FCC to back off on its demand that cable operators adopt CableCARD for their own set-top boxes. For consumers, the big question is whether or not the FCC will still require the cable industry to deliver a two-way version of CableCARD. Assuming that CableCARD 2.0 could be delivered to consumers by the end of 2006, that’s at least two years sooner than DCAS. If the cable industry is allowed to skip CableCARD 2.0, companies such as TiVo, which has been planning a CableCARD 2.0-based DVR for some time, will be locked out of the market until late 2008 at the earliest.

The DCAS vs. CableCARD situation is one that bears watching, especially to see how the consumer electronics industry responds to the NCTA’s report and proposed timetable. We may be further away from fully cable-compatible set-top boxes at your local retailer than it looked just a few weeks ago.

Friday, December 02, 2005

Blu-Ray: Everything Old is New Again

According to this article, Sony has decided to release its initial Blu-Ray (high definition) discs using the same MPEG-2 compression that today’s DVDs use, even though the Blu-Ray specification requires all compatible players to support both MPEG-4 AVC and SMPTE VC-1 (Microsoft’s WMV9 HD) formats. Sony’s argument is that MPEG-2 encoders are used by every DVD authoring system on the planet, while AVC and VC-1 encoders are still both rare and expensive. The capacity of Blu-Ray discs (25-50GB) is more than sufficient to store a full-length HD movie in MPEG-2, so Sony will stick with MPEG-2 until they need to use one of the more space-efficient alternatives.

Sony’s move has come as a surprise to most of its partners in the Blu-Ray consortium, not to mention the DVD authoring houses that are gearing up to support Blu-Ray. If MPEG-2 is more than sufficient for the task, why did Sony bother to include AVC and VC-1 in the Blu-Ray standard in the first place? AVC and VC-1 will add cost and complexity to every Blu-Ray player made; without those codecs, players will be considerably less expensive.

One possible outcome of Sony’s decision is that it might actually reopen the door for HD discs based on red-laser DVD technology. If Sony is saying that it doesn’t think that consumers really want or need all the special features that would be possible using a next-generation codec, why can’t AVC- or VC-1-compressed HD films be put onto conventional DVDs? After all, that’s what Microsoft has been doing for two years now with its WMV HD DVDs. In addition, it neutralizes the capacity advantage that Blu-Ray has over HD DVD, which will also support both AVC and VC-1. Encode in a next-generation format for HD DVD, and get all the special features, or encode in MPEG-2 for Blu-Ray and be stuck with the same amount of special features as today’s DVDs.

I suspect that Sony’s move was designed to eliminate one source of technical risk from Blu-Ray’s launch, but it sends a mixed and confusing message to Sony’s partners and the market in general. If MPEG-2 is “good enough,” then maybe the DVDs that we all know and love, with the right codecs, are good enough as well.

The Evolution of Mass

Back in the dot-com era, the gospel was that the age of mass media and mass marketing was coming to an end. The focus was shifting to one-to-one experiences. Fast forward to late 2005, and the real story is very different. Mass is no longer a dirty word—it’s essential for success. We’ve gone through two stages of the evolution of mass, and we’re now in the third. Let’s look at where we are and how we got there:

Stage One: Mass Aggregation

This is where things started post-World War II: The goal was to get as many people as possible in one place, so that you could sell them as much mass-produced stuff as possible. This is the model that Yahoo!, Excite, Ask Jeeves, etc. originally followed: Aggregate as many eyeballs as possible, keep them for as long as possible, and sell them to advertisers. This is also the foundation for Internet commerce: and eBay work only because so many consumers come to their sites to buy.

The big difference between mass aggregation up to the late 1990s and mass aggregation today is the composition of the mass. The old mass model discounted differences in the audience, and sold everyone who showed up the same things. If you watched “All in the Family,” advertisers assumed that you’d probably buy the same type of car and drink the same kind of beer as everyone else who watched the show.

Today’s mass audience is comprised of a host of micro-audiences, each with its own interests and reasons for being there. Someone may go to Yahoo! to get driving directions to a meeting, investigate job opportunities, read the latest entertainment news or listen to music. It’s the aggregate of all these micro-audiences that makes Yahoo! so valuable to advertisers. Advertisers can pitch to everyone who comes to the site, everyone looking for directions in a certain city, or everyone who happens to be interested in model railroading.

Stage Two: Mass Participation

Yahoo! Groups.’s customer reviews. eBay’s feedback. All of these are examples of mass participation. Information and ideas come not just from one source, but from many. Participants share their experiences and insights with one another, which dramatically increases the value and credibility of the information. On a shopping site like Pricewatch, in addition to the vendor-supplied product specifications and prices, you’ll also find ratings and reviews contributed by other customers.  Just the number of reviews that a given vendor has provides valuable information; the more reviews, the more reliable the rating is likely to be. It brings an entirely new dimension to a purchase decision that would have previously been based primarily on price.

The participants generally aren’t creating “freestanding” content. Their contributions are meaningful only in the context of the host site’s own content (or goods and services.) For example, eBay’s buyer and seller feedback only makes sense when considering whether or not to do business with someone on eBay itself. Comments in a Yahoo! group make sense only within the context of the other messages and comments in the group. The next stage, however, expands the impact of individual contributions from a single site to a much bigger arena.

Stage Three: Mass Empowerment

Create a blog, and you can discredit evidence from a broadcast news story or report first-hand about government ineptitude in the wake of Katrina. Put your PC up for sale on eBay, and you become a sophisticated electronic merchant serving potential customers anywhere in the world. This is mass empowerment.

The objective of mass empowerment is to enable individuals to compete with the giants of media and commerce on a level playing field. It’s democratic, egalitarian and highly chaotic. Individuals change their roles on the fly: Read a variety of blogs to figure out the zeitgeist of a particular topic, then write your own blog entry and link back to your sources. Buy some shop equipment on eBay, use it to build custom furniture and then sell the furniture on eBay. Both network effects and virtuous circles arise from mass empowerment.

eBay is the clear leader in commerce mass empowerment, although’s Marketplace is also popular with sellers of books, music, videos and similar goods. Google and Yahoo! are fighting for leadership in media mass empowerment, but AOL and MSN are also major players. Apple’s iTunes has become the leading distributor of podcasts, but Yahoo! is climbing fast. Google and Apple are leading in the early rounds of videocasts (or vodcasts,) but there’s plenty of opportunity for other aggregators to jump into the fray.

Where To Now?

The exciting thing about the Internet has always been the fact that even if future directions are obvious, how we get there usually isn’t. Internet search was considered a commodity before Google turned it into an incredible growth engine. Internet radio died a protracted death at the hands of the recording industry, only to be resurrected as podcasts, most of which are beyond the control of entrenched business and government interests. I believe that vodcasts are the next great frontier of mass empowerment--there’s enormous room for new players to jump in with production, editing and indexing software that will make creating vodcasts much easier and more productive.

The stage beyond mass empowerment may well be mass collaboration. Once there’s a critical mass of individual producers, the next step is to start getting individuals working together across the Internet. Wikis are an obvious first step in this process, but there are plenty of opportunities to enable more sophisticated peer-to-peer collaboration. For example, a group of individual eBay merchants could pool their expertise and inventories to create a virtual mass merchant with a single identity, even though the individual members are scattered all over the world. A virtual newswire can be created by a group of individual bloggers, either for a single event (i.e., an election or conference) or on an ongoing basis. Vodcasters can parcel out the many tasks that go into creating video content to collaborators across the country and around the world.

No matter what the next stage turns out to be, the combination of mass and micro will drive the evolution of media and commerce.

Sunday, November 27, 2005

Your Next Set-Top Box, Part 2

In Part 1 of this topic, I introduced the concept of the "One Box," a set-top box (STB) that can serve as a digital cable box, DVR, DVD/blue laser player, game console and media center. I proposed that One Boxes will most likely be based on Microsoft's XBOX 360 and Sony' Playstation 3. In the second and final part, I'll examine the challenges and opportunities that the One Box brings to major industry players.

Cable MSOs

In my opinion, the cable operators never really wanted to get into the set-top box business, but they were forced to do so, first for technical reasons and then for security and value-added services. In the early days, when TVs only had screw-type antenna terminals, cable operators could get by with cheap baluns that converted coaxial cable signals to television antenna inputs. Eventually, virtually all televisions included coaxial cable inputs, which eliminated the need for a balun. However, in these simple systems, it was very easy to steal cable signals, and almost impossible for cable operators to track thieves down. To combat theft, cable operators switched to addressable set-top boxes, which enabled them to turn service on or off at a central location, as well as identify people who were getting cable service but not paying for it.

When cable systems went digital, there were a vast number of digital channels that a television set couldn't tune to. The FCC requires cable operators to provide a low-cost analog service for those customers who don't want or can't afford digital cable, but any viewer who wants more than the limited analog channel lineup has to upgrade to a digital STB. With digital boxes, cable operators can provide interactive programs guides (IPGs,) enhanced pay-per-view and video-on-demand (VOD) services. The latest digital STBs have integrated DVRs which permit local storage and replay of programs.

Each improvement makes STBs more complex and expensive. At present, MSOs purchase set-top boxes for $100 to $300 each. They have to stock, maintain and refurbish boxes. As boxes are lost, stolen or destroyed, they have to replace them. And, the IRS requires cable operators to write off their investment in STBs over seven years. The result is that MSOs take a lot of time deciding on whether, when and with what to replace their existing STBs. The upfront cost can range into the billions of dollars for the largest MSOs.

Now comes the One Box. Unlike cable STBs, the One Box will be updated on consumer electronics timelines. For example, the original Microsoft XBOX started shipping in the U.S. before Christmas, 2001. Four years later, it's been superseded by the XBOX 360. (The Playstation 2 will be about six years old when it's superseded by the Playstation 3.) While neither the XBOX 360 nor the PS3 are "open," they're both expandable. In about the time that an MSO can upgrade its entire installed base to a new generation of STBs, consumer electronics companies can ship two generations of their products.

If third-party STBs take off, cable operators will get stuck with thousands or even millions of boxes that they no longer need. The MSOs can't require their customers to rent STBs from them because of an FCC ruling that requires cable operators to support third-party "navigation devices" (set-top boxes, DVRs, television receivers, etc.) by July 1, 2007, starting with the six largest MSOs. Cable operators must provide customers using such devices with a CableCARD (TM), which performs the same security and service support as their digital STBs. Today's CableCARD 1.0 standard doesn't support electronic program guides, video-on-demand or interactive applications, and only includes a single tuner (which makes recording one program while watching another impossible.) CableCARDs conforming to the 2.0 specification will support VOD and interactive applications, and will include two tuners for more DVR functionality; these cards are expected to ship by the end of 2006. However, existing CableCARD slots won't support the 2.0 standard; they have to be redesigned in order to be compatible.

Cable operators rent out CableCARDs, of course, but they generate much less income for them (CableCARDs typically rent for 1/3rd or less of the monthly cost of a digital STB.) CableCARDs are also much easier to steal or lose, and they can't be repaired or refurbished; if they're damaged, they have to be replaced. So, they're stuck with two inventories: One of STBs, the other of CableCARDs. The can't drop the STBs because most of their customers still can't use CableCARDs, and they have to supply CableCARDs and thus make much less money from rentals. Over time, there will be more and more CableCARDs and fewer and fewer standalone STBs.

Consumer Electronics Manufacturers

Consumer electronics companies have wanted to get into the set-top box market for years, but for all practical purposes, they're locked out of the U.S. cable business due to the MSOs' massive investments in Motorola and Scientific-Atlanta (soon to be Cisco) equipment. The limitations of CableCARD 1.0 (one-way communications only) make it an unacceptable substitute for integrated STBs. The formal launch of CableCARD 2.0 and related software will for the first time put the CE manufacturers (including Microsoft and Sony) on an even footing with the legacy STB vendors.

While Microsoft and Sony are uniquely positioned to deliver One Box hardware, there are other vendors who are very close. For example, TiVo announced last year its intention to offer a combination HD DVR/STB as soon as 2.0 CableCARDs ship. Humax, a TiVo licensee, already combines DVR functionality with a built-in DVD recorder, thus providing three of the four elements of the One Box. LG, Panasonic, Philips, Samsung, Sharp and Toshiba (among others) all sell DVR/DVD recorder combos that can incorporate full digital STB capabilities with CableCARD 2.0.

There's no particular magic involved in building a STB/DVR/DVD recorder; all the hardware is readily available. The challenge is integration: Getting all the pieces to work together seamlessly and transparently to the user. The key is software--both the user interface and the "behind-the-scenes" services and system management software. In this area, Microsoft and TiVo are well ahead of Sony and the other CE manufacturers. TiVo's true "value-adds" are its intuitive user interface, powerful features and online program guide, all implemented in software on a generic Linux platform. Microsoft licenses the Microsoft Program Guide, a software and services package that gives Windows XP Media Center Edition-equipped PCs TiVo-like functionality, to CE manufacturers (the first being LG Electronics.) No matter who manufactures the hardware, the "arms merchants" that make everything work will be software and services experts like Microsoft and TiVo.

Content Aggregators

The One Box could well become a godsend for content aggregators, placing them for the first time on an even playing field with the cable and satellite operators. The cost, time and effort required to launch a new cable network are enormous. The vast majority of proposed cable networks never get off the ground because of a lack of funding, a lack of available channels (even on digital tiers), or both. The fastest way to get a new network on a large number of cable systems is to buy an existing one. For example, Al Gore and Joel Hyatt led an investment team that acquired Newsworld International from NBC Universal for $70 million, and transformed Newsworld into Current, a current events network targeting the 18-to-34 audience. By buying NWI, Current got into 19 million households, most of which came from DIRECTV, but the network has no chance of being successful until it gets into most U.S. households through cable. To do so, Current will have to pay cable operators to carry the network, at least until its advertising revenues and audiences are large enough that MSOs will start paying carriage fees to Current. Several hundred million dollars will be sunk into the network before it becomes available to the majority of television households. The One Box could eventually make it practical for future networks like Current to launch directly into consumers' homes via the Internet, dramatically decreasing the amounts of time and money needed to reach "critical mass."

Startups Akimbo and DAVE Networks are examples of IPTV distributors that license video content from a variety of sources and then make it available to consumers through proprietary set-top boxes and software. To date, the "take rate" for their STBs and services has been low, in part because most consumers don't want to add yet another set-top box to their living rooms. The One Box will enable these companies to offer content and services without requiring consumers to purchase their STBs. They can get out of the STB business and focus on their content and services.

The One Box is also attractive to content aggregators such as Google, MSN and Yahoo, which can provide their own friendly user interfaces and "electronic program guides" for the content that they host on their own systems and promote for others. For example, one can readily envision a Yahoo! program guide running on the One Box that enables viewers to stream and download news, sports and entertainment. Content on the Internet will be almost as convenient to find and watch as content on cable, so long as the Internet powerhouses are willing and able to develop a look and feel for their services that's comparable or superior to what consumers already get with their cable set-top boxes.

Integrated Set-Top Box Manufacturers

In the One Box era, the incumbent STB manufacturers will have their work cut out for them. One the one hand, FCC rules require the MSOs to replace their existing integrated STBs with STBs with CableCARD slots, which will open up an enormous replacement market years ahead of time. On the other hand, there's absolutely no reason why MSOs have to buy their open STBs from Motorola, Scientific-Atlanta, Pioneer, Pace, etc. The incumbents will manufacture and ship millions of CableCARDs to the MSOs, but the cards sell for a fraction of the price of integrated STBs. As a result, their revenues will drop dramatically unless they can convince MSOs and consumers that their STBs are preferable to those from consumer electronics companies. STB manufacturers have always had no more than a handful of serious competitors for sales to cable operators, but that era is coming to an end.

Friday, November 18, 2005

Set-Top News: Cisco Buys Scientific-Atlanta

Earlier this morning, Cisco Systems acquired Scientific-Atlanta for $6.9 billion. Scientific-Atlanta is one of the two world leaders in the set-top box market, along with Motorola. The acquisition instantly turns Cisco into a pivotal player in the cable industry.

In 2003, Cisco acquired Linksys, which was (and continues to be) the leader in networking equipment for SOHOs (small offices/home offices.) While Linksys’s products appeal to technically-oriented users, it has had problems penetrating the home entertainment market. Scientific-Atlanta gives Cisco entrée into home entertainment with a huge installed base (S-A has already shipped 2.6 million high-definition set-top boxes.)

I expect that Cisco’s networking expertise will start getting built into S-A’s products as soon as the acquisition is finalized. A good deal of technology is likely to be shared between S-A and Linksys, with S-A building “ruggedized” versions of products for cable operators and Linksys building consumer-grade versions for sale at retail. Cisco is also acquiring S-A’s cable infrastructure products, which will enable them to supply end-to-end solutions to the cable industry for the first time.

Look for some very interesting announcements as early as the Consumer Electronics Show in January, and major product & strategic announcements at the National Cable Television Association’s (NCTA) national conference in April. In short, this is very big news. The 900 pound gorilla of the networking business is about to become at least a 400 pound gorilla in cable.

Sunday, November 13, 2005

Your Next Set-Top Box

You’ve probably got a set-top box in your living room. Most likely, it’s a cable model made by Motorola or Scientific-Atlanta. It might have a DVR with a hard disk built in, but most likely, it gives you access to analog and digital channels, perhaps on-demand programming, and maybe HD (if you pay extra). That’s it.

If Microsoft and Sony get their way, however, your next set-top box with look suspiciously like a XBOX 360 or a Playstation 3. A game console as a set-top box? In reality, the next-generation systems are called game consoles only because that’s the easiest way to categorize them. The XBOX 360 is the obvious follow-on to the XBOX, just as the Playstation 3 was born from the original Playstation and Playstation 2, but game playing is only the tip (albeit a very big tip) of the iceberg.

For years, consumer electronics, computer, cable and satellite market analysts have searched for a “Holy Grail” that will bring about the great age of Convergence. Today, if you want a digital cable receiver, DVR, DVD player and game console, you’ll need as many as four different boxes, all of which have to be connected to each other and to other home theater equipment. It’s a nightmare--when something goes wrong, where does the consumer go for help? One version of the Grail is what I’ll call the One Box: One set-top box that does it all…one box to replace all four boxes. Nothing to interconnect because everything is in one box, and a handful of connections to the home theater system.

Digital cable (or satellite) receivers and DVRs have been integrated by the major cable set-top box manufacturers, as well as TiVo and Pace (both for DIRECTV.) These integrated receivers are being adopted much faster, and in much greater numbers, than standalone DVRs. That mergers two boxes into one. The current generation of game consoles can play CDs and DVDs, so they can replace a standalone DVD player (merging another two boxes into one,) although DVD players are so cheap and so much more flexible than the current game players that there’s little reason to substitute one for the other.

That still leaves at least two boxes for the consumer to deal with, each with its own set of connections, remote controls and “looks & feels.” One Box Nirvana is reached with a single box that does all four key functions, plus new functions that are enabled by combining everything together. That’s where the next-generation game consoles come in.

The XBOX 360 and Playstation 3 can of course play videogames superbly, on conventional definition and HD screens, in stereo or 5.1 surround sound. Their built-in optical drives (DVD in the XBOX 360, Blu-Ray in the Playstation 3) will be a match for any standalone player on the market. There won’t be any reason to keep a standalone player if you have a next-generation console. That removes one box (the standalone disc player.)

The next two pieces are a little tougher. The most challenging job is to integrate a digital cable or satellite receiver into the game consoles. Both the XBOX 360 and Playstation 3 have the horsepower to handle all the functions that set-top boxes now perform—not just tuning to a channel, but also interactive program guides, on-demand and pay-per-view programs. However, in their current incarnations, neither console will incorporate a slot into which a module (a CableCARD) can be inserted in order to add set-top box functionality. There are ways to work around this, and both Microsoft and Sony have undoubtedly already chosen their approaches (either a module that can be connected to their game consoles, or a different version of their consoles with CableCARD slots.)

Note that this isn’t a problem for consoles connected to IPTV systems (the kinds of video networks proposed by Verizon and SBC, among others.) Those systems use the Ethernet connections like those already built into the XBOX 360 and Playstation 3 to send content to and from the set-top box.  If you’re using IPTV, skip this step; no receiver required.

The last piece, the DVR, falls in place once the receiver is added. Both the XBOX 360 and Playstation 3 need at least 80GB of hard disk space to be usable DVRs. Microsoft’s hard disk will only be 20GB; enough to store game data and lots of MP3s, but nowhere enough for video recording. Sony hasn’t announced the hard disks available for the Playstation 3, so it remains to be seen if they’ll be limited. In either case, however, adding larger capacity disk drives is only a matter of how much more consumers are willing to pay, not technical limitations.

In short: Add a digital cable or satellite receiver, along with at least 80GB of disk space, and either the XBOX 360 or Playstation 3 becomes the One Box. This analysis doesn’t even begin to take into account all of the additional things that become possible after you’ve combined all the functions. For example, you can connect the One Box to your network and share audio, video, images and games with your PCs and Macs. You can take programs recorded by the One Box’s DVR and make them available to your notebook computer, anywhere on the Internet, or to your cell phone or media player, anywhere at all. You can create podcasts and videocasts, and then stream them anywhere via the One Box.

In Part 2, I’ll discuss why the One Box represents both a threat and opportunity to the biggest players in the business.

Wednesday, November 09, 2005

A Two-Track Mind

My first job after college was working for Hewlett-Packard in Corvallis, Oregon. I spent a year as a product manager, and then transferred to the Lab to work as a software engineer. While in the Lab, I took note of a very unusual promotions policy that was unique to engineering. HP had a “two-track” promotions system, where engineers could take the conventional path and advance through the ranks by taking on management responsibilities. However, for engineers who were much more comfortable as individual contributors, didn’t want to deal with supervisory duties, or got promoted to a management position that they weren’t suited for, HP established a parallel promotion path. Engineers could choose to remain individual contributors, but get more senior titles and more money based on their performance.

At the time, I couldn’t imagine why anyone would want to take this path. I’d just graduated from an MBA program and thought that I should already be running the company. Why wouldn’t someone want more power and influence?

It took me a long time to figure it out, but after many years observing what really goes on in all kinds and sizes of businesses, I finally got it. There are a lot of excellent people who don’t have the temperament, desire or skill to manage others. They’re superb at what they were originally hired to do (i.e., engineering, sales, accounting, etc.), but for them to get promoted, they have to take on supervisory tasks that they don’t want or can’t handle.

How many of us know salespeople who were superb in the field, and then got promoted to sales manager and mismanaged their sales teams? Their employers lost two ways: First, they took one of their best salespeople away from selling, and second, they increased tension and lowered close rates for their other salespeople.

Why can’t a superb engineer stay an engineer? Why can’t a great salesperson keep on selling? Two of my uncles are examples of how this can work. My uncle Ben was considered to be the world’s greatest insurance salesman during his lifetime (Chairman Emeritus of the Million Dollar Roundtable, etc.) He worked for New York Life in a little town in Ohio, and for years his managers at headquarters wanted him to become a manager and supervise much, if not all, of the sales force. He always turned them down. He loved nothing more than to call on customers and sell them life insurance. He was an incredibly successful individual contributor. At the end of the day, New York Life would probably have hurt itself by promoting Ben to headquarters, and Ben knew it. He stayed a field agent until the day he died.

My Uncle Abe joined the Navy during WWII. He decided to make the Navy his career, and stayed for almost 30 years. He got promoted to Chief Petty Officer, one of the highest non-commissioned positions in the Navy. For years, time after time, Abe was offered promotions to commissioned rank, and time after time, he turned them down. He was perfectly happy as a CPO, and by his retirement, he was one of the most experienced CPOs in the Navy.

Looking back at my time with HP, I probably would have been happier on a non-management track, at least until I learned the people skills, patience and even-handedness that are hallmarks of good supervisors. A lot of people would appreciate and benefit from the option of moving up without moving into management, and a lot of companies would retain more people doing a better job.

There’s a lot of companies (and organizations) that would benefit from a two-track promotion system. You may be working for one. Declining a promotion to supervisory management shouldn’t be career suicide. Think about it the next time you get offered a promotion or look for a new job.

Tuesday, November 08, 2005

The Right Problem

In the late 1970s, when I was studying for my MBA at Northwestern’s Kellogg School of Management, I took course in Channels of Distribution, taught by Lou Stern. At the time, Lou (now retired) was considered the world’s leading expert on channels (the marketing process necessary to get goods and services from manufacturers to final consumers, and the entities (“members”) through which the goods and services change hands on their way to the consumer.)

Case studies were all the rage in the 1970s, and Kellogg was no exception. The Channels class included a case study on Volkswagen of America that took place in the early 70s. Volkswagen’s sales were down and their relations with dealers were very poor. Their dealers were usually located in undesirable locations, and their facilities were poorly maintained. What, asked Lou, had to be changed in Volkswagen’s relationships with dealers for the company to again be successful?

I’d shopped for a Volkswagen during the period covered by the case study, so I had personal experience with the problems with VW dealers. However, it was very clear to me that the problems with the channel had nothing to do with Volkswagen’s real problems; in fact, the dealer problems were merely a symptom.

In the early 70s, VW’s product line consisted of the Beetle, Sedan, Wagon, Karmann Ghia and Bus. VW defined the US economy car market in the 1960s, but by the time of the case study, both the look and technology of their cars was dated, and their quality, which had once been superb, was in decline. At the same time, Toyota and Datsun had entered the market with cars that were better-looking, more powerful, better engineered and higher quality than VW’s, at lower prices.

Given a choice between VWs and cars that cost less and were superior in just about every way, consumers did the only sensible thing: They stopped buying VWs. Volkswagen’s market share plummeted. VW dealers’ revenues dropped through the floor. They didn’t have enough money to maintain their dealerships, move to better locations, or advertise.

What would have happened if VW had fixed all its channel problems? Loaned their dealers money to clean up their buildings or move to better locations? Pumped more money into local advertising? Improved their relationships with the dealers? None of that would have fixed much for very long, because the real problem was Volkswagen’s products, not their channel of distribution. The cleanest, most modern dealerships still couldn’t sell crappy cars.

In my report on the case study, I said all of that (in more words, of course, since length counted.) Lou wasn’t impressed. He gave me a C+. Lou was wrong, which leads us to:

Rule #1: You have to know what the real problem is in order to fix it.

Far too many organizations solve the wrong problem. McDonald’s was struggling while other fast food chains were growing. To turn the company around, they tried a bunch of things: They added playgrounds to their restaurants to attract more families, dumbed down their cash registers to compensate for their huge employment attrition rate (at one point McDonald’s was turning over almost its entire workforce every year,) and cleaned up the restaurants. They launched McRib sandwiches and hamburgers with artificial beef made with seaweed. They completely redesigned their sandwich production process to speed up delivery and make sandwiches fresh for every order. They experimented with cafes attached to their restaurants (McCafes) in cities with Starbucks on every corner.

What happened? Their sales continued to drop. McDonald’s solved many problems, except the core one: Their food. The core menu was old, and the beef patties had no more flavor than the buns. The seaweed beef substitute tasted like cardboard. Even switching to a healthier oil used for frying French fries backfired: The new fries didn’t taste as good as the ones fried with the fattier oil, so McDonald’s had to switch back to the original formula..

While McDonald’s was adding playgrounds, cleaning the restaurants, speeding up delivery, etc., their focus, energy and resources were diverted from fixing the core problem. It’s only in the last 18 months that McDonald’s has focused on its food. They’ve launched a family of chicken sandwiches that are both healthier than hamburgers and taste good. They’ve worked hard to improve the flavor and quality of its hamburgers. They’re starting to replace their coffee (known for being very hot but not very good) with a much better coffee made from Fair Trade beans.

McDonald’s sales are trending up for the first time in several years, because they’re finally getting the food right.  If McDonald’s had been fully focused on improving their food instead of playgrounds and production processes, they would have turned the company around years earlier. Even if they tried and failed repeatedly to get the food right, they would have committed the resources and effort necessary to eventually solve the problem. Which leads us to:

Corollary #1: Doing a great job of fixing the wrong problem can often be worse than doing a bad job of fixing the right problem.

Focusing on the wrong problem diverts the resources necessary to fix the real problem. During the Jacques Nasser regime at Ford, he fired most of the senior managers who knew anything about making and selling cars, made a huge investment in Internet marketing, bought Volvo, Land Rover, Jaguar, Aston Martin and operating control of Mazda, then bought junk yards and went on the record saying that his goal was to get Ford’s revenues from cars and trucks down to less than 50% of the total. In other words, he fixed everything but the real problem: Ford’s own vehicles were mediocre at best. Quality was poor and designs were unappealing.

You can’t fix the problem of not selling enough cars by selling fewer cars. Hence, Bill Ford allowed Nasser to spend more time with his family, and took over running the company. Ford hasn’t done a great job of running the company either, but at least he has the company focused on the real problem: Improving the design and quality of its cars and trucks.

In the car business, there’s virtually no problem that can’t be solved with better products. However, I don’t mean to infer that products are always the problem. For example, Jones Soda has great products, but their sales potential is limited by a dearth of local distributors that are both strong and unencumbered by exclusive distribution deals with companies like Coca-Cola and Pepsi. That’s a channel problem. Making their cherry soda taste better is great, but it doesn’t get it into more stores.

My point is simple: In order to fix the real problem, you have to know and focus on what the real problem is.

Thursday, November 03, 2005

Big TV, Little TV

Not long ago, the New York Times published an article that suggests that “television” is splitting into two related, but significantly different, mediums. “Big TV” is watched on ever more giant monitors, takes advantage of all the latest presentation bells and whistles (HD, 5.1 sound, etc.), transmits content that comes in multiples of 30 minutes long, and is (usually) professionally produced. “Little TV” is watched on ever smaller screens, such as mobile phones, the new iPod and media players. Programs are designed to work well with a tiny screen and mono sound, so there’s no bells and whistles. Little TV programs are typically short (6 to 10 minutes,) but they can be any length. Finally, with a few exceptions, Little TV content is produced by individuals, not by a studio or large production company; while the quality can be very good, it’s rarely up to “professional” standards.

In days gone by, using Marshall McLuhan’s taxonomy, Big TV would be a “hot” medium and Little TV would be “cool.” Big TV invites passive observation, while Little TV invites active participation. Not only is that true for the audience, it’s true for producers as well. Big TV is usually produced months before it’s seen by an audience, and Big TV producers rarely care whether or not individual audience members actually like their programs. They only care about ratings, a handful of aggregate numbers around which the entire Big TV universe revolves.

Little TV, on the other hand, is usually delivered piping hot to the audience, no more than days (and sometimes, seconds) after it’s been produced. If audience members like or dislike the program, Little TV producers will hear about it instantly, on their and other people’s blogs and by email. Little TV programs have too small and dispersed an audience to be measured by ratings services, so Little TV producers rely on audience feedback and server hits to determine their programs’ popularities.

The tools used to create Big TV and Little TV are very different as well. Big TV uses everything from HD cameras to professional microphones, and programs are edited on Avid or Apple systems that get more “professional” as they get more expensive. While the cost of production and post-production equipment for Big TV has dropped precipitously, it’s still expensive. Even more important, Big TV production requires significant training and experience in order to get the best results. Big TV tools come from companies like Avid, Autodesk, Digidesign, Euphonix, Panasonic, Quantel and Sony—cameras that cost upwards of $100,000, post-production systems that can easily reach $500,000, and audio systems from a few thousand to hundreds of thousands of dollars.

By contrast, Little TV programs cost next to nothing to produce. All that’s needed is an inexpensive DV camcorder, a decent microphone or two, a PC or Mac with basic media editing software and an Internet connection. Even if a producer uses a HD camcorder, great audio equipment and a full-blown post-production system, the benefits of all the bells and whistles will be lost on the Little TV screen. The big players in Little TV are companies like M-Audio and Pinnacle Systems (both owned by Avid,) Adobe, Apple, Panasonic and Sony—products that cost from a few hundred to a few thousand dollars.

For example, a producer can trick out a dual processor, dual core Apple G5 with lots of memory, a high-speed RAID array, a big HD monitor, a HD audio/video card from Aja or Blackmagic Design, Final Cut Studio and some supporting software for around $25,000. That system will do everything up to and including HD post-production for broadcast or film. Or, for Little TV, that same producer could put together a system with a 20” 2.1GHz G5 iMac, 2.5GB of RAM, 500GB of hard disk space and Final Cut Studio, for around $3,600. That system will do everything a producer needs for Little TV, but if even $3,600 is too much, there’s always the same Mac with 1.5GB of RAM, 250GB of hard disk space and iMovie for under two grand.

Let’s review: Versus Big TV, Little TV is cheaper and easier to produce, cheaper to distribute, and is available anywhere, any time. With all these advantages, it’s no wonder that Little TV is growing much faster than Big TV. Feedburner counts almost 23,000 unique podcast channels. The number of videocasts is currently a small fraction of the podcast count, but the number will grow quickly as production and distribution tools become easier to use.

The comparison between Big TV and Little TV reminds me of what happened with MP3 came on the scene. The record labels were pushing the next generation of audio reproduction: SACD and DVD-Audio (“Big Audio”). Both systems required a big investment in new speakers and amplifiers, not to mention new versions of many of the CDs that listeners already owned, in order to take full advantage of their quality.  On the Little Audio side was MP3. It didn’t sound as good as Big Audio, and it didn’t have 5.1 channels, but it was far more convenient (thanks to portable players,) and thanks to Napster and its ilk, it offered a huge library of music for free. The Big Audio folks ignored Little Audio until it just about killed them.

The same thing is happening with Big TV vs. Little TV. Big TV advocates are pushing for high-definition disc systems (Blu-Ray and HD DVD,) which require a big investment in better HD monitors, new blue-laser players and new versions of many of the DVDs that viewers already own, in order to take full advantage of their quality. On the other hand, Little TV will play on many of the cellphones that people already own, Sony’s PSP game system, the new iPod and other portable video players. Enterprising computer programmers have already figured out how to get DVDs, television shows and videocast material to play on Little TV players. Little TV is far more convenient, and there’s a huge library of programs for free. Déjà vu, anyone?

Big TV advocates think that a video iPod will never compete with a HD system with surround sound and Blu-Ray. They’re wrong. Picture and audio quality are not the purchasing drivers. If they were, it wouldn’t have taken an act of Congress to force consumers to buy digital television receivers. No, what’s driving consumers is convenience and cost. When you can buy everything you need to watch Little TV for $299, and it’s far more convenient and much cheaper than Big TV, the decision is a no-brainer.

If you’re a Big TV producer or distributor: A television or cable network, television station or cable operator, you should be scared by Little TV, but you have a chance of avoiding the music industry’s quagmire by adapting to Little TV rather than ignoring it or trying to kill it. You can’t kill it, any more than newspapers killed radio or radio killed television. In Darwinian terms, adapt or die.