Friday, December 30, 2005

Showdown at the IPTV Corral

According to this story, SBC (soon to be AT&T) has filed suit against the city of Walnut Creek, CA. To summarize, SBC is upgrading its telephone network in Walnut Creek to support broadband data and video. As part of the installation, SBC has to run fiber optic cables under a bikeway owned by the city, and requested a permit to close access to the bikeway while work was in progress. Walnut Creek’s city administration refused to issue the permit unless SBC agreed to get a local cable franchise. In its lawsuit, SBC claims that both federal and state laws say that SBC’s service will not be a cable service, and so shouldn’t be required to get a cable franchise.

For decades, the cable industry has been subject to local franchise laws. Cable service was effectively a monopoly because it was both uneconomic and infeasible for more than one cable company to serve an area of a city. Thus, municipalities retained the right to approve the cable company that wanted to build in the area. That approval, called a franchise, specifies both the fees that the cable operator has to pay to the city and the services that the company has to provide. These services vary from city to city, but typically include free service for municipal offices and public schools, community channels and studios where local programs can be produced, and at least one tier of low-cost analog service.

Thanks to new technology, the rules have changed. Telcos are now able to offer IPTV television services along with high-speed data and telephone service. Satellite providers aren’t subject to local franchise rules. Even power companies are experimenting with high-speed data services across their power lines that could be used for IPTV. Every one of these services threatens to take away subscribers from the local cable franchisee, and fewer customers means less revenue for both the cable operator and the city.

Cable operators argue that they’re at a cost disadvantage vs. telcos (thanks to those franchise fees,) and therefore, telcos should either be subject to the same franchise rules, or cable operators should be relieved of their franchise obligations. In response to a recent Supreme Court ruling, the FCC recently issued a proposed rule finding that telcos offering “wireline broadband Internet access service” are information services and are thus not subject to local regulation. Cable operators’ broadband Internet services are equally free of local regulation, but here’s the rub: Conventional analog and digital cable services are not information services, and thus are subject to local regulation, but telcos will offer their video services as IPTV over their broadband Internet services, and are not subject to local regulation.

This is one of those situations where both sides have valid points. The FCC proposes to create a level playing field with respect to broadband data services, but unless telco IPTV services are regulated in the same way as cable video services, the cable operators will be at a significant disadvantage. There are literally thousands of local franchises, and if the telcos have to negotiate with each one, rollout of widespread IPTV services could be stalled for years.

The issue will undoubtedly end up back at the FCC. No matter what the FCC rules, however, the losers are going to appeal in federal court, and the case will probably end up back at the Supreme Court. Congress could also jump into the fray with legislation that waives local franchising of cable operators, telcos, or both. The technology for telcos to offer IPTV video already works, but consumers may not get to use it for quite some time.

Wednesday, December 21, 2005

Podcast Perfecto, Part 2

In Part 1, I gave some tips on what to include in your podcast. In this part, I’ll give you some technical tips and tricks for creating and improving the sound of your podcasts. Again, in no particular order:

  • Consider buying a podcast package. There are a variety of online merchants that are selling packaged systems (usually with software included.) You don’t need to spend a lot of money: Sweetwater has a complete podcasting package from M-Audio for $150. Musician’s Friend (owned by Guitar Center) has a variety of packages from $100 to $280. BSW’s podcast packages run from $169 to $1,699. Make sure that the software in the package is compatible with your PC; some packages are only available for Windows or Mac.

  • Get the right microphone. Buying a mic is very confusing, even for experienced users. There are many brands at prices from under $100 to over $5,000. You don’t have to spend a lot for a mic, but make sure that you get one that’s optimized for voice and narration. Mics have pickup patterns that determine from where they can hear sounds. Get a mic with a cardioid or hypercardioid pickup pattern: Cardioid means “heart-shaped,” and this kind of pattern rejects sound to the rear of the microphone. Hypercardioid mics are even more focused on the front and reject sound from the sides and rear.

    There are several technologies that microphones use to capture sound; I’ll discuss the two most common ones. Dynamic microphones don’t require their own power. They’re most commonly used for vocals and musical instruments. They’re rugged and can take a fair amount of abuse. Condenser microphones are more sensitive to quieter sounds than are dynamic mics, and they tend to sound better (although some dynamic mics are equal to or superior than condensers in the same price range.) Condensers require a power supply, which on most models is an external +48 volt supply (called “phantom power.”) Some condensers can be powered by a battery for those times when external power is not available. Condensers are much more fragile than dynamic mics.

    For podcasts, a good dynamic microphone is fine. Dynamic mics don’t pick up soft sounds as well as condensers, but that can be a benefit for podcasts, where a condenser might pick up noise from your computer more easily than a dynamic mic. If you get a condenser, get a large diaphragm model; they’re more sensitive and tend to reproduce sounds more accurately than small diaphragm mics.

  • Buy a pop filter for your microphone. A pop can occur when you say a word like “pizza” or “pack.”  The “p” in these words is called a plosive, and it can momentarily overload a microphone. A pop filter is a small nylon screen that’s placed in front of the microphone to dissipate the energy of plosives. The result is that the microphone hears a “p” but doesn’t overload.

  • Get a good computer audio interface. There are lots of these, and they can be very inexpensive. Some install in your computer (from E-Mu and other companies,) some are dedicated interfaces (from Digidesign, Focusrite, Lexicon, Mackie, M-Audio, etc.,) and some are audio mixers that include a computer interface (Alesis, Digidesign, Mackie, M-Audio, etc.) PC audio systems built into the motherboard tend to pick up electrical noise from other components, so you’re better off to use a separate external interface. Some of the less expensive interfaces don’t have phantom power for condenser microphones, so be sure to check before you buy.

  • Consider getting a headset. Headsets combine headphones with a microphone. Broadcast-grade headsets have high-quality microphones, and you can move around without going out of range of your mic. They’re also less sensitive to noise. Most broadcast-quality headsets are shipped without connectors, so unless you’re good with a soldering iron, buy a headset that already comes with connectors for the microphone and headphones. Whatever you do, don’t use a conventional computer headset. The microphone quality is generally poor.

  • Good audio editing software is essential. There are lots of choices, from Audacity, a free editor that runs on Mac, Windows and Linux, to for-fee software such as Adobe Audition, Digidesign Pro Tools, Mackie Traktion, Steinberg Cubase and Sony Sound Forge. They all work well and, if you know what you’re doing, can be used for creating superb podcasts. All of these products come with filters and plug-ins for adjusting the sound of your recording, reducing noise, etc. Take the time to play with whatever package you select before you record your first podcast. A little training time at the start will pay big dividends later on.

  • Save your podcast as a MP3. All podcasts have to be sent to listeners in a compressed audio format in order to shorten download time and save hard disk space. Every podcast player that I know of supports MP3 files, and every audio editing package can easily create them. AAC, Apple’s standard audio format, is also often used, but not all players can handle it, so unless you specifically want to tailor your podcast to the Apple audience, use MP3.The compression bitrate that you use is critical in determining the sound quality of your podcast. A podcast that’s been compressed too much can sound like a bad telephone call. The good news is that voice can be compressed much more than music. Typical bitrates run from 48Khz to 128Khz. The lower the bitrate, the smaller the file. My suggestion is to experiment until you find the lowest bitrate that doesn’t sacrifice your audio quality. If you’re not sure, err on the side of using a higher bitrate.

  • Learn microphone technique. Using a microphone is a skill that’s easy to learn and is essential for a good-sounding podcast. Use your normal speaking voice, as if you’re having a conversation. Stay close to the mic (two to four inches is usually best,) and speak directly into it, not from the top or sides. If you’re recording directly into an audio editing program, watch the sound meters and keep the volume of your voice below 0 dB (the red zone on the meters) and well above the bottom of the meter. If you’re too loud, you’ll cause distortion, or clipping. If you speak too softly, you’ll have to amplify your voice in the audio editor. Any noise in the original recording will be amplified along with your voice.
Those tips should get you started. Good luck!

Podcast Perfecto, Part 1

I’m planning to launch a podcast for the Feldman File early in 2006. In the process of getting ready, I’m doing all the “due diligence” research needed to make a podcast work, and I’d like to share some of what I’ve found with you. Part 1 of this article will deal with content, while part 2 will cover technical tips.

In no particular order:

  • Pick a subject for your podcast, and stay on it. Just as in radio, where every station has a format (news, talk, oldies, progressive rock, etc.) that stays the same from day to day, you should establish a subject for your podcast that listeners can reliably listen to week after week. If you do a podcast on photography, don’t talk about politics (unless it relates directly to photography.) Wandering off topic will confuse and frustrate your listeners.

  • Bring in more than one speaker. I recently listened to a podcast by an industry analyst, and all he did was read an article that he wrote (including headings.) There’s nothing as boring as listening to a single person drone on about a topic. Again, consider radio: Howard Stern has a studio full of on-air staff and guests, and Rush Limbaugh takes phone calls. Bring in a co-host, have on-air guests, do interviews or take phone calls.

  • Entertain your audience: Make it fun for people to listen to your podcast. Even if you’re covering a very dry topic, find a way to lighten the mood. Have a sense of humor.

  • Inject some controversy. I resisted this concept for a long time, but controversy is entertaining. People like to listen to arguments. You can also take the Howard Stern approach of being outrageous. Don’t artificially create controversy; choose some topics where there’s a genuine difference of opinion.

  • Keep it short. Make the length of the show fit the material; don’t pad it out to make it 30 for 60 minutes long. With podcasts, there’s no “right” length.

  • Include music and/or sound effects. It’s essential to draw listeners in from the very start of your podcast, and opening music is a great way to do it. You can have music in other places in your podcast (to separate stories or topics, for example.) However, be very careful about the music that you use. Don’t use music from a commercially-recorded CD, unless you have performance rights to the song. You can acquire royalty-free music from many suppliers, or you can purchase software such as SmartSound’s Sonicfire Pro to quickly compose your own without requiring any musical knowledge. If you have any question about whether or not you can use a musical selection in your podcast, consult an attorney.

  • Find advertisers that fit your subject. When I produced a comedy talk show several years ago, one of the “promotional consideration” sponsors was a vodka. It didn’t fit the show. The broader your subject, the greater the variety of advertisers you can choose from. For example, a sports show could have beer and car ads, but a show on knitting should have advertisers that make or sell products related to knitting.
That’s it for content. Stay tuned for Part 2, technical tips and tricks for improving your podcasts.

Tuesday, December 20, 2005

A Return to “Walled Gardens”?

This week's Business Week draws attention to an issue that's been hot in the Internet community for the last month. Currently, companies like eBay, Google and Yahoo pay a lot of money to telecom companies for local loop connections to their networks, and from there to the Internet backbone. However, some of the major telephone and cable companies (led by AT&T's Edward Whitacre) want the content and service providers to pay for access to their residential and business Internet subscribers. Specifically, this is what’s being considered:

  • Using Quality of Service (QoS) to give content providers that agree to pay them more bandwidth and better response times than content providers that don’t pay, and blocking some websites and services altogether.

  • Limiting or barring competitive VoIP vendors from accessing their networks.

The result could be a return to the “walled garden” model pursued by AOL, CompuServe, Delphi, etc. in the years before the Internet. SBC, for example, could give priority to its partner Yahoo, and make competitive services like Google and MSN much slower and harder to reach. It could also decide to block all usage of file-sharing services because they use too much bandwidth.

None of this is new, at least to corporations that have limited employees’ access to certain websites and services for years. Companies regularly block access to porn and shopping sites, and they forbid the use of file-sharing software and services such as Skype. However, these controls are only imposed within these companies’ firewalls. Now, the major telecom companies want the ability to impose these same kinds of limitations on their subscribers.

The risks of this scenario might explain some moves by Google over the last year. At the beginning of 2005, rumors surfaced about a plan by Google to purchase large amounts of “dark fiber”. There was some speculation that Google was planning to become a “super-ISP”, but that talk quickly died down as the press moved on to other stories. Then, in late summer, Google pursued providing citywide WiFi service to San Francisco and Mountain View, CA, where Google is headquartered. The San Francisco franchise is still being debated, but Google won the contract in Mountain View.

Now, why would Google need nationwide dark fiber? It makes sense for the company to run its own private network in order to interconnect its server sites. But why does the company want to get into WiFi? I believe that Google’s dark fiber and WiFi initiatives are closely linked. In the event that the major telecom companies try to force Google to pay for access to their networks, Google can “light up” its own competitive ISP service in major metropolitan areas. After all, there’s no reason why Google can’t overbuild wireless data services in the same areas as the major telecom companies. With WiMAX service and mesh networking, Google could cover an entire city quickly, and at a comparatively low cost.

Of course, Google could use QoS to give priority to its own content and services, but there’s no reason why other content providers couldn’t launch their own ISP services. It’s even reasonable to assume that two or three major content providers could get together and finance a competitive ISP service, guaranteeing all of them equal access to subscribers.

The fight for open access to the Internet is just beginning. Stay tuned—you may see a lot more competitors soon.

Thursday, December 15, 2005

Telephone in the 21st Century

There’s an interesting article in this week’s Television Week magazine about SureWest, a local telephone company based in Roseville, CA, just outside Sacramento. The company is offering its subscribers IPTV with 260 channels, and is about to launch 17 HD channels. SureWest has 85,000 telephone subscribers, of which 20% (approximately 17,000) are taking the video service. SureWest is offering a package of telephone service, DSL and video for $110/month, about 15% less than AT&T/SBC and Comcast.

What’s interesting about SureWest is not only that it’s offering IPTV video in the first place, but that it’s got 17,000 subscribers. AT&T has only a few dozen users of its U-Verse service in a test market, and Verizon has a few thousand users of its FIOS service right now, although it predicts that it will have 400,000 subscribers by the end of 2006. (Verizon is also far ahead of AT&T in development and deployment.)

SureWest’s service is highly competitive with that of Comcast, although Comcast boasts that it has 2,200 hours of video-on-demand (VOD) content available, whereas SureWest has only 400 hours. However, the size of SureWest’s library is only a temporary disadvantage as more content providers offer VOD programming. SureWest also beats DirecTV and Echostar; those companies offer less expensive video packages, but their services are inherently one-way, so they can’t offer much in the way of data services and no real-time VOD.

SureWest’s engineering approach is dramatically different than that of the traditional cable operators, which typically purchase both set-top boxes and headend (the cable operator’s central distribution site) equipment from the same company. If they’ve got a Motorola headend, they use Motorola STBs, and the same goes for Scientific-Atlanta. By comparison, SureWest has gone for a “best of breed” strategy, where it’s installed what it thinks are the best products in each category. For example, SureWest uses IPTV STBs from a UK-based company called Amino, which I’d never heard of before I read the TV Week article. Amino has a range of STBs with a variety of features and price points, as well as a PVR, all of which are about the size of a paperback book. SureWest also uses equipment from BigBand Networks, Irdeto, Kasenna, Minerva and Cisco in its network and headend. It’s not locked into any single vendor or technical approach, because all of its vendors’ products conform to industry standards.

AT&T and Verizon have to eventually support millions of subscribers, so they’re going to get most of their technology from a single supplier. Installing and supporting a nationwide network of “best of breed” equipment would be a nightmare for them. However, that enables the smaller telcos like SureWest, as well as cable overbuilders like Grande and RCN to add services and respond to market demand faster than the behemoths.

In short, if you want to see the future of video distribution today, take a look at SureWest. It’s already played host to over 100 companies that want to learn about deploying IPTV. I suspect that quite a few companies will be going there for an education.

EETimes.com - Alliance hopes to simplify HDTV networks

Yesterday, the High-Definition Audio-Video Network Alliance (that name just rolls off the tongue, doesn't it?), also known as HANA, announced its formation and its plans to connect TVs, DVRs, set-top boxes, media centers, etc. via IEEE 1394, otherwise known as Firewire. Members of the group include JVC, Mitsubishi, Samsung, Sun Microsystems, Charter Communications and NBC Universal. (Notice that Sony, Panasonic, Philips, Toshiba and Apple aren't members.)

The establishment of yet another industry consortium isn't news--there seems to be at least one or two announced every week. What's interesting about this consortium is that Samsung appears to be leading it. The president of HANA is Heemin Kwon, who's the executive VP and general manager of Samsung's Digital Solution Center. To the best of my knowledge, this is the first time that Samsung has been the lead partner in an international standards consortium. The leadership role usually falls to a company such as Sony, Panasonic or Toshiba, but Samsung is now clearly in position to be taken just as seriously as the "incumbent" industry leaders.

It'll be very interesting to see if Samsung can convince any of the big Japanese "alliance builders" to join HANA, and what compromises (including sharing of power) they'll demand in order to gain their cooperation.

Read more at www.eet.com/news/semi/s...

Wednesday, December 14, 2005

January Action

There’s lots of new product announcement that are expected next month. Here’s a quick look:

  • Intel has confirmed that it will release its Yonah dual-core processor for notebooks in early January. Yonah will combine significantly better performance than today’s Pentium M processor with lower power consumption. The plan is for manufacturers to begin shipping Yonah-based products at the same time that Intel makes its announcement. You can expect to see both Yonah-based notebooks and small form-factor desktop PCs.

  • While not confirming the date, Intel will announce the first products based on its VIIV home entertainment platform in January. The date bandied about is just before both the Consumer Electronics Show and Macworld. There should be several VIIV-based home media center prototypes and early production models at CES.

  • As usual, Apple remains mum on what (if anything) the company plans to introduce at Macworld, but Intel’s timetable lends credence to rumors of the announcement of new iBook models. Yonah will enable Apple to make faster iBooks that use less power. However, Apple can’t risk making the Powerbooks obsolete by selling iBooks with higher performance, so we’re likely to either see iBooks with slower processors or Powerbooks announced at the same time with higher-speed Yonah chips.

  • The VIIV and Yonah launches also make Apple’s introduction of a Mac Mini-based media center more possible. However, even if this kind or product is announced, Apple won’t necessarily use the VIIV platform. Intel has been very circumspect as to whether or not Apple has committed to adopt VIIV, and Intel claims that VIIV will only support Microsoft’s Windows Media Center software.

None of this includes the dozens, if not hundreds, of new consumer electronics products and services that will be shown for the first time at CES. Personally, I expect to see a bunch of new satellite radio receivers, as that business really begins to take off. (Sirius showed a mobile video service at last year’s CES conference, and XM came back shortly later and said “We can do it too,” so I expect to see more developed versions of video over satellite radio at CES.)

By the end of January, we may have a very different playing field and set of competitors than we have today. It may be cold, but it’s going to be a very interesting month.

Sunday, December 11, 2005

The Role of Marketing

I’ve spent most of my career in technology marketing: Product management, product marketing and market communications. I’ve launched dozens of products, run departments and worked with engineering teams over the last 25 years. And yet, I’ve always felt that I had a “Kick Me” sign on my back, because in high tech, marketers are second-class citizens.

Years ago, when I worked at HP, I joined the company as a product manager. I spent a year in marketing before I got so frustrated about my lack of influence that I transferred into the lab. As an engineer, I had far more say over how products turned out than when I was in marketing. This pattern has been repeated again and again.

I was prompted to write this entry by the cover article in Business Week a few weeks ago, one of the many articles that have been published lately about Google. Toward the end of the article, the authors discuss the “pecking order” at the company: Engineers come first, followed by product management (which is closely aligned with engineering at Google,) while sales and marketing are at the bottom of the pecking order.

I don’t mean to pick on Google, because this pattern is repeated at most high tech companies. Don’t believe me? Try this simple test: Go to the websites of several high tech companies. Click on “Company” or “About Us,” and then select “Management.” Look at the management roster, and see who’s at the bottom of the list. Most times, it’ll be marketing and sales.

High tech companies are famous for confusing sales and marketing. Sales generates revenues for companies, while marketing supports sales, figures out what products are needed for future growth and profitability, and does whatever it can to support customers so that they buy again and again. By its nature, sales has a short-term focus; the emphasis in sales organizations is to “make the numbers” for the current month, quarter or year. Marketing has to have a longer-term focus; it must figure out what products and services are needed in the next months and years, and how to communicate the benefits of those products and services to the salesforce, customers, investors and the press.

One of the most common situations in high tech is the company that has no “second act.” This is a company that launched a successful product or service, but couldn’t follow up with an equally successful second product. Silicon Valley is littered with the corpses of companies that fell into this trap. It’s the job of marketing to figure out what that second act should be, by talking with current and potential customers, talking with engineering for their concepts and considerations, and analyzing the competition. That doesn’t mean that engineering shouldn’t be developing new products and technology independent from marketing, because that’s where most of the true discontinuous innovations come from. However, engineering, marketing, sales and finance have to tightly coordinate their actions, or you run the risk of “getting on your horse and riding off in all directions.”

I would love to see high tech companies take the discipline of marketing as seriously as they take engineering, but I’m not likely to see it in my lifetime. However, there is one branch of high tech that does take marketing more seriously (because it has to,) and that’s consumer products and services. Engineering-driven products that are difficult to use or don’t fill the needs or wants of a reasonably large consumer audience will die a horrible death.

A good example is Sony’s past digital audio players. For a number of years, Sony went its own way, using its proprietary ATRAC format to encode and protect the songs, and a program called SoundStage to manage the PC-to-player connection. Even though virtually every portable player could handle MP3s, Sony required that MP3s be converted to ATRAC by SoundStage before they could be copied to its players. Users of other formats with DRM (such as Windows Media and AAC with Apple’s FairPlay) were completely out of luck.

The result was that sales of Sony’s players were miniscule, while Apple’s grew until the company reached its current point of market domination with iPods and iTunes. It’s only recently that Sony finally acquiesced to the market and began supporting other formats. If Sony had bothered to listen to consumers when it was planning its original players instead of insisting on ATRAC because it was a “better” format, it might dominate the market now and Apple would be an also-ran.

It’s essential that companies know who their customers are, and talk to them. It doesn’t matter whether a company sells to businesses or consumers—the basic principles are the same. Products and services have to meet customer needs in order to sell. If it’s a truly innovative product, customers may not know that they need it, but it’s marketing’s job to identify the needs and wants that the product fulfills and communicate them to the salesforce and customers.

Marketing, especially for companies that want to survive past their first product, is as essential as engineering. I firmly believe that the survival rate of new ventures will dramatically increase as more companies take marketing seriously. Will that ever happen? We’ll see.

Animation: It’s the Script That Counts

There’s an interesting article in this month’s Fast Company (subscription required) that discusses how small companies are getting into the computer animation business, using off-the-shelf software and workstations to create films at a fraction of the cost of those from Pixar and Dreamworks Animation. However, hardly mentioned, if at all, is the importance of scripts in the success of animated films.

Over the past few years, along with the mega-hits The Incredibles and Shrek II, we also got films such as Valiant and Robots, both of which disappeared from theaters in a few weeks after their releases. All four movies were well-animated; in fact, Robots came from the same studio that produced the hit Ice Age. However, almost every review of the films came down to the scripts: Brad Bird’s script was nominated for Best Original Screenplay at last year’s Oscars. Valiant and Robots, on the other hand, were panned by critics for their scripts, even though Robots was praised for its animation.

Perhaps a better example is The Polar Express. The characters had almost creepy features, but the story was strong enough to make the film one of the biggest hits of last Christmas.

The impact of scripts doesn’t just apply to fully-animated movies. Star Wars Episodes I and II, Hulk and Van Helsing were all criticized, not for the quality of their animation but rather for their scripts. While the Star Wars movies still did very well at the boxoffice, their performance (especially Episode II) was below industry expectations. Hulk and Van Helsing both performed much worse than their pre-release estimates. (In fact, many audience members were text messaging their friends while they were watching Hulk, telling them not to bother seeing the movie.)

My point is that while the animation is very important, the story and script are far more important. By focusing on the technology rather than the writing, it’s easy to draw the wrong conclusions. Yes, animation technology has gotten much cheaper, and excellent animators can be found worldwide, but they’re only as good as the scripts they’re given.

Friday, December 09, 2005

Where’s Google Going?

It seems that Google has been on the cover of just about every business magazine in the last two months. With a share price over $400, a lot of people are interested in the company. However, Google’s strategic direction is about as clear as mud. The company says that its goal is to index and make available all the knowledge in the world, which is laudable but has already gotten them into lots of trouble with book publishers, who really don’t want their “knowledge” to be given away.

Google now has eighteen different search-related services plus Blogger, Google Analytics, Picasa, Google Talk, Google Earth, Gmail, Google Desktop and Google Toolbar. Google Labs is testing twelve more services. The company also sells a search appliance for large organizations. And let’s not forget Google Adwords, the financial engine that drives the company. (Google is even experimenting with selling print advertising space to its Adwords advertisers.)

Most of Google’s services fit well with its stated goal, but services like Blogger, Google Analytics, Google Talk and even Gmail are a stretch. Further, it appears that the things that are generating the vast majority of Google’s revenue are search and Blogger. The other services and software are either being given away or (in the case of the search appliance and the pro version of Google Earth) contribute a miniscule share of revenues.

The situation at Google reminds me of Netscape in many ways (although Netscape had a far more coherent product strategy.) Netscape had a full line of servers and a suite of development tools along with its ubiquitous browser, but the vast majority of the company’s revenues and profits came from sales of the browser to commercial customers. That’s why Microsoft gave away Internet Explorer and rushed to bring it to parity with Netscape Navigator. Microsoft saw that Netscape was totally dependent on its browser for revenues; cut off those revenues, and Netscape would be crippled, which is exactly what happened.

Similarly, Google is almost totally dependent on Adwords for its revenues and profits. Microsoft is rumored to be negotiating with AOL to create a joint keyword advertising network to compete with Adwords. Would Microsoft offer its affiliates a much better percentage of advertising revenues than Google? More than likely; in fact, Microsoft might even give its affiliates 100% of its advertising revenues from their sites. Would that cut off Google’s oxygen supply? Not necessarily, but it would make Google much more dependent on advertising revenues from its own site.

Google has to start building its non-advertising revenue base, and the sooner the better. It needs to look at more effective ways to generate revenue from its services and software, and it also needs to put more emphasis on getting its products right than on constantly pushing new products into the market.  It’s easy to drive adoption when you give everything away for free. It’s much more difficult to create services and software that customers will pay for. Google’s challenge is to raise its game to the next level.

HD is in the Eye of the Beholder

Scientific-Atlanta recently released the results of a telephone survey that found that 49% of owners of HD-capable televisions or monitors aren’t using them to watch HD programming because they don’t have a cable or satellite set-top box, CableCARD or antenna capable of receiving HD. (A study by Forrester Research confirms this number; in fact, they say that by the end of the year, less than 44% of HD-capable sets will be used to watch HD programming.) What’s even more interesting is that 41% of owners of HD-capable sets think that they’re already watching HD when they’re not.

There’s undoubtedly a variety of reasons why these viewers think that they’re watching HD when they’re not. Some viewers thought that they could watch HD without any additional equipment, while others thought that they were automatically watching HD when they saw the “Broadcast in HDTV” bugs at the beginning of some TV shows and sporting events. My personal belief is that these viewers have never actually seen a HD show in HD, so they have no standard of comparison. Or, it may also indicate that some of them can’t tell the difference between HD and conventional programming.

In 2004, I attended a DVD industry conference where Blu-Ray and HD DVD advocates pitched their formats to senior industry execs. (The heads of all the major studios’ home video divisions were in attendance.) The Blu-Ray team showed a scene from Lawrence of Arabia in which one side of the screen was in HD, and the other was in standard definition. The difference was so subtle as to be almost invisible. (And I’m hardly a HD newbie; I saw my first HD pictures in Japan in 1989.) The same thing happened at the Consumer Electronics Show last year; I saw Blu-Ray movies that were virtually indistinguishable from the DVD versions.

To give the Blu-Ray forces the benefit of the doubt, the most recent demo was almost a year ago, so improvements have probably been made by both their and the HD DVD team. However, if a significant percentage of the audience either can’t see or don’t know the difference between HD and SD, are they going to bother to buy a HD blue-laser player? DVDs look a lot better than conventional SD broadcasts, so the difference between DVD and blue-laser discs is likely to be much more subtle than the difference between HD and SD broadcasts. Even in an in-store comparison (which is where most consumers will see HD discs,) will there be enough of a difference to get them to trade up?

I haven’t seen any results of independent side-by-side consumer comparisons between blue-laser discs and DVDs, but my suspicion is that they’re not going to be very favorable to the HD disc crowd. Could this turn into a repeat of the DVD Audio vs. Super Audio CD (SACD) battle, where consumers didn’t buy either one? Stay tuned.

Wednesday, December 07, 2005

The DRM Pandemic

Unless you’ve spent the last few weeks housesitting Saddam Hussein’s “spider hole,” you’ve heard about Sony’s software that silently installed itself on listeners’ PCs in an attempt to keep its CDs from being ripped. Instead, the listeners got ripped, thanks to a just plain ugly job of programming that opened a massive security hole in their computers.

I bring up this issue because last night, I tried to play some music stored on my PC that’s protected with Microsoft’s Windows Media Digital Rights Management (DRM) system. I purchased the songs months ago from Napster’s (legal) music store. I’ve got one copy on my PC and one on my music player, so I’m not even close to the limit on the number of copies of each song that I’m allowed to make by the DRM.

When I tried to play one of the Napster-sourced songs on my PC, instead of music I got a dialog box that said “License Acquisition Project page to upgrade to Premium.” After being unable to translate that message with my Captain Midnight Decoder Ring, I thought that the problem might be with Napster, so I launched it only to be told that I had to upgrade to a new version. After downloading and installing the upgrade, I tried to play a couple of the songs through Napster, only to learn that I no longer had a valid license for the music, Napster no longer had the right to sell the song, or both.

This is the second time in the last two months that I’ve had a similar problem. The first time, I got an error message from Microsoft’s Windows Media Player that said that my licenses were corrupt and had to be replaced with a backup (which of course I didn’t have.) Napster walked me through getting around that problem, and I’m now waiting for them to reply to my customer service email and help me fix the latest problem.

I’m a fairly sophisticated user, yet I’ve lost control of content that I purchased months ago and I need help from tech support. If this is a problem for me, it’s going to be at least an order of magnitude bigger problem for most consumers. The only reason why the DRM piƱata hasn’t spewed over the entire media and consumer electronics landscape is that there still aren’t a lot of actual DRM consumer users…but that’s about to change.

Today’s DVD players use a security system called CSS (Content Scrambling System,) which encrypts the content on manufactured DVDs. The problem is that once an enterprising programmer named Jon Lech Johansen figured out how to derive the encryption keys for each movie, anyone with a personal computer and DVD drive could remove the encryption and make their own perfect copies. Because CSS is hard-wired into all DVD players, and because every new disc has to play on the entire installed base of hundred of millions of DVD players, the movie studios are stuck.

Enter Blu-Ray. This almost-here, super-duper high definition replacement for DVDs uses three security systems: Advanced Access Content System (AACS,) which is the next-generation version of CSS, BD+, which enables content providers to update DRM on already-installed Blu-Ray players, and ROM-Mark, which is a defense against bootlegging. AACS allows content providers to limit access to their content through an Internet connection to the viewer’s player. For example, a vendor could specify that a disc can only be watched for a maximum of 72 hours following the first time that it’s played. BD+ enables content providers to completely disable the ability of a Blu-Ray player to play discs, in the event that the current DRM system is compromised. Consumers would be required to install a DRM upgrade before they could play any more discs. Sounds great! What could go wrong?

Well, let’s see…you get a disc from Netflix in the mail, and when you put it into the player, the disc tries to “call home” over the Internet. What if your Internet connection is down or the content provider’s server has crapped out—will the disc play or not? What if a BD+ “upgrade” that’s hidden on the Blu-Ray disc you just bought manages to trash your player’s ability to play anything? How will that get fixed? Or what if someone who’s less then scrupulous manages to distribute a BD+ upgrade that will only allow certain movies to play?

According to the Digital Entertainment Group (DEG,) more than 147 million DVD players and more than 5 billion DVDs have been sold to customers in the U.S. since 1997. A hot DVD can sell several million copies in its first week. If something goes wrong with the AACS or BD+ code for an equivalent Blu-Ray disc, the carnage will be massive. If the disc won’t play, are video retailers and rentailers going to be willing (or able) to support customers? If the players themselves are damaged, who’s responsible for fixing the problem—the content provider or the player manufacturer?

DRM, by its very nature, is an accident waiting to happen. DRM systems create a game of cat & mouse between DRM developers and hackers: A new DRM system is released, hackers figure out how to break it, the DRM vendor works around the hack, the hackers work around the fix, and so on. Sooner or later, something is going to go wrong with either a hack or a fix, and when it does, lots of people are going to hear about it. One guy who got ripped off by an Internet camera store posted the story on digg and turned the name PriceRitePhoto into a synonym for sleaze. Imagine what will happen when a few million people find out they can’t play that shiny new Blu-Ray disc they just bought.

Content providers, consumer electronics and computer companies have to start looking beyond DRM to come up with better market-based ways to limit piracy. If version 1 of a DRM system doesn’t stop piracy, release a more powerful (and onerous) version 2; repeat indefinitely. However, there’s always a point past which it doesn’t make sense to protect content. We could have bank vault doors on our homes and apartments to make them burglar-proof, but they’d be massive overkill. Instead, we live with wood or metal doors with cylinder locks and deadbolts, even though we know that they won’t stop a determined burglar.

There is no practical DRM system that will stop a determined pirate. Sooner or later, someone will find a way around the DRM. Rather than engineering DRM systems for “worst-case scenarios,” we need legal, pricing, packaging and distribution strategies, along with content protection, that discourage pirating and encourage purchase of legitimate copies. The sooner we focus on the carrot rather than the stick, the better.

Tuesday, December 06, 2005

The Annual Apple Rumor Mill

The rumor mill is running full blast with speculation about what (if anything) Apple will announce next month at Macworld. There are three core questions being asked:

  1. Will Apple announce any Intel-based products?

  2. If so, what will they be?

  3. Will Apple announce any further PowerPC-based hardware?

Here’s my uninformed speculation to go along with everyone else’s:

  1. Apple will announce Intel-based products at Macworld. With Intel apparently making better progress than expected in getting new processor designs out, Apple’s in an excellent position to announce Intel-based products in January, for release in the subsequent 60 to 90 days.

  2. I really have no idea what they’ll be, except that I don’t expect to see a G5-replacement tower-style computer at the show. Apple’s biggest win would be to take advantage of the low-power Pentium M/Core processors for its notebooks now. I’d also like to see the rumored Intel Mac Mini home media center running Front Row 2.0, but that’s primarily wishful thinking on my part.

  3. No. Once Apple introduces its first Intel-based systems, no further PowerPC-based products will be released. From what I’ve seen publicly from Apple, the company is pulling out all the stops to get its third-party developers to port their software over to Intel ASAP. Apple has to be concerned about being Osborned (the act of killing sales of your existing products by preannouncing future products.) Once it starts shipping Intel-based products, existing PowerPC-based products will appear obsolete. They control the timing for the start of the transition to Intel, but once it starts, they’ll have to move everything over to Intel quickly. I wouldn’t be surprised to see the first Intel-based products at Macworld in January 2006 and the entire computer line transitioned over to Intel by Macworld in January 2007.

Saturday, December 03, 2005

Simpler Set-Tops, Later Date?

Last week, the National Cable & Telecommunications Association (NCTA,) the United States’ official cable television trade organization, submitted a report to the FCC concerning conditional access (the ability to give or deny cable users access to various services and programming tiers.) Today, cable operators purchase set-top boxes that are compatible with the conditional access system that they’ve implemented (usually from Motorola, Scientific-Atlanta or NDS.) Boxes built for one system won’t work on another.

To enable consumers to have a choice of suppliers for set-top boxes, the FCC ruled that cable operators had to separate their conditional access systems from the set-top boxes themselves. The result was CableCARD, a device that provides conditional access for specially-equipped set-top boxes, television sets, DVRs and other products. The current version of CableCARD (1.0) is one-way only, so it doesn’t support video-on-demand, pay-per-view, interactive program guide or DVR functionality. CableCARD 2.0, which will be two-way and will support all of the above features, is due some time in 2006.

Last week’s report from the NCTA discusses another way to enable conditional access, called the Downloadable Conditional Access System, or DCAS. With DCAS, when a compatible device (set-top box, receiver, DVR, etc.) is attached to a cable system, it downloads the necessary software to support that cable operator’s conditional access system. No CableCARD or other hardware is needed.

DCAS is the access control system that pretty much all parties have wanted from the beginning: It doesn’t require cable operators to either toss out their existing set-top boxes or lease CableCARDs to their subscribers, it fully supports two-way operation, and consumers should be able to add compatible equipment to their cable connections without requiring the cable company to send out an installer.

According to the NCTA’s report, DCAS should be deployed nationally by July, 2008 (a date that I think is wildly optimistic.) The cable industry is currently operating under FCC rules that require them to soon replace all their existing integrated set-top boxes with boxes that use CableCARDs for conditional access. With the NCTA’s report, the cable industry is essentially saying that CableCARD will be obsolete by July 1, 2008. The obvious argument that follows is that it makes no sense for cable operators to replace all the existing set-top boxes with models that use CableCARD, only to replace them again two years later.

The not-so-subtle intent behind the NCTA’s report is to get the FCC to back off on its demand that cable operators adopt CableCARD for their own set-top boxes. For consumers, the big question is whether or not the FCC will still require the cable industry to deliver a two-way version of CableCARD. Assuming that CableCARD 2.0 could be delivered to consumers by the end of 2006, that’s at least two years sooner than DCAS. If the cable industry is allowed to skip CableCARD 2.0, companies such as TiVo, which has been planning a CableCARD 2.0-based DVR for some time, will be locked out of the market until late 2008 at the earliest.

The DCAS vs. CableCARD situation is one that bears watching, especially to see how the consumer electronics industry responds to the NCTA’s report and proposed timetable. We may be further away from fully cable-compatible set-top boxes at your local retailer than it looked just a few weeks ago.

Friday, December 02, 2005

Blu-Ray: Everything Old is New Again

According to this article, Sony has decided to release its initial Blu-Ray (high definition) discs using the same MPEG-2 compression that today’s DVDs use, even though the Blu-Ray specification requires all compatible players to support both MPEG-4 AVC and SMPTE VC-1 (Microsoft’s WMV9 HD) formats. Sony’s argument is that MPEG-2 encoders are used by every DVD authoring system on the planet, while AVC and VC-1 encoders are still both rare and expensive. The capacity of Blu-Ray discs (25-50GB) is more than sufficient to store a full-length HD movie in MPEG-2, so Sony will stick with MPEG-2 until they need to use one of the more space-efficient alternatives.

Sony’s move has come as a surprise to most of its partners in the Blu-Ray consortium, not to mention the DVD authoring houses that are gearing up to support Blu-Ray. If MPEG-2 is more than sufficient for the task, why did Sony bother to include AVC and VC-1 in the Blu-Ray standard in the first place? AVC and VC-1 will add cost and complexity to every Blu-Ray player made; without those codecs, players will be considerably less expensive.

One possible outcome of Sony’s decision is that it might actually reopen the door for HD discs based on red-laser DVD technology. If Sony is saying that it doesn’t think that consumers really want or need all the special features that would be possible using a next-generation codec, why can’t AVC- or VC-1-compressed HD films be put onto conventional DVDs? After all, that’s what Microsoft has been doing for two years now with its WMV HD DVDs. In addition, it neutralizes the capacity advantage that Blu-Ray has over HD DVD, which will also support both AVC and VC-1. Encode in a next-generation format for HD DVD, and get all the special features, or encode in MPEG-2 for Blu-Ray and be stuck with the same amount of special features as today’s DVDs.

I suspect that Sony’s move was designed to eliminate one source of technical risk from Blu-Ray’s launch, but it sends a mixed and confusing message to Sony’s partners and the market in general. If MPEG-2 is “good enough,” then maybe the DVDs that we all know and love, with the right codecs, are good enough as well.

The Evolution of Mass

Back in the dot-com era, the gospel was that the age of mass media and mass marketing was coming to an end. The focus was shifting to one-to-one experiences. Fast forward to late 2005, and the real story is very different. Mass is no longer a dirty word—it’s essential for success. We’ve gone through two stages of the evolution of mass, and we’re now in the third. Let’s look at where we are and how we got there:

Stage One: Mass Aggregation

This is where things started post-World War II: The goal was to get as many people as possible in one place, so that you could sell them as much mass-produced stuff as possible. This is the model that Yahoo!, Excite, Ask Jeeves, etc. originally followed: Aggregate as many eyeballs as possible, keep them for as long as possible, and sell them to advertisers. This is also the foundation for Internet commerce: Amazon.com and eBay work only because so many consumers come to their sites to buy.

The big difference between mass aggregation up to the late 1990s and mass aggregation today is the composition of the mass. The old mass model discounted differences in the audience, and sold everyone who showed up the same things. If you watched “All in the Family,” advertisers assumed that you’d probably buy the same type of car and drink the same kind of beer as everyone else who watched the show.

Today’s mass audience is comprised of a host of micro-audiences, each with its own interests and reasons for being there. Someone may go to Yahoo! to get driving directions to a meeting, investigate job opportunities, read the latest entertainment news or listen to music. It’s the aggregate of all these micro-audiences that makes Yahoo! so valuable to advertisers. Advertisers can pitch to everyone who comes to the site, everyone looking for directions in a certain city, or everyone who happens to be interested in model railroading.

Stage Two: Mass Participation

Yahoo! Groups. Amazon.com’s customer reviews. eBay’s feedback. All of these are examples of mass participation. Information and ideas come not just from one source, but from many. Participants share their experiences and insights with one another, which dramatically increases the value and credibility of the information. On a shopping site like Pricewatch, in addition to the vendor-supplied product specifications and prices, you’ll also find ratings and reviews contributed by other customers.  Just the number of reviews that a given vendor has provides valuable information; the more reviews, the more reliable the rating is likely to be. It brings an entirely new dimension to a purchase decision that would have previously been based primarily on price.

The participants generally aren’t creating “freestanding” content. Their contributions are meaningful only in the context of the host site’s own content (or goods and services.) For example, eBay’s buyer and seller feedback only makes sense when considering whether or not to do business with someone on eBay itself. Comments in a Yahoo! group make sense only within the context of the other messages and comments in the group. The next stage, however, expands the impact of individual contributions from a single site to a much bigger arena.

Stage Three: Mass Empowerment

Create a blog, and you can discredit evidence from a broadcast news story or report first-hand about government ineptitude in the wake of Katrina. Put your PC up for sale on eBay, and you become a sophisticated electronic merchant serving potential customers anywhere in the world. This is mass empowerment.

The objective of mass empowerment is to enable individuals to compete with the giants of media and commerce on a level playing field. It’s democratic, egalitarian and highly chaotic. Individuals change their roles on the fly: Read a variety of blogs to figure out the zeitgeist of a particular topic, then write your own blog entry and link back to your sources. Buy some shop equipment on eBay, use it to build custom furniture and then sell the furniture on eBay. Both network effects and virtuous circles arise from mass empowerment.

eBay is the clear leader in commerce mass empowerment, although Amazon.com’s Marketplace is also popular with sellers of books, music, videos and similar goods. Google and Yahoo! are fighting for leadership in media mass empowerment, but AOL and MSN are also major players. Apple’s iTunes has become the leading distributor of podcasts, but Yahoo! is climbing fast. Google and Apple are leading in the early rounds of videocasts (or vodcasts,) but there’s plenty of opportunity for other aggregators to jump into the fray.

Where To Now?

The exciting thing about the Internet has always been the fact that even if future directions are obvious, how we get there usually isn’t. Internet search was considered a commodity before Google turned it into an incredible growth engine. Internet radio died a protracted death at the hands of the recording industry, only to be resurrected as podcasts, most of which are beyond the control of entrenched business and government interests. I believe that vodcasts are the next great frontier of mass empowerment--there’s enormous room for new players to jump in with production, editing and indexing software that will make creating vodcasts much easier and more productive.

The stage beyond mass empowerment may well be mass collaboration. Once there’s a critical mass of individual producers, the next step is to start getting individuals working together across the Internet. Wikis are an obvious first step in this process, but there are plenty of opportunities to enable more sophisticated peer-to-peer collaboration. For example, a group of individual eBay merchants could pool their expertise and inventories to create a virtual mass merchant with a single identity, even though the individual members are scattered all over the world. A virtual newswire can be created by a group of individual bloggers, either for a single event (i.e., an election or conference) or on an ongoing basis. Vodcasters can parcel out the many tasks that go into creating video content to collaborators across the country and around the world.

No matter what the next stage turns out to be, the combination of mass and micro will drive the evolution of media and commerce.