I'm in the early days of working on an Internet services start-up, and one of the questions that I'm wrestling with is where to base my business. In the early days, it's very easy to move from one city to another; once my business has multiple employees, many with families, it'll be much more difficult (and more disruptive) to move. I haven't yet come to a decision, but my thinking may be of value to some of my readers who are also going through the process, or who may do so in the future.
Let's start with Chicago, since that's where I'm currently based. Specifically, I'm located in a far northwestern suburb of Chicago, not the city itself. As a practical matter, I'll have to move to Chicago or to one of the near suburbs, such as Evanston or Skokie, to find the talent I need. Chicago has just about everything a technical startup needs--excellent colleges (University of Chicago, Northwestern, Illinois Institute of Technology and University of Illinois at Champaign-Urbana) for talent, a culture with plenty of night life and social activities to attract young people, and reasonably priced office space. It's also got some angel investors and venture capitalists, but by and large, they're significantly more risk-averse than investors in Silicon Valley. Investors in Chicago like to see a start-up making money before they invest, which goes against the whole purpose of seed funding, which is to help new businesses get to the point where they can generate revenue. If you need to be generating revenue in order to attract the necessary investment in order to generate revenue, you're not being offered seed money.
Another problem with the Chicago financing community is that there's a big hole between angel investors and when most local venture capitalists will participate. Even if you manage to get angel investment, it's extremely difficult for growing businesses to get the next round of funding. The amount needed is too high for the local angels, and the risk is too high for local institutional investors.
There are two other issues that make Chicago less than ideal for start-ups (I'll deal with weather in the section titled "Potential Natural Disasters"). The first one is political corruption. Chicago is the most corrupt major city in the U.S., and Illinois is the most corrupt state in the U.S. Last Night's "Daily Show" gave an amusing, but accurate, look at the situation, but for now, consider that you're more likely to go to prison if you get elected Governor of Illinois than if you commit murder in Chicago. This corruption adds billions of dollars each year to the cost of government (which results in higher taxes), and to the cost of running businesses (in the form of those same higher taxes, plus payoffs, contributions to political campaigns, requirements to use union labor when non-union labor is just as good and is much less expensive (even with benefits), and other expenses.)
The second issue is the risk-averse nature of the Chicago community. Just as investors look to minimize their risk by focusing on revenue-producing businesses, Chicagoans would much rather take a job with a start-up, with a guaranteed income and benefits, than start a business themselves. Taking the personal and financial risks involved with a start-up simply isn't ingrained in the local culture, as it is in Silicon Valley.
Potential Natural Disasters: This is where I'll discuss weather and similar problems. For Chicago, the issues are winter weather in general, with blizzards in particular, along with floods and tornadoes in the spring and summer.
Another option is Silicon Valley. The universities (Stanford, U.C. Berkeley and U.C. San Francisco, along with many smaller colleges), existing base of trained developers, engineers, marketers, lawyers and financial experts, and deep financial resources in the form of angels, super-angels and venture capitalists, along with the local culture that supports and encourages start-ups, make Silicon Valley the best place in the U.S. to start a business. Some Chicagoans brag that there's more money in the Chicago Merchantile Exchange than in all of Silicon Valley, but if the people who are investing in the CME won't invest in start-ups because they don't fit their "risk profile", their money is worthless.
Silicon Valley's Achilles' heel is the cost of doing business. Everything in California costs more than just about anywhere else in the U.S. Taxes are higher, and housing costs more, as does medical care (and the insurance to cover that medical care). All of those costs mean that salaries in Silicon Valley are higher than just about anywhere in the U.S., yet the standard of living isn't any better. So long as I can find qualified people in another location, I can save 30% to 50% of my operating costs by basing somewhere other than Silicon Valley.
Potential Natural Disasters: Earthquakes, floods, mudslides, droughts and firestorms. Silicon Valley (and much of California) has earthquakes all the time; most of them are too small to be sensed. However, everyone waits for "The Big One", a magnitude 7.5+ earthquake on one of the major fault systems that's overdue. Any big earthquake has the potential to disrupt transportation, electricity and natural gas supplies for days, weeks or even months. It also has the potential to take many lives. The one big positive is that California's building codes have been upgraded many times, and few places in the world are better prepared for earthquakes. In the coastal areas of California, including Silicon Valley, winter is the rainy season, and that's when the state gets floods and mudslides. The summer is when California gets firestorms, usually in some of the state or national forests.
Austin, Texas is a good alternative. Austin is the capital of Texas, the home of the University of Texas at Austin, and it houses Dell's and Freescale Semiconductor's headquarters as well as major facilities for AMD and Intel. It's also a little circle of liberalism inside a huge red conservative state. Austin has a young, creative population, as perhaps best illustrated by the city's annual SXSW conference every spring that attracts techies, filmmakers and fans from around the world. Austin is a moderately-sized city, but it has technical and business resources that belie its population size. It's also small enough that a start-up isn't likely to get lost or ignored. Finally, Austin's cost of living is a bit lower than Chicago's and much lower than Silicon Valley's.
Austin has a small angel and venture capital community, but it's not as developed as Chicago and nowhere near the size of Silicon Valley. It's also largely dependent on the University of Texas to provide an ongoing supply of young talent.
Potential Natural Disasters: Summer weather in general, droughts and tornadoes. An oppressively hot day in Chicago would be considered cool and comfortable in Austin, where daily maximum temperatures near or even over 100 degrees Fahrenheit are common. Texas is in the midst of a severe drought, and the state suffers a drought every few years; combined with the heat, this causes pretty miserable conditions.
I've also considered Portland, Oregon. I haven't been there for many years, but it's a beautiful city with a strong technical community (Intel, Tektronix, Agilent, Nike and others). Portland State University is based in the city, and both Oregon State and the University of Oregon are two hours away or less. It's also less than 150 miles to Seattle, where Amazon, Microsoft, Nintendo U.S. and many other technology companies are based. Portland's a young person's city, with lots of night life, easy access to the Pacific Ocean, and a deep commitment to the environment. Surprisingly, Portland's cost of living is almost identical to Chicago's.
Portland's angel and venture capital community are similar to Austin's, with one big difference: Portland's location--driving distance from Seattle and two hours from Silicon Valley by air--makes it possible for Portland start-ups to attract funding from both areas, as well as local investors.
Potential Natural Disasters: Volcanoes, earthquakes, tsunamis, floods. When I lived near Portland, Mt. St. Helens erupted (not the devastating eruption that blew off the side of the volcano). Both Mt. St. Helens and Mt. Hood are active volcanoes. Earthquakes occur due to faults and plate lines off the Pacific coast. Portland is much less seismically active than Silicon Valley, but like California, Portland and the rest of the Pacific Northwest are awaiting their own Big One, which is likely to trigger tsunamis due to the underwater location of the faults. Perhaps the biggest weather problem with Portland is its almost continuous rain for eight months out of the year. Unlike other cities, the rain is more like a mist, and the sun comes out so rarely during the rainy season that weather forecasters predict "sun breaks". When the sun finally comes out and stays out, however, Portland is spectacular.
So, those are my options: Chicago, which has a lot of things going for it but also has some serious drawbacks; Silicon Valley, which has much more going for it but is an incredibly expensive place to live and work; Austin, which is a great city with an excellent cost of living, but is incredibly hot in the summer; and Portland, which is also a great city with an excellent cost of living, but it's rainy and overcast eight months of the year.
What are your thoughts? What other cities should I consider, and why?
Wednesday, June 29, 2011
Sunday, June 26, 2011
FCPX Part 3: The backfire
I didn't want to write another post about Apple's Final Cut Pro X debacle (I really, truly didn't), but I've reached my irritation limit. Let's be frank: Apple blew it, but not necessarily for how FCPX turned out. Rather, it blew it by:
The enormous reaction, for a product that represents a minute fraction of Apple's revenues, suggests to me that there's something more at work here than simple customer dissatisfaction. For example, Adobe started looking for "Premiere Pro ambassadors" just prior to the launch of FCPX. Call me paranoid, but I have to suspect that Apple's competitors are encouraging the firestorm, even to the point of offering talking points to bloggers and tweeters. I have no evidence that this is happening, but the number of posts and tweets, and their similarity, sound very much like what would be driven by a competitive response team. (I used to run those teams in the past, and I know how they work.) Throw in free "evaluation" copies of software that have valid serial numbers, and you end up with a corps of people who have motivation to keep the pressure on.
The resellers who have been tweeting constantly since last Tuesday, trying to get FCP users to switch to Avid or Apple, have a transparent reason for doing so: They can no longer make any money selling FCP. FCPX will only be sold through the Apple App Store, so resellers and integrators can't make any money selling it. They can continue to sell peripherals that work with FCPX, but they can't make any money on FCPX itself.
The FCPX release has stirred more negative reaction than Microsoft's decision not to support direct Windows XP upgrades to Windows 7. Remember that one? It affected, and still continues to affect, millions of PC users--many times more than the FCP user base--but it didn't get this level of vitriol.
So, I've stopped following the resellers that continue to tweet negative coverage of FCPX and exhort me to buy Avid or Adobe. When Apple gets this resolved, as I'm convinced they will, there's going to be a lot of people with egg on their faces. And, for the record, I've been compensated by no one for this (or any other) post, and I'm not writing from anybody's talking points other than my own. I just wish that a whole bunch of people would grow up.
- Not fully communicating just how much of a departure FCPX is from FCP 7, and not educating users to see it as version 1.0 of an entirely new platform,
- Not anticipating how vehement user reaction would be to key missing features, such as the inability to import FCP 7 projects, no multicam capabilities, and the lack of any facilities for getting audio and EDLs from FCPX to and from other applications, and
- Not keeping Final Cut Studio 3 available for sale while Apple and third-parties worked to bring FCPX up to functional parity with FCP 7.
The enormous reaction, for a product that represents a minute fraction of Apple's revenues, suggests to me that there's something more at work here than simple customer dissatisfaction. For example, Adobe started looking for "Premiere Pro ambassadors" just prior to the launch of FCPX. Call me paranoid, but I have to suspect that Apple's competitors are encouraging the firestorm, even to the point of offering talking points to bloggers and tweeters. I have no evidence that this is happening, but the number of posts and tweets, and their similarity, sound very much like what would be driven by a competitive response team. (I used to run those teams in the past, and I know how they work.) Throw in free "evaluation" copies of software that have valid serial numbers, and you end up with a corps of people who have motivation to keep the pressure on.
The resellers who have been tweeting constantly since last Tuesday, trying to get FCP users to switch to Avid or Apple, have a transparent reason for doing so: They can no longer make any money selling FCP. FCPX will only be sold through the Apple App Store, so resellers and integrators can't make any money selling it. They can continue to sell peripherals that work with FCPX, but they can't make any money on FCPX itself.
The FCPX release has stirred more negative reaction than Microsoft's decision not to support direct Windows XP upgrades to Windows 7. Remember that one? It affected, and still continues to affect, millions of PC users--many times more than the FCP user base--but it didn't get this level of vitriol.
So, I've stopped following the resellers that continue to tweet negative coverage of FCPX and exhort me to buy Avid or Adobe. When Apple gets this resolved, as I'm convinced they will, there's going to be a lot of people with egg on their faces. And, for the record, I've been compensated by no one for this (or any other) post, and I'm not writing from anybody's talking points other than my own. I just wish that a whole bunch of people would grow up.
Tuesday, June 21, 2011
Two more words about Final Cut Pro X: Don't Panic
Ever since Apple released Final Cut Pro X today, there's been a flood of tweets and blog posts offering instant analysis, most of them referring to missing features and capabilities, such as the inability to import previous FCP projects into FCPX. Some of the complaints are just plain wrong; for example, one person tweeted that FCPX can't use shared media, then tweeted a bit later that it can.
One thing that some commenters are getting right is that this is Final Cut Pro X Version 1, not Final Cut Pro Version 8. It's (apparently) an entirely new code base, with a dramatically different user interface, and different terminology for some of the same functions in previous versions. Version 1.0 of a new product always has bugs and missing features. It's a given with any software product, and FCPX is no different.
If you're already using Final Cut Pro 7 or Final Cut Studio, you should continue to use it for deadline-based and revenue-producing jobs while you get up the learning curve on FCPX. Even if FCPX was a completely incremental release, I'd still be suspicious of switching to it immediately. New releases always have bugs. However, if you're an existing Final Cut Pro user and you can afford the $300, I'd suggest that you buy it, in order to learn the software on your own schedule. This is the future of Final Cut Pro, and unless you want to eventually abandon the platform entirely, you'll need to understand and use it.
On the other hand, if you're a new user or you have questions or concerns, there's no risk in waiting. Those "instant" reviews are likely to be largely appended, corrected and superseded over the next few weeks. Let others dig into the software and give you their feedback.
When OSX Lion ships next month, the App Store will be integrated into the operating system, and updates will be much faster and easier than they are today. Only the parts of the code that need to be changed will be updated, not the entire application, so downloads and installation will be much faster. Over the next few weeks and months, Apple will release updates, and third parties will release utilities and drivers to correct or compensate for the more serious shortcomings in FCPX.
In short, don't panic. This isn't a race.
One thing that some commenters are getting right is that this is Final Cut Pro X Version 1, not Final Cut Pro Version 8. It's (apparently) an entirely new code base, with a dramatically different user interface, and different terminology for some of the same functions in previous versions. Version 1.0 of a new product always has bugs and missing features. It's a given with any software product, and FCPX is no different.
If you're already using Final Cut Pro 7 or Final Cut Studio, you should continue to use it for deadline-based and revenue-producing jobs while you get up the learning curve on FCPX. Even if FCPX was a completely incremental release, I'd still be suspicious of switching to it immediately. New releases always have bugs. However, if you're an existing Final Cut Pro user and you can afford the $300, I'd suggest that you buy it, in order to learn the software on your own schedule. This is the future of Final Cut Pro, and unless you want to eventually abandon the platform entirely, you'll need to understand and use it.
On the other hand, if you're a new user or you have questions or concerns, there's no risk in waiting. Those "instant" reviews are likely to be largely appended, corrected and superseded over the next few weeks. Let others dig into the software and give you their feedback.
When OSX Lion ships next month, the App Store will be integrated into the operating system, and updates will be much faster and easier than they are today. Only the parts of the code that need to be changed will be updated, not the entire application, so downloads and installation will be much faster. Over the next few weeks and months, Apple will release updates, and third parties will release utilities and drivers to correct or compensate for the more serious shortcomings in FCPX.
In short, don't panic. This isn't a race.
Final Cut Pro X: The war is on
Earlier today, Apple made Final Cut Pro X available in the OSX App Store for $299.99 (U.S.). Since NAB, there have been many questions about how Apple would make the other applications in the previous Final Cut Studio suite available, and the company answered the questions today: Motion is available as a separate application, for $49.99, as is Compressor, for the same $49.99. Final Cut Pro X includes much of the functionality from Soundtrack Pro, Color and DVD Studio Pro, so I assume that those applications have been discontinued.
Yesterday, I noticed a number of posts on Twitter, saying that Adobe is looking for bloggers to become Premiere Pro "ambassadors". The reason is clear: Adobe has got to be very scared by Apple's new pricing. A full copy of Creative Suite 5.5 Production Premium costs $1.699. If you just want Premiere Pro, that's $799; After Effects is $999, and Audition is $349. In other words, it's cheaper to buy the Production Premium suite. I would say that Adobe has to lower its prices dramatically to compete with Apple, but that goes against Adobe's corporate philosophy, which is to continually raise, not lower, prices.
Avid is also feeling pain; the company launched a preemptive strike against Apple by making Media Composer available to Final Cut Pro users for $995 right before NAB, only to learn that Apple planned to sell Final Cut Pro X for $300 a few days later. Again, I don't expect Avid to drop its prices to compete with Apple; the company remains in somewhat fragile condition, and a price war is the last thing it needs.
One question is, why did Apple drop its prices so precipitously? It didn't really need to in order to satisfy the needs of the professional audience, which has been conditioned to expect pro video editing packages to be priced at $1,000 or more. I think that there are a few possible reasons:
Apple's pricing for Final Cut Pro X opens up a new front in its war against Adobe (and to a lesser extent, Avid). Apple's competitors absolutely don't want to respond, but they may have no choice.
Yesterday, I noticed a number of posts on Twitter, saying that Adobe is looking for bloggers to become Premiere Pro "ambassadors". The reason is clear: Adobe has got to be very scared by Apple's new pricing. A full copy of Creative Suite 5.5 Production Premium costs $1.699. If you just want Premiere Pro, that's $799; After Effects is $999, and Audition is $349. In other words, it's cheaper to buy the Production Premium suite. I would say that Adobe has to lower its prices dramatically to compete with Apple, but that goes against Adobe's corporate philosophy, which is to continually raise, not lower, prices.
Avid is also feeling pain; the company launched a preemptive strike against Apple by making Media Composer available to Final Cut Pro users for $995 right before NAB, only to learn that Apple planned to sell Final Cut Pro X for $300 a few days later. Again, I don't expect Avid to drop its prices to compete with Apple; the company remains in somewhat fragile condition, and a price war is the last thing it needs.
One question is, why did Apple drop its prices so precipitously? It didn't really need to in order to satisfy the needs of the professional audience, which has been conditioned to expect pro video editing packages to be priced at $1,000 or more. I think that there are a few possible reasons:
- Final Cut Pro X is dramatically different from previous versions of the software, and Apple may have wanted to lessen resistance to buying the new version. At $300 (or $400 for the equivalent of the previous Final Cut Studio), it's almost an impulse buy.
- By making Final Cut Pro X available at a low price and selling it only in the OSX App Store, Apple may be conditioning its customers to buy all their software, not just inexpensive software, online. (It may also be part of the process of getting customers ready for Macs without any optical drives.)
- Apple and Adobe aren't friends, or even "frenemies". They're enemies. Apple is already driving a stake into Flash, and any sales that Apple takes away from Premiere Pro and After Effects will show up in Adobe's bottom line. In this battle, Avid may turn out to be an unintended casualty.
Apple's pricing for Final Cut Pro X opens up a new front in its war against Adobe (and to a lesser extent, Avid). Apple's competitors absolutely don't want to respond, but they may have no choice.
Sunday, June 19, 2011
Incoherence in the movie industry
Something happened this weekend that caused me to reflect on the state of the U.S. movie industry, and that something was the release of "Green Lantern". No one movie better exemplifies the state of corporate moviemaking in the U.S., which can best be explained by the following analogy: What would happen if a bunch of drunks stumbled on a satchel full of money and decided to make a movie?
Movie executives readily affirm their desire to minimize risk by making movies based on "pre-sold" properties--that means books, comic books, television shows, cartoons, other movies and anything else that consumers recognize instantly. (I'm amazed that we haven't already seen "McDonalds, The Movie.") So, this summer, we got "Thor", "X-Men: First Class", "Kung Fu Panda 2", "Pirates of the Caribbean: On Stranger Tides", "Fast Five", "Judy Moody and the NOT Watchable Movie" (not the real title, but accurate), and, of course, "Green Lantern." We're waiting for "Cars 2", a second version of perhaps the worst movie that Pixar ever made, and "Transformers: Dark of the Moon", where Michael Bay blows up Chicago without getting rid of a single corrupt political official.
Some pre-sold movies turn out very well; I don't think that Christopher Nolan can make a bad movie, the original "Iron Man" was brilliant, and the first two X-Men movies were fun, but not one of the series movies that have been released so far this summer, with the possible exception of "Fast Five", is as good as the best movie in its series.
"Green Lantern" is reported to have cost Warner Bros. $150 million to produce. For $150 million, you would have thought that studio executives would have read the script. Martin Campbell, who directed "Casino Royale", directed "Green Lantern" with the alacrity of a blind man driving an Indy Car. For that matter, Kenneth Branagh, who directed some of the best versions of Shakespeare's plays ever put on film, directed "Thor", another pre-sold superhero movie that wasn't worth the price of admission. This conclusively demonstrates that the "auteur theory", which states that the director, not the writer, is the "author" of the movie, has been replaced by the "hooker theory", which states that directors will knowingly film any complete piece of crap if they're paid enough.
What the U.S. movie industry has done is not minimize risk by fixating on pre-sold properties--it's attempted to absolve itself of any responsibility for making good movies. You can make excellent movies from pre-sold properties, as Brian Singer, Gore Verbinski, Jon Favreau, Christopher Nolan, and even Martin Campbell (with James Bond) demonstrate. You can also make really lousy movies. If you're paying $150 million, it doesn't cost any more to make it good. J.J. Abrams' "Super 8" proves that you can make a good movie for much less than $150 million that's still loaded with special effects and suspense.
My philosophy is that any movie from a U.S. studio other than Pixar has to be regarded as crap unless proven otherwise.
Movie executives readily affirm their desire to minimize risk by making movies based on "pre-sold" properties--that means books, comic books, television shows, cartoons, other movies and anything else that consumers recognize instantly. (I'm amazed that we haven't already seen "McDonalds, The Movie.") So, this summer, we got "Thor", "X-Men: First Class", "Kung Fu Panda 2", "Pirates of the Caribbean: On Stranger Tides", "Fast Five", "Judy Moody and the NOT Watchable Movie" (not the real title, but accurate), and, of course, "Green Lantern." We're waiting for "Cars 2", a second version of perhaps the worst movie that Pixar ever made, and "Transformers: Dark of the Moon", where Michael Bay blows up Chicago without getting rid of a single corrupt political official.
Some pre-sold movies turn out very well; I don't think that Christopher Nolan can make a bad movie, the original "Iron Man" was brilliant, and the first two X-Men movies were fun, but not one of the series movies that have been released so far this summer, with the possible exception of "Fast Five", is as good as the best movie in its series.
"Green Lantern" is reported to have cost Warner Bros. $150 million to produce. For $150 million, you would have thought that studio executives would have read the script. Martin Campbell, who directed "Casino Royale", directed "Green Lantern" with the alacrity of a blind man driving an Indy Car. For that matter, Kenneth Branagh, who directed some of the best versions of Shakespeare's plays ever put on film, directed "Thor", another pre-sold superhero movie that wasn't worth the price of admission. This conclusively demonstrates that the "auteur theory", which states that the director, not the writer, is the "author" of the movie, has been replaced by the "hooker theory", which states that directors will knowingly film any complete piece of crap if they're paid enough.
What the U.S. movie industry has done is not minimize risk by fixating on pre-sold properties--it's attempted to absolve itself of any responsibility for making good movies. You can make excellent movies from pre-sold properties, as Brian Singer, Gore Verbinski, Jon Favreau, Christopher Nolan, and even Martin Campbell (with James Bond) demonstrate. You can also make really lousy movies. If you're paying $150 million, it doesn't cost any more to make it good. J.J. Abrams' "Super 8" proves that you can make a good movie for much less than $150 million that's still loaded with special effects and suspense.
My philosophy is that any movie from a U.S. studio other than Pixar has to be regarded as crap unless proven otherwise.
Thursday, June 16, 2011
The (second) rise and fall of walled gardens
Those of us who have been around for long enough remember the era of proprietary online services. In the U.S., the leaders were America Online, CompuServe, Prodigy and GEnie. Canada and several European countries had teletext services. All of these services were what came to be known as "walled gardens": Each service had its own collection of content, its own email service, and its own client software. Subscribers could use the content and services from one vendor, but couldn't get to the content or services from other vendors without subscriptions to their services. You could easily send email and messages to subscribers of the same service, but it was very difficult to send email from one service to another. Content providers had to use the publishing tools provided by each online services, and needed contracts with each service to reach their subscribers.
The Internet, and in particular the web, changed all that. Anyone with a web server on the public Internet could reach anyone with a web browser. Thanks to HTML and HTTP, browsers, servers and authoring tools were standardized, so that proprietary software and tools weren't needed. It only took a few years for the open Internet to displace the proprietary online services. Of the four U.S. leaders, only America Online survives, with web-based services and content. CompuServe is now a brand name of America Online, GEnie closed down at the end of 1999, and Prodigy closed down in 2001.
We're now living in the second era of walled gardens, thanks to smartphones. Apple's iOS, Google's Android, RIM's BlackBerry, Microsoft's Windows Phone 7 and HP's WebOS all support web content, but they have their own proprietary standards for apps, their own app stores, and their own rules for which apps will or won't be allowed to run on their devices. Apps written for one platform won't work on a different platform without recompiling and significant recoding.
As with the proprietary online services, the web (especially the combination of HTML5, CSS and JavaScript) may lead us out of the walled garden era of mobile operating systems. PBS's MediaShift recently published an excellent video interview with Tom Peeters, the multimedia manager for Mediafin, the Belgian-based publisher of newspapers De Tijd and L'Echo.
Mediafin has been working on an HTML5 version of its newspapers for the iPad for some time, even though it already has native iOS apps in the App Store. The Financial Times' decision to release an excellent HTML5-based web app for iOS, and to commit to eventually replace its existing iOS app with a web app, is bringing a lot of other publishers with similar plans (especially European publishers) out of the woodwork. The FT's actions are also serving as an existence proof--publishers can deliver usable web apps with a high degree of interactivity without going through Apple. FT's decision also gives momentum to HTML5 publishing toolkits from companies such as OnSwipe and pugpig.
One point that you'll hear in the interview is that there's a definite marketing advantage to being in the App Store, but if you already have a way of reaching customers directly, as Mediafin does with its newspapers, you can gain much more control over the development process and save the 30% commission (closer to 40% for Mediafin, due to VAT) that would go to Apple.
It may be wishful thinking, but ten years from now, I expect that we'll look back at today's mobile walled gardens and wonder how they ever existed.
The Internet, and in particular the web, changed all that. Anyone with a web server on the public Internet could reach anyone with a web browser. Thanks to HTML and HTTP, browsers, servers and authoring tools were standardized, so that proprietary software and tools weren't needed. It only took a few years for the open Internet to displace the proprietary online services. Of the four U.S. leaders, only America Online survives, with web-based services and content. CompuServe is now a brand name of America Online, GEnie closed down at the end of 1999, and Prodigy closed down in 2001.
We're now living in the second era of walled gardens, thanks to smartphones. Apple's iOS, Google's Android, RIM's BlackBerry, Microsoft's Windows Phone 7 and HP's WebOS all support web content, but they have their own proprietary standards for apps, their own app stores, and their own rules for which apps will or won't be allowed to run on their devices. Apps written for one platform won't work on a different platform without recompiling and significant recoding.
As with the proprietary online services, the web (especially the combination of HTML5, CSS and JavaScript) may lead us out of the walled garden era of mobile operating systems. PBS's MediaShift recently published an excellent video interview with Tom Peeters, the multimedia manager for Mediafin, the Belgian-based publisher of newspapers De Tijd and L'Echo.
Mediafin has been working on an HTML5 version of its newspapers for the iPad for some time, even though it already has native iOS apps in the App Store. The Financial Times' decision to release an excellent HTML5-based web app for iOS, and to commit to eventually replace its existing iOS app with a web app, is bringing a lot of other publishers with similar plans (especially European publishers) out of the woodwork. The FT's actions are also serving as an existence proof--publishers can deliver usable web apps with a high degree of interactivity without going through Apple. FT's decision also gives momentum to HTML5 publishing toolkits from companies such as OnSwipe and pugpig.
One point that you'll hear in the interview is that there's a definite marketing advantage to being in the App Store, but if you already have a way of reaching customers directly, as Mediafin does with its newspapers, you can gain much more control over the development process and save the 30% commission (closer to 40% for Mediafin, due to VAT) that would go to Apple.
It may be wishful thinking, but ten years from now, I expect that we'll look back at today's mobile walled gardens and wonder how they ever existed.
Tuesday, June 07, 2011
Technology previews vs. premature release
Earlier today, Nintendo previewed its new Wii U at the E3 conference. The Wii U is a new console with a tablet-like device that serves as a controller and second display device. Like a number of other Nintendo events, today's Wii U announcement was deliberately positioned as a technology preview rather than a formal product announcement. The Wii U won't ship until 2012, and it may be significantly different by the time it ships. However, Nintendo is encouraging developers to start writing games and applications for the Wii U, so that when it ships, there will be a significant third-party library to support it.
I'm not a fan of early announcements, but when it comes to new platforms, technology previews make a lot of sense. They introduce developers, the press and potential customers to new product concepts, and they build interest and support for the formal product release. They buy time for their vendors--Nintendo said "2012", which gives them almost 18 months. They're clearly prototypes, and they give vendors the time they need to gather feedback and make changes before they go to market.
Compare this approach with what Google did with Google TV and its Android 3.0 tablets. Last year's Google TV announcement was clearly premature; the resulting products from Sony and Logitech were too expensive and too hard to use for most consumers. There was no reason for Google and its partners to rush Google TV out for last year's holiday season. Had they positioned the announcement at last year's I/O Conference as a technology preview, with a product release scheduled for some time in 2011, they would have had the opportunity to get developers involved, get much more usability feedback and resolve objections from television and cable networks before they went to market.
Much the same thing happened earlier this year with Android 3.0 and Motorola's Xoom. Google and Motorola were determined to beat Apple's iPad 2 to market, so they rushed out both Honeycomb (Android 3.0) and the Xoom. Third-party developers had almost no time to develop tablet-aware Android apps before the Xoom shipped, and the first version of the Xoom was much too expensive: $799 (U.S.) without a data plan, or $599 with a two-year contract. In addition, Motorola promoted the Xoom's LTE broadband compatibility, but the initial model shipped with 3G CDMA, and Motorola still hasn't released the LTE capability.
Honeycomb was rough around the edges, with almost no tablet-specific apps, and the Xoom was too expensive. It was a replay of the Google TV launch. Even though there are many more Android tablets coming this year, it looks like there won't be a big market for them until 2012 at the earliest.
If Google had given a technology preview of Google TV last year for release in 2011, and if they had previewed Honeycomb with "reference platform" tablets early this year for release in time for the holiday season, it would have given developers time to build a base of compatible apps, and hardware vendors time to build devices that took full advantage of the operating system while meeting customers' price expectations. In hindsight, it wouldn't have hurt Google and Motorola at all to ship after the iPad 2; in fact, they would have shipped better products at lower prices.
The lesson is that if you're working on platforms, not just products and services that are compatible with existing platforms, technology previews are a much better option than prematurely releasing final products.
I'm not a fan of early announcements, but when it comes to new platforms, technology previews make a lot of sense. They introduce developers, the press and potential customers to new product concepts, and they build interest and support for the formal product release. They buy time for their vendors--Nintendo said "2012", which gives them almost 18 months. They're clearly prototypes, and they give vendors the time they need to gather feedback and make changes before they go to market.
Compare this approach with what Google did with Google TV and its Android 3.0 tablets. Last year's Google TV announcement was clearly premature; the resulting products from Sony and Logitech were too expensive and too hard to use for most consumers. There was no reason for Google and its partners to rush Google TV out for last year's holiday season. Had they positioned the announcement at last year's I/O Conference as a technology preview, with a product release scheduled for some time in 2011, they would have had the opportunity to get developers involved, get much more usability feedback and resolve objections from television and cable networks before they went to market.
Much the same thing happened earlier this year with Android 3.0 and Motorola's Xoom. Google and Motorola were determined to beat Apple's iPad 2 to market, so they rushed out both Honeycomb (Android 3.0) and the Xoom. Third-party developers had almost no time to develop tablet-aware Android apps before the Xoom shipped, and the first version of the Xoom was much too expensive: $799 (U.S.) without a data plan, or $599 with a two-year contract. In addition, Motorola promoted the Xoom's LTE broadband compatibility, but the initial model shipped with 3G CDMA, and Motorola still hasn't released the LTE capability.
Honeycomb was rough around the edges, with almost no tablet-specific apps, and the Xoom was too expensive. It was a replay of the Google TV launch. Even though there are many more Android tablets coming this year, it looks like there won't be a big market for them until 2012 at the earliest.
If Google had given a technology preview of Google TV last year for release in 2011, and if they had previewed Honeycomb with "reference platform" tablets early this year for release in time for the holiday season, it would have given developers time to build a base of compatible apps, and hardware vendors time to build devices that took full advantage of the operating system while meeting customers' price expectations. In hindsight, it wouldn't have hurt Google and Motorola at all to ship after the iPad 2; in fact, they would have shipped better products at lower prices.
The lesson is that if you're working on platforms, not just products and services that are compatible with existing platforms, technology previews are a much better option than prematurely releasing final products.
Friday, June 03, 2011
Can "old media" survive a double-dip recession?
Last Friday, the U.S. Labor Department announced that national unemployment unexpectedly rose in May to 9.1%. 54,000 people found jobs last month, but that was more than 100,000 less than what most economists expected. That follows news that the Conference Board's Consumer Confidence Index dropped to 60.8 in May from 66 in April, and more than 100 before the start of the Great Recession. The Standard & Poor's/Case-Schiller Index of housing prices in the 20 largest U.S. markets fell in March to its lowest level since 2003.
I don't usually dabble in economic forecasts, but it's increasingly looking like we're entering a double-dip recession. In the current issue of Business Week Magazine, writer Peter Coy refers to the situation as the economy's "You First" problem: Unemployed people don't have money to spend, and people who are employed don't want to spend money, because they're still afraid that they might lose their jobs. Businesses are unwilling to start hiring again because consumer spending is so low. Both sides are waiting for the other to go first, and the Federal Government is in gridlock, so the economy has no option but to fall back into recession.
The Great Recession did no favors for incumbent media companies. The decline of newspapers, which had begun years before the start of the recession, took on a fatal momentum. Magazines' advertising dropped precipitously, and many magazines either shrank their page counts or folded completely. Radio and television advertising revenues also dropped dramatically; in the case of television, the drop in advertising made local stations much more dependent on retransmission fees from cable, satellite and IPTV operators in order to stay in business.
While there seems to be no hope for the recovery of the newspaper industry, advertising revenues for other media have improved, although they're still far below their pre-recession levels. However, a double dip recession would likely drive revenues back down to where they were two years ago. It's hard to overestimate what impact that would have on incumbent media companies. Internet companies will likely hold their own, but a lot of media companies that barely got through the Great Recession won't survive a second one.
I don't usually dabble in economic forecasts, but it's increasingly looking like we're entering a double-dip recession. In the current issue of Business Week Magazine, writer Peter Coy refers to the situation as the economy's "You First" problem: Unemployed people don't have money to spend, and people who are employed don't want to spend money, because they're still afraid that they might lose their jobs. Businesses are unwilling to start hiring again because consumer spending is so low. Both sides are waiting for the other to go first, and the Federal Government is in gridlock, so the economy has no option but to fall back into recession.
The Great Recession did no favors for incumbent media companies. The decline of newspapers, which had begun years before the start of the recession, took on a fatal momentum. Magazines' advertising dropped precipitously, and many magazines either shrank their page counts or folded completely. Radio and television advertising revenues also dropped dramatically; in the case of television, the drop in advertising made local stations much more dependent on retransmission fees from cable, satellite and IPTV operators in order to stay in business.
While there seems to be no hope for the recovery of the newspaper industry, advertising revenues for other media have improved, although they're still far below their pre-recession levels. However, a double dip recession would likely drive revenues back down to where they were two years ago. It's hard to overestimate what impact that would have on incumbent media companies. Internet companies will likely hold their own, but a lot of media companies that barely got through the Great Recession won't survive a second one.
Subscribe to:
Posts (Atom)