Monday, January 30, 2012
Moonbot Studios: Prototype for the "New Hollywood"?
Unless you're an iPad user, you probably haven't heard of Moonbot Studios, but you're likely to hear much more about them over the next few weeks. Moonbot develops interactive children's books for the iPad; its first project, The Fantastic Flying Books of Mr. Morris Lessmore, started as a short film that's been nominated for an Academy Award for Best Animated Short. Update, February 26, 2012: The Fantastic Flying Books of Mr. Morris Lessmore won the Academy Award for Best Animated Short. Moonbot isn't an animation studio, publisher or app developer--it's all of those things.
Moonbot was founded in 2009 by William Joyce, Brandon Oldenburg and Lampton Enochs. Joyce is an illustrator, graphic designer, writer and animator who's written and illustrated more than 50 children's books, created Rolie Polie Olie, one of the first computer-animated children's television shows (for which he received three Emmys), created character concepts for Pixar's Toy Story and A Bug's Life, and co-created and produced Blue Sky's Robots.
Moonbot's creations, Morris Lessmore and the new The Numberlys, which was released for the iPad earlier this month, are what the company calls "story apps". They combine elements of animated films, children's books, and videogames, but they're unique enough not to be classifiable as any of those things. It's that uniqueness that makes Moonbot's story apps early examples of a new medium, not just an extension of eBooks or films.
According to Fast Company, Moonbot stumbled onto its approach when working on Morris Lessmore, which was originally planned to be a short film and printed children's book. The iPad was released while the film was in production, and Joyce realized that it would enable Moonbot to do things that simply weren't possible previously...what he calls "a third way of expression".
If Moonbot is pioneering a new medium, what's its potential? It's too early to say how big the market size is, but if it develops like the video game industry, it'll both be big and largely independent of legacy media. Digi-Capital estimated that global video game industry revenues for 2011, including online and mobile games, were $87 billion, about twice the size of the global theatrical motion picture business.
The video game industry only tangentially depends on legacy media companies like movie studios and broadcasters--in fact, few video games based on movie characters have been successful. Moonbot isn't dependent on legacy media companies at all: It creates its own characters, writes its own stories, produces its own animation, builds its own apps and distributes its own works. It does all of that in Shreveport, Louisiana, a city not known as either a technology or media center, with only 35 employees.
The story app concept is no way limited to children--Moonbot's creations are as compelling to adults as they are to kids. It's still in its infancy; imagine how we'll be able to interact with these apps when we have Siri-style voice interaction and Kinect-style 3D motion detection to go along with touch gestures.
Could The Fantastic Flying Books of Mr. Morris Lessmore be this new medium's The Great Train Robbery? I wouldn't bet against it.
Sunday, January 29, 2012
Somewhere between wishful thinking and delusion: Can Barnes & Noble save print books?
There's an article in today's New York Times about Barnes & Noble titled "The Bookstore's Last Stand". The central premise of the article is that publishers are depending on Barnes & Noble to keep the print book alive. The tone of the article, or at least the quotes from publishers, is somewhere between wishful thinking and delusion. Here's an example:
If trends continue, eBooks will represent more than 50% of trade book sales within the next two years. Textbooks and specialty titles will take longer, but they'll likely reach or exceed the 50% point before the end of this decade. Denial of reality is only going to make the transition more painful. Publishers can survive in a majority-eBook market, so long as they manage their businesses to do so.
So, what can publishers do? They can consolidate their warehouses and get rid of excess capacity. They can move to a "digital first" model where eBooks, not print, drive the editorial and production process. They can anticipate smaller print runs and start implementing print-on-demand production. If they don't do these things, they'll have no one to blame but themselves when their companies fail.
As for Barnes & Noble, it too has to manage for a future when eBooks comprise most book sales. It has to be prepared to shrink the size of its stores, dramatically decrease the amount of display space dedicated to print books, and use electronic displays to replace physical shelves. It should already be privately prototyping and testing these new-generation stores, so that it's ready to start rolling them out in the next few years. It can't go forward by staring in a rear-view mirror.
The handwriting is so clearly on the wall that any publishing or bookselling executive who ignores it is guilty of willful ignorance.
Carolyn Reidy, president and chief executive of Simon & Schuster, says the biggest challenge is to give people a reason to step into Barnes & Noble stores in the first place. “They have figured out how to use the store to sell e-books," she said of the company. "Now, hopefully, we can figure out how to make that go full circle and see how the e-books can sell the print books.”"...we can figure out...how the e-books can sell the print books"? That's like saying that if we wish hard enough, we can use calculators to sell abacuses. There's an incredible denial of reality going on here: Print still represents a majority of book sales, but it's been declining for years, while eBooks' share of book sales has been growing. If you're fixated on maintaining your print book sales, you're guaranteeing that your business will be marginalized over time.
If trends continue, eBooks will represent more than 50% of trade book sales within the next two years. Textbooks and specialty titles will take longer, but they'll likely reach or exceed the 50% point before the end of this decade. Denial of reality is only going to make the transition more painful. Publishers can survive in a majority-eBook market, so long as they manage their businesses to do so.
So, what can publishers do? They can consolidate their warehouses and get rid of excess capacity. They can move to a "digital first" model where eBooks, not print, drive the editorial and production process. They can anticipate smaller print runs and start implementing print-on-demand production. If they don't do these things, they'll have no one to blame but themselves when their companies fail.
As for Barnes & Noble, it too has to manage for a future when eBooks comprise most book sales. It has to be prepared to shrink the size of its stores, dramatically decrease the amount of display space dedicated to print books, and use electronic displays to replace physical shelves. It should already be privately prototyping and testing these new-generation stores, so that it's ready to start rolling them out in the next few years. It can't go forward by staring in a rear-view mirror.
The handwriting is so clearly on the wall that any publishing or bookselling executive who ignores it is guilty of willful ignorance.
Labels:
Barnes and Noble,
E-book,
eBook,
New York Times,
Publishing
Friday, January 27, 2012
What's more important to authors: Royalties or advances?
One of the strongest arguments for writers to self-publish their works is the potential to earn much higher royalties: Major publishers typically pay 10% to 15% royalties on the suggested list price of hardcover books, and 20% to 25% of their net revenue (wholesale price, or agency price minus 30%) for other formats. Self-publishers, on the other hand, can get as much as 70% of the sale price from Amazon and Barnes & Noble if they comply with those companies' restrictions. However, these numbers don't take into consideration the advances paid by publishers.
At the Digital Book World Conference that ended this week, Publishers Lunch Deluxe reported on a session on "Changing Author-Publisher Relationships" that shed some light on the question of advances vs. royalties. Madeline McIntosh, Random House's President of Sales, Operations and Digital said that over the last five years, for fiction titles, the company has paid 45% to 65% of its sales revenue to authors. Little, Brown Publisher Michael Pietsch said that, across all of Hachette Book Group's titles over the past 15 years, the share of the company's revenues that has gone to authors has risen from 30% to 40%.
Both companies' payouts are substantially higher than any standard royalty rate, suggesting that many, if not most, books fail to earn back their advances. The result is the same as a higher royalty on the actual number of copies sold. On the other hand, self-published books don't get advances, and the authors have to pay editorial, design and conversion costs themselves. As a result, self-published books start out much further in the hole financially, at least so far as the author is concerned.
The question for authors then becomes: Is it better to work with a publisher or to self-publish? If you know with absolute certainty that your book will sell more than it needs to in order to earn back any potential advance, you might make more money by self-publishing. However, if a publisher could sell at least two to three times as many copies as you could sell yourself, you're better off working with a publisher, since the increased volume will compensate for the lower royalty.
But what if you have no idea how many copies your book will sell? In that case, you probably should work with a publisher, because you'll get your advance no matter how many copies of the book are sold. However, there are two risks:
At the Digital Book World Conference that ended this week, Publishers Lunch Deluxe reported on a session on "Changing Author-Publisher Relationships" that shed some light on the question of advances vs. royalties. Madeline McIntosh, Random House's President of Sales, Operations and Digital said that over the last five years, for fiction titles, the company has paid 45% to 65% of its sales revenue to authors. Little, Brown Publisher Michael Pietsch said that, across all of Hachette Book Group's titles over the past 15 years, the share of the company's revenues that has gone to authors has risen from 30% to 40%.
Both companies' payouts are substantially higher than any standard royalty rate, suggesting that many, if not most, books fail to earn back their advances. The result is the same as a higher royalty on the actual number of copies sold. On the other hand, self-published books don't get advances, and the authors have to pay editorial, design and conversion costs themselves. As a result, self-published books start out much further in the hole financially, at least so far as the author is concerned.
The question for authors then becomes: Is it better to work with a publisher or to self-publish? If you know with absolute certainty that your book will sell more than it needs to in order to earn back any potential advance, you might make more money by self-publishing. However, if a publisher could sell at least two to three times as many copies as you could sell yourself, you're better off working with a publisher, since the increased volume will compensate for the lower royalty.
But what if you have no idea how many copies your book will sell? In that case, you probably should work with a publisher, because you'll get your advance no matter how many copies of the book are sold. However, there are two risks:
- If the book earns out its royalty but doesn't sell many copies beyond that point, you might have made more money if you'd self-published it.
- If your book doesn't sell well at all, the publisher will be much less likely to offer to publish your next book, and if it does, the advance will be substantially lower.
As a practical matter, the "publisher vs. self-publishing" question is often a moot point: If a book is rejected by multiple publishers, self-publishing may be the only option available. But, for those authors who can get a publishing contract, the decision may well come down to your confidence in the publisher vs. yourself.
Monday, January 23, 2012
Kill Hollywood? Here's a better approach
Last week, partially in reaction to the SOPA/PIPA debacle, Paul Graham of Y Combinator issued a "Request for Startups" under the title "Kill Hollywood". Graham argued that the fact that entertainment companies are relying on legislation rather than competition and innovation is a sign that they can be displaced. Graham's right about the industry's ham-handed reliance on legislation, but segments of the entertainment industry have gone through extinction-level crises many times in the past. Here are a few examples:
- The music industry was threatened by radio starting in the 1920s, but radio stations eventually chose to play recorded music because it was cheaper than producing original shows. Radio became the primary mechanism for promoting records for decades, until MTV.
- An entire generation of movie distributors and studios, most of which were affiliated with Edison's patent pool, went out of business in the first two decades of the 20th Century. Companies such as Essanay, Kalem, Selig Polyscope, Mutual and Biograph disappeared and were replaced with the progenitors of the movie studios we know today.
- In the 1950s, movie studios were forced to divest themselves of ownership of theaters in the U.S., but independent theaters quickly picked up the slack.
- Television also threatened the movie industry in the 1950s, but the major studios started producing television shows and licensed their movies to television stations and networks.
- Some movie studios went bankrupt or experienced painful reorganizations in the 1960s and 1970s, thanks to massive cost overruns on unpopular films. The industry was saved by the "independent movement", which was where many of today's most successful directors began their careers.
- File sharing and digital distribution has led to consolidation of the biggest record companies. For example, who would have believed that RCA Victor and Columbia, once the world's two largest record companies and fierce competitors, would eventually both come to be owned by Sony?
It's really hard to "kill Hollywood", even when Hollywood seemed to be dead set on killing itself. Industries die when they become obsolete or are replaced by something better. Movie theater attendance is declining, as are DVD sales, but the movie business itself isn't obsolete. The record industry has been struggling, but it's still surviving. The commercial television networks' audiences are declining, but cable networks have been growing for years.
What's needed is a two-part approach:
- First, take on the entertainment companies head-to-head with lobbying. As distasteful as lobbying is to most technology companies, they can't let their opponents have the battlefield to themselves. A strong, coordinated approach to lobbying would counter the efforts of the entertainment companies, which are certainly vulnerable, especially in the "Red States".
- Second, invest in technologies and content that appeal to consumers during their leisure time. Don't worry about what the effect will be on entertainment companies; if consumers like it and are willing to pay for it, that's all that matters.
Friday, January 20, 2012
SOPA and PIPA: Dead? Well, maybe...
Wednesday's Internet blackout by Wikipedia, Reddit and many other sites was the last straw in a battle of wills between the entertainment and technology industries. As of now, the technology companies have won: Mashable reports that Texas Representative Lamar Smith, the chief sponsor of the Stop Internet Piracy Act (SOPA) in the U.S. House of Representatives, has at least temporarily tabled the bill, as has Senate Majority Leader Harry Reid with the PROTECT-IP Act (PIPA), the U.S. Senate's version of the bill. Of course, in Hollywood terms, we can't be sure that SOPA and PIPA are dead unless someone has hammered a stake into their hearts, and they've turned to ash. (That doesn't stop the studios from reviving them as SOPA II, Son of PIPA, etc.)
In Thursday's New York Times, MPAA Chairman Christopher Dodd is quoted as saying that he wants to meet with Silicon Valley executives to draft an anti-piracy bill that both industries can agree to. Of course, we don't know if the Senator Dodd who would fly to Northern California would be the conciliatory one interviewed on Thursday, or the one quoted by the MPAA on Tuesday saying "...some technology business interests are resorting to stunts that punish their users or turn them into their corporate pawns..." and "It is an irresponsible response and a disservice to people who rely on them for information and use their services. It is also an abuse of power given the freedoms these companies enjoy in the marketplace today. It’s a dangerous and troubling development when the platforms that serve as gateways to information intentionally skew the facts to incite their users in order to further their corporate interests." I suspect that it will be the latter, and I'm not sure that Senator Dodd and his employers have learned anything from this debacle.
Update, January 21st, 2012: The Hill reports that last Thursday, Senator Dodd said the following on Fox News: "Those who count on quote 'Hollywood' for support need to understand that this industry is watching very carefully who's going to stand up for them when their job is at stake. Don't ask me to write a check for you when you think your job is at risk and then don't pay any attention to me when my job is at stake." It's even more clear to me that nothing has changed and no lessons have been learned.
In the New York Times article, Dodd admitted that he was responsible for the legislative strategy that introduced both bills into the House and Senate, with the objective of ramming them through before technology companies had any chance to respond. He didn't admit that he proposed, but was almost assuredly responsible for, Representative Smith's decision not to allow any opponents of the bills to testify in front of the House. The MPAA and its allies did everything they could to prevent any opponents of the bill from making their positions known. Even on the last day before the blackout, Senator Dodd was trying to intimidate Internet sites into dropping their blackouts.
Had Senator Dodd and his allies simply opened the process to full discussion from the beginning, there would have been no need for the technology companies and individuals to take the action that they did. It also would have saved the entertainment industry a lot of time and money, as well as its reputation. There's no opposition to stopping piracy from international sites, so long as due process is accorded to all parties and domestic sites aren't under a continuous threat of being shut down due to actions that they didn't instigate or encourage.
I would suggest that Senator Dodd sold the MPAA's board a bill of goods: He most likely told his bosses that he could get legislation through that would allow them to take down any website they wanted, at little or no cost, and with virtually no recourse. "Trust me", he likely said: "I got through the Dodd/Frank Bill". Yes, he did, in a greatly watered-down form, when the House, Senate and White House were all controlled by the Democrats, and when the American public was strongly in favor of increased legislation of the financial industry. That's a little like being appointed a General for the Allies after the Germans had already been pushed back behind the Rhine. Your tactical skill was probably not what won the War.
It's time for the entertainment and technology industries to negotiate as equal partners in order to craft an anti-piracy bill that helps to stop piracy without killing off legitimate businesses. Personally, I wouldn't trust Senator Dodd to be part of those negotiations, but I'm not going to be making that decision.
In Thursday's New York Times, MPAA Chairman Christopher Dodd is quoted as saying that he wants to meet with Silicon Valley executives to draft an anti-piracy bill that both industries can agree to. Of course, we don't know if the Senator Dodd who would fly to Northern California would be the conciliatory one interviewed on Thursday, or the one quoted by the MPAA on Tuesday saying "...some technology business interests are resorting to stunts that punish their users or turn them into their corporate pawns..." and "It is an irresponsible response and a disservice to people who rely on them for information and use their services. It is also an abuse of power given the freedoms these companies enjoy in the marketplace today. It’s a dangerous and troubling development when the platforms that serve as gateways to information intentionally skew the facts to incite their users in order to further their corporate interests." I suspect that it will be the latter, and I'm not sure that Senator Dodd and his employers have learned anything from this debacle.
Update, January 21st, 2012: The Hill reports that last Thursday, Senator Dodd said the following on Fox News: "Those who count on quote 'Hollywood' for support need to understand that this industry is watching very carefully who's going to stand up for them when their job is at stake. Don't ask me to write a check for you when you think your job is at risk and then don't pay any attention to me when my job is at stake." It's even more clear to me that nothing has changed and no lessons have been learned.
In the New York Times article, Dodd admitted that he was responsible for the legislative strategy that introduced both bills into the House and Senate, with the objective of ramming them through before technology companies had any chance to respond. He didn't admit that he proposed, but was almost assuredly responsible for, Representative Smith's decision not to allow any opponents of the bills to testify in front of the House. The MPAA and its allies did everything they could to prevent any opponents of the bill from making their positions known. Even on the last day before the blackout, Senator Dodd was trying to intimidate Internet sites into dropping their blackouts.
Had Senator Dodd and his allies simply opened the process to full discussion from the beginning, there would have been no need for the technology companies and individuals to take the action that they did. It also would have saved the entertainment industry a lot of time and money, as well as its reputation. There's no opposition to stopping piracy from international sites, so long as due process is accorded to all parties and domestic sites aren't under a continuous threat of being shut down due to actions that they didn't instigate or encourage.
I would suggest that Senator Dodd sold the MPAA's board a bill of goods: He most likely told his bosses that he could get legislation through that would allow them to take down any website they wanted, at little or no cost, and with virtually no recourse. "Trust me", he likely said: "I got through the Dodd/Frank Bill". Yes, he did, in a greatly watered-down form, when the House, Senate and White House were all controlled by the Democrats, and when the American public was strongly in favor of increased legislation of the financial industry. That's a little like being appointed a General for the Allies after the Germans had already been pushed back behind the Rhine. Your tactical skill was probably not what won the War.
It's time for the entertainment and technology industries to negotiate as equal partners in order to craft an anti-piracy bill that helps to stop piracy without killing off legitimate businesses. Personally, I wouldn't trust Senator Dodd to be part of those negotiations, but I'm not going to be making that decision.
Labels:
Chris Dodd,
Harry Reid,
Motion Picture Association of America,
MPAA,
PIPA,
Senate,
SOPA,
White House
Thursday, January 19, 2012
Apple's eTextbook announcements: Far from a slam dunk
This morning, Apple announced:
- A new version of iBooks that supports eTextbooks.
- Distribution agreements with five textbook publishers (Pearson, McGraw-Hill, Houghton Mifflin Harcourt, DK and the E.O. Wilson foundation) covering 90% of the high school market.
- eTextbooks priced at $14.99 (U.S.) or less.
- A free eTextbook editing application for OS X called iBooks Author.
- A revamped version of iTunes U for higher education, with a dedicated iOS app.
Apple's announcements could be very important, but the company has a long way to go, for several reasons:
- Apple's focusing on the high school market, not colleges and universities, and it's trying to convince parents and students to purchase eTextbooks directly from Apple. In most U.S. schools and districts, students get their textbooks from the school, either at no charge or as part of an activities fee. Why would parents who don't have to pay for textbooks now or get them automatically start paying for them? Thus, Apple's plan only impacts those parents and students who have to pay for their textbooks now.
- Parents also have to be willing to buy an iPad for their child. That cuts out low-income and many middle-income families.
- There are currently only eight titles in Apple's eTextbook collection--not even enough to be called a good start.
- Although iBooks Author creates eBooks that are based on EPUB 2.0, it uses Apple's proprietary extensions for supporting multimedia, animation and JavaScript-based new features. Thus, titles created with iBooks Author can only be used in iBooks. (This might change in the future, when Apple fully implements EPUB 3 in both iBooks and iBooks Author.)
- If you use iBooks Author and create eBooks that you intend to sell, according to the EULA for the software, you are prohibited from selling the eBooks anywhere except through Apple.
Apple's decision to focus on high school textbooks before going after the college market is questionable: College students pay far more for textbooks than do high schoolers, and parents are far more likely to purchase a tablet for a new college student than for a high school student. However, that's not the most important reason why I believe that Apple will have an uphill battle. When Apple launched the iBookstore initially, with the support of five of the "Big 6" trade publishers and the agency pricing model to eliminate Amazon's price advantage, it looked as though Apple would eventually become as dominant in eBooks as it already was in music. The results, however, have been far from what Apple and its boosters expected.
Even today, Apple has a minuscule share of the U.S. eBook market, far below those of Amazon and Barnes & Noble. Apple's eBooks can only be used on Apple's devices, while Amazon's and Barnes & Noble's eBooks can be be used on those companies' popular eReaders and tablets, as well as with software eReaders on PCs, tablets and smartphones. Amazon in particular has set up an effective self-publishing program for authors, while Apple is just taking the first tentative steps today.
In short, Apple's new eTextbook initiative could make a big difference eventually, but it's far from a slam dunk.
Labels:
Amazon,
apple,
DK,
E-book,
EPUB,
Houghton Mifflin Harcourt,
IBook,
iBooks Author,
iPad,
iTunes U,
McGraw-Hill,
Pearson
Tuesday, January 17, 2012
Stop SOPA
Many websites, including Wikipedia, BoingBoing, Mozilla, WordPress.org and Reddit, are going to go black tomorrow to protest the Stop Online Piracy Act (SOPA) under consideration in the U.S. House of Representatives, and the Protect Intellectual Property Act (PIPA) that's being considered by the U.S. Senate. Mashable has posted an excellent summary of SOPA, and rather than rehash those arguments, I've linked to it for your reference.
The stated purpose of SOPA is to cripple non-U.S. websites that distribute unlicensed copyrighted content, and to prevent U.S.-based sites from hosting, or even linking to, unlicensed content. The problem with SOPA is that it imposes a "death sentence" on websites that haven't been proven to have done any infringement whatsoever. SOPA front-loads the prosecution and punishment of copyright infringement cases. In the case of foreign websites, the U. S. Justice Department can request a court order to seize their domain name(s), order advertising networks and financial processing services to stop doing business with them, order search engines such as Google and Bing to drop them from their indices, and order Internet Service Providers to stop connecting to them. All of this is supposed to take place within five days after the court gives the order, and most importantly, without any notice given to the website. In short, the website can be put out of business before it has any opportunity to defend itself.
SOPA gives content owners the power to do the same things to domestic websites that encourage or facilitate copyright infringement. The Justice Department doesn't need to be involved at all. This part of the bill imposes the same "death penalty" on domestic websites, and doesn't require them to be informed until the penalty has been imposed. Even worse, the owner or operator of the site isn't required to have been the one who posted the infringing content. Infringing content could be in the form of a comment or an uploaded video posted to a user-generated content site like YouTube. It could even be a link to another website that posts infringing content.
SOPA means that every website that allows any kind of third-party content or comments would have to review everything before it's posted. It would make a service such as YouTube, which receives 24 hours of uploaded content every minute, impossible to operate. (Correction, January 23, 2012: According to its blog, YouTube is actually receiving 60 hours of video every minute.) Content providers would no longer need to give notice of infringement as required under the Digital Millennium Copyright Act, and websites would no longer be protected by the law's "safe harbor" provision if they don't knowingly encourage or participate in copyright infringement.
Let me be clear: I defend content companies' right to protect their property. However, SOPA effectively eliminates due process for website operators and creates a poisonous climate of prior restraint, where every post has to be considered infringing unless proven otherwise. An analogy would be if I, believing that a movie used some of my intellectual property, could get a court order seizing every copy of the movie from every theater playing it, or from every store and service distributing it, without giving notice to the film's distributor. By the time the studio answered the charges and got the movie back into theaters and stores, the financial damage would be incalculable.
SOPA would be fair if it required the Justice Department and content owners to give notice to the website operator before any action was taken. It would be fair if it allowed website operators to remedy the infringement, if it exists, without court action. It would be fair if it allowed website operators to defend themselves in open court before they lost their income, domain name and audience. As written, SOPA tilts the playing field decisively in favor of the content providers, most of which already have a massive advantage in legal and financial resources over website operators.
The stated purpose of SOPA is to cripple non-U.S. websites that distribute unlicensed copyrighted content, and to prevent U.S.-based sites from hosting, or even linking to, unlicensed content. The problem with SOPA is that it imposes a "death sentence" on websites that haven't been proven to have done any infringement whatsoever. SOPA front-loads the prosecution and punishment of copyright infringement cases. In the case of foreign websites, the U. S. Justice Department can request a court order to seize their domain name(s), order advertising networks and financial processing services to stop doing business with them, order search engines such as Google and Bing to drop them from their indices, and order Internet Service Providers to stop connecting to them. All of this is supposed to take place within five days after the court gives the order, and most importantly, without any notice given to the website. In short, the website can be put out of business before it has any opportunity to defend itself.
SOPA gives content owners the power to do the same things to domestic websites that encourage or facilitate copyright infringement. The Justice Department doesn't need to be involved at all. This part of the bill imposes the same "death penalty" on domestic websites, and doesn't require them to be informed until the penalty has been imposed. Even worse, the owner or operator of the site isn't required to have been the one who posted the infringing content. Infringing content could be in the form of a comment or an uploaded video posted to a user-generated content site like YouTube. It could even be a link to another website that posts infringing content.
SOPA means that every website that allows any kind of third-party content or comments would have to review everything before it's posted. It would make a service such as YouTube, which receives 24 hours of uploaded content every minute, impossible to operate. (Correction, January 23, 2012: According to its blog, YouTube is actually receiving 60 hours of video every minute.) Content providers would no longer need to give notice of infringement as required under the Digital Millennium Copyright Act, and websites would no longer be protected by the law's "safe harbor" provision if they don't knowingly encourage or participate in copyright infringement.
Let me be clear: I defend content companies' right to protect their property. However, SOPA effectively eliminates due process for website operators and creates a poisonous climate of prior restraint, where every post has to be considered infringing unless proven otherwise. An analogy would be if I, believing that a movie used some of my intellectual property, could get a court order seizing every copy of the movie from every theater playing it, or from every store and service distributing it, without giving notice to the film's distributor. By the time the studio answered the charges and got the movie back into theaters and stores, the financial damage would be incalculable.
SOPA would be fair if it required the Justice Department and content owners to give notice to the website operator before any action was taken. It would be fair if it allowed website operators to remedy the infringement, if it exists, without court action. It would be fair if it allowed website operators to defend themselves in open court before they lost their income, domain name and audience. As written, SOPA tilts the playing field decisively in favor of the content providers, most of which already have a massive advantage in legal and financial resources over website operators.
Saturday, January 14, 2012
eBooks: After the transition
In my last post, I examined the impact that eBooks have had on the publishing industry, and I noted that things are just getting started. Most industry observers agree that eBooks now comprise around 20% of book sales by unit volume. The Forrester Research/Digital Book World survey of U.S. publishers I wrote about found that the single largest group of respondents believes that eBooks will comprise 50% or more of total book sales by sometime in 2014.
What will the book industry look like when today's ratio of eBook to print book sales is reversed--when 80% of book sales by unit volume comes from eBooks? I'm not willing to guess when the industry will get to that point, but I have no doubts that it will get to that point eventually. Here are some likely results of the transition:
What will the book industry look like when today's ratio of eBook to print book sales is reversed--when 80% of book sales by unit volume comes from eBooks? I'm not willing to guess when the industry will get to that point, but I have no doubts that it will get to that point eventually. Here are some likely results of the transition:
- Print books will be much more expensive: As anyone who's purchased large print jobs, from business cards to books, will tell you, the unit cost for printing decreases dramatically as the size of a print job increases. It's sometimes no more expensive to purchase a larger print run than a small one, even if you end up recycling some of the printed materials rather than using them. The reason is that set-up costs are the same whether you're printing a small number of items or a large number, and that set-up cost is spread over the total number of items that you print. Books encounter additional set-up costs for binding, especially for hardcover books.
Companies such as Kodak, HP and Xerox are major players in the print-on-demand market, using digital presses rather than offset or letterpress in order to make small runs economical. However, some of the costs, such as binding, remain, no matter what method is used to print. Even with digital presses, it's not going to be possible to make short-run books at the same unit cost as large-run books. We're already seeing this effect, as the "Big 6" publishers are using agency pricing to boost the prices of their eBooks in order to offset the lower profit margins they're getting on print titles. That's with 20% of sales going to eBooks. When eBooks comprise 80% of sales, publishers aren't going to be able to hide the true cost of printing and binding books. - Nevertheless, print books will still be around: Even though print will be more expensive, I have no doubt that print books will survive, just as vinyl records have achieved a renaissance thanks to audiophiles and nostalgia buffs.
- Publishers will only commit to print runs for their "sure-fire hits": Just as the movie industry is fixated on producing sequels and movies based on existing successful books, television shows and comic book characters, major publishers will only print books that are from well-known, previously-successful authors, as well as new authors who are well-known from other arenas, such as television, movies, sports and politics. All of their other titles will be published as eBooks first, and will get print runs only if they're justified by customer demand.
- Bookstores will be very different: In the U.S., there will be far fewer Barnes & Noble bookstores, and the ones that remain will be much smaller. As I wrote some time ago, they're likely to be cafes with bookstores inside them, rather than bookstores with cafes inside them. They'll still carry some print books, albeit a much smaller selection. Big touchscreen displays will give customers a similar experience when shopping for eBooks that they have today when shopping for print books: They'll see bookshelves with book covers, and with a flick of a finger, they'll be able to see the back cover, inside covers, and leaf through the book, just as they can today with print books. With another few touches, they'll be able to buy the title as an eBook and download it instantly, or for some titles, purchase it in a print version that will be shipped directly to their home if it's not in stock at the store.
As for independent bookstores, there will also be fewer of them, but the ones that focus on used books will do quite well. Used titles will be much less expensive than new ones, so for price-sensitive customers and those who have to have print books, used bookstores will be their best choices. - Today's black & white eReaders will be a thing of the past: All eReaders and tablets will use color displays. In the case of eReaders, they'll use low-power displays such as Qualcomm's Mirasol and E Ink electrophoretic color displays; tablets will use LCD and OLED displays. The functionality of the devices will be more similar to each other than they are today; both eReaders and tablets will be able to handle audio, video and interactivity. The primary differences will be in battery life and cost.
- The "Big 6" will become the Big 3 or 4: Just as in the music industry, where financial problems have resulted in a wave of consolidation, we're almost certainly going to see consolidation among the "Big 6" publishers, as well as publishers in every market segment: Business and professional books, children's books, religious books, K-12 and college textbooks.
I would argue that most of these developments are already underway, and the ones that aren't will start once eBooks pass 50% market share. As I wrote in my last post, anyone who doesn't believe that eBooks will result in revolutionary changes in the book industry is fooling themselves.
Labels:
Barnes and Noble,
Big 6 publishers,
E Ink,
E-book,
HP,
Kodak,
LCD,
Mirasol,
OLED,
Print on demand,
Publishing and Printing,
Qualcomm,
Xerox
Friday, January 13, 2012
eBooks: Reality sets in for publishers
Forrester Research and Digital Book World released some details yesterday of a survey of book publishers representing 74% of U.S. revenues. According to the survey conducted late last year, publishers are actually getting more pessimistic about the future as eBooks become more important. In a similar survey in 2010, 66% of respondents said that they expected that more people would read eBooks than before; in 2011, only 47% gave the same answer. When asked whether eBooks would cause people to read more books than before, 66% of respondents answered "yes" in 2010, while 60% answered "yes" in 2011.
29% of 2011's respondents believe that eBooks will comprise 50% of all book sales in 2014; 22% of the respondents believe that eBooks won't reach the 50% mark until 2015 or later. 82% of respondents are "optimistic" about the digital transition, but while 51% of respondents in 2010 believed that their companies would be stronger as a result of eBooks, only 28% believed so in 2011.
These are only a handful of responses from what is undoubtedly a much more detailed survey, but they suggest that publishers' mindsets are changing. In 2010, many publishers believed that eBooks were only another "binding"--another way to consume books--and that they didn't represent a fundamental change. Since then, however, you'd have to be living under a rock not to recognize that eBooks are changing just about everything about the book industry:
29% of 2011's respondents believe that eBooks will comprise 50% of all book sales in 2014; 22% of the respondents believe that eBooks won't reach the 50% mark until 2015 or later. 82% of respondents are "optimistic" about the digital transition, but while 51% of respondents in 2010 believed that their companies would be stronger as a result of eBooks, only 28% believed so in 2011.
These are only a handful of responses from what is undoubtedly a much more detailed survey, but they suggest that publishers' mindsets are changing. In 2010, many publishers believed that eBooks were only another "binding"--another way to consume books--and that they didn't represent a fundamental change. Since then, however, you'd have to be living under a rock not to recognize that eBooks are changing just about everything about the book industry:
- eBooks are continuing to cannibalize print sales. Last year, eBook sales more than doubled over 2010, but sales in every category of print books tracked by Nielsen Bookscan were lower in 2011, from a drop of 3% for hardcover adult nonfiction to a 24% decline for mass market paperbacks.
- According to USA Today, for the week including Christmas 2011, 42 of the top 50 titles sold more eBook than print copies (this compares with 19 of the top 50 titles for the same week in 2010).
- When customers walk into the biggest bookstore chain in the U.S., Barnes & Noble, the first thing they see is no longer a table stacked with new print arrivals. Instead, it's a display of Nook eBook readers and tablets, staffed full-time by a salesperson.
- Self-publishing, which was once the domain of vanity presses and the last refuge for writers who couldn't get a contract with a publisher, is now a viable option for writers--even those who could get a conventional publishing deal. It's now possible for authors to sell a million copies of their self-published eBooks.
- The tasks performed by publishers, including acquisition editing, copy editing, book cover design, book layout, typography, format conversion, distribution of eBook masters to resellers and printing, are now being done by contractors, service suppliers or the authors themselves.
- Amazon, which represents both the biggest customer and the biggest frustration for many publishers, got into publishing in a big way in 2011 with five imprints. Amazon is willing to pay top dollar to sign authors such as Tim Ferriss and Penny Marshall, and to acquire backlist titles.
Publishers are beginning to understand that things aren't going to go back to the way they were before the Great Recession, and that eBooks are much more than simply another way to consume books. They may not represent as shocking a transition as the effect of television on the movie industry during the 1950s and 60s, but eBooks' impact on the book industry will be dramatic, especially given that we're still early in the transition from print to digital. What will things look like on the other side of the transition? That's the subject of a future post.
Tuesday, January 10, 2012
Vizio launches $99 Google TV set-top box
Dan Rayburn of StreamingMedia.com reports that Vizio's new Google TV-based Stream Player will ship in the first half of 2012, and will be priced at $99 (U.S.). According to Rayburn, the set-top box will only be sold directly by Vizio from its website, but I don't expect that to last--Vizio sells too much product through resellers such as Costco for the company to ignore that channel.
The VAP430 Stream Player uses the new ARM-based Google TV architecture, and Vizio has reskinned Google TV's user interface. It will have HDMI in and out (so it can be connected to a receiver or A/V amplifier in-line with another set-top box or other device without taking up an additional HDMI port), Ethernet and Wi-Fi interfaces, and a USB port that can be used to connect an external hard disk (only for playing, not recording, audio and video). It will also come with a universal remote control with both IR and Bluetooth outputs. The device will support 1080P video in and out, and Vizio claims that the device will have sufficient bandwidth to support 3D streaming.
Vizio has confirmed that the Stream Player will support Netflix, Amazon Instant Video, Hulu Plus, HBO Go (for existing HBO subscribers), YouTube, Pandora, Technicolor's new M-GO streaming video service, and others. Additional services will be announced by the time the device ships.
On paper, Vizio has hit all the right notes: The Stream Player will be priced competitively with Apple and Roku, it will run a more polished version of Google TV, and it can be connected in-line with the user's existing cable, satellite or IPTV set-top box, instead of requiring a separate HDMI connection. It remains to be seen how well the device works when it gets into the hands of consumers, and whether Google and Vizio have smoothed out the many rough spots in Google TV's user interface. If it works well, it'll help put Google TV back into the thick of the over-the-top set-top box competition.
The VAP430 Stream Player uses the new ARM-based Google TV architecture, and Vizio has reskinned Google TV's user interface. It will have HDMI in and out (so it can be connected to a receiver or A/V amplifier in-line with another set-top box or other device without taking up an additional HDMI port), Ethernet and Wi-Fi interfaces, and a USB port that can be used to connect an external hard disk (only for playing, not recording, audio and video). It will also come with a universal remote control with both IR and Bluetooth outputs. The device will support 1080P video in and out, and Vizio claims that the device will have sufficient bandwidth to support 3D streaming.
Vizio has confirmed that the Stream Player will support Netflix, Amazon Instant Video, Hulu Plus, HBO Go (for existing HBO subscribers), YouTube, Pandora, Technicolor's new M-GO streaming video service, and others. Additional services will be announced by the time the device ships.
On paper, Vizio has hit all the right notes: The Stream Player will be priced competitively with Apple and Roku, it will run a more polished version of Google TV, and it can be connected in-line with the user's existing cable, satellite or IPTV set-top box, instead of requiring a separate HDMI connection. It remains to be seen how well the device works when it gets into the hands of consumers, and whether Google and Vizio have smoothed out the many rough spots in Google TV's user interface. If it works well, it'll help put Google TV back into the thick of the over-the-top set-top box competition.
Labels:
Amazon Instant Video,
ARM,
Dan Rayburn,
Google,
googleTV,
HBO Go,
HDMI,
Hulu Plus,
Netflix,
Stream Player,
Vizio,
YouTube
JVC's 4K $5K Camcorder
After showing it at the 2010 NAB Conference as a non-working mock-up and at the 2011 NAB show as a working prototype, JVC has formally announced the first prosumer 4K camcorder, the GY-HMQ10, at CES 2012. The GY-HMQ10 records at 3840 x 2160 resolution, or four times the resolution of a conventional 2K (1920 x 1080) camcorder, using a single 1/2" CMOS imager and a fixed 10X zoom lens. It supports frame rates of 24P, 50P and 60P. The camcorder can also record in 2K mode for compatibility with existing infrastructure at 50/60P and 50/60i.
The camcorder uses AVCHD compression...but the AVCHD standard doesn't support 4K. To do it, JVC splits the 4K image into four 2K images that it records and compresses simultaneously using the company's new Falconbrid processor. Each 2K image is compressed at 36Mbps, for a total bit rate of 144Mbps. Each 36Mbps stream is then recorded on its own dedicated SDHC card. So, yes, the HMQ10 has four memory card slots. For live broadcasts and external recorders, the HMQ10 has four (yes, four) HDMI outputs. The camcorder can also down-convert the 4K images to 2K on the fly, so if 2K is all you need, you can use a single SDHC card and HDMI interface. It's also got two XLR microphone inputs.
Does it work? I saw footage that was shot on JVC's prototype at NAB last year, using a true 4K monitor. The picture quality was stunning, but there was no way to test the camera under real-world operating conditions--panning, zooming, low light, etc. So, how much would you pay for all this? When the camcorder ships in March, JVC plans to charge $4,995 (US). That's right--$5K for a 4K camcorder. The 4K mode isn't terribly practical today, and with a 1/2" imager and without a removable lens, the HMQ10 isn't going to be as flexible as a camcorder like Sony's FS100 or Panasonic's AF100/101. Nevertheless, 4K for $5K? That's pretty amazing pricing.
The camcorder uses AVCHD compression...but the AVCHD standard doesn't support 4K. To do it, JVC splits the 4K image into four 2K images that it records and compresses simultaneously using the company's new Falconbrid processor. Each 2K image is compressed at 36Mbps, for a total bit rate of 144Mbps. Each 36Mbps stream is then recorded on its own dedicated SDHC card. So, yes, the HMQ10 has four memory card slots. For live broadcasts and external recorders, the HMQ10 has four (yes, four) HDMI outputs. The camcorder can also down-convert the 4K images to 2K on the fly, so if 2K is all you need, you can use a single SDHC card and HDMI interface. It's also got two XLR microphone inputs.
Does it work? I saw footage that was shot on JVC's prototype at NAB last year, using a true 4K monitor. The picture quality was stunning, but there was no way to test the camera under real-world operating conditions--panning, zooming, low light, etc. So, how much would you pay for all this? When the camcorder ships in March, JVC plans to charge $4,995 (US). That's right--$5K for a 4K camcorder. The 4K mode isn't terribly practical today, and with a 1/2" imager and without a removable lens, the HMQ10 isn't going to be as flexible as a camcorder like Sony's FS100 or Panasonic's AF100/101. Nevertheless, 4K for $5K? That's pretty amazing pricing.
Labels:
4K resolution,
AVCHD,
Camcorder,
GY-HMQ10,
HDMI,
JVC,
Secure Digital,
Sony,
XLR
Thursday, January 05, 2012
Desperation time: Warner Bros. doubles the waiting time for DVDs
All Things Digital reports that Warner Brothers is set to double the delay between the time that DVDs and Blu-Ray discs go on sale and when they're available for rental through Netflix, Redbox and Blockbuster from 28 to 56 days--almost two full months. (In a separate decision, Warner Brothers' sister division HBO has decided to stop selling DVDs to Netflix altogether, requiring the company to purchase the movies at retail price.)
Warner Brothers' plan is very likely to anger consumers but have no substantial effect on DVD sales. The reason is that consumers who are already unwilling to pay for a DVD in order to see it a month sooner aren't likely to be willing to pay for it in order to avoid a two-month delay. Under Warner Brothers' new plan, movies will hit the rental and pay-TV/video-on-demand markets at about the same time. The plan could actually backfire and lead to lower wholesale sales of DVDs and Blu-Ray discs, since Netflix, Redbox and Blockbuster may purchase fewer copies due to the increased competition from video-on-demand and streaming services.
Warner Brothers and other studios can't turn back the clock and can't change the economy. They might be able to make their plan work, if they make UltraViolet versions of their movies available without having to first purchase the movies on DVDs or Blu-Ray, at a reasonable price and with a much simpler process than they have today. That would make services like Warner Brothers' Flixster a real alternative to Netflix, rather than an ill-conceived tool for decreasing piracy.
Warner Brothers' plan is very likely to anger consumers but have no substantial effect on DVD sales. The reason is that consumers who are already unwilling to pay for a DVD in order to see it a month sooner aren't likely to be willing to pay for it in order to avoid a two-month delay. Under Warner Brothers' new plan, movies will hit the rental and pay-TV/video-on-demand markets at about the same time. The plan could actually backfire and lead to lower wholesale sales of DVDs and Blu-Ray discs, since Netflix, Redbox and Blockbuster may purchase fewer copies due to the increased competition from video-on-demand and streaming services.
Warner Brothers and other studios can't turn back the clock and can't change the economy. They might be able to make their plan work, if they make UltraViolet versions of their movies available without having to first purchase the movies on DVDs or Blu-Ray, at a reasonable price and with a much simpler process than they have today. That would make services like Warner Brothers' Flixster a real alternative to Netflix, rather than an ill-conceived tool for decreasing piracy.
Labels:
Blockbuster,
Blu-ray Disc,
DVD,
Flixster,
HBO,
Netflix,
Ultraviolet,
Video-on-Demand,
Warner Brothers
Reality bites: Google replaces Intel with Marvell for Google TV
Earlier today, The Wall Street Journal reported that Marvell has replaced Intel as the lead chipset supplier for Google TV. The deal is non-exclusive, but the bigger news is that Google has replaced Intel's X86 architecture with ARM, which is supported by Broadcom, nVidia, Samsung and Texas Instruments, along with Marvell and others. When the first Google TV devices were released by Logitech and Sony, it was clear that they were far too expensive for the market; for example, while Apple was selling Apple TV for $99 (U.S.) and Roku's set-top boxes were priced at $99 or less, the Logitech Revue was launched at $399. In order for Sony and Logitech to be competitive, they had to drastically cut prices and, in Logitech's case, take huge losses. (Logitech subsequently abandoned Google TV.)
By switching from the Intel architecture to ARM, Google TV's licensees will gain a less-expensive, lower-power platform that can compete with set-top boxes from Apple, Roku and others on both price and performance. They'll also get a choice of multiple processor vendors; for example, even though Marvell is the lead partner, there's nothing keeping Samsung from using its own ARM-based processors in its HDTVs, Blu-Ray players and set-top boxes.
In short, this is the move that Google should have made from the beginning. With lower-priced set-top boxes, the ability to run apps and an operating system based on a more modern version of Android, Google TV 2.0 should be significantly more successful than the original version. At the very least, it has a chance for survival, instead of being "dead on arrival".
By switching from the Intel architecture to ARM, Google TV's licensees will gain a less-expensive, lower-power platform that can compete with set-top boxes from Apple, Roku and others on both price and performance. They'll also get a choice of multiple processor vendors; for example, even though Marvell is the lead partner, there's nothing keeping Samsung from using its own ARM-based processors in its HDTVs, Blu-Ray players and set-top boxes.
In short, this is the move that Google should have made from the beginning. With lower-priced set-top boxes, the ability to run apps and an operating system based on a more modern version of Android, Google TV 2.0 should be significantly more successful than the original version. At the very least, it has a chance for survival, instead of being "dead on arrival".
Tuesday, January 03, 2012
Best Buy: Another example of the "self-inflicted wounds" rule
Forbes.com has an op-ed post about Best Buy. It centers around a case of horrible customer service that the writer and his friend experienced a few days ago at a store in the San Francisco Bay area. They wanted to buy a copy of the Blu-Ray version of "How to Train Your Dragon", which is a Best Buy exclusive (when will the movie studios stop giving exclusives on hot titles to retailers and consumer electronics companies?). Best Buy's website said that the movie was back-ordered online but was in stock at a local store. They went to the store, only to find that the movie was out-of-stock. They weren't offered a rain check or even an apology. When the men shopped for another Blu-Ray disc at the store, an unkempt salesperson came over and tried to sell them a television service (they never found out exactly what he was selling). They finally found a movie to buy, only to realize when they got home that, with all the irritation, they'd picked up the wrong one. When they came back to return the unopened disc, they were told that all software sales are final and were refused a refund.
Just about everyone I know who shops at Best Buy has a customer service horror story. Last year, I went there to buy a point-and-shoot camera. They had the model I wanted on display, but when I told a salesperson that I wanted to buy one, he told me that it was out of stock. He said that another store had it, and that they would ship it to his store, and I could come back in a day or two and pick it up. I told him that if I had to drive back to his store in a day or two, I could just as easily drive to the store that had it and buy it that day, or go online and have Best Buy ship it directly to my home. The salesperson insisted that I have the camera shipped to his store for pickup, most likely so that he could get credit for the sale, and I left without buying anything.
Shortly before Christmas, Best Buy told an unknown number of customers that it couldn't fulfill their online orders for products, some of which had been placed as early as Thanksgiving. Best Buy refuses to reveal the number of customers affected, or which products were involved. After Christmas, the company sent an email to its regular customers with a video from company CEO Brian Dunn. In the video, he thanked customers for their business and said that things would be "even better" in 2012. He didn't apologize for the holiday order snafu. Just a single sentence, such as "We're committed to improving customer service online and in our stores" would have indicated that top management is willing to admit that the company has customer service problems. Instead, the impression left was that Dunn and his team are in denial.
Most businesses that fail do so because of self-inflicted wounds, rather than competitors or the economy. Circuit City brought on its own failure, through a combination of poor locations, confusing store layouts, skimpy product selection and lousy customer service. Best Buy has a perfect example of what not to do from Circuit City's example, but it's going down the same path.
The best thing that the company could do is to close the stores that it plans to close in 2012 quickly, and then focus 100% on improving customer service:
The good news is that Best Buy has time to turn itself around, but it first has to acknowledge that it has a real customer service problem.
Just about everyone I know who shops at Best Buy has a customer service horror story. Last year, I went there to buy a point-and-shoot camera. They had the model I wanted on display, but when I told a salesperson that I wanted to buy one, he told me that it was out of stock. He said that another store had it, and that they would ship it to his store, and I could come back in a day or two and pick it up. I told him that if I had to drive back to his store in a day or two, I could just as easily drive to the store that had it and buy it that day, or go online and have Best Buy ship it directly to my home. The salesperson insisted that I have the camera shipped to his store for pickup, most likely so that he could get credit for the sale, and I left without buying anything.
Shortly before Christmas, Best Buy told an unknown number of customers that it couldn't fulfill their online orders for products, some of which had been placed as early as Thanksgiving. Best Buy refuses to reveal the number of customers affected, or which products were involved. After Christmas, the company sent an email to its regular customers with a video from company CEO Brian Dunn. In the video, he thanked customers for their business and said that things would be "even better" in 2012. He didn't apologize for the holiday order snafu. Just a single sentence, such as "We're committed to improving customer service online and in our stores" would have indicated that top management is willing to admit that the company has customer service problems. Instead, the impression left was that Dunn and his team are in denial.
Most businesses that fail do so because of self-inflicted wounds, rather than competitors or the economy. Circuit City brought on its own failure, through a combination of poor locations, confusing store layouts, skimpy product selection and lousy customer service. Best Buy has a perfect example of what not to do from Circuit City's example, but it's going down the same path.
The best thing that the company could do is to close the stores that it plans to close in 2012 quickly, and then focus 100% on improving customer service:
- Improve employee training.
- Encourage good employees to stay with pay and benefits; don't repeat the mistakes that the company made in 2007, when it fired 3,400 of its most experienced (and highest paid) salespeople, replacing them with cheaper new hires, and in 2009, when it demoted as many as 8,000 senior sales associates to regular sales positions, with 25% to 50% pay cuts.
- Get rid of the third-party salespeople, or at least force them to stay at their stations (I was shopping for a laptop at Best Buy a couple of months ago, and a third-party Comcast salesperson came over and tried to sell me a Comcast subscription. I suspect that the person who interrupted the writer of the Forbes.com article was a third-party salesperson.)
- Implement whatever systems are necessary to avoid a repetition of the pre-Christmas order cancellations.
- Improve the accuracy of inventory counts on the Best Buy website, so that customers aren't told that products are in stock in stores when they're actually sold out.
The good news is that Best Buy has time to turn itself around, but it first has to acknowledge that it has a real customer service problem.
Labels:
Best Buy,
Brian Dunn,
Circuit City Stores,
Customer service
Monday, January 02, 2012
Is there a market for "enhanced" eBooks?
Shortly before Christmas, The Huntington Post published an interview with David Prichard, the President and CEO of Ingram's Content Group. Ingram is the largest book distributor in the U.S. and operates Lightning Source, which is one of the largest publishing-on-demand services companies, and a major vendor to self-publishers. In the interview, Prichard talked about the future of publishing, and one of the things he talked about were "enhanced" eBooks:
"Enhanced e-books are only in their infancy, allowing authors to add alternative endings or interviews. Down the road, who knows what's possible? Maybe we will have biometric devices that can sense your pulse and body temperature and change the plot based on your feelings -- and you think Stephen King is scary now."
"...for example, a biography can to come to life in many ways. Jacqueline Kennedy: Historic Conversations on Life with John F. Kennedy has all of the interview audios, videos, photographs, text, and transcripts available. Even classics -- Penguin has updated Pride & Prejudice with clips from the movie and even instructions on dancing. For the 75th anniversary of The Hobbit, HarperCollins released an e-version with exclusives including J.R.R Tolkien's book illustrations and recently discovered Tolkien recordings. Publishers are still learning what added value readers will or won't pay for. I expect we'll continue to see lots of experimentation in this arena."Seth Godin, well-known author and marketer, responded on paidContent.org to Prichard's remarks:
"It (the interview) is filled with breathtaking visions of the future, and they are economically ridiculous. The Long Tail creates acres of choice, so much as to make the number of options almost countless. But at the same time, it embraces (in every format) much lower production values. For what Michael Jackson and Sony (NYSE: SNE) paid to produce the Thriller album, today’s artists can make and market more than 5,000 songs. You just can’t justify spending millions of dollars to produce a record in the long tail world."
"The same thing that happened to music is going to be true of books. The typical e-book costs about $10 in out of pocket expenses to write (more if you count coffee and not just pencils). But if we add in $50,000 for app coding, $10,000 for a director and another $500,000 for the sort of bespoke work that was featured in Al Gore’s recent “book”, you can see the problem. The publisher will never have a chance to make this money back."
"Sure, there will be experiments at the cutting edge, but no, they’re not going to pay off regularly enough for it to become an industry. The quality is going to remain in the writing and in the bravery of ideas, not in teams of people making expensive digital books."Others have picked up on the discussion; for example, the Teleread blog summarized Godin's post, and as of now, every comment on the Teleread post opposes Godin's position. I agree with what Godin wrote, with some reservations. First, let's leave the "long tail" arguments aside--the long tail theory has largely been debunked. The long tail only makes money for distributors, who can aggregate small numbers of sales from a large number of publishers/writers/producers. However, the market is being flooded by titles from self-publishers, and it's harder than ever for consumers to separate the wheat from the chaff. Price is no longer an indicator of quality. The most likely outcome is that there will be a small number of titles that do well (as usual), and an ever-larger collection of titles that barely, if ever, earn back their investment in time and money.
Many of the Teleread commenters objected to Godin's statement that it costs "$10 in out of pocket expenses to write" an eBook. He doesn't include editorial, design and eBook conversion services, which can cost hundreds to thousands of dollars if an author farms them out, but that's not what the commenters objected to. Their concern was that Godin made no accounting for the value of the time that authors spend writing. I understand their arguments, but I'm not sure that they're realistic, especially in today's climate. In the 1990s, I wrote two computer books, one for Prentice Hall and the second for Ventana. The first one earned back its advance and sold around 12,000 copies domestically, as well as local-language reprints in a variety of markets. The second one was never released in the U.S., but was released by Ventana's partner in Japan as a local-language title. It didn't earn back its advance. Considering the time I spent writing the two books and the amount I earned, I would have made about the same amount on an hourly basis if I'd worked at Burger King. That's why I stopped writing books.
Now, to Godin's central point: Most publishers and self-publishers are very unlikely to recoup the additional cost for adding rich media and interactivity to their eBooks. His cost estimates may be off, but his logic is correct. The fact is that most "enhanced" eBooks to date have sold poorly. If you're creating a native app for iOS or Android and you have to hire developers to do it, that costs money. Even if you're sticking with, say, Apple's EPUB extensions for rich media, Barnes & Noble's extensions for Nook Kids, Kindle Format 8 (when it becomes available to all publishers) or, in the not-too-distant future, EPUB3, there's a cost in time and money for adding interactivity and rich media. For now, at least, you're unlikely to earn back that cost. Thus, the most reasonable approach is to create conventional eBooks.
At some point, enhanced eBooks will "crack the code" and become widely popular, and the additional front-end expense to produce them will be justified. Today, however, that's not the case.
Your market value is what your customers or clients will pay for your time. In the case of a self-publisher, it's the income that you get from your title divided by the number of hours you spent working on it. If that number doesn't satisfy your financial requirements, you have to increase the number of copies you sell, change your pricing, or do something else that pays more money.
At some point, enhanced eBooks will "crack the code" and become widely popular, and the additional front-end expense to produce them will be justified. Today, however, that's not the case.
Labels:
Android,
Barnes and Noble,
eBook,
EPUB,
Ingram,
iOS,
Kindle,
Lightning Source,
Long Tail,
nook,
Prentice Hall,
Seth Godin,
Teleread,
Ventana
Subscribe to:
Posts (Atom)