Thursday, December 26, 2013

2014: The year of 3D printing?

I've had a chance to read Make's 2014 Ultimate Guide to 3D Printing, and if you have even a casual interest in the subject, you should buy a copy. Compared with last year's version, the 2014 edition has many more printers, a wider range of technologies (fused filament and optical resin printers, delta printers, and printers combined with CNC mills,) digitizers (to create 3D models for printing,) extruders (for making your own filament from plastic pellets,) and a wider range of plastics. Looking at the cover of Make's guide, you'd think that consumer 3D printers are "ready for prime time"--that is. until you look at the actual printers and read the reviews. Then, you see products that look like they were built with Tinkertoys and parts from an Erector set. Most printer beds need to be manually leveled to insure that parts print correctly. Some printers only use one kind of plastic, or need to be carefully cleaned when switching from one plastic to another. Some come with little or no documentation. Make doesn't discuss if they had any experiences with getting malfunctioning printers serviced, other than being assisted in fixing the problems themselves.

Make does point out some printers that it describes as having "just hit print" simplicity, but even those printers had problems: Poor print quality compared to competitors' models at a similar price, filament jams and overheating, and software problems. There's nothing on the market that compares to conventional laser printers for ease of installation and use. Even the simplest 3D printer requires significant skill and manual intervention to use. Nevertheless, at the end of 2013, it's easy to see that 3D printer manufacturers are improving their products by leaps and bounds. Prices are dropping--for example, the Printrbot Simple, a fully-assembled, fully-functional 3D printer that's a great entry into the world of 3D printing, costs only $399. Some 3D printers can turn out significantly bigger parts than they could last year. The "simple to use" models are indeed far simpler than the models from a year ago. Resin printers offer the potential of significantly higher-quality printing. Software is improving--in some cases, faster than the printers themselves.

Unless you're prepared to do a lot of the work yourself, I'd hesitate to buy any of the models available for sale today. However, there's enormous potential for the 3D printers in Make's 2015 guide to be faster, smarter, able to handle more materials and, most importantly, much easier to use than this year's models.

Sunday, December 22, 2013

NSA: The world's biggest hoarder?

It dawned on me this weekend that the NSA is exhibiting classic hoarder behavior. TV shows such as "Hoarders" and "Hoarding: Buried Alive" visit the homes and apartments of compulsive hoarders, which are inevitably stacked to the ceiling with everything you can imagine--magazines, books, cats and dogs, fingernail clippings, used pizza boxes, etc. The rationalization often given by compulsive hoarders is that they're keeping these things in case they need them someday.

That's exactly the same rationalization that the NSA has used for many of its data collection programs. The agency is running hundreds of programs under nondescript codenames, vacuuming up telephone call metadata, emails, texts, tweets, browser histories, etc., in the hope that they may be useful for stopping a terrorist plot. There are so many programs, operating under so many different sets of rules, that analysts at the NSA can't keep track of them all. Having run out of space to keep all the data in its existing data centers, the NSA is spending billions of dollars to build new data centers in Utah and Maryland. That's like a hoarder renting storage units when they have no more room in their house.

Last week, Judge Richard Leon ruled that the NSA's program of storing all the phone call metadata for every person in the country for five years is likely to be a violation of the Fourth Amendment. In his ruling, Judge Leon wrote that neither the NSA nor the Justice Department had presented any evidence that the NSA's massive, multi-year phone metadata collection program had contributed to thwarting or solving a single case of terrorism.

The NSA has explicitly argued that it needs to keep billions of phone records (and, by extension, everything else) for years because it wants to be able to go back through them if necessary. Judge Leon wrote that, to date, the NSA hasn't found anything useful in the phone records, and there's no justification for the agency's massive violations of the Fourth Amendment. That sounds an awful lot like compulsive hoarding behavior.

Would the NSA have been a lot more selective in its data collection if General Alexander wasn't running it? I suspect so...and given what he's done at the NSA, I wouldn't want to visit the General's house.

Saturday, December 21, 2013

Can we PLEASE do something about credit cards?

If you live in the U.S., you've probably heard about the theft of as many as 40 million credit and debit card numbers from Target customers between November 27th and December 15th. As with so many of these thefts, the first public disclosure came not from the merchant or card processor that lost the data, but from a third-party source. In Target's case, it was security researcher Brian Krebs who pieced together the story. Krebs buys credit and debit card numbers and other personal information from "darknet" sources on behalf of banks and other clients, and he noticed that a flood of numbers that apparently came from Target were available for sale. Theft of credit and debit card information has become a common occurrence in the U.S., and some researchers claim that as few as 5% of thefts ever get detected and disclosed publicly.

When I heard about the Target theft, I checked my banking records, and sure enough, I used my debit card there a couple of times during the period in question. So, yesterday, I drove over to my local bank branch, cancelled my debit card and got a new one. That was the third time in a little more than a year, and the second time in two months, that I had to cancel my debit card and get a new one. The first time was a scam at Barnes & Noble stores that involved replacement of point-of-sale credit card terminals in dozens of stores with hacked versions that sent complete transaction information, including PIN numbers, to hackers. The second time was due to the hack of Adobe's transaction processing system earlier this year, and now, it's Target for the trifecta.

Barnes & Noble, Adobe and Target are responsible for their security failures, but banks share some responsibility as well. These kinds of data losses are almost unheard of in Europe, where banks issue smart cards to their customers. Smart cards use two-factor authentication to insure that only the proper owner is using it, and encryption to keep anyone except the bank authorizing payment from either intercepting or saving the account information. Smart cards aren't in wide use in the U.S. because they're significantly more expensive than magnetic stripe cards, but, using me as an example, I have to believe that a single smart card has to be less expensive than six magnetic stripe cards (three temporary and three permanent replacements) plus the time of bank tellers, managers and phone customer service personnel spent processing and issuing those replacements. (Update, 12/22/13: According to Brian Krebs, reissuing a magnetic stripe credit or debit card costs from $3 to $5; Gemalto, one of the biggest smart card vendors, says that the average cost for a smart card with a microprocessor is $3.72. Even if that number is on the low side, it means that banks would be ahead of the game, or would at least break even, with smart cards vs. replacing mag stripe cards.)

Whether it's an encryption-based system or a "one-time pad" approach where the customer gives the merchant an account number issued by their financial institution that's good for only one transaction and is useless if anyone tries to use it again, the U.S. needs to move to a more secure and reliable method for credit and debit card transactions. The system we have now is no more secure than the weakest transaction system used by any merchant--which means that we have almost no security at all.

Saturday, November 09, 2013

Intel acquires eTextbook distributor Kno, co-founder departs

TechCrunch reports that Kno, a higher-education eTextbook distributor, has been acquired by Intel. It will become part of Intel's global education program run by John Galvin, a Vice President of Sales and Marketing at Intel. The acquisition terms weren't released. (Update, November 11: On GigaOm, Om Malik wrote that according to his sources, Intel acquired Kno for $15 million plus some retention bonuses for employees. Kno's equity investors put $73.4 million into the company, and Kno also took on $20.3 million in debt, so Malik believes it's likely that the investors will see little or no return.)

Kno's initial plan in 2009 was to develop and launch its own higher-education oriented tablet, for which the company would sell eTextbooks. The tablet had dual 14.1" touchscreens and used both touch and stylus input. However, less than a year after the company was founded, Apple introduced its original iPad, which dramatically changed customer expectations for how big tablets should be and how they should work. Kno reportedly only shipped a handful of its tablets to customers in late 2010 before giving up on its proprietary design and selling it to Intel in April 2011 for $20 million (which was structured as an equity investment.) Kno subsequently focused on software, and developed eReaders for the iPad and Android tablets, Windows 7 and 8, and in-browser use. According to Wikipedia, Kno began selling content for the K-12 market in April 2012, but the company isn't a significant player in that market.

Once Kno dropped its proprietary hardware, it became just another eTextbook vendor, and had to compete with far more established players including Follett, Barnes & Noble, Amazon, CourseSmart and Chegg, which was co-founded by Kno co-founder Osman Rashid. To date, eTextbooks have been poorly accepted by college students, and unlike its competitors, Kno doesn't also offer print textbooks, which put it into an even worse competitive situation.

TechCrunch reports that Kno's investors, led by Andreessen Horowitz, have been pushing the company for several months to find a buyer or some other way to exit. One interesting fact is that a difference of opinion between Kno CEO Rashid and Intel VP Galvin led to Rashid's departure when the acquisition was announced; Rashid wanted to continue focusing on the North American market, while Galvin wants to focus on international markets. Given how competitive the U.S. and Canadian markets are, an international focus for Kno makes a lot of sense. However, it's not clear to me what Intel brings to the party, other than money. The company has never sold content successfully and pulled the plug a week ago on its biggest investment in content to date, its OnCue over-the-top video content service. It also appears that Intel did nothing with the tablet designs that it acquired from Kno in 2011. Intel wrote that "The acquisition of Kno boosts (our) global digital content library to more than 225,000 higher education and K-12 titles through existing partnerships with 75 educational publishers." Kno claims on its own website that it has more than 200,000 titles, so its collection is responsible for all or nearly all of Intel's "global digital content library."

My bet is that within a couple of years, Intel will shut down or sell off Kno, just as it's rumored to be selling off its OnCue business to Verizon for a small fraction of what it invested. Intel isn't a content company, its management doesn't understand content, and in my opinion, it should make investments in content companies but leave content production and distribution to others.

Tuesday, October 29, 2013

The future of public radio isn't its past

A couple of weekends ago, I attended "Radiolab Apocalyptical," a live show presented by Radiolab hosts Jad Abumrad and Robert Krulwich at the landmark Chicago Theatre. For those of you who aren't familiar with Radiolab, it's a weekly hour-long public radio show that covers science topics in an incredibly engaging way. The Chicago Theatre was packed, but what was really interesting was who it was packed with. The audience was young--most of the people I saw were in their 20's and 30's--not at all what you'd expect from a public radio audience in the U.S. Cast and crew members from NBC's "Chicago Fire" and "Chicago PD" tweeted pictures of themselves backstage at "Apocalyptical."

For much of the last two decades, public radio has been wrestling with a big problem: Its audience, while growing, is getting older. Twenty years ago, I was a member of one station's "big donor" club, and I attended several of its events. Other than the radio station's staff, I was by far the youngest person attending; most of the attendees appeared to be in their 60's or 70's. The public radio audience is, quite literally, dying off. Listeners have historically been older, well-educated and high-income. That's great when you're seeking donations, but not so good when you're trying to attract younger listeners and get them to become members for decades.

Public radio stations have been torn between keeping their older audiences and attracting younger ones. That becomes more difficult when even the hosts of shows that attract older listeners start retiring. Tom and Ray Magliozzi, the hosts of "Car Talk," one of the most popular shows on public radio, retired last year. Instead of cancelling the show or taking a chance on new hosts, WBUR Boston and National Public Radio (NPR) decided to rerun old episodes. Some of the episodes being rerun are 20 years old, and since they cover car repairs, Tom and Ray are talking about cars that many listeners may not have ever heard of. (Geo Metros, anybody?) Just about every public radio station runs Car Talk at least once a week; some run it several times a week.

Another show that's very popular with older audiences is "A Prairie Home Companion." Creator and host Garrison Keillor caused panic at public radio stations when he told the AARP in March 2011 that he planned to retire from the show this year. In December of that year, he changed his mind, and now says that he'll keep doing the show as long as he "love(s) doing it."  "Car Talk" and "A Prairie Home Companion" are two examples of public radio shows that appeal almost entirely to an older audience, but they're far from the only ones.

There are shows on public radio that draw younger audiences; "Radiolab" is one, along with "This American Life," "The Moth Radio Hour," "Wait, Wait, Don't Tell Me" and Roman Mars' "99% Invisible." But, shows for younger listeners are far outweighed, both in number and in scheduling, by those that target older listeners. If public radio stations are serious about attracting younger listeners, the first thing they can do is cancel "Car Talk" and put a show for younger listeners in its place. There's no excuse for giving prime radio time to 20-year-old reruns of a radio show that many listeners found banal when the shows were new.

And, while we're at it, let's get rid of pledge weeks (a misnomer, because most of them now run two weeks.) Pledge weeks give regular listeners to public radio a strong disincentive to contribute, because everyone has to put up with having their favorite shows interrupted whether they pledge or not. With the wide availability of smartphones that can receive streaming audio, public radio stations can offer subscribers apps that would enable them to listen to uninterrupted programs, so long as they maintain their subscriptions. Pledge weeks would continue on the broadcast station, but paid-up members could avoid them by listening through their smartphones. That's what younger listeners are doing already with Pandora, Spotify, iHeartRadio, Slacker and other services. These services typically offer free and paid versions; with the paid version, subscribers have to sign in with a username and password, but they get their music or talk commercial-free.

Public radio already has the shows to attract younger audiences (although it could have a lot more,) and it has the technology to do away with pledge weeks for listeners on smartphones. If public radio stations are serious about getting younger listeners to become members, the ball is in their court.

Wednesday, October 23, 2013

The unbearable misery of upgrading to Windows 8.1

There are some events that we humans know from experience will be painful: Childbirth, passing a kidney stone, or getting a colonoscopy without anesthesia. After last weekend, I can now add, without hesitation, upgrading Microsoft Windows. I've been doing Windows upgrades since Windows 3, and it's always been relatively straightforward: You either do a straight upgrade that includes everything, or you do a clean install of the new version, restore your files and reinstall your applications.

Things got muddled with Windows 7. I, like many people and companies, decided to skip the abomination against nature that was Windows Vista and stay with Windows XP. That turned out to be the right decision, but when Microsoft introduced Windows 7, even though it really wanted XP users to upgrade, it prohibited direct upgrades from XP. That meant that XP users could move files, folders and some settings over to Windows 7, but they had no choice other than reinstall all of their applications. In my case, I had a huge number of applications that had accumulated over the years that I was running XP. So, I nursed my old system along until last winter, until it was just too slow to be useful any more, and then transferred what I could to a newer PC running Windows 8 and reinstalled my apps.

Then came Windows 8.1, specifically the Windows 8.1 Preview, which I installed when it was released. I hoped that Windows 8.1 would solve some of the worst problems with Windows 8, starting (no pun intended) with the Windows Start menu. Although the Preview never fixed the Start menu issue, it did resolve some of Windows 8's other aggravating design flaws. However, when it came time to upgrade to the final release of Windows 8.1, the Preview turned out to be much more trouble than it was worth.

Here's a step-by-step summary of the update:
  1. Bought a new Toshiba Satellite laptop with a current-generation Core i7 processor and 16GB of RAM to replace my old Samsung first-generation i3 laptop.
  2. Installed the Windows 8.1 Preview on the new machine to make it simple to transfer everything from my old machine to the new one.
  3. Learned that Windows 8.1 Preview cannot be updated to the final release version of 8.1 without having to reinstall all the applications, so I put everything on hold for almost a month until the final version of Windows 8.1 was released.
  4. Last weekend, installed the final Windows 8.1 on my new machine from the Windows App Store.
  5. Found that I couldn't install the final version on that machine without a serial number, and it would cost more than $100 to get one, so I figured out how to roll the new machine back to Windows 8. That also reinstalled all the crapware I had removed from the machine, but at least it allowed me to install Windows 8.1 for free.
  6. Purchased a to-remain-nameless PC transfer program and installed the program on both machines. The software required that I uninstall all Norton software from both PCs, so I got all of the Norton software off my old machine, and all the preinstalled Norton crapware off the new one except for one anti-theft program that failed to uninstall.
  7. Ran the PC transfer program, which failed instantly with a cryptic error indicating that the registry on my old machine was corrupt.
  8. Sent diagnostic file from old machine to vendor of PC transfer software, and received reply saying that I had to send diagnostic files from BOTH machines, even though there was no way to do so from the old machine using the vendor's program, and oh by the way, I still had some Norton software on the new machine, and I had to remove ALL OF IT.
  9. Manually removed every remaining Norton-related file I could find on the new machine and tried the transfer again. It again failed instantly with the same cryptic error.
  10. Checked the registry on the old machine and downloaded software to send a diagnostic log from that PC to the transfer software vendor. Sent diagnostic logs from both machines. Received a reply the next day from the same tech support engineer that was word-for-word identical to the previous one, except that he highlighted the note that I HAD TO UNINSTALL ALL THE NORTON SOFTWARE.
  11. Reinstalled Norton on the old machine so that I had some protection on the Internet, then purchased and downloaded a copy of Laplink PCMover Professional. Installed PCMover Professional on both machines. Threw the unnamed transfer software into the garbage.
  12. PCMover's documentation was much less cryptic than that from the unnamed transfer software, but it required me to disable a fair amount of software (Internet security, backup, defragmenter, scheduled tasks, etc.). Disabled everything relevant I found on both systems and started the transfer.
  13. The transfer took about six hours, but it worked. All but a handful of programs from my old machine transferred over, and PCMover was very clear about what it could and couldn't transfer.
  14. Once I started working on the new machine, I found out that not everything that transferred worked as expected. For example, Google Chrome worked fine on the new machine, but it couldn't update. The solution was to remove all my Google applications, wait an hour, and then reinstall them, which fixed the problem.
Step 5 started last Friday evening; it took me to Monday night to get through Step 14. So, what are the lessons to be learned from this experience?
  1. Never install a Windows Preview or Beta version on your primary PC. Even if you're a developer, load it on a PC that you can afford to wipe when the Gold Master or Final Release is available.
  2. Use Laplink to do the system transfer when you're not doing an in-place upgrade, and avoid transfer software from companies you've never heard about before, even if they have good reviews. You don't know if those reviews were paid for.

Sunday, September 08, 2013

Can I have the version without backdoors and built-in defects?

Last week, The Washington Post reported that the NSA has gotten hardware and software vendors to implement backdoors and exploitable design defects in their products. In one case, the NSA learned that an unnamed country had placed an order with an unnamed U.S. vendor for networking hardware, and the vendor agreed to install NSA-designed backdoors in that hardware.

Despite the fact that the NSA has no business doing mass interceptions of Internet communications and phone calls that both originate and terminate in the U.S., I'm not terribly worried about the NSA using backdoors into my network hardware and software. However, the problem with backdoors and intentional design defects is that anyone that can find them can use them. By making its job of penetrating networks easier, the NSA has also made hackers' and foreign countries' jobs of penetrating those same networks easier. They've compromised everyone's data security. This is what's called an unintended consequence, but it's very real.

The NSA may very well have also compromised the U.S. Government's own security. Consider that compromised hardware and software may be in use at U.S. defense contractors, and that those "engineered defects" could be exploited by China, Russia and who knows who else. That's why it's so important that companies not give in to the NSA's demands to build backdoors and design defects into their hardware and software, and that the NSA not make the requests in the first place.

Wednesday, September 04, 2013

Samsung's Galaxy Gear Smartwatch: A placeholder for the real thing?

Yesterday, Samsung announced its thoroughly leaked Galaxy Gear smartwatch at simultaneous presentations in New York and Berlin. It immediately became the primary topic of conversation on cable business news channels. On Bloomberg TV, as an example, the split of analysts' and reporters' opinions about the Galaxy Gear was, by my count, about 25% positive and 75% negative. I've come down on the negative side as well, because at the end of the day, the market for a device with all of its limitations seems very small:
  • At its release on September 25th, the Galaxy Gear will work with only two devices, both of which are also being released the same day: The Galaxy Note 3 "phablet" and the updates Galaxy Tab 10.1 tablet, both of which run Android 4.3, which is apparently required for the Galaxy Gear to work. The Galaxy S4 smartphone will be updated with Android 4.3 some time in October, but if you own an earlier Samsung smartphone or tablet, or an Android device from any other vendor, you're out of luck. (Compare that to the Pebble smartwatch, which works with a variety of Android and iOS devices.)
  • The Galaxy Gear has to be recharged every day. That means taking the custom charger base with you whenever you go on a business trip. (Again, compare it to the Pebble, which runs for seven days on a full charge.) Your average quartz watch needs a new battery once a year or so.
  • The Galaxy Gear has a built-in 720p 640 x 640 camera, but it's built into the band. The camera's position is fine for taking pictures of other people, but you can't take a selfie without twisting your wrist into an unnatural position, which doesn't allow you to view the watch's screen and take the picture simultaneously.
  • As a Mashable editor pointed out on Bloomberg TV, the Galaxy Gear is too big for most women's wrists--which immediately eliminates more than 50% of the potential market.
  • Reviewers are criticizing the performance of the Galaxy Gear's user interface, which is most likely caused by the use of an 800 MHz processor in order to keep power consumption down. 
  • The Galaxy Gear is priced at $299 (U.S.). That's twice the price of the Pebble, and the same price as a high-end smartphone on a two-year contract from most carriers. How many people are likely to buy a smartwatch that adds very little functionality and is the same price as (or even more expensive than) their smartphone?
It's very likely that Samsung already knows that the Galaxy Gear has all these problems, but it's more important to get customer feedback on the first-generation model--and to get it out before Apple--than it is to get everything right. For now, the target market, consisting of obsessive-compulsive male contortionists who'll charge their Galaxy Gear every day, are prepared to buy a new smartphone or tablet in order to use it and don't care about performance or how much money they spend, seems awfully small.

Wednesday, August 21, 2013

You can help save drive-in movies

Bloomberg reports that drive-in movie theaters, which once were a central part of suburban life in the U.S., have passed the point of being endangered species and are headed toward extinction. At their peak, there were more than 4,000 drive-in theaters operating in the U.S.; today, there's only about 360. The remainder have been converted into swap meets, bulldozed to make room for strip malls and subdevelopments, or simply leveled and left to turn back into grassland.

210 of the remaining 360 drive-ins are threatened with closure if they can't replace their film projectors with digital models. Motion picture distributors have just about stopped sending their movies to theaters in the form of film reels; instead they typically send reusable hard drives loaded with encrypted versions of movies. Digital projectors cost $50,000 to $100,000, depending on their throw (how far they can project a viewable image) and resolution.

Honda has stepped up to help with a promotion called "Project Drive-In," which will give five digital projectors to drive-in theaters chosen by voters on the projectdrivein.com website. The site also has links to a separate Indiegogo campaign to raise money to help drive-ins with digital conversion. This video explains the problem and Honda's program:


It's not just drive-ins that are in danger; art and nonprofit theaters are also threatened. If there's a drive-in, art or nonprofit theater in your city or town, it's worth it to call or stop by and see if they need help with funding their digital conversion. Even ten or twenty dollars could be a big help.

Tuesday, August 06, 2013

Chromecast, Leap Motion and Ouya: Experiences and updates

So far this summer, I've gotten several new gadgets; here's the rundown and my experiences so far:

Ouya

I was an Ouya Kickstarter backer and got my system a few weeks before it arrived on store shelves. At the time, I wrote that the software was very rough, with frequent freeze-ups, a confusing user interface and a limited selection of games. Now, after untold firmware updates, a factory reset and many restarts, my Ouya is finally reasonably stable. It still has a tendency to crash in the middle of games, but instead of freezing up, it returns to a point where it's easy to recover. I still wouldn't recommend that gamers rush out and buy one, but it looks likely that Ouya's firmware will be solid and a good selection of games will be available for the Holiday season.

Leap Motion

I put a pre-order in for a Leap Motion sensor last year, and mine arrived a couple of weeks ago. The Leap Motion device plugs into a USB port on a Windows or Macintosh PC, and it senses the position and motion of hands and fingers in a space above and around the sensor. A collection of apps that take advantage of the sensor are available on the Airspace website.

In general, I'm very disappointed with the Leap Motion device. I recognize that it's still effectively a beta product, but based on the demonstrations given to the press, it certainly appeared to be stable and usable. My experience so far is that it's neither. I've tested the sensor on a Core i3-based laptop running Windows 8.1 Pro, and a Core i7-based iMac running OS X Mountain Lion. There's a significant amount of noise in object detection on both machines--when holding my hands still, the image of the hands jitters in space as if I've got palsy--but the problem is worse on my Windows machine. Both computers also have difficulty detecting all ten fingers unless I hold my hands almost perfectly parallel to the Leap Motion sensor. Moving either hand (especially my left hand) more than slightly vertical will cause fingers to disappear from tracking.

The Airspace apps are nothing to write home about, either: A hand-controlled version of Google Earth included with the package is impossible for me to use. No matter where I start on the globe, I usually end up underwater in an unmarked body of water. A version of Cut the Rope is also included, but it's far easier for me to use a mouse or trackpad to control it than it is to control it with my hands and the Leap Motion sensor. Flocking, another free demo app, uses your hands to control a group of fish. It does absolutely nothing on my Windows system, leading me to believe that Leap Motion's performance is very sensitive to your processor's power. Other Airspace apps, which range from free to $29.99 with most priced at $9.99 or less, tend to be overpriced and underpowered--there are far better free to $0.99 apps in both the iTunes Store and Google Play than anything you'll find in Airspace.

At this point, I can't recommend the Leap Motion sensor. It's still too early in its development cycle, and it has no compelling applications. Anything that you can do with Leap Motion, you can do faster and more accurately with a mouse or with your finger on a touchscreen. If Leap Motion improves the stability and performance of its software, integrates it better with Windows and OS X, and improves its usefulness in 3D applications, I'll be able to recommend it.

Chromecast

I got a Google Chromecast Internet video player last weekend, and of these three devices, it's clearly the best thought-out and most mature. Chromecast is a $39 dongle that attaches to an open HDMI connector on your HDTV or A/V receiver. It gets its power from an included USB wall wart. There's virtually no hardware configuration needed on the Chromecast:
  1. Plug the Chromecast into an open HDMI port
  2. Plug in the power supply and attach the USB cable between the power supply and the Chromecast
  3. Switch your HDTV to display the signal from the port to which the Chromecast is attached
  4. Go to http://cast.google.com/chromecast/setup and download the setup app for your operating system
  5. Follow the instructions in the setup app to save your Wi-Fi router's SSID and password in your Chromecast
It was when I got to step 5 that I hit my first (and only) glitch: The Chromecast has to be on the same Wi-Fi network as all the devices that want to use it. I was running on a 5GHz 802.11ac network (which isn't supported by Chromecast,) and my Windows PC couldn't find the the device. However, a couple of clicks later, the setup app switched my PC to a 2.4GHz connection, the app found the Chromecast and configured it for my network.

On the Chromecast setup website, you can also download and install a Chromecast extension for Google's Chrome browser. Once your device is talking to the Chromecast, you can send any webpage (even a webpage with video, such as Hulu or CBS.com) to your HDTV. If you're not using a Chrome browser, you can still use your Chromecast to watch videos from any site that supports it. YouTube currently supports Chromecast with a small icon in the bottom right-hand corner of each video; click the icon, select your Chromecast device, and in a few seconds, the video will start playing on your TV. In addition to YouTube, Netflix and Google Play currently support the Chromecast; other sites are planning to add support in the near future.

Monday, August 05, 2013

Mr. Bezos buys a newspaper...and you know what happens next

Earlier today, The Washington Post Company announced that Jeff Bezos, founder of Amazon.com, has agreed to purchase The Washington Post and affiliated newspapers, along with the printing plant, for $250 million. The sale came as a shock, especially to many Post employees, who hadn't been told that the paper was for sale. For his part, Bezos said that the current publisher and executive editor will stay in their jobs, and that no layoffs are contemplated as a result of the transaction.

What, I wondered, would be the reaction of The New York Times, the "newspaper of record," the "Grey Lady"? The Times has run an increasingly shrill series of articles about Amazon, so I was interested in seeing how the newspaper handled the Washington Post story. As expected, the Times turned it into evidence of some sort of a nefarious scheme.

The Times gave the story to Nick Wingfield and David Streitfeld. I've written about Streitfeld previously, and called him a hack whose sole purpose at the Times seems to be doing hatchet jobs on Amazon. This evening's story did nothing to change my opinion of either his objectivity or his writing skill. The article--one of the biggest media stories of the year--has in it a total of two quotes: A generally laudatory one from Rob Glaser of RealNetworks, and the second from Dennis Johnson, who runs publisher Melville House and its blog Moby Lives. Mr. Johnson, whose negative position on Amazon is well-known in the book industry, has been Streitfeld's go-to source for negative quotes about the company. Here's a quote from the article:
Critics of Amazon were aghast at the news of The Washington Post purchase, saying it would further increase the power of a company and a tycoon they think already has too much of it. Although Mr. Bezos and not Amazon bought The Post, rivals and critics were already concerned that the newspaper’s work would be used to help Amazon. 
“It’s an old boring story — rich man buys a newspaper — but in this instance it’s one of the richest men ever buying one of the most important newspapers ever, which is the one our government leaders read first thing every morning,” said Dennis Johnson, the co-founder of Melville House, a well-regarded small publisher. “This is the capper in the development of one of the most powerful vertical monopolies in our history, which is also one of the most controlling in matters of cultural concern.”
It doesn't take a trained journalist to see how "over the top" Johnson's quote is. (For example, for the record, Jeff Bezos ranks 16th on Bloomberg's Billionaires list, behind Warren Buffet (who's actively buying newspapers,) the Koch Brothers (who are rumored to be interested in buying some or all of Tribune's newspapers,) and the Walton family heirs (whose Walmart has been accused of its share of sharp practices,) among others.) Johnson sees a conspiracy under Amazon's rock--but I don't recall him reacting the same way when Barnes & Noble and Borders divided up the retail bookstore business and caused literally thousands of independent bookstores to go out of business. Incidentally, his books continue to be on sale at Amazon.

That leads me to wonder: Why does the Times have it in so badly for Amazon? There are other companies that sell online without collecting sales tax. There are others that have pushed their suppliers around to get preferential pricing, terms and conditions. There are others that are more dominant in their industries than Amazon is. There are others that have been accused of poor working conditions in their factories and warehouses. Is the Times trying to curry favor with one of Amazon's competitors? Is it getting payback for a business deal gone wrong? From where I sit, it's impossible to know.

Aereo: If you can't beat 'em, buy 'em?

Let's set the War of Retransmission between Time Warner Cable (TWC) and CBS aside for a minute, and consider another battle that CBS is fighting: The one with Aereo, the company (backed by IAC/InterActiveCorp) that's deploying hundreds of thousands of tiny antennas in cities across the country to allow consumers to watch broadcast television over the Internet. So far, Aereo has prevailed in court against CBS, Fox, ABC, Univision, NBC and local broadcasters.

Aereo is paying nothing for retransmission rights, and if it prevails in lower court, the television networks will have a potentially years-long court battle to shut the service down, followed by fighting to enact legislation against Aereo if the court battle fails. In the worst case, Aereo could provide a model for anyone to get into the over-the-top Internet television business without paying a dime to broadcasters.

Let's return to the War of Retransmission. Earlier today, Time Warner Cable made CBS an offer: Instead of including CBS as a standard part of all of its cable packages, TWC would make CBS and its related channels "a la carte," just like HBO or CBS's Showtime. CBS could charge whatever it wants, TWC would collect the fee and pass 100% of it back to CBS.

That brings us to an important point: Broadcast retransmission and cable network licensing fees can't continue to go up indefinitely. We are very close to the point where multichannel video services will be too expensive for the majority of consumers. When that happens, we'll see millions--yes, millions--of cable, satellite and IPTV customers downgrading their service, and quite a few of them cancelling service altogether.

There are two trends that could result in very undesirable outcomes for broadcasters:
  1. Aereo may have a legal way to distribute broadcasters' video without paying them anything for retransmission rights, and
  2. Broadcasters don't have much more room to increase retransmission fees before cable operators and their customers walk away.
Here's a heretical approach that could well solve both problems: A group of broadcasters should buy Aereo, and use it as the basis for their own multichannel video service. By most accounts, Aereo's customer software is well-designed, and the company is scaling up to serve the 60 largest U.S. markets. With Aereo, broadcasters could quickly put a national video distribution system into place. As for the problem with rising retransmission fees, broadcasters could figure out just how much consumers are willing to pay...by charging them directly.

Instead of Aereo's current pricing model ($8 to $12/month,) a broadcaster-owned Aereo could offer channels on an a la carte basis--and consumers could buy the channels that they want. Those broadcasters (and cable networks) that want carriage on Aereo without getting direct compensation could make their channels available as part of the basic Aereo package (which would be priced at perhaps $5 per month) at no additional cost to subscribers. Broadcasters could make all of their digital subchannels (the extra channels made possible by digital television in the U.S.) available through Aereo, either at no charge or as part of their main channel subscription. They could also offer their own bundles and packages in order to launch new channels or increase viewership of existing ones.

The next step would be to add cable channels to Aereo. Some of the most popular free and paid cable channels are owned by the major U.S. broadcast television networks, including ESPN, The Disney Channel, Fox Sports 1, FX and Showtime. The only broadcast network that probably wouldn't participate is NBC (which owns USA, Bravo, Syfy, E! and others) because it's owned by Comcast, but given NBC's ratings in recent years, that's no loss. With Aereo in broadcasters' hands, it would no longer represent a threat, but instead would be both an offensive and defensive weapon: Offensive, in its ability to attract revenue away from existing video distributors, and defensive, by being offered to consumers for free in markets where a cable operator or other video service is threatening to drop a broadcast network over retransmission fees.

Do I think that a group of broadcasters is likely to buy Aereo? No. Do I think that they should seriously consider it? Yes.

Time Warner Cable offers a la carte option to CBS

The Verge is reporting that Time Warner Cable CEO Glenn Britt has made a new proposal to CBS in the form of an open letter (PDF). Time Warner Cable stopped carrying CBS's television stations and cable networks last Friday (except for CBS Sports Network, which is covered by a separate agreement.) Trade reports indicate that Time Warner Cable is currently paying $1.00 per subscriber for CBS, close to CBS's industry average of $0.85 per sub. CBS is reportedly asking for $2.00 per subscriber.

Time Warner Cable's new proposal is that the cable operator can either take its previous final offer, which would continue to make CBS available to all subscribers, or it could make CBS and its affiliated networks available to individual subscribers on an "a la carte" basis. CBS would be able to charge Time Warner Cable's subscribers whatever it likes, and CBS would get 100% of the revenue.

The choice between Time Warner Cable's offers isn't clear, because we don't know for certain how much TWC has offered to pay, how much CBS is looking for, or how many of TWC's subscribers are likely to pay separately for CBS if it goes a la carte. That last number also affects how much CBS can charge advertisers, since it would only be able to count the subscribers who actually pay extra for CBS as part of its potential audience, not the entire Time Warner Cable subscriber base. Here's a quick example:

Time Warner Cable had 11.7 million video subscribers as of June 30, 2013 (PDF). Assuming that CBS gets $1 per subscriber, that means that CBS would get $11.7 million per year from TWC, not including Showtime, CBS Sports Network or its other premium networks. At $2 per subscriber, CBS would have to sign up at least 50% of TWC households in order to bring in the same revenue--but they'd actually have to sign up more, because instead of selling to a single customer (TWC,) it would have to sell to 11.7 million households, and there would be significant marketing and promotion costs. CBS would pay TWC for some of that promotion; other parts would have to be done by CBS through its own stations and networks, and through other media.

The impact on ad revenues would be based on how many TWC households CBS signs up. Let's say that CBS only signs up 50% of TWC households, or 5.85 million households. That means that there would be almost 6 million fewer households that have the potential to watch CBS's hit shows. Some of those households would switch to other video providers, and some would watch CBS's stations over the air, but there would be an inevitable drop in CBS's ratings in some of the biggest markets in the U.S., including New York and Los Angeles.

CBS has said that it will consider the proposals and respond soon.

Monday, July 15, 2013

After "Sharknado," what's next? Here's a few ideas...

The Asylum's "Sharknado" took over Twitter last Thursday night and continues to be a topic of conversation. We know that they have more shark-themed movies in development, but there's a whole planet full of species with B-movie potential. So, with apologies in advance, here are a few suggestions:
  • Thanksgiving of Terror: This Thanksgiving, it'll be the humans asking for a pardon from the turkeys! A midwest turkey processing plant treats turkeys with an experimental steroid to make them plumper. Instead, the steroid makes them intelligent and improves their flying, so they escape and rain havoc on Chicago.
  • Carpageddon: Carp escape from a gefilte fish plant and wreak havoc on a Hasidic Jewish neighborhood in Brooklyn.
  • Horses from Hell: In the Old West, horses begin trampling and eating their riders. Only one man can stop the horses before they eat the entire Territory of Arkansaw and thwart the country's Manifest Destiny.
  • Pigeons Poisoning People in the Park: Pigeons in New York are mutated by poisons used to kill them. The pigeons survive, and their droppings become highly poisonous to people. First Central Park, and then all of Manhattan, become kill zones for the pigeons.
And here's one that's ripped out of today's headlines:
  • The Twinkie Invasion: After seven months, Twinkies are returned to a grateful nation. However, the new, "healthier" Twinkies contain parasites that take over the minds of people who eat them. Can anyone stop them before every overweight person in America becomes a mindless slave?
Here you are, folks. Go for it!

Tuesday, July 09, 2013

What's really behind the decline in brick & mortar bookstores?

This morning, Bloomberg Television covered yesterday's resignation of Barnes & Noble CEO William Lynch and subsequent management reorganization. Bloomberg showed a bar graph of the decline in the number of bookstores in the U.S., and said that Amazon was responsible for the decline. Yes, Amazon played a part, but there are other reasons that are at least as important:
  • The decline in the number of bookstores began in the 1980s, when Barnes & Noble's and Borders's superstores decimated independent booksellers.
  • U.S. book sales started declining years before the 2007 introduction of Amazon's Kindle and the 2008 Great Recession. People are simply spending less time reading books.
  • eBooks from Amazon and other retailers have cannibalized sales of print books. In other words, eBook sales haven't increased total U.S. book sales revenues--they've only slowed the rate of decline.
So, you've got three factors responsible for the decline in the number of bookstores:
  1. Price competition, which was used by Barnes & Noble and Borders to kill off a large part of the U.S. bookstore industry even before Amazon was founded in 1995 (but which Amazon has certainly used to its advantage.)
  2. Declining book sales, which pressures all booksellers but puts the most pressure on retailers that don't have other product lines to fall back on for revenue.
  3. eBooks, which generally aren't sold in brick & mortar bookstores (although they could be.)

Monday, July 08, 2013

The New York Times takes yet another dump on Amazon

The New York Times' David Streitfeld is at it again. Last FridayThursday, the Times published an article written by Streitfeld that charges that Amazon is using its "monopoly" on book sales to increase prices. The bottom line was that Streitfeld claimed, using completely anecdotal evidence, that Amazon is increasing prices on some academic and small-press print titles; eBook prices are unaffected. How many titles are affected, and the average price increase, either in dollars or percentage, Streitfeld didn't say. He lists price changes for less than ten out of several million titles.

Even if Streitfeld is correct, and Amazon is generally increasing prices on academic and small-press print books, there are at least two reasons why they'd do so that have absolutely nothing to do with abuse of Amazon's market share:
  1. The company is compensating for the costs of stocking slower-moving titles. Unlike eBooks, print books take up warehouse space, and have to be physically picked from shelves, packaged and shipped. A slow-selling title takes up space that could be used for a faster-selling title or a completely different type of merchandise. That space has a cost, and the cost is allocated on a per-copy-sold basis, whether or not Amazon passes it on to consumers. Amazon may simply be passing that cost onto consumers in the form of lower discounts.
  2. For the reason stated above, Amazon may be considering not stocking some of these low-selling titles and instead buying them from distributors as orders are received, which would increase its costs. (This was the way that Amazon first started business.) By raising prices now, Amazon can gauge the impact on demand and change course if needed.
One other point: If you prefer not to buy an academic or small-press title from Amazon, you can always get it from Barnes & Noble or an independent bookseller. If they don't have it in stock, they can order it for you from Ingram, Baker & Taylor or an academic distributor. However, I can almost guarantee you that you'll pay the publisher's list price.

Now, just three days laterThe following day, Streitfeld and the Times came back with another article that makes an even more ridiculous allegation: Amazon's pricing model makes it impossible for customers to determine the "true" price of its books. He says that book prices will be "determined by demand and perhaps by whim." I thought that we lived in an free economy where supply and demand determines prices, and that the law of demand is a good thing. Apparently, according to Mr. Streitfeld, I'm wrong. Consumers should accept whatever prices are set by vendors and buy their goods and services, no matter what the price. Silly me.

Streitfeld also claims that Amazon's pricing policies are now "a radioactive topic with some vendors." For that claim, he uses two publishers as examples: The University of Chicago, which refused to answer because its pricing policies are proprietary, and Melville House, which has had a long and very public adversarial relationship with Amazon. Streitfeld suggests that the University of Chicago Press's refusal to discuss the subject is because of fear of retaliation, but it could equally be due to the fact that most companies that sell through distribution don't discuss their pricing policies. For example, Apple is happy to discuss the retail prices for its devices, but ask it how much it charges Best Buy for those products, Best Buy's retail markup on their wholesale cost, or its reaction to Best Buy's pricing policies, and all you'll get is a stony silence. In addition, discussing pricing can be seen as signalling, which can be of benefit to competitors and can also be used to fix prices. But, apparently, Mr. Streitfeld didn't think that those alternate explanations were plausible.

Let's go back to Streitfeld's original argument, that consumers are unable to determine the "true" price of books sold by Amazon. In my experience, Amazon always shows the suggested list price set by publishers for both print books and eBooks, as well as its price and the discount (if any.) Shipping costs are clearly visible when you place the order, and can be adjusted based on speed of service. (Amazon Prime members generally pay nothing for shipping if they're willing to accept two business day service.) Even sales taxes are fairly simple from the consumer's point of view: If you live in a state where Amazon collects sales tax, Amazon will calculate and add the tax to the order. If Amazon doesn't collect taxes in your state, they don't add them to your bill, and it's up to you to determine whether or not your state requires you to pay the sales tax separately. How does any of this obscure the "true" price of books? I know what the publisher has set as its suggested list price, what Amazon charges, the discount off the list price, the shipping cost and (in some states) the sales tax.

I'm going to use two terms that reporters and editors don't like to see, especially in relationship to themselves: In my opinion, David Streitfeld is a hack. There is absolutely no way to read his articles as anything other than what they are: Hatchet jobs on Amazon. Even when he does manage to speak to someone at Amazon, he turns their response into a reinforcement of his argument. I'm not even sure what Mr. Streitfeld's job is at the New York Times. Is he a reporter? If he is, his articles fail any reasonable test of objectivity, and should never make it to print. Is he the Times' columnist in charge of dumping on Amazon? If so, he's doing a great job--but his column should be on the editorial page or in the Op-Ed section, not in an area of the paper where readers expect hard news.

Wednesday, July 03, 2013

(Another) One bites the dust: Boxee sold to Samsung

Yesterday, over-the-top Internet set-top box maker Boxee confirmed that it had been acquired by Samsung. The rumored acquisition price was around $30 million--about the same amount as the company raised in venture financing, meaning that at best, investors got their money back. According to VentureBeat, Boxee will continue to support owners of Boxee Box and Boxee TV devices "for the immediate future," but Boxee's Cloud DVR service will be discontinued on July 10th and recorded television shows will be deleted.

Samsung has been one of the biggest potential customers for set-top box technology companies such as Google and Roku. Now, it's very likely that Samsung will integrate Boxee's technology into future HDTVs, Blu-Ray players and other devices. For Roku in particular, the number of companies that are both likely to integrate its technology into their devices and are big enough to represent a major business opportunity are dwindling. Sony and Vizio are already Google licensees, and Samsung is now on board with Boxee. Panasonic, Sharp and LG are still in play, but beyond them, the remaining players are second- and third-tier brands.

I'm still not convinced that there's a broad market for standalone Internet set-top boxes. Boxee couldn't find one, Google's licensees are struggling, Apple TV is supposedly beyond the "hobby" stage (but not far beyond) and Roku is putting more emphasis on its "streaming stick" and software licensing deals than its set-top boxes. As much as I like the idea of over-the-top video, unless Apple or Intel can come up with something both revolutionary and highly desirable, Internet video will remain a "second screen" application for PCs, tablets and smartphones.



Monday, June 24, 2013

Does 3D stand for "Dead, Dead, Dead?"

Industry blog "The Wrap" reports that first-weekend 3D ticket sales for "Monsters University" and "World War Z" represented the smallest percentage of total sales since the introduction of digital 3D to theaters. "Monsters University" took in 31% of its revenues from 3D ticket sales; prior to that, the least successful 3D animated movie release was "Brave," with 34% of its tickets sold for 3D showings. "World War Z" took in 34% of its revenues from 3D ticket sales; the previous low was 40% for "Captain America." 3D hardware vendor and licensor RealD tried to put a good face on a bad situation by pointing out that "Man of Steel" opened with 41% of its revenues from 3D ticket sales. The problem, of course, is that 41% number was only 1% higher than the previous all-time low.

Last week, ESPN announced that it's shutting down its ESPN 3D channel because of low viewership. According to the Associated Press, FIFA, the organization behind World Cup soccer, is considering whether to drop 3D coverage of the 2014 World Cup because of cost, and instead is considering 4K Ultra HD coverage. Sales of 3D HDTVs in general are difficult to break out because high-end devices tend to be compatible with 3D whether or not consumers actually use the 3D features. Nevertheless, there's a fairly clear trend toward sales of lower-priced HDTVs that can't support 3D.

Consumer electronics companies have largely come to the conclusion that 3D is a dead end, and are instead focusing their attention and development resources on Ultra HD (UHD). UHD doesn't require glasses and doesn't cause the headaches and dizziness that some people get with 3D. The movie industry is already well along with its transition to 4K production and post-production; the biggest remaining issue is coming up with cost-effective ways to deliver 4K movies. Television producers are further behind because of the lack of a UHD broadcast standard, but the cost of 4K equipment is dropping rapidly, and television shows can be produced in 4K but down-converted to 2K for broadcast.

3D in theaters won't go away, but if the average percentage of box office revenues from 3D falls to 25% or so, studios will have to become much more selective as to which movies are produced in or converted to 3D. Eventually, studios will be forced to drop 3D if the cost of production and post-production is equal to or greater than the incremental revenue that they earn from 3D ticket sales.

Update, July 5, 2013: Yesterday, the BBC announced that it will suspend 3D broadcasts "indefinitely" because of a lack of viewer interest. Coming on the heels of ESPN's decision to discontinue ESPN 3D, the BBC's action makes it even clearer that, for now at least, consumer big-screen 3D in homes is dead. That doesn't mean that 3D is dead everywhere, however. Theatrical 3D remains viable, although audiences have gotten much more selective about which 3D releases they're willing to pay for. Oculus's immersive Rift 3D visor is getting positive reviews and lots of interest from game developers, but the real test will be when Oculus releases its consumer-grade hardware and publishers release 3D games for the device.


Enhanced by Zemanta

Tuesday, June 18, 2013

Set-top boxes and game consoles: The $99 option

Last week, I wrote about installing my OUYA game console, which lists for $99 (U.S.) OUYA's isn't the only Android game console--there are several available or in development, all of which sell in the $99-$129 price range and all of which run "casual" games. $99 is also the consensus price point for over-the-top (OTT) set-top boxes from Roku, Vizio, Apple and other vendors.

It's very unlikely that hardcore gamers will prefer Android consoles to the Xbox One or Sony's Playstation 4. Even with a Tegra 4 chip set, which some of the newer Android consoles will have this fall, they won't have the performance to be able to compete with Microsoft's or Sony's consoles. On the other hand, while the Xbox One will have features that the $99 set-top boxes don't, such as voice recognition and gesture control, it's equally unlikely that set-top buyers will be willing to spend $400 more to get them. In short, serious gamers want high-end game consoles and are willing to pay for them, while Internet set-top box buyers who aren't gamers are unlikely to pay a high premium in order to get gaming features.

For the OUYA and similar devices to succeed, they have to prove that people want to play the same games that they already play on their smartphones and tablets on their big-screen televisions as well. Despite OUYA's successful Kickstarter campaign, that's still an unproven hypothesis. In addition, the Internet set-top box market is still far from proven--Logitech dropped its Google TV-based STB, Boxee is said to be looking for an acquirer or additional capital, Roku is increasingly emphasizing software licensing to HDTV manufacturers, and Apple TV has graduated from being a "hobby" but still contributes minuscule revenues to the company.

It's still too early to tell if there's a market for either low-cost game consoles or third-party Internet set-top boxes at any price. We may get more clarity by the end of this year's Holiday season.
Enhanced by Zemanta

Tuesday, June 11, 2013

OUYA: Lots of potential, but not quite ready for prime time

Last week, I got my OUYA game console, in what was one of the last batch of devices to go out to Kickstarter supporters. I've got it connected and working (sometimes,) but the device's potential is considerably greater than what the current console can deliver.

At the outset, let me make clear that I'm not dumping on either OUYA's device or the company itself. I want to see OUYA succeed, but to say that the current device is "rough around the edges" is an understatement:

  • The instruction manual that comes with the console explains how to connect and power up the device, but it says nothing about how to add batteries to the controllers. It's easy enough to find information on the Internet on how to take off the side panels in order to add batteries, but users shouldn't have to do that.
  • When I first turned the console on, all I got was the "circle U" logo on screen, which stayed there quite a while. I assumed that the console was locked up, so I held down the power button until the console powered down. Then, I powered it back up, and had the same problem, so I pulled the power plug, waited a while, and turned it back on. Same problem. It took several tries before the console successfully booted up and came to the home screen.
  • Once in the user interface, OUYA bounces between customized screens and conventional Android menus and dialog boxes. That's not all that confusing for people who are familiar with stock Android, but it's likely to be a problem for novice users. For example, when downloading and installing a game, it looked as though I'd installed the game based on OUYA's prompts, but then I got an Android dialog box which asked if I wanted to install the game. So, I installed it (or installed it again.)
  • My console has locked up a variety of times, usually when restarting the console or when searching for a game to install. Lockups require that the power cord be removed for several minutes.
  • If you want to purchase a game, you have to first save your credit card information in an OUYA account. There's no way to enter a credit card from within games. At this point, I don't feel comfortable keeping my credit card permanently on file with OUYA, and I don't want to go through the process of first entering and then deleting it every time I want to buy a game or make an in-app purchase.
  • There's also no way to manage your account from the OUYA website. It would be much easier to enter credit card information from a web browser, but other than creating an account, everything has to be done from the console.
  • I love the fact that OUYA is open to any game developer, but that sometimes results in inconsistent or missing controls. For example, I've been unable to figure out how to completely exit from several games. In one case, the game has a menu with an "Exit" option that's available by pressing the power button on the controller, but all it does it take you back to the game's home page. There's no way to completely exit the game (at least that I've found.)
All of these problems are fixable, but taken together, it's clear that the OUYA console is, or should still be, in beta testing. However, the console is scheduled to go on sale to the general public on June 25th, through Amazon, Best Buy, GameStop and Target. As of today, I can't recommend that you buy an OUYA. If you wait a few months, the most serious bugs should be worked out, and there should be a much better selection of games available. OUYA would probably have been better off by holding off on general retail availability until the fall.

As I said, I'm a big fan of the ideas underlying OUYA, and I think that it will eventually be a great casual gaming console with lots of potential for set-top box applications as well. It's just not there yet.
Enhanced by Zemanta

It's a can, it's a coffeemaker, it's the new Mac Pro

Over the last few weeks, rumors have been flying about a new Mac Pro with a dramatically different physical design than the current model, which has been on the market with fairly minor changes for the last seven years. Yesterday, at Apple's Worldwide Developers Conference, the company took the wraps off of the new Mac Pro. It's a prototype for which Apple gave neither pricing nor a delivery date, but the industrial design seems solid enough that the final product isn't likely to change very much from the prototype.

Compared to today's big, bulky, heavy Mac Pro that's more of a "deskside" tower than a desktop PC, the new Mac Pro looks a lot like an oversize, black beer can. It's a 9.9" high cylinder that's 6.6" in diameter--small and light enough to easily fit into a backpack. To get to that size, Apple had to make almost all expansion options external. The new Mac Pro will come with one or two Xeon processors with as many as 12 cores, and two AMD FirePro GPUs that can support as many as three 4K displays. Internal storage is provided by flash on a PCIe interface; there's simply no room for hard drives.

Six Thunderbolt 2 and four USB 3 interfaces will provide high-speed connections to mass storage arrays, audio and video I/O devices, PCI expansion chassis, big-screen displays, etc. The Mac Pro will also come with dual FireWire 800 interfaces, a Gigabit Ethernet interface, HDMI 1.4 for an external display, an 802.11ac Wi-Fi interface, and Bluetooth 4.0, which Apple expects will be used for keyboards, mice and similar devices.

The primary design center for the new Mac Pro is video post-production. It can certainly be used for other applications, but video editing, color correction and special effects are where the Mac Pro really fits. The ability to support three 4K displays with even the entry-level model says that the new Mac Pro is aimed at motion picture-quality image processing.

That having been said, I have no doubt that we'll see other PC workstations in packages similar to the Mac Pro, probably as soon as next January's CES conference. Just as with the iMac and a legion of other all-in-one computers, MacBook Airs and Ultrabooks, iPhones and all the big-screen button- and keyboard-free smartphones, and even tablets, in which Apple was far from the first vendor of tablets but was the first to make them a huge business, the new Mac Pro will represent a new form factor that's first validated by Apple and then adopted by many other companies.

We don't know when the new Mac Pro will be released or how much it will cost. I suspect that it'll be released in late Q3 or early Q4, and it'll be priced comparably to the existing Mac Pro--which means that it won't be cheap, but also won't be priced out of line with other Xeon-based workstations.
Enhanced by Zemanta

Sunday, May 26, 2013

If you want to make movies, start small

The movie business is going through a dramatic transition: The major movie studios are focusing on expensive action- and special effects-based productions that they can sell worldwide, and are giving short shrift to the kinds of smaller, more plot- and character-driven movies that independent filmmakers do best. On the other hand, the tools necessary for professional filmmaking are less expensive and more available than ever. Thus, the paradox: It's easier than ever to produce an independent film, but it's as hard or harder than it's ever been to get it distributed.

That hasn't stopped people from trying; one estimate is that as many as 5,000 independently-produced films are available for U.S. distribution each year. However, only a tiny fraction ever get any theatrical distribution. In 2012, the Motion Picture Association of America reported (based on Rentrak figures) that 677 movies opened in at least one theater, of which 128 were distributed and/or produced by one of the major studios. Some of the remaining movies were distributed to the home video market via DVD, Blu-Ray or streaming video--but there's not much demand for full-length movies that never got theatrical distribution.

Many, if not most, people who produce independent films do so in order to get the attention of a movie distributor. They neither plan to nor have the resources to distribute the film themselves. However, if they can't get theatrical distribution, there's very little chance of ever recouping even a fraction of the money they spent on production and post-production.

If you're thinking of producing your own movie, you should consider producing a short film rather than a full-length feature. Here's why:
  • It costs much less to produce a short film that it does to produce a feature-length one, so you can spend less time raising money and more time producing your film.
  • From inception of the project to completion, it can take as long as three years to produce a feature-length movie. On the other hand, you can produce a short film in a few months.
  • A short film requires the same elements as a feature film: Writing, acting, directing, producing, cinematography, editing, special effects, etc. It demonstrates talent just as well as a full-length movie.
  • It's much easier to self-distribute a short film; you can post it on YouTube or Vimeo, make it available to film festivals, license it to be included in short film collections, etc.
A short film is much easier to finance, produce and distribute than a full-length movie--and it's far more likely to be seen. 
Enhanced by Zemanta

Wednesday, May 22, 2013

Thoughts on Microsoft's new Xbox One

Yesterday, Microsoft unveiled its new Xbox One game console to an assembly of press, analysts and Microsoft employees on its Redmond, WA campus. The Xbox One has faster processors and more memory than the Xbox 360. A new version of its Kinect 3D digitizer with a 1080p camera is included as standard equipment. The Xbox Controller has also been redesigned, although the changes are mainly cosmetic. In addition, the Xbox One has an HDMI input, so that selected cable, satellite and IPTV set-top boxes can be connected to and controlled by the Xbox One.

Microsoft spent the first half of the presentation focusing on the Xbox One's TV-related features. For example, the Xbox One will have a built-in Electronic Program Guide (EPG) that supports many video operators. Users will be able to change channels and look for shows to watch by voice. In addition, the Xbox One will enable navigation via Kinect gestures.

The second half of the presentation was devoted to games. Only a handful of game publishers were represented on stage, and none of them showed actual game play; instead, they showed trailers. To my eyes, the most impressive trailer was for Forza Motorsport 5, which is the only game title that's been confirmed to be released day-and-date with the Xbox One. It looked great, with visual elements such as metallic paint and realistic depth-of-field rendering that would have been possible only in pre-rendered cutscenes not long ago. Unfortunately, that wasn't the case with the demos from the other game publishers. Electronic Arts, for example, appears to be using the Xbox One's additional horsepower to add more intelligence to the game play in its sports titles rather than for improving how its games look.

Microsoft is apparently concerned about the future of the Xbox given the falloff in sales of console games and the rise of casual games on smartphones and tablets. As a result, it's trying to position the Xbox One as both a set-top box (one that can both connect directly to content over the Internet and indirectly through cable, satellite and IPTV set-top boxes) and a high-performance game console. The problem is that those are two very different markets, with different use cases and consumer expectations. For example, Google TV provides most of the same non-game functionality as the Xbox One, albeit without voice recognition or gesture control. On the other hand, you can buy a Google TV-based set-top box from Vizio for $99, while I expect the Xbox One to be priced around $399. You can also buy an Apple TV or Roku set-top box for $99 or less. I simply don't see very many people buying the Xbox One for its set-top box features, since they can get most of its functionality from less expensive competitors. That means that most of the Xbox One's buyers will be hard-core or moderate gamers, which won't expand the potential market for the device at all.

I suspect that Microsoft's corporate leadership has fallen victim to Shimmer Syndrome (named after the combination floor wax and dessert topping in the famous Saturday Night Live commercial parody.) As with Windows 8, which works both on tablets and on conventional PCs but is compromised on both platforms, it's trying to make the Xbox One work both as a set-top box and game console. The compromise on the set-top box side is clearly price; we don't yet know what the compromise is on the game side, but it may be lack of attention that opens the door for Sony to offer a superior developer and gaming experience.

Enhanced by Zemanta

Saturday, May 11, 2013

Does a Microsoft purchase of Nook Media make sense, and to whom?

Earlier this week, TechCrunch reported that it received private documents describing a $1 billion offer made by Microsoft to acquire Barnes & Noble's digital businesses from its Nook Media business unit, in which Microsoft invested $300 million last year. Nook Media also includes Barnes & Noble's college bookstore unit, which Microsoft doesn't want and would most likely be reintegrated with B&N's retail business.

Here's what Microsoft would be acquiring:
  • Barnes & Noble's eBook business, including its publisher contracts, self-publishing business, eCommerce websites, online order fulfillment infrastructure and customer lists.
  • The Nook hardware line (both eReaders and tablets,) and Barnes & Noble's hardware design operation in Silicon Valley.
  • B&N's other digital product lines (apps, magazines, newspapers, audiobooks and video.)
The deal, if it goes through, would make Microsoft the second largest reseller of eBooks in the U.S., ahead of everyone other than Amazon. It would save Microsoft the time needed to build its own relationships with publishers and eBook distribution infrastructure. However, the other things it would buy might not be all that valuable:
  • Barnes & Noble's tablet business, which was once a viable competitor for Apple and Amazon, has been declining since last year's Holiday sales season. B&N has been running a series of promotions to try to sell off its inventory of Nook HD and HD+ tablets.
  • The document received by TechCrunch states that Barnes & Noble intends to shut down its tablet business by the end of its 2014 fiscal year. That's a huge "red flag" to B&N's Silicon Valley-based hardware and software engineers, who'll have no trouble finding jobs with other companies. By the time a Microsoft acquisition closes, most of Barnes & Noble's top engineers are likely to be gone.
  • The existing Nook tablet line is of no interest to Microsoft, and in fact will represent a customer support liability.
  • Microsoft already has its own app stores for Windows 8 and Windows Phone 8. It has no interest in maintaining the Nook's Android-based app store.
  • Microsoft already sells videos and music through its Xbox Marketplace; it doesn't need Barnes & Noble's content.
That's what Microsoft gets for its one billion dollars, but what does the deal mean for Barnes & Noble? A billion dollars could fund a more serious reorganization of Barnes & Noble's retail business. The company is planning to reduce its store count largely by allowing leases for less-profitable locations to expire. Microsoft's money could enable Barnes & Noble's management to buy out leases and reduce its total number of stores much more quickly. It could also be used to redesign the stores in order to make them more profitable--but there's no evidence to date that Barnes & Noble knows how to turn its stores around.

Selling its eBook business to Microsoft also leaves Barnes & Noble with a big problem. eBooks represent as much as 30% of the sales of the Big 6 publishers; for some genres, such as romance, eBooks comprise 50% of sales. B&N's eBook sales are profitable and growing. So, Barnes & Noble needs to continue to offer eBooks to its customers. It could do so by referring its customers to Microsoft's eBookstore and getting a commission. However, Barnes & Noble would no longer be able to use its eBook sales to negotiate steeper discounts from publishers, since Microsoft would actually be the reseller for those publishers.

So, is Barnes & Noble's eBook business really worth a billion dollars (71% of the company's market capitalization as of this writing) to Microsoft? Is that billion dollars worth it to B&N if it means getting out of the only segment of the book business that's continuing to grow in both revenue dollars and units? In the long run, will selling its eBook business save Barnes & Noble's retail bookstores, or will it only buy the company a little more time?
 Enhanced by Zemanta

Monday, May 06, 2013

Adobe drops software sales in favor of subscriptions

Earlier today, Adobe announced a major change in strategic direction. Its Creative Suites, which bundle software such as Photoshop, Dreamweaver, Illustrator, Premiere Pro and After Effects into multiple packages for applications such as video post-production and web design, will be discontinued as of June, when the company introduces the next version of its Creative Cloud service. CS6, the current version of Adobe's Creative Suite, will remain on the market in both physical and downloadable versions, but will not be updated. In addition, individual applications will only be available in physical and downloadable versions in their CS6 form. As of June, the only way to get the latest version of Adobe's software will be to subscribe to either the complete Creative Cloud or to individual applications.

As with the current version of Creative Cloud, the new version will be priced at $49.99/month, and individual applications will be priced at $19.99/month. Existing Adobe customers who own either a complete copy of Creative Suite 3 or greater, or one of the applications in Creative Suite 3 or greater, can take advantage of three different pricing models:
  • They can license individual applications for $9.99/month.
  • If they own CS6, they can license the complete Creative Cloud collection for $19.99/month.
  • If they own CS3, CS4, CS5 or CS5.5, they can license the complete Creative Cloud collection for $29.99/month.
After the first year, the price of individual applications will increase to $19.99/month, and the price of Creative Cloud will increase to $49.99/month.

Customers who use most of Adobe's applications, such as those in the former Production and Master Creative Suite collections, will end up saving money with Creative Cloud versus buying annual updates. On the other hand, customers who use fewer applications, such as those in the Design and Web Creative Suite collections, and customers who've typically skipped versions of Creative Suite in the past, will end up paying more on an annual basis for Creative Cloud subscriptions.

As it stands today, Creative Cloud is a good deal for many Adobe customers, but what we don't know are Adobe's future pricing plans. What's $49.99/month this year could be considerably more in a few years. My sense is that there's going to be a lot of resistance to the end of Adobe's Creative Suites and individual purchase options, and a fair number of creative professionals will begin evaluating open source and lower-cost replacements for Adobe's software.
Enhanced by Zemanta

Monday, April 29, 2013

A new approach to ENG field transmission

There are two approaches that most local television stations use to get audio and video from their news gathering trucks to their studios:
  1. For decades, ENG trucks have been equipped with microwave transmitters and antennas mounted on masts that range from 14 to 42 feet high when fully extended. These systems provide very reliable transmission, but they require that the ENG truck be parked, the mast be extended and the antenna be aimed at one of the station's receivers. Extension and retraction of the mast takes time.
  2. In the last few years, companies such as LiveU, Dejero, TVU, Streambox and Teradek have offered Wi-Fi- and 3G/4G/LTE-based broadband transmitters, all of which are small enough to be carried in a backpack, and some of which are small enough to be mounted on top of or behind a camcorder. These systems are light, portable and can go live very quickly. They also operate from moving vehicles. On the other hand, these broadband transmitters are at the mercy of available mobile phone bandwidth. In a situation such as the recent bombing in Boston, mobile phone networks may become gridlocked, resulting in blocky video as the system is forced to use less bandwidth, or the connection may be completely dropped.
TVNewsCheck reports that Gray Television has developed its own approach to ENG transmission that combines many of the benefits of the microwave and broadband approaches. Its new system, called GrayMax, uses a single steerable antenna in a dome on top of a SUV, which connects to base stations with 18-inch antennas located around the city. Gray says that four base stations should be sufficient to cover a medium-sized city. The operator in the truck can use GPS to steer the antenna to align with one of the base stations with a single button push, but the antenna can also be manually steered. Gray will use the 2 GHz Broadcast Auxiliary Service (BAS) to send audio and video to the studio, and to receive audio instructions from the studio. The antenna can dynamically track the base stations while the vehicle is moving, so it can continue to feed content back to the studio.

A fully-equipped system, including the transmitter, base stations and vehicle, could cost as little as $80,000. Gray believes that it can eventually reduce the size of GrayMax so that it will fit into a backpack. In short, the system should offer the reliability of microwave systems with broadband's much faster set-up and ability to operate while in motion. In addition, by using the BAS band, it's not impacted by mobile phone congestion.

GrayMax won't replace broadband systems, because they're much less expensive and more flexible, albeit at the cost of lower reliability. However, for stations that want to replace existing microwave systems, GrayMax is likely to be less expensive to acquire, easier to use and more flexible than simply upgrading what they already have.

.Enhanced by Zemanta

Thursday, April 25, 2013

Post-NAB business idea, 2013 edition

Every year, I return from NAB with a buzz from seeing new products and meeting new people. I get energized with lots of ideas--and then start penciling them out, which usually results in a bad case of reality setting in. Here's my 2013 idea:

A lot of people are repelled by the atrocious state of television news in many markets: A focus on crime and accidents, along with shoving cameras and microphones in the faces of victims' family members. My premise is borne out by the ever-older audiences for broadcast news: People who grew up watching it keep watching, but younger viewers get their news from the Internet. There has to be a market (albeit a small one) for people who want more serious local news, and they're likely to be both better educated and higher income than the population in general--prime targets for upscale advertisers.

The costs of building out a streaming-based (not over-the-air) news-focused television station are a fraction of what it would have cost to build a broadcast station just a few years ago. In fact, you can build eight or ten streaming stations for the cost of a single broadcast station. You don't need transmitters, antennas, studio-to-transmitter links, or any of the overhead required to fulfill FCC requirements. LED lighting and low-powered equipment make it feasible to use former retail space for production and post-production, and there's plenty of retail space available for lease in most major markets.

Start with a streaming station in a single major market to test the concept and identify what works and what doesn't. Then, over time, build out additional stations in other large markets, and create a network the way it was done in the early radio days--one station at a time. At a minimum, each station would produce two daily newscasts; as the network grows, those newscasts and additional stories would be fed to the network to create two national newscasts. In addition, some of the local stations would produce their own programming, such as talk and children's shows. The best of that programming would also run on the network. All programming, both local and network, would be available both live and on demand.

That covers equipment and real estate, but one area where you can't save much money is labor. A streaming station doesn't require as many people as a comparable broadcast station, but if the goal is to provide a superior news alternative to existing stations, you've got to hire experienced journalists. Yes, lots of stations and networks are laying off personnel, as are newspapers, but good people need to be paid appropriately. I'm a strong believer that if you're running a for-profit business, you should pay a living wage to the people who work for you, even if there's some way to get around it with interns and freelancers. People can't eat "exposure."

To staff a seven-day-a-week news operation producing two daily newscasts, by my count it would require 42 people at various salary levels. That's more than $2 million per year in salaries and benefits, even in a fairly small market. Advertising revenues aren't assured until the station can demonstrate that it has a loyal and worthwhile audience.

I believe that this idea has tremendous promise for someone who's willing to invest with the expectation that 1) Break-even may be five years out, and 2) To fully capitalize on the opportunity, the network will have to be built out. I can't fund it, and it's unlikely that I can find someone who will, so it goes into the drawer, likely to be pulled out again after NAB 2014.

Update, 4/26/2012: Google would be the perfect company to launch streaming stations, for several reasons:
  1. It fits very well with YouTube, and adds a live, local news component that YouTube doesn't have.
  2. The local stations can double as production and post-production space for YouTube's "creators," expanding beyond their existing facilities in Los Angeles, New York and London.
  3. Google TV would get the streaming stations onto big-screen TVs.
  4. Other than Netflix, Google is probably the biggest buyer of Internet bandwidth in the world, and it operates its own international fiber network that rivals the major telcos. Its bandwidth costs, a big part of any streaming network, would be the lowest in the industry.
  5. As Google expands Google Fiber, locating streaming stations in Fiber cities would allow Google to compete with cable operators that have their own local channels.
  6. Google already has 12,000 advertising salespeople worldwide, so it's well-equipped to sell advertising for a network of streaming stations.



Enhanced by Zemanta