Monday, September 03, 2012

Bogus reviews and how to spot them

About a week ago, The New York Times wrote an article about someone who sold bogus positive book reviews that he posted under a variety of identities on Amazon, and presumably, other websites. His company lasted only a few months until it was "outed" by a disgruntled author who didn't like the review of her book that the company posted. The Times used this one company, and one piece of research, to insinuate that there's a torrent of fake book reviews on Amazon, and that all four and five star reviews should be considered to be fake unless proven otherwise.

The Times not only drew a sweeping conclusion from relatively scant evidence, but it also "buried the lede": The problem of fake reviews on the Internet pervades every product category, not just books. It also discounted the fact that fake reviews can be both positive and negative. I chalk up the Times' article to lazy reporting and sloppy editing, but there's a very real problem with fake reviews.

(Update, September 8, 2012: It does appear that Amazon is full of fake book reviews, but according to The Guardian, the practice isn't limited to self-published authors. Author Jeremy Duns figured out that best-selling crime author R.J. Ellory was posting breathtakingly positive reviews of his own books under the pseudonyms "Jelly Bean" and "Nicodemus Jones," and was trashing competing authors with 1-star reviews, including Stuart MacBride and Mark Billingham. Ellory has since apologized, but I suspect that the only thing he's really sorry for was getting caught. The Guardian's article also notes other examples of best-selling authors getting caught giving themselves positive reviews, and in some cases, trashing their competitors. Amazon is going to lose a huge amount of credibility unless it comes up with a way to confirm the identities of reviewers. Publishers could also add clauses to their contracts that prohibit writers from posting reviews under any name other than their own or paying third parties to post reviews.)

It's important to keep in mind that reviews are inherently going to be more heavily distributed toward very positive and very negative ratings, because people are more motivated to review things that they're very happy or unhappy about. Think about going out for dinner to a modestly-priced restaurant and getting an "okay" meal at the price you expected. You're unlikely to write a review about that "meh" experience. On the other hand, what if you get one of the best meals you've ever eaten, or what if the food is bad, the service is worse and the night ends with the waiter spilling a hot cup of coffee in your lap? Either way, you're much more likely to write a review, and the review is likely to be very positive in the former case and very negative in the latter one. So don't automatically assume that great or terrible reviews are fakes.

There's been some research published on how to spot fake reviews. For example, MIT's Technology Review reported that researchers at the State University of New York at Stony Brook used the TripAdvisor site to come up with rules for identifying fake reviews. They started by assembling a group of "likely valid" reviewers--they'd written at least ten reviews, each review was more than a day or two apart, and their ratings didn't deviate too far from the average ratings for all hotels.

The researchers then compared reviews from its "likely valid" group with those of one-time reviewers to see if the one-time reviewers gave a significantly higher number of five-star ratings. They also looked at the ratio of high to low ratings given by different groups of reviewers, as well as sudden bursts of reviews (multiple reviews posted over a few days) that might indicate a deliberate marketing campaign. Then, they compared their results with a previous study they'd conducted, in which they hired people to write fake positive reviews, so that they could identify tell-tale clues such as use of too many superlatives. The researchers found that they could identify fake TripAdvisor reviews "in the wild" around 72% of the time.

The SUNY Stony Brook research focused on fake positive reviews, but a couple of years ago, Consumer Reports' The Consumerist website asked its readers for suggestions on how to spot both positive and negative fake reviews, and they came up with 30 "tells". Here are a few:

  • The reviewer only has a single review on the site.
  • There's little or no information about the reviewer in their site profile.
  • You can't find any information about the reviewer on other sites, such as LinkedIn.
  • The reviewer uses a pseudonym that has more than three numbers at the end.
  • Multiple reviews, either very positive or very negative, show up about the same subject in a very short period of time (a day, or a few days.)
  • The wording of multiple reviews is very similar.
  • The review uses the "official" name of the product or service. If it keeps using the official name over and over, it may be an attempt to game search engines.
  • There are no details, just a broad statement that the subject is great or terrible.
  • They use "marketing speak"--no one would write conversationally the way that the review is written.
  • There's a "conversion story"--the reviewer thought that they would hate the product or service, but then they tried it, and now they love it.
  • If the subject has multiple locations (such as a chain store or restaurant), the exact same review can be found for multiple locations.
  • The review is very negative about the subject and strongly recommends a competitor by name.
  • There's a link to the subject's website, or a third party's website, in the review.
  • The spelling and grammar in the review is poor--it suggests that the review may have been written by an offshore review mill.
Reviews can save you a lot of money and aggravation, but you have to look for obvious signs of fakery. A fake review can be worse than having no review at all.
Enhanced by Zemanta

No comments: