4

WHY WE’RE BULLISH ABOUT ABSOLUTE VALUES

THERE WAS NOTHING special about a review written by a Gary O’Reilly, who loved the book The Mistress, by British actress Martine McCutcheon. “It was funny and moving in parts and well worth the time spent on it,” O’Reilly wrote, awarding the book five stars on Amazon’s British site. Over the years O’Reilly has written a few more reviews, all positive and all for books from the publishing company Pan Macmillan. In summer 2012, the Sunday Times reported that Jeremy Trevathan, head of Pan Macmillan’s adult division, admitted that he was O’Reilly.1

As we were working on this book, similarly alarming stories kept popping up. Here are three more: In 2012, David Streitfeld of the New York Times wrote about a man named Todd Jason Rutherford from Oklahoma who had started a website called GettingBookReviews.com with a simple offer for authors who wanted their books to be reviewed on Amazon.com. Pay $499, and twenty reviews will be written about your book. For $999, you’ll get fifty reviews. It didn’t take long before orders started pouring in and added up to $28,000 a month, according to the Times.2 Another related story was aired on ABC’s 20/20 in 2010 about how Hamas got an A-minus rating with the Better Business Bureau in Santa Monica, California. Beyond the obvious question of how an organization that the United States considers a terrorist group can operate in California (let alone get almost a perfect score), the program raised serious concerns when a blogger claimed that all it took to get this rating was for someone to call with a credit card and pay $425 to the Better Business Bureau.3 Another story we came across was about a woman from Salford, England, who admitted writing negative reviews of a vegetarian restaurant that she had never visited. She just had a grudge against the owner. “Staff cold and unattentive. The vegan option wasn’t vegan. There were hairs in my quiche,” one of her reviews on TripAdvisor read.4

Stories about vendors that try to game the system, vindictive customers, or accusations of corrupted rating systems appear frequently in the press, raising justified concerns about the credibility of these information sources. There are companies that sell followers on Twitter, views on YouTube, or Likes on Facebook. (We came across a company that offers five hundred Facebook Likes for $19. If you go on the monthly plan, the price goes down to $15.) Indeed, there are many questions that are raised about online reviews and other forms of user-generated content: If reviews can be manipulated by unscrupulous marketers, how can they serve as proxies for quality? And can’t competitors distort the picture to their advantage? What about disgruntled employees or consumers with unreasonable expectations?

These are serious concerns, which may call into question a core premise of our book. If enough attempts to game the system are successful, consumers’ ability to assess the absolute value of products and services will be seriously reduced, and consumers will not enjoy the benefits of the new information environment. Under an extreme scenario in which manipulations go out of control and across the board, our book may be remembered as just an intriguing idea—a dream that never materialized.

Despite these valid concerns, we are bullish about the trends we describe in this book. In this chapter we explain why. Before we delve into the stories of Hamas in Santa Monica and fake review services, here’s a summary of why these trends are almost inevitable:

  • While it’s easy to fake some reviews, gaming the system (without being caught) is harder than one thinks, especially as participation in rating systems grows.
  • There are cases where manipulations are successful. In these cases, a gap between positive reviews and negative experiences is likely to generate frustration, which can erode consumer trust in a review site.
  • When attempts to game the system are caught, the press and bloggers alert the public, which may further erode consumer trust in a review site.
  • Losing consumer trust is bad business for a review site. As a result, the review site will either try to improve by curbing manipulation, or will lose users to alternative solutions.
  • Reviews are far from being perfect, but the one solution that consumers are not turning to is to ignore reviews altogether and just rely on marketers as the main source for information regarding quality.
  • If consumers lose trust in certain review sites, they are much more likely to migrate to review sites they do trust, to opinions of experts, and/or to recommendations from friends and acquaintances. All of these are much more accessible in the new, socially intensive information environment.

Let’s start with the last two points. While we’ll focus in this chapter on reviews written by strangers, it is important to reiterate that consumers’ ability to better assess absolute values is also driven by unprecedented access to experts, friends, and acquaintances. A decade ago, most people’s access to experts was limited to magazines or newspaper columns. Today, top experts are a few clicks away, and their recommendations are amplified through social media (as we write this, tweets regarding Consumer Reports’ review of the iPhone 5 are spreading). It is also radically easier to get feedback from people you know. Post a question on Facebook or Twitter (“Can anyone recommend a good moving company?”) and you are likely to get advice in minutes. Using Facebook’s Graph Search you can find what people you know use or say about different products and services.5 In Chapter 13 we’ll look at new tools that facilitate the assessment of quality even further. The bottom line is this: If consumers will lose their trust in reviews (and currently, there are no signs of that), they are likely to seek information from more trusted sources and new tools. They are highly unlikely to turn to marketers as the main source for information regarding quality. Still, even though reviews written by strangers are just part of what drives the trends we discuss in this book, they are an important part, and most of the concerns we’ve heard surround them, so these reviews will be our main focus for the remainder of this chapter.

FAKING IS EASY. TIPPING THE SCALE IS USUALLY MUCH HARDER

Writing a fake review is very easy, and when there are not enough genuine reviews to counterbalance fake reviews, the latter can have some impact. But the more reviewers participate in a rating system, the harder it becomes to game the system. As participation in rating systems grows, trying to tip the scale without being caught will be more difficult. Even though there are disturbing cases in which manipulation is successful, in many cases manipulation attempts don’t yield real results. We can’t dismiss the problem, especially when there are only a few reviews, but we need to keep things in perspective, and we need to differentiate between the alarming ethical issues raised by these actions and the actual impact that some fake reviews have.

While it’s simple for a product manager to post a few fake reviews about a new gadget without getting caught, it’s harder to do it on a larger scale. A Tennessee company found out about this the hard way when they had to pay the Federal Trade Commission (FTC) $250,000 to settle charges that it used misleading online “consumer” and “independent” reviews.6 Or consider the case of Jeremy Trevathan of Pan Macmillan, who posted a review of The Mistress, a book published by his own company. The harm caused to unsuspecting consumers who rely on fake reviews is obvious, and in some cases can lead to costly mistakes, but let’s examine the impact of this review. A single review is just that—one review. In this case there were about fifty additional reviews, twenty-five of which gave the book one star, some calling it “a disaster” or “Painfully bad.” So The Mistress had a 2.5-star average rating.7 Of course, Trevathan could have asked employees and friends to write additional glowing reviews, but there was a risk associated with it—getting caught can hurt a marketer’s reputation. Benjamin Franklin famously said that three people can keep a secret if two of them are dead. If dozens of people are involved, it’s very hard to prevent leaks. Such group behavior is also easier to detect by algorithms employed by review sites.8

It would be naïve to think that manipulations never succeed. The New York Times story about Todd Rutherford and his paid reviews service mentioned the case of John Locke, who confirmed to the reporter that he had paid Todd Rutherford for three hundred reviews. According to the Times, Locke had sold a few thousand e-books before he signed up with GettingBookReviews.com. Then, in December 2010, after he commissioned Rutherford to order reviews for him, things picked up significantly and Locke sold fifteen thousand e-books. Locke attributed his success to other factors and said that reviews are the smallest part of being successful. He seems to be an effective promoter who connects with readers through his blog, tweets, and personalized emails. Pricing his e-books at ninety-nine cents didn’t hurt, either. Eventually he had sold more than a million e-books through Amazon, becoming a poster child for self-publishing. It seems, however, that other authors who paid for reviews were not as successful as Locke. Not even close. When we checked the rankings of some of these books, one was at number 5,121,624 despite the fact that it had about thirty incredible reviews. Another book (with eighteen glowing five-star reviews) was ranked at 1,254,944. Evidently, Rutherford wasn’t producing an endless stream of bestselling books. If Rutherford made $28,000 a month by providing positive reviews, it proves that some authors are willing to pay good money to see their books reviewed. It doesn’t necessarily prove that readers are fooled by those reviews. We will later present research that links higher review rankings with higher revenues, but this doesn’t mean that faking your way to success is easy, and it is certainly not a sustainable business model.

REVIEW SITES CAN CURB MANIPULATIONS

When a rating system consistently disappoints consumers in assessing the quality of products or services, it will have to improve, or consumers will look for alternative solutions. A failure to control fake reviews can eventually harm a review site. A reader from Chicago wrote in response to the New York Times story, “I enjoyed buying obscure and interesting books on Amazon that I couldn’t find anywhere else. That changed in about 2009 when I started getting burned by 5 star books that were utter garbage once I started reading them myself.”9 A similar thing happened to business traveler Michelle Madhok with hotels she stayed at. “I read reviews of hotels that I’ve stayed at,” she told a reporter, “and they’re just wrong. I wonder if they’ve really stayed at the hotel.” What happened as a result? She had become increasingly skeptical of online reviews.10

People who don’t trust reviews are bad business for Yelp, TripAdvisor, and other review sites. Yelp, for example, relies on advertising as its main source of revenue. If people stop trusting the site, they will find alternative sources of information, and Yelp’s main source of revenue will dry up. Similarly, people who don’t trust reviews are bad business for Amazon, because reviews are a big attraction to the site. As a result of the New York Times article, Amazon removed some of Rutherford’s reviews. In the months that followed, Amazon took some further measures to remove fake reviews (sometimes raising criticism for eliminating legitimate ones).11 Google, which also has a stake in the reviews business, suspended Rutherford’s advertising account, because the company does not approve of ads for favorable reviews.

In order to keep its audience, a suspect rating system is likely to try to regain people’s trust. The question is: Can review sites curb manipulations? The short answer is yes. It’s an endless cat-and-mouse game, but there are many examples for tactics that can cumulatively reduce successful manipulations. For example, in October 2012 Yelp ran a sting operation in which employees pretended to be reviewers, and offered reviews for sale to businesses. Yelp caught about a dozen companies in this operation and these companies had their Yelp page tagged for three months with an alert: “We caught someone red-handed trying to buy reviews for this business.”12 The results were highly publicized in the press and on TV and are likely to make some business owners think twice before they buy fake reviews.

Many review sites employ algorithms to weed out bogus reviews. Yelp, for example, displays about 80 percent of the reviews that are submitted. Review sites don’t publicize their algorithms, for obvious reasons, but in general they are trying to detect anything unusual. One advantage that a rating system has in this battle is its knowledge of normal patterns. For example, one of the books that was backed by Rutherford’s reviews was published in 2009, and had eighteen five-star reviews. A quick glance at the dates of these reviews reveals something odd: Sixteen of them were entered in a span of ten days in January 2011. It is highly unusual for a book to get such a sudden burst of reviews two years after it was published. This is clearly inconsistent with usual patterns and should have raised a red flag. Sites can also detect suspicious patterns in the content of reviews or with a specific user’s behavior (a user who for months obsessively visits the same restaurant on a rating site is not displaying normal behavior; a review from such a user is suspicious).

Accepting reviews only from verified buyers is another method that is likely to work in certain domains. Think about two review sites that rate hotels: TripAdvisor and Expedia. In order to post a review on TripAdvisor, you don’t need to prove that you stayed at the hotel you’re reviewing. In contrast, to review a hotel on Expedia, you need to have actually stayed there. In which rating system do you expect to see more fake reviews? Myle Ott, a computer scientist from Cornell University, and his colleagues Claire Cardie and Jeff Hancock tested this question by comparing six online review sites that rate hotels: Expedia, Hotels.com, Orbitz, Priceline, TripAdvisor, and Yelp. They focused on the relative differences in the rate of deception between the sites, and their results suggest (as you might have guessed) that deception is more prevalent in sites with a low “signal cost,” like TripAdvisor or Yelp, where the requirements for posting are minimal.13

The key point from Ott’s work is that deception rates vary among sites and have to do with what the site does to prevent deception. After talking to people who manage rating sites, we have no doubt that much can be done to curb fake reviews and that not all sites are created equal when it comes to handling manipulations. Consider Angie’s List. In order to post a review on Angie’s List, you need to be a paid subscriber of the service, and although your name is not revealed to visitors of the site, it is known to the folks at Angie’s List and to the contractor you’re reviewing (so that he or she can respond to your review). Posting fake reviews is obviously harder on Angie’s List than on a site where you can post anonymously and without any other commitment.14

How prevalent are fake reviews? There is no easy answer, though all the experts we asked agreed on two points: First, it’s hard to tell. And second, sites can take measures to fight manipulation, so the answer can vary among sites. We read in the New York Times that Bing Liu, a computer scientist from the University of Illinois, estimates that about one-third of all consumer reviews on the Internet are fake. But when we asked him about it, he clarified that this percentage refers to fake reviews before any attempts to curb manipulations are done by a site.15 Myle Ott from Cornell doubts that the exact percentage of fake reviews can be determined, and as we discussed, his research shows that rating sites can take measures to curb manipulation.16 The research firm Gartner estimated that by 2014, 10–15 percent of social media reviews will be fake.17 Jenny Sussin, one of the researchers behind the study, noted that Bazaarvoice (which manages the reviews for sites like Expedia, Walmart.com, Costco, and Best Buy) is a company that does a good job in detecting fake content. When we talked to Brett Hurt, cofounder of Bazaarvoice, he estimated that only 1 percent of all content gathered across client sites is rejected as inauthentic by the company’s anti-fraud technology and team of authenticity analysts.

The bottom line is this: Manipulation can be curbed. Review sites have strong economic incentives to curb manipulations, and they increasingly address the problem.

CHECKS AND BALANCES

In the new environment the reviewers are under review as well. The true nature of things is likely, over time, to be revealed, and this applies not only to manipulation attempts by outsiders, but to the integrity of the rating sites themselves. The story of Hamas and the Better Business Bureau will illustrate this. Millions of people every year check the reliability of businesses through the Better Business Bureau, but in 2010 the organization itself came under fire on ABC’s 20/20 when business owners accused the BBB of letting companies pay to improve their ratings. Wolfgang Puck, for example, argued that they are running a pay-to-play operation (one part of his food empire got an F): “If you become a member you’re sure to get an A, but if you don’t pay, it’s very difficult to get an A,” he said. Terri Hartman, manager of an antique hardware store in Los Angeles, said she was told by a BBB telemarketer she had to pay a membership fee if the store’s C grade was to be improved. (That grade was based on an old complaint that had been resolved.) Hartman said she paid the membership fee, and shortly after that the C was upgraded to an A+ and the old complaint no longer showed in the store’s record.

The Hamas listing was actually a publicity stunt arranged by an anonymous blogger and a group of business owners who wanted to make a point. They listed Hamas with a nonexistent address in Santa Monica. They claimed that about twenty-four hours after paying $425 for membership, Hamas got an A- rating. In a similar act, the group said BBB awarded an A+ to a racist website. Again, they said all it took was a call with a credit card.18

Yelp has faced similar allegations in the past few years from business owners who say sales reps from the company put pressure on them to advertise, and link advertising to the display order of bad reviews. For example, Stacy Oltman, a restaurant manager from the Seattle area, told the Seattle Times she got a call from a Yelp salesperson who gave her a hint: “You have gotten a terrible review online. We would love to help you remove it.” (Yelp maintains that businesses cannot pay to remove or reorder bad reviews.)19

So far, people have not been influenced too much by these stories. The 20/20 story about Hamas was damaging to the Better Business Bureau, but the organization took some measures that seem to have improved the situation. Yelp’s monthly visitors continue to go up,20 and in general, despite an ongoing stream of stories in the media, the public shows pretty high confidence in reviews written by other consumers. A Nielsen study conducted among 28,000 Internet respondents in fifty-six countries found that online consumer reviews are the second most trusted source of information about products, with 70 percent of respondents indicating they trust messages from this source “completely” or “somewhat” (a 15 percent increase in four years).21 The only source that was trusted more was recommendations from friends and family.

If a tech company flies a blogger to a trade show across the world and pays for his hotel, he may be biased in the way he reports about their new products. This, again, may raise some doubts regarding consumers’ ability to assess the quality of products. Yet here, too, there are checks and balances. Some bloggers (like many other people) love perks, but there’s one thing they usually love even better: They love to have readers. And readers look for spin-free answers. If they start to sense that a blogger sticks too much to the party line of a certain company, they will look elsewhere. And these days, there’s no short supply of “elsewhere.” That is another reason why manipulating the outcome of reviews is harder than one might think. In order to game the system, a marketer would have to “bribe” most of the experts, all the reviewers, all the bloggers (or most of them), and that starts to get expensive, and is probably impossible anyhow. Another piece of the puzzle of checks and balances are governments and organizations that can fight manipulation attempts. In the United States, for example, the FTC requires bloggers to disclose any material connections (such as payment or free product) they share with a company.22

Of course, even with the best safeguards, reviews will continue to be imperfect quality indicators. As we said up front, knowing the absolute value (assuming it exists and is unambiguous) is the extreme utopian case that will not be achieved. So we are talking about getting closer to that extreme, and this seems to start happening. Peter Rojas, the founder of Engadget, Gizmodo, and gdgt.com, pointed out to us that reviewers—bloggers, journalists, and other expert reviewers—generally reach a broad consensus about a new gadget. “And it’s not some conspiracy. It’s just that the products tend to get the reviews they deserve,” he added.23 A report by a biased blogger (just like a fake review) is one piece of the puzzle. Any incorrect facts are pointed out quickly by readers, and opinions that go against the majority’s point of view have to be well argued or they are dismissed.

Another proof that reviews help people assess the quality of products is the growing evidence that user reviews and expert reviews usually move in the same direction, a trend that cannot be explained simply based on the effects of experts on consumers. Michael Luca from Harvard found strong positive correlation between expert and consumer opinions on Rotten Tomatoes, a rating system of movies.24 Luca also found a link between Yelp reviews of restaurants and hygiene grades (lower grades by city inspectors are associated with low ratings by Yelp reviewers). In another study, Luca and coauthors Loretti Dobrescu and Alberto Motta compared reviews of books on Amazon with reviews by professional critics and found that expert ratings are correlated with Amazon ratings (although experts tended to favor more established authors and award winners).25 Tim and Nina Zagat (old hands in the battle against fake reviews; they started the New York City guide in 1979) told Emanuel in a 2007 interview that they employ food critics in different cities as one of their many methods to detect unreasonable ratings.26

Joanna Langfield is the owner of The Good Life, a small vegetarian restaurant in Shrewsbury, England. All was working fine when in the summer of 2011 the restaurant started getting very negative reviews on TripAdvisor and other review sites. “It started off quite extreme,” she told a reporter. “Someone posted a review calling me ‘arrogant’ and making other nasty references. TripAdvisor actually took that one down.” The reviews didn’t stop, though, and TripAdvisor was not willing to remove other reviews. Langfield felt powerless.

Beyond the emotional stress, the damage to the vegetarian restaurant was real. Before Christmas, the restaurant owner got a statement from her accountant showing an unusual dip. According to Langfield, profits fell by about 25 percent.27

After a long time, a man who worked for one of the review sites gave police the IP address associated with the reviews which eventually led police to the person behind the posts. It was a woman whose husband was a former partner of Joanna Langfield. The woman received a police caution for harassment and published a public announcement in national newspapers in which she apologized for her action.

In the same way that people are not becoming smarter in this new era, they are not becoming more (or less) honest because they have access to Yelp or TripAdvisor. The stories that opened this chapter are ricochets from an endless battle that most likely will continue into the future in the same way that the battle against shoplifting, credit card fraud, or crime in general will never reach an end. The anguish of people (like Joanna Langfield) who are victims of such manipulations is very real, and we should all fight back and try to curb attempts to game the system. Yet we doubt that manipulations will ever disappear. There’s always someone who stands to gain from distorting the truth—the marketer or his competitors—and there are angry or unreasonable people who will lie for a few bucks or for a variety of other reasons.

But such concerns and bumps on the road cannot reverse what is inevitable—user and expert reviews have the potential to provide essential information about quality, and help make better decisions. If trust in one review site will erode beyond a certain threshold, that site will have a strong incentive to take action. If they don’t, consumers will migrate to sites they do trust, and increase their dependence on experts, friends, and acquaintances. Aside from objective, verifiable specs and facts, the one source that consumers are not turning to as the main source for information regarding quality are marketers. If anything, the opposite is happening: Consumers are looking for new and better ways to get closer to the absolute value of things (and as we discuss in Chapter 13, these tools keep coming). This is why we’re bullish about the trends that we described in the first part of the book. Now let’s turn to the second part, which examines how the shift from relative to absolute changes marketing forever.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.220.1.239