I was thinking about buying a probe thermometer because I would like to better assess the doneness of my meat, so I went to Amazon to take a look. I typed in “probe thermometer” and Amazon returned a set of thermometers. They vary in price and functionality, but they all have one thing in common: 3.5 stars.
The other day, I decided I wanted to try a new Indian restaurant in my area other than the few I always frequent. Lo and behold, they all have 3.5 stars on Yelp.
It seems obvious that, given enough reviewers, customer reviews should tend to mellow out around, well, 3.5 stars. But there are notable exceptions (consider, for example, IMDB).
What is it about product/service reviews in particular that seems to promote the averaging out around 3.5 stars? Let’s take one heavily reviewed Amazon product — an excellent book that I recently read called The Art of Fielding (in my opinion, the best book ever written about baseball):
5 star: (100)
4 star: (34)
3 star: (35)
2 star: (29)
1 star: (50)
It’s an odd distribution to say the least, but it highlights perhaps the problem of people who write online reviews. They are overwhelmingly very high — I enjoyed this book so much that it warranted a review — or overwhelmingly low — this book was so bad that I decided to review it. Compare this with the reviews for the digital probe thermometers and the effect is similar:
5 star: (41)
4 star: (23)
3 star: (10)
2 star: (12)
1 star: (28)
Again, lots of 5 star reviews and lots of 1 star reviews. Again, though, this points to the kind of person willing to write a review for the product. In buying the product you already analyzed it and expected it to be worth your money. If you were extremely impressed or disappointed, you reviewed it. If you were meh, why bother reviewing?
Thus, all popular products inevitably end up in the meh bin.
Weirdly, this does not apply to non-book pop culture. For example, Adele’s 21 sports a shocking 4.5-star rating, but again, this has to do with the ability to sample the wares before you buy the product (same with movies). Before listening to an album, you have heard the songs enough to know whether you like it enough to buy it.
What’s unnerving about this tyranny of the 3.5-star review is that it then makes the customer reviews essentially worthless. The whole promise of crowd-sourced reviews was that they would remove the monopoly of product-reviewers and open everything up to the masses. Instead, the incentives for responses make it such that the reviews provide little to no value to the consumer.
Whether it’s recipes or Zadie Smith novels, the “pretty good” averaging out of reviews has hurt their ability to tell us much about the product we’re buying. In the end, we either end up trusting the qualitative reviews over the quantitative (a bad proposition, if you ask me) or we buy the product and hope for a generous return policy.
Just as with a lot of the new information-heavy world does not make our decisions any easier, these reviews are just more information without any understanding of what they really mean.