A recent study of some of Amazon.com’s top 1000 reviewers has PC Magazine writer John Dvorak questions the validity of their reviews:
In the first academic study of its kind, Trevor Pinch, Cornell University professor of sociology and of science and technology studies, independently surveyed 166 of Amazon’s top 1,000 reviewers, examining everything from demographics to motives. What he discovered was 85 percent of those surveyed had been approached with free merchandise from authors, agents or publishers.
Pinch, who also found the median age range of the reviewers he surveyed was 51 to 60, a surprise said Pinch, because the image of the internet is more of a young person’s thing. Amazon is encouraging reviewers to receive free products through Amazon Vine, an invitation-only program in which the top 1,000 reviewers are offered a catalog of free products to review…
This is the fraud aspect of the process that cannot be tolerated. And now to find out they are in a much older demographic makes me think they are just product hoarders who will say what they need to say to get more products. This conclusion is hinted at by the professor.
I do not like man on the street reviews. I never have, and I’ve always thought they could be easily corrupted by smart public relations folks who have already dove into what they call social media. This includes phony personas on Twitter and Facebook that are used to sway public opinion, shipping free goodies to “influential” bloggers, and things like this Amazon scandal.
Dvorak is not really arguing that reviews are not valuable but rather that because Amazon does not fully disclose how these reviewers operate, customers could be duped. The problem here is trust: Dvorak and others might assume that reviewers are doing this out of the goodness of their hearts but instead they are “professionals.” Instead, these reviewers are being “paid.” This is a classic gatekeepers problem: how do you know that a reviewer is trustworthy and giving unvarnished opinions? There are plenty of critics these days for various media outlets and websites. I suspect many average citizens have to read through multiple reviews from a single critic to see if their thoughts line up with their own or to see if they are consistent.
Of course, Amazon relies on a crowd sourcing approach, just like aggregator websites such as Rotten Tomatoes or Metacritic. Do these top reviewers really sway people’s opinions about products since there are often many others who provide reviews of the same products?
Why not ask Amazon whether critical reviewers have been kicked out of these programs? Dvorak is suggesting that these reviewers would speak positively about products just in order to receive more – couldn’t Amazon fight back against this?
My first thoughts when I saw this study a while back was that how confident could Pinch be about his findings based on 166 reviewers. Why not go for a larger sample out of the 1000 Top Reviewers?
(Side note: at the end, Dvorak applauds Pinch for tackling this topic:
By the way (and off topic), you should read my writings over the past 30 years, because I have been hounding sociologists around the world to begin to study these sorts of computer and Internet activities. Give Professor Pinch an award, will you! Maybe that will encourage more studies.