This article is part of the On Tech newsletter. You can sign up here to receive it on weekdays.
This week Amazon acknowledged the reality: it has a problem with fake reviews.
The problem is that Amazon blamed almost everyone involved in untrustworthy reviews, and not nearly enough the company itself. Amazon criticized Facebook but failed to realize that the two companies share an underlying problem that people trust in could undermine their services: the inability to effectively monitor their sprawling websites.
Learning from the crowd is a promise of the digital age that has not come true. Before we buy a product, book a hotel, or see a doctor, it can be great to evaluate feedback from others. But it’s so common and lucrative for businesses and services to pay for all kinds of websites or otherwise manipulate reviews that it’s hard to trust everything we see.
The persistence of fake reviews raises two big questions for Amazon: How much attention does Amazon really give to stopping fake customer feedback? And would it be better for buyers if Amazon reevaluated its essence as (almost) everything that goes online?
Amazon’s rules prohibit companies from offering people money or other incentives for reviews. Amazon says it intercepts most of the fake reviews and works to stay one step ahead of rule breakers. Yet the global review fraud industry is active on Amazon and everyone knows it.
According to the Vox publication Recode, Amazon appears to have been persuaded by the Federal Trade Commission and journalists to take action to counter manipulated reviews.
After a Wall Street Journal columnist wrote this week about the purchase of a RAVPower charger that came with a postcard with a $ 35 gift certificate in exchange for a review, the seller said Thursday that it had been banned by Amazon be. (The statement is in Chinese, and I read it on Google Translate.) That was followed by bans on several other large sellers who have apparently been buying reviews for years.
If government attorneys and newspaper columnists discover sellers who openly manipulate reviews, how hard is the company looking for them?
Perhaps you think the world works like this: Caveat emptor. When I read reviews of products on Amazon or from doctors on Zocdoc, the feedback is helpful, but I take it with caution.
But unfortunately, a lot of people are harmed by fake reviews, and they are not always easy for us to spot. The Washington Post recently wrote about a family who was misled by bought-out Google reviews for an alcohol addiction treatment center. I wrote about research last year that found that Amazon picks up a lot of purchased reviews, but only months later and after shoppers showed signs of being misled into buying a product.
I wish Amazon took more responsibility for the problem. In its statement earlier this week, the company blamed social media companies and poor regulatory enforcement for fake reviews. Amazon is right. Fraudulent online reviews are big business with many enablers. Facebook and China’s WeChat app aren’t doing enough for forums where companies coordinate review manipulation.
But Amazon didn’t say much about what it could do differently. For example, the University of California researchers I spoke to last fall found that purchased reviews are far more common with Chinese vendors and with products for which many vendors sell an almost identical product. Perhaps that means Amazon should monitor more closely China-based sellers? Or that it would be helpful to limit the number of sellers who offer the same toilet trolley?
Strong ratings also help sellers appear prominent when searching for products on Amazon, which creates a huge financial incentive to fraud. Should Amazon reconsider how it considers ratings in search results? The company didn’t say that.
Most of all, it is disappointing that Amazon does not recognize that fake reviews are a result of its decision to opt for quantity over quality.
People can buy almost anything on Amazon and almost any seller. This can be great for buyers, but it comes with tradeoffs. Being a business for everything – and one that tries to operate with as little human intervention as possible – makes it harder for Amazon to stamp out fake or dangerous products and purchased reviews.
Before we go …
-
No more “speed filter”. NPR reports that Snapchat will be phasing out an app feature that will allow people to record and share how fast they are driving. Road safety advocates say the feature has for years encouraged young people to drive recklessly in order to give themselves the right to brag.
-
Destroying myths with WhatsApp: During the pandemic, government health workers in rural India used WhatsApp to counter misinformation about the virus, The Verge reports. Healthcare workers take a long time to review information on the app, but the online messaging and in-person conversations seem to keep a lot of people safe.
-
LOOK AT THE GIANT BUNNY: My colleague Amanda Hess has spoken to people who post online videos of their numerous and exotic animals. The Pet Tube niche corresponds to our love of vision gag like a bunch of snakes sliding on a piano, but these people love animals too – “even potentially repulsive flocks of animals,” wrote Amanda.
Hugs too
A baby seal tests the water. The little one switches from uncertainty to joy at lightning speed.
We want to hear from you. Tell us what you think of this newsletter and what else you would like to learn from us. You can reach us at ontech@nytimes.com.
If you have not yet received this newsletter in your inbox, please register here. You can also read previous On Tech columns.
Comments are closed.