Five Stars? Think Again

shutterstock_632961749.jpg

In theory, ratings and reviews should improve both customers’ shopping experiences and the quality of products and services in a marketplace. With the boom of e-commerce and ratings sites, this did occur as the cost of “slipping up” for businesses and online vendors has risen substantially. Scanning through reviews has become second-nature when choosing restaurants or hotels, as proven by the popularity of Yelp and Tripadvisor—sites that provide detailed customer feedback on service-based establishments. However, fake reviews have not only grown more prevalent but also increasingly sophisticated. Amazon utilizes an algorithm to identify and delete fake reviews, but in response, they simply become more difficult to detect.

One flaw in the system of online customer feedback may be the imbalance in incentives for businesses and customers to obtain and give ratings, respectively. Since providing customer feedback is usually not mandatory, it is perceived as a task involving extra time and effort that most people are unwilling to expend. As a result, typical customers don’t bother to leave ratings (much less write out reviews) unless their experience was so unpleasant that they’re motivated to give honest, but negative, feedback. In this sense, leaving a helpful review about a hotel stay or dining experience is an entirely selfless act. On the flip side, one study has found that each additional star on a business’ Yelp rating corresponds with more revenue.

So, it is unsurprising when an incident of cheating the system take place, as it did in South Bay, Los Angeles in 2012. Members of a regional business-networking association agreed to exchange positive reviews on Yelp in order to mutually benefit from greater customer traffic. Fortunately, Yelp’s built-in algorithms were able to identify these reviews as fake and delete them, successfully exposing all those involved in the scheme. Nonetheless, this incident proves how easy and potentially profitable it is for businesses to falsely inflate their reputations on Yelp. Similarly, in 2015, Amazon sued several vendors who used the freelance site Fiverr in recruiting paid strangers to write seemingly-authentic reviews of their products. This was arguably more malicious in the way it deceives customers who prioritize “helpfulness” and overall quality written reviews, in addition to a product’s aggregate rating out of five stars.

Although the internet has the potential to make online marketplaces more transparent, evidence like the rise of fake news shows that it is often exploited to fulfill selfish goals. For this reason, technology designed to detect false content has been rapidly advancing to the point at which it outperforms even humans. Such algorithms look beyond the review itself and into the profile of the reviewer, making note of whether the account was recently created, had a history of reviewing products or businesses, along with other characteristics most humans wouldn’t bother to investigate. Another kind of algorithm sorts through reviews based on how detailed, relevant, and helpful they are to readers in order to rank them accordingly.

Still, it is estimated that around 30% of Amazon reviews and 10-20% of hotel and restaurant reviews are fake. The best option for customers may be to simply ignore those that are short and uninformative, instead searching for ones that include photos and objective descriptions. Yelp discourages businesses from rewarding customers for reviewing them, but the site should at least provide some incentive such as the kind of gratification that social media provides. Just as posting stories on Snapchat and photos on Instagram have become normalized, so should writing reviews that others can benefit from. After all, above everything the internet is an open forum that, for better or worse, gives everyone a voice.