Ranjan here, talking about datasets stuck in time, Rotten Tomatoes, and internet reviews.
My co-host Can is heading to Beirut this weekend. I brought up how when I visited, I was shocked at how good Foursquare was for recommendations. We agreed it can still deliver better than a TripAdvisor or Yelp, all over the world. There are a few things that make it a really intriguing dataset:
I'm guessing the only demographic that ever really used it have similar tastes as me. For a few glorious years you had moderately affluent, tech-forward, urban professionals roaming the world and actively reviewing places.
It was never overrun with spammers because it never became that important a platform. I'm guessing no black hat marketers ever spent time going after Foursquare. That's what Yelp and Google Reviews are for.
The company never had a massive, operational advertising model built on the ratings. They seemed to live in the pre-monetization purgatory of late-stage growth funding for a while, and now, I think they're selling my location data to hedge funds, or something like that. But it never got to an IPO-level revenue push that forced them to build in aggressive ad products.
It's like this weird dataset, trapped in amber, built by a very specific audience that was never corrupted by external influences, and is allowed to currently exist courtesy of an exogenous business model.
Rotten Data
On the other side of the dataset purity spectrum, you have Rotten Tomatoes. For a while, if I saw a Rotten Tomatoes movie with a rating above 90%, I watched without hesitation. Regardless of the genre, it almost always delivered. Not just a decent movie, but one so good it makes you spend a few hours on Reddit afterwards, trying to learn more.
It not that anymore. A 90%+ movie is still pretty good, but not that lightning strike it once was. This recent study from a movie marketing consultant validated my suspicion that there has been ratings inflation:
The average rating from 1997-2010 for all movies that were widely distributed (1000+ screens) was 44.6. In 2018, it was 57.2, steadily increasing since 2010 (with a slight dip in 2012-13).
My technically minded co-host has pointed out that I usually ascribe all product outcomes to underlying business models. The revenue channels, the capital structures and the corporate ownership end up driving the product. But hey, I'm an ex-finance MBA turned business development type.
I knew Rotten Tomatoes was currently owned by Fandango, but in writing this post learned the longer-term story (from Wikipedia):
Rotten Tomatoes is an American review-aggregation website for film and television. The company was launched in August 1998 by three undergraduate students at the University of California, Berkeley: Senh Duong, Patrick Y. Lee, and Stephen Wang.[4][5][6][7] The name "Rotten Tomatoes" derives from the practice of audiences throwing rotten tomatoes when disapproving of a poor stage performance.
Since January 2010, Rotten Tomatoes has been owned by Flixster, which was in turn acquired by Warner Bros. in 2011. In February 2016, Rotten Tomatoes and its parent site Flixster were sold to Comcast's Fandango.[8] Warner Bros. retained a minority stake in the merged entities, including Fandango.[2]
Correlation doesn’t always equal causation, but…….right after the purchase of the site by Flixster (a venture-backed social-networking site for movie buffs) in 2010 the average rating started climbing. Things are up and to the right after the 2016 Fandango purchase.
Of additional note, in 2015 (prior to the acquisition) Walt Hickey studied how Fandango ratings were already inflated relative to IMDB, MetaCritic and RottenTomatoes.
Fandango ratings were already biased positive, and after the RottenTomatoes acquisition, those ratings started climbing as well.
They sell movie tickets. They want us going to the movies.
Follow the Money
For anyone who read my Margins newsletter on stock index construction, you may notice I think about the business models which underlie "neutral" ratings systems. But it is skepticism, not cynicism. If the company that owns the rating system sells movie tickets, and everyone else in the movie industry has a strong economic incentive for the ratings to be better, they're going to be better.
I'm more fascinated by the delicate balance of trust that makes these reviews valuable to a platform. What level of bias can be injected before that trust is broken, and consequently, when is there an impact on the overall business.
Rotten Tomatoes is an incredibly effective and cheap marketing channel for Fandango. They also make some money on display ads, and they might get paid to feed their ratings into services like Apple TV (I’d think so). If we all stopped going to RottenTomatoes.com, how does that change Fandango's marketing cost structure? What does that trust provide to their bottom line? Is there a moment that it’s significant enough that they would finally have an incentive to fix ratings inflation?
Fake Amazon Reviews
A much bigger related story is the clusterfuck that are currently Amazon reviews. The Hustle had a fantastic piece on going "Inside the fake Amazon review complex" and there has been a ton of related coverage. Most of my friends who work in ecommerce look at them as a joke. Both Amazon and 3rd-party sellers have a strong incentive to see higher ratings - it necessarily means more transactions. Amazon's incentives to crack down only kick in if people start to leave the platform because of too many misleading purchases.
I've mostly lost trust in aggregate Amazon ratings but I still use them a lot, just not for product discovery or research. This could one day start to matter to them. Customers ignoring the Amazon platform for discovery because a lack of ratings trust could hurt them at some point, especially in their burgeoning ad business. But we're not quite there yet.
Side note: I’ll once again recommend the Fakespot Chrome Extension, which overlays an adjusted rating right into the Amazon site - and no, I am not compensated for this :)
Opportunity
That slow erosion of trust in reviews does present a big opportunity for startups and competitors. I religiously use The Wirecutter to begin the majority of my product research. They still mostly transact through Amazon affiliate links, but I have increasingly seen them linking to other commerce sites (Target, Jet/Walmart). While that beautiful Foursquare dataset saves me from the cesspool of Yelp, I have began increasingly using The Infatuation for restaurants. Market entrants can capitalize by taking over product discovery and recommendation elements in creative ways if the trust in reviews continues to break down. Maybe one day I'll be going back to the NYTimes for movie reviews.
Human curation, FTW.
Related: How Rotten Tomatoes Ratings Work
Rotten Tomatoes, explained: This Vox piece does a great job laying out the critic-driven Tomatometer scoring system. It feels constructed reasonably enough to avoid significant bias, unlike the audience-meter which is prone to sexist or racist spam attacks (which actually led to a recent methodology change):
The opinions of about 3,000 critics — a.k.a. the “Approved Tomatometer Critics” who have met a series of criteria set by Rotten Tomatoes — are included in the site’s scores, though not every critic reviews every film, so any given score is more typically derived from a few hundred critics, or even less. The scores don’t include just anyone who calls themselves a critic or has a movie blog; Rotten Tomatoes only aggregates critics who have been regularly publishing movie reviews with a reasonably widely read outlet for at least two years, and those critics must be “active,” meaning they've published at least one review in the last year. The site also deems a subset of critics to be “top critics” and calculates a separate score that only includes them.
A Song
Soldi - Mahmood: I just discovered you can embed Spotify tracks in Substack so I'll try ending this newsletter with a song - there's still a geopolitical and identity media element.
A rapper named Alessandro Mahmoud won a big song festival in Italy. He was born in Italy, to an Egyptian father, and that was enough for the right-wing government to say this song was not genuinely Italian.
It's pretty good.