"Let's watch a movie tonight," you say to your significant other, your best friend, your cat, inadvertently setting yourself on a path to discovering how Rotten Tomatoes lies to you. But whatever will you choose? With so many options, you've got to find a way to narrow down your choices. You quickly check Rotten Tomatoes to see what's "Fresh." Surely, if 87% of people enjoyed a film, you will as well. Right? Well, not so fast, because there are some deceptive things about Rotten Tomatoes.
Do you really know how Rotten Tomatoes works? Or, for that matter, does anyone? Film criticism in the age of corporate sponsorship is a tricky thing. After all, Warner Bros. owned Rotten Tomatoes at some point, and still holds a minority stake, and Fandango owns the majority stake. Is Rotten Tomatoes really ensuring these films are receiving fair and meaningful reviews?
Perhaps you think all of this is a bit moronic. Does anyone really care about Rotten Tomatoes and whether or not we understand what it means? Well, yeah, actually. Because Rotten Tomatoes scores can make or break a movie, and are even used in advertising campaigns.
The Term "Consensus" Is Used Very Loosely
Rotten Tomatoes ostensibly offers a consensus of film critic reviews. Some of these reviews are negative, some positive. With enough volume, a consensus usually tips one way or the other. However, the sample size used by Rotten Tomatoes varies wildly, and is therefore not always indicative of any meaningful cultural consensus.
By way of example, Teacher of the Year, a 2015 mockumentary staring Keegan-Michael Key as the titular teacher, has a 100% on Rotten Tomatoes as of April 2017. Apocalypse Now, some movie from a guy called Francis Ford Coppola, has a 97%. You might see this and think, "Holy sh*t, Teacher of the Year is better than Apocalypse Now! I gotta see this movie!"
Not so fast. Teacher of the Year has seven reviews posted on Rotten Tomatoes. Only one of these is from a so-called Top Critic (more on that later). Two of them are from the same site (Kaplan vs. Kaplan) and one is from Cleveland.com, a publication of dubious repute.
Apocalypse Now has 79 reviews, 16 of which are from top critics. You get a lot more meaningful consensus from 79 reviews that you might from, say, seven. If seven people tell you a movie is good, and you only trust one of them, you might think, "Well, maybe." If 77 people tell you a movie is good (there are, apparently, at least two people in the world who don't like Apocalypse Now), and you trust 16 of them, you will see that movie.
The plot thickens when a Google search reveals there are reviews in the wide world of the Internet for Teacher of the Year not included in the consensus, and not just random stupid blog posts Why? Well, that's really at the discretion of Rotten Tomatoes.
Everyone’s A Critic, Literally
So who counts as a critic on Rotten Tomatoes? Are you reading a consensus based on reviews written by people who really know what they're talking about? Who work for top publications, have seen thousands of movies, studied film or film studies, know all the correct terminology and points of reference, and understand how the medium works and what makes a genuinely, objectively good movie? Or are you reading a bunch of drivel peddled by fans who used the Internet to weasel their way into critical discourse by espousing the opinions of fellow fans and thereby gaining readership, support, and, because of that, ad revenue, studio approval, and legitimacy?
Reviews from print publications across America make the cut, but how does the site deal with Internet scribbling? According to Rotten Tomatoes, "Online publications must achieve and maintain a minimum 500,000 unique monthly visitors according to comScore, Inc or Nielsen Net Ratings and reviews must have an average length of at least 300 words." Sure, 500,000 seems like a big number, but with sites pulling down tens of millions of unique visitors every month, it isn't, really. And you can apply to be a Rotten Tomatoes-approved reviewer right there on the site.
Everyone sees movies. You see a movie, you think, "I didn't like that." That's your subjective, probably uniformed (no offense, but it's true) opinion, not film criticism. Still, you might be inclined to tell the world about it via the Internet. If other people with your subjective opinion agree with you and you earn yourself a lot of readers because of that, you can get your reviews featured right there on Rotten Tomatoes alongside people with 20 years experience and a degree in film studies. You might not be a Top Critic, but who actually pays attention to those distinctions? Suddenly, your subjective, uninformed opinion is equally as valid as far more objective reviews from actual film critics.
As Box Office analysis reveals, negative Rotten Tomatoes scores have an impact on the success of a film. Let's say you and all your fellow fan reviewers give a movie negative reviews. That's tantamount to you saying a filmmaker should've done something different, right? That you know better than filmmakers what makes a good film. And maybe that's right. Who knows. But it probably isn't.
To put this in perspective, pretty much everyone except the homeless live in a building of some kind, right? You write your movie reviews at a desk in an apartment. Say there's something that annoys you about the layout of your building. You haven't studied architecture or structural engineering, have you? Probably not.
Do you feel emboldened to state your subjective opinion on these perceived problems with your building in a public forum, in such a way that would have a negative impact on an architect or engineer? Do you think you should really be telling these people what to do, or will you just sit silently and think, "I don't like this, but I should let the people who know what they're talking about do their jobs"?
The Site's Historical Perspective Is Super Screwy And Might Mess With Your Head
Rotten Tomatoes has all the movies. There's stuff from the 1920s on there and stuff that hasn't come out yet. And all these movies are lumped together under the same grading system, despite some of them being rated by glowing retrospectives offering historical context and social significance, and others being rated on a the basis of a single viewing (maybe on a day during which a reviewer saw five other movies) and with an inability to fully understand social and historical perspective.
Here's where Rotten Tomatoes lies to you: according to an analysis of Rotten Tomatoes data by the good folks at Slate,
"Tomatometer rating is strongly influenced by its age. Films from the 1920s, for instance, have an average Tomatometer rating around 91 percent, while films from the 1990s average around 55 percent. Movies might have gotten worse since the Great Depression, but not that much worse. The golden-oldies effect may be explained by a bias toward reviewers reviewing, or Rotten Tomatoes scoring, only the best movies from bygone eras. Rotten Tomatoes includes a score for Casablancafrom 1942, for example, but leaves out clunkers from the same year like The Corpse Vanishes and Lady Gangster."
On top of this, there's a huge difference between a review, written as an immediate reaction to something and with no real purpose other than to tell a readership whether or not to see and movie and why, and a piece of film criticism. You can see these differences when comparing scores and consensus on movies.
Everyone loves the Toy Story films. Only a cynic could hate them. From a critical perspective, they're well-structured, visually innovative, and have solid jokes and performances. They're also damn entertaining. It's easy to overlook any flaws.
The first Toy Story has a Rotten Tomatoes rating of 100%. The consensus reads "Entertaining as it is innovative, Toy Story reinvigorated animation while heralding the arrival of Pixar as a family-friendly force to be reckoned with." That's what you call historical perspective, and film criticism. The film changed animation, and is rightfully regarded as a watershed moment.
Toy Story 2 also has a Rotten Tomatoes rating of 100%. The consensus reads "The rare sequel that arguably improves on its predecessor, Toy Story 2 uses inventive storytelling, gorgeous animation, and a talented cast to deliver another rich movie going experience for all ages." That is a review. You see the difference, right?
La Dolce Vita, widely regarded as one of the greatest movies of all time, has a Rotten Tomatoes rating of 97%. Dave Kehr of the Chicago Reader and Stanley Kauffman of The New Reader are the sole dissenters, and both raise philosophical objections with what director Federico Fellini says in the film. Should you believe, because two film critics take exception fo the point Fellini makes, La Dolce Vita is a lesser film than Toy Story or Toy Story 2? And, in the case of Toy Story 2, is it fair to compare relatively short reviews to lengthy, thoughtful film criticism of a philosophical nature, as La Dolce Vita faced?
The Percentage Doesn't Work Like Percentages Usually Do In Your Life
You want to see a movie. You head on over to Rotten Tomatoes. Say you see The Martian on there. Sure, who doesn't like Matt Damon? And space? And Ridley Scott? Wow, look at that! A 92%! Hells to the yeah! Buying a ticket on Fandango immediately (by the way, did you know Fandango owns Rotten Tomatoes? Fun fact. Anyway).
You look at a score as a percentage and you probably think one of two things, a school test score or an SAT score. Remember when you took the SAT and it told you, after your score, your percentile? That means what percentage of people you did better than on the test. If you got in the 92%, that's crazy! You did better than 92% of people who took the test. At school, if you got a 92 on a test, that meant you only got 8% of the answers wrong. Hella smart.
That's not how Rotten Tomatoes works. Rather, someone (probably a team of people) goes through reviews for a film (or, in some cases, reviewers post for themselves, which presents a host of other problems, as you'll read later). For each review, someone decides whether the review is, on balance, positive or negative. In many cases, reviewers do not provide grades or ratings, so making this decision can turn into a subjective crap shoot. The number of positive and negative reviews is averaged to determine the percentage of a film.
For The Martian, a 92% means 92% of reviewers decided the movie was more good than bad. It's possible a number of these reviewers thought it was mediocre, but entertaining, and therefore worthy of a positive review. The number does not reflect a grade in the way it would on a test.