On 25th of May 2011, a young bachelor downloaded every score from review aggregator Metacritic in an attempt to educate, entertain and get laid.
These are the MetaCritique Graphs.
The 7-9 Scale: Fact or Fiction?
7 is average, 8 is good, 9 is great. Working magic like a magician, adding up like a mathematician, colloquially it feels like video game reviewers transform a 10-point scale into a 3-point.
But how much truth is there to this old nerd's tale? Metacritic averages select publications' review scores into a standardised measure, the MetaScore. The 7-9 scale jokes got old a long time ago, but can Metacritic prove to us if it was funny because it's true?
The distribution of MetaScores is indeed significantly shifted toward 70, with a calculated average MetaScore of 67.9 yet more probable score of low 70s.
Metacritic officially launched in 2001 but contains a smattering of older reviews dating back to 1994’s Doom II, scoring 83. Scrolling through the years we see the biased not-quite-normal distribution develops early on and is fully established after 2001's official launch. So it appears that reviewers do in fact favour the top half of the scale and have been doing so for at least a decade. The 7-9 scale joke is ten years old! Time to grow up and move on?
In some way Metacritic’s secondary traffic light grades correctly adjusts, and tries to move past, the much maligned game score inflation. Contrary to their films, tv and music grades of 0-39 Unfavourable, 40-60 Mixed and 61-100 Favourable, games are elevated respectively to 0-49 , 50-74 and 75-100.
I’m paraphrasing MetaScore’s grades slightly: they further subdivide the greens and reds up into universal acclaim and overwhelming dislike. For more bore you can check out their official explanations and conversion snore factors: MetaScores Explained.
Growing Consensus: ...but we know what we like!
Another frustrating aspect of game reviewers is their chorus-like approval when praising God's latest gift to gaming. I know I can't be the only person to notice this.
And I'm not the only person. Sean Houghton performed the above study of similar review aggregator GameRankings, looking into the individual publication’s scores in relation to the averaged GameRanking score.
Vividly he shows that as the GameRankings score of a game increases, the distribution of publication’s scores becomes more defined around the mean. As games get "better", the independent reviewers - completely independently, probably in hermetically sealed chambers - score the game the same. Why rock the boat when the boat's this good?
Everyone agrees on the absolute best. Few publications disagree when everyone else agrees on the absolute best. Why rock such a good looking boat?
Is such unanimous verdict in reviews of these critically praised games a good thing? These are just numbers we're looking at here but I find that the disappointingly homogenised, often gushy, reviews of critical darlings all read the same. Mainly they read like the back of the box- but that's a topic for another graph and I don't think I have all the Zs I'd need to make it.
More variety in publication scores is found in the broader distributions for games with a lower GameRankings (40-60). It’s when games “divide the critics” like this do I find interesting and varied reviews.
Care and attention are given to differentiating between amazing and outstanding shades of green when a game delights. But if it fails to entertain, when a game gets slammed by reviewers, the same courtesy is far too often forgone in place of ridicule and dismissal. Numerically we see the lower half of the scale is not even fully realised.
The lowest scoring game on Metacritic is Big Rigs: Over the Road Racing with a lowly MetaScore of 8 from 5 publications. I shudder to think of all the corny trucking jokes those reviews contain.
The Inner Circle: Greatest Game, Ever
Let's leap up to the top of the scale and have a look at the highest rated games of all time.
Pretend you’re a publisher or game developer and it's your dream to make the greatest game of all time. Woah! Stop right there! Who told you to think such crazy thoughts? I now this fictional career change has been short term, but game development is clearly not for you. Unless you work at Nintendo- or better yet you're the latest incarnation of Link- the dream of making the greatest game of all time is probably not going to happen.
Or is it? There’s one surprising entry amongst the Hall of Famers: Out of the Park Baseball 2007. I'm told people apparently exist who have disputed this baseball management sim’s besmirchingly high MetaScore, citing its low number of publications, only 5, as a valid reason for its exclusion.
But it seems unjust to be affronted by the inclusion of a good baseball game because of its low number of reviews when not four paragraphs ago we all chuckled at Big Rigs: Over the Road Racing. Remember: Numbers are meaningless without context. So I'm sure we can all take MetaScores a little bit more seriously now we've seen, mathematically, that Out of The Park Baseball: 2007 is as good as Half-Life 2.
Publishers: I am not a number, I am a free man
It tops so many pointless Top 100 nostalgia fests it's unsurprising to discover Legend of Zelda: Ocarina of Time is Metacritic's highest rated game of all time. In many ways it’s to be expected when Nintendo are involved: They’re a large, wealthy, experienced and powerful company who are also very, very good at what they do. They’re one of the Big Guys.
So let’s take into account publisher experience and organise their average scores by the size of their catalogue.
You may have heard the online hullabaloo caused when Metacritic decided to begin rating developers in a similar manner to the games they produce. They've since abandoned this plan. Well if you wanted to chart which publisher is the cutest, this is one of the ways we could visualise such a high-school popularity system.
Now we can also quantify how good the big publishers actually are. Pretty good and big, eh? Here the 'Number of Games' counts multi-platform releases of the same title as separate games- which in terms of programming they are.
There are a staggering number of smaller publishers who are doing well regardless of my never hearing of them. Congratulations! I retroactively bestow upon you my blessings. In many cases the lesser experienced, often first time, publishers are outperforming the industry veterans. The little fish can outswim the big fish.
Even without the bloated Activisions and Ubisofts, the average MetaScore still hovers around the 65-70 mark. So if anything it almost appears the larger publishers are raising the standard for everybody. To you I bestow my permission to continue pumping out product.
Taking a closer look at the high-scoring, smaller publishers we see the shooting star games which set the sky on fire first time. id Software's Quake and 2D Boy's World of Goo, share the highest rated game in this round, both scoring 94. We also now see the emergence of the comparatively-conserved giants like Rockstar, Blizzard, Valve and, our new favourite publisher, Out Of The Park Developments.
If you've made it this far, hopefully you've realised you can interact with most of these graphs. These two in particular are pretty nifty.
Releases: That Time of the Week/Month/Year
This viz stopped working and I can't figure out why. Arguably it works better as a picture anyway. But if you do want to argue then by all means go ahead.
Turns out there is no correlation between high MetaScores and release date. Boy, this is awkward. Here's what we know for sure: Most games are released in the latter quarter of the year right after the summer drought. The strategic thinking behind this may be based on a company's annual financial and market forecasting. It may involve a fat man in a red suit. Or, as Jamie Madigan points out on the highly recommended Psychology of Video Games, perhaps its the opportunity to capitalise on fresh impressions of recent games for those Best of The Year lists.