Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
October 21, 2014
arrowPress Releases
October 21, 2014
PR Newswire
View All

If you enjoy reading this site, you might also want to check out these UBM Tech sites:

Full Sail study attempts to shed light on Metacritic's weighting system
Full Sail study attempts to shed light on Metacritic's weighting system Exclusive
March 27, 2013 | By Simon Parkin

[Update: Metacritic has responded to the study, claiming that the information is "wildly, wholly inaccurate." The full statement can be found on Facebook.]

The website Metacritic weights the scores of different video game critics and publications when calculating its aggregate 'metascore' for any particular title.

Each critic/ publication is assigned one of six different weightings of 'importance', with some publications exerting considerably more influence over a game's final 'metascore' than others.

This was revealed by Adams Greenwood-Ericksen of Full Sail University at a talk titled 'A Scientific Assessment of the Validity and Value of Metacritic' delivered at the Game Developer's Conference in San Fransisco this afternoon.

Metacritic confirmed to Greenwood-Ericksen during the course of his research that the site applies different weightings to incoming critics and publications' reviews in order to calculate its 'averaged' numerical score for any particular title.

However, it would not reveal how the weightings were assigned to different publications or on what criteria one critic was given a greater weighing than another.

The course director and his students then set about modelling the weightings based on data pulled from the site. Finally, after six months of work, the researchers compared their modeled scores to the actual scores and discovered that across the 188 publications that feed into Metacritic's video game score work, their findings were almost entirely accurate.

Greenwood-Ericksen stated they wanted to carry out the research as Metacritic scores are "very important to a lot of people" and pointed out that, when publishers withhold financial bonuses when a game doesn't reach its Metacritic target, livelihoods are tied up in the site's work.

He also reminded attendees that often a publisher's Wall Street stock can change on the basis of a Metacritic score., and as such the site's workings are of practical interest.

The findings will also be of interest to consumers as, if accurate, they reveal that some official magazines and sites (which are sponsored by platform holders in some cases) are assigned a greater weighting than independent sites and critics.

Here is the full listing of score weightings used by Metacritic according to Greenwood-Ericksen's research:

Weighting -- Critic/ Publication

Highest (1.5) -- Dark Zero
Highest (1.5) -- Digital Chumps
Highest (1.5) -- Digital Entertainment News
Highest (1.5) -- Extreme Gamer
Highest (1.5) -- Firing Squad
Highest (1.5) -- Game Almighty
Highest (1.5) -- Game Informer
Highest (1.5) -- GamePro
Highest (1.5) -- Gamers Europe
Highest (1.5) -- GameTrailers
Highest (1.5) -- GotNext
Highest (1.5) -- IGN
Highest (1.5) -- IGN AU
Highest (1.5) -- IGN UK
Highest (1.5) -- Just Adventure
Highest (1.5) -- Machinima
Highest (1.5) -- Planet Xbox 360
Highest (1.5) -- PlayStation Official Magazine UK
Highest (1.5) -- PlayStation Official Magazine US
Highest (1.5) -- Telegraph
Highest (1.5) -- The New York Times
Highest (1.5) -- TheSixthAxis
Highest (1.5) -- TotalPlayStation
Highest (1.5) -- VGPub
Highest (1.5) --
Highest (1.5) -- Wired
Highest (1.5) -- Xboxic
Highest (1.5) -- Yahoo Games
Highest (1.5) -- ZTGames Domain

High (1.25) -- Absolute Games
High (1.25) -- ActionTrip
High (1.25) -- Adventure Gamers
High (1.25) -- Computer & Video Games
High (1.25) -- Console Gameworld
High (1.25) -- Da GameBoyz
High (1.25) -- Darkstation
High (1.25) -- Edge Magazine
High (1.25) -- EGM
High (1.25) -- EuroGamer Italy
High (1.25) -- EuroGamer Spain
High (1.25) -- G4 TV
High (1.25) -- Game Chronicles
High (1.25) -- GameDaily
High (1.25) -- Gameplayer
High (1.25) -- Gamer 2.0
High (1.25) -- Gamervision
High (1.25) -- Games Master UK
High (1.25) -- Gamespot
High (1.25) -- GameSpy
High (1.25) -- Gaming Age
High (1.25) -- Gaming Nexus
High (1.25) -- Maxi Consoles (Portugal)
High (1.25) -- Pelit
High (1.25) --
High (1.25) -- PlayStation Universe
High (1.25) -- PlayStation Official AU
High (1.25) -- PSM3 Magazine UK
High (1.25) -- PS Extreme
High (1.25) -- RPG Fan
High (1.25) -- Strategy Informer
High (1.25) -- Team Xbox
High (1.25) -- The Onion (AV Club)
High (1.25) -- Totally 360
High (1.25) -- WonderwallWeb
High (1.25) -- XGN

Medium (1.0) -- 1Up
Medium (1.0) -- CPU Gamer
Medium (1.0) -- Cubed3
Medium (1.0) -- Cynamite
Medium (1.0) -- D+Pad Magazine
Medium (1.0) -- DailyGame
Medium (1.0) -- Destructoid
Medium (1.0) -- Eurogamer
Medium (1.0) --
Medium (1.0) -- Game Revolution
Medium (1.0) -- Game Shark
Medium (1.0) --
Medium (1.0) -- GameKult
Medium (1.0) -- Gamereactor Denmark
Medium (1.0) -- Gamers' Temple
Medium (1.0) -- GameShark
Medium (1.0) --
Medium (1.0) -- GamesNation
Medium (1.0) -- GameStar
Medium (1.0) -- GameTap
Medium (1.0) -- Gaming Target
Medium (1.0) -- Gamereactor Sweden
Medium (1.0) -- The Guardian
Medium (1.0) -- Hardcore Gamer Magazine
Medium (1.0) -- HellBored
Medium (1.0) -- NiceGamers
Medium (1.0) -- Joystiq
Medium (1.0) -- Just RPG
Medium (1.0) -- Level
Medium (1.0) -- Modojo
Medium (1.0) -- MondoXbox
Medium (1.0) --
Medium (1.0) -- N-Europe
Medium (1.0) -- Netjak
Medium (1.0) -- NGamer Magazine
Medium (1.0) -- Nintendo Life
Medium (1.0) -- Nintendo Power
Medium (1.0) -- Nintendojo
Medium (1.0) -- Nintendo World Report
Medium (1.0) -- NZGamer
Medium (1.0) -- Official Nintendo Magazine UK
Medium (1.0) -- Official Xbox 360 Magazine UK
Medium (1.0) -- Official Xbox Magazine
Medium (1.0) -- Official Xbox Magazine UK
Medium (1.0) -- PALGN
Medium (1.0) -- PC Format
Medium (1.0) -- PC Gamer (Germany)
Medium (1.0) -- PC Gamer UK
Medium (1.0) -- PC Gamer
Medium (1.0) -- PC Powerplay
Medium (1.0) -- PGNx Media
Medium (1.0) -- Play Magazine
Medium (1.0) -- PlayStation LifeStyle
Medium (1.0) -- Pocketgamer UK
Medium (1.0) -- PT Games
Medium (1.0) -- Real Gamer
Medium (1.0) -- SpazioGames
Medium (1.0) -- Talk Xbox
Medium (1.0) -- The Escapist
Medium (1.0) -- Thunderbolt
Medium (1.0) -- Total VideoGames
Medium (1.0) -- Worth Playing
Medium (1.0) -- X360 Magazine UK
Medium (1.0) -- Xbox World 360 Magazine UK
Medium (1.0) -- Xbox World Australia
Medium (1.0) -- Xbox360 Achievements
Medium (1.0) -- Xbox Addict

Low (0.75) -- 360 Gamer Magazine UK
Low (0.75) -- 3DJuegos
Low (0.75) -- Ace Gamez
Low (0.75) -- Atomic Gamer
Low (0.75) -- BigPond GameArena
Low (0.75) -- Console Monster
Low (0.75) -- Deeko
Low (0.75) -- Eurogamer Portugal
Low (0.75) -- Game Focus
Low (0.75) -- Gameplanet
Low (0.75) -- Gamer Limit
Low (0.75) --
Low (0.75) -- Games Radar (in-house)
Low (0.75) -- Games TM
Low (0.75) -- Gamestyle
Low (0.75) -- GameZone
Low (0.75) -- Gaming Excellence
Low (0.75) -- Gaming Trend
Low (0.75) -- Impulse gamer
Low (0.75) -- Kombo
Low (0.75) -- MEGamers
Low (0.75) -- Metro Game Central
Low (0.75) -- MS Xbox World
Low (0.75) -- NTSC-uk
Low (0.75) -- PS Focus
Low (0.75) -- PSW Magazine UK
Low (0.75) -- Video Game Talk
Low (0.75) -- VideoGamer

Lower (0.5) -- Armchair Empire
Lower (0.5) -- Cheat Code Central
Lower (0.5) -- Game Over Online
Lower (0.5) -- Game Positive
Lower (0.5) -- Gamer's Hell
Lower (0.5) -- Gamereactor Sweden
Lower (0.5) --
Lower (0.5) -- Giant Bomb
Lower (0.5) --
Lower (0.5) -- RPGamer
Lower (0.5) -- Vandal Online

Lowest (0.25) -- 9Lives
Lowest (0.25) -- Boomtown
Lowest (0.25) -- Computer Games Online RO
Lowest (0.25) -- GamerNode
Lowest (0.25) -- GamingXP
Lowest (0.25) -- IC-Games
Lowest (0.25) --
Lowest (0.25) -- Jolt Online Gaming
Lowest (0.25) -- Kikizo
Lowest (0.25) -- LEVEL
Lowest (0.25) -- Meritstation
Lowest (0.25) -- My Gamer
Lowest (0.25) -- Official PlayStation 2 Magazine UK
Lowest (0.25) -- Play UK
Lowest (0.25) -- WHAM! Gaming

Related Jobs

Cloud Imperium Games
Cloud Imperium Games — Santa Monica, California, United States

Marketing Director
Petroglyph Games
Petroglyph Games — Las Vegas, Nevada, United States

Illustrator / Concept Artist
Harmonix Music Systems
Harmonix Music Systems — Cambridge, Massachusetts, United States

Software Engineer - Animation
Trion Worlds
Trion Worlds — Redwood City, California, United States

Senior Gameplay Engineer


Robert Boyd
profile image
Giant Bomb & RPGamer both use 5 star rating systems - I wonder if that's a reason why Metacritic discounts them (Giant Bomb is a hugely popular site and RPGamer is about as popular as RPGFan which has a much heavier Metacritic weight).

K Gadd
profile image
Another possibility would be that they pick weights based on how often a site tends to be 'correct'; that is how often it tends to get scores that closely match the average from sites that are already highly-rated. In that case, a site like Giant Bomb would probably do poorly since they tend to score games more subjectively and use their whole scoring range (3 stars on Giant Bomb doesn't necessarily mean a game is awful and shouldn't be purchased)

Some of the weights are still really bizarre though. Yahoo games?!

Robert Boyd
profile image
That's a definite possibility. On RPGamer, anything 3 stars or higher is considered a good game but when converted to the 100% system Metacritic uses, 3 stars would be considered 60% (aka this game is horrendous on most sites).

Lewis Wakeford
profile image
Wow. Really? Yahoo games at the top? If these are accurate metacritic is doing something wrong.

It seems like the highest weights go to outlets that follow the status quo the closest. Sites like IGN basically gives everything that isn't terrible 8s and 9s while the sites further down the list are less consistent or don't use the same rating system as everyone else. Like Kevin said, Giant Bomb will give average games 3/5, while other sites might give average games 7/10.

Andrew Traviss
profile image
Metacritic isn't doing anything wrong, the game industry is doing something wrong by giving Metacritic so much importance.

Lewis Wakeford
profile image
Well yeah. Metacritic is only meant to provide a (very) general idea of how well received a game was by critics. What we do with that information is our responsibility.

However, when Yahoo games is apparently one of the most most important review outlets in determining overall critical reception it's pretty obvious that metacritic isn't even doing that job correctly.

Arnaud Clermonté
profile image
Andrew, what's your method of quantifying the quality of your games?
Or do you think that it should not even be measured?

Daniel Campbell
profile image
As a student of Full Sail I can honestly say I'm not that surprised at the inaccuracy. I remember finding the questions (word for word) of a statistics test on google that were taken from another university's program. There are also classes that consist entirely of making students watch Buzz 3D tutorial videos (which are available for free) and then turning in the results of following along with the videos as the final assignment. That's right, they are charging students thousands of dollars to go watch freely available youtube videos.

Zack Hiwiller
profile image
Why were you Googling the test questions?

Daniel Campbell
profile image
I choose not to answer that question on the grounds it will incriminate me. :-)

John Maurer
profile image

Marc Mullen
profile image
If you're attending online then it's up to you to put in the extra effort to ensure you actually learn something. You do use Buzz 3D tutorials but you're also going above and beyond what they do in those. If you apply yourself, you'll be better prepared for your future classes and hopefully be a better candidate for a job in the industry.

Liz Canacari-Rose
profile image
As with any class or knowledge you take on (online or otherwise), the more you utilize what you've learned or go above and beyond the assigned work, the better prepared you will be. No educator can hold your hand through the learning process, you have to want to learn it and find the niche that you are best at. We can only guide you into new territory.

I also would think one would also have learned a few things about posting in a public forum: 1) Don't post drunken or salacious material on Facebook/Twitter/Social sites 2) Don't admit possible lying/cheating in anyway. 3) Be very careful with slanderous material against any person, place or company.

Remember possible employers Google to find out all about you.

will onhead
profile image
Daniel I agree with you. Once a university's questions have been issued and the exam is finished, that information is free, and should be helpful for revision and discussion. This is why questions need to be written anew and differently for each institution's tests. No matter the subject. A university course where the co-ordinator has simply thrown together a list of free urls that could have taken ten minutes, and the only thing required to pass being following the tutorial videos exactly, is an exploitative sham, as much of the gaming media has been concerned these courses can be- a fast, unvetted, exploitative buck for the institute which could be doing more involved and valuable work, such as game jams in whatever tools wanted and more direct supervision and research. Proficiency in interface understanding and the deeper principles of modelling and 3D space that are transferable to the ever changing world of software packages, as sketching is, are more important.

I did not realize it was acceptable to threaten all open discussion and criticism of educational institutions with zero future employment and the full force of the law. Daniel has done nothing but state facts so where is the slander? Do his statements make you uncomfortable? When you say "no educators can hold your hand through the learning process", does this extend to the phrase "no educators can educate."? You are too quick to assume that employers won't want someone to be realistic and serious about the way they spend their money and time to make something compelling. It is an employer's market and an educator's too, clearly.

Hobbyists are making a resurgence this year and it is not without reason.

Paul Laroquod
profile image
4) Be a meek a citizen and don't ever rock the boat lest you be punished by the capitalists that rule the world. DON'T YOU KNOW THIS YET? /s

Spyder ONeil
profile image
The fact that IGN has not just one, but three sites weighed in the heaviest tier...saddens me. It is widely known in the industry they have the worst reviews. They've been contacted on more than one occaision by developers and publishers to remove their reivew because there was evidence that the reviewer hadn't even played the game.

Wendy Jones
profile image
I'd be perfectly fine with gaming sites eliminating a score rating completely and expressing their opinion in purely text or video form. Tell me why the game is good or bad, give examples, compare and contract against other games.

Metacritic has changed how game companies internally view both game concepts and game teams. Bonuses are withheld based on metacritic scores, development teams are broken up and placed on other games. Since the metacritic system is not transparent, you have no idea how your game is really going to be received.

Adams Greenwood-Ericksen
profile image
I'm the researcher who presented the study, and I'm sorry that Metacritic appears to be upset. We were careful to stress that the data presented were *modeled;* that is, they were an attempt to explain observed data in the absence of full internal knowledge of the system. All models are more or less inaccurate, but if ours was "more," rather than "less," we'd love for Metacritic to tell us how so we can improve upon it. I should point out that our internal checking and validation (which we also presented at the talk) showed our model was accurate in most cases to within a few tenths of a point. That being said, we'd love to see the actual weights to use for comparison. We got into this mostly as an intellectual exercise in statistical modeling, and we'd be interested to see how close we got (or didn't, if that's the case).

Adams Greenwood-Ericksen, PhD

Chuong Ngo
profile image
I would honestly love to see a follow-up in 6 months or a year. It will be interesting to see just how accurately this model will "predict" the metacritic scores.

Steven An
profile image
Could you make your scraped data publicly available? That would allow us to easily confirm/deny you and/or Metacritic :)

Steven An
profile image
It would also help if you released the source code for your data scraper, along with a technical paper on your methodology (did you use a least squares fit, etc.?). Unless Metacritic just changes their scores today, it would be damn near trivial to reproduce your results.

Thanks for the work!

Steven An
profile image
I'm also curious whether or not these weights change over time. Did you account for that in your model? Sorry I missed your talk - will check the vault video.

Niero Gonzalez
profile image
Why would Gamasutra run this article without talking to someone at Metacritic first?

That list is so riddled with mistakes at a glance. C'mon guys.

Rob Wright
profile image
Niero, I'm curious -- does Metacritic let you know where Destructoid resides in its weighting ranks?

Like others, I'm also a little baffled by these rankings. For example, I'd think Destructoid would rank higher than, say, Da GameBoyz or Game Almighty, which for crying out loud isn't even around anymore.

But the notion of weighted review scores bothers me greatly. How does MC determine which publications/sites/outlets get higher rankings versus lower ones? Is it based one site traffic/readship? (I'm guessing not) Is is based on review writing quality? (again, that's got to be a no) Is it based on number of games reviewed? When those reviews are posted? What scores are given to top tier titles?

Rob Wright
profile image
Thanks for the post, Christina. I've seen that, but it still doesn't answer my questions. How does Metacritic judge "prestige"? Is it based on brand awareness? Do they commission surveys to judge what sites or publications are held in high regard? (if so, their rankings are WAY off)

As for the writing quality, I find this point to be beyond ridiculous. Yes, some critics do wrote consistently better reviews than others -- critics, being the key word, and not sites or outlets. If a site has one superstar reviewer and a bunch of other mediocre writers, how does that affect the ranking? does a rising tide lift all boats, or does the higher ratio of mediocre reviews to stellard reviews decrease the weight? And does MC judge writing quality? Do they have an editorial review board? How often do they really examine the quality of the content they are aggregating? Is it like the ESRB where they just take a small slice of the content without consuming the entire breadth? So many questions.....

Andreas Ahlborn
profile image
The weighting system metacritic uses is pretty much the core of their business model.

No wonder they want to protect it, and while the study is interesting it should be clear from the start that there will be so much dynamic adjusting of the different input variables, that its as pointless as revealing the secret formula google uses for its searchengine.

With its dominant grip on the gamesreview-market its also clear that AAA-developers already jumped on the Metacriticscore-farming-bandwaggon, which is easy anough to achieve: Secure a deal with one leading gamesplatform (exclusive prerelease-review) and watch the score-avalanche roll in.

Even such a highly estimated studio as Irrational didn`t shy away from using "the highest metacritically scored shooter" as a catchphrase to their BS:Infinite - campaign. It`s also highly possible that the deal Levine made with IGN (which published the Infinite review 48 hours before any other magazine included an NWTUA-clause (Not worse than universally acclaim).

You only have to imagine the Fallout Infinite could have suffered from this one review if they had not gotten the "right" numbers.

Arnaud Clermonté
profile image
I'm surprised that the platform-specific publications would even be taken into account.
From what I remember, they grossly over-rate whatever runs on their platform, especially the exclusives.

David Tarris
profile image
There a couple of unknowns in this equation. First, we don't know the weights of each reviewer, but we also don't know the way scores themselves are weighted (is an F a 60 or a 0?). I don't deny that you can use this model to get reasonably accurate predictions of the Metacritic score, but I don't think you can then regressively conclude that this is the weighting scheme Metacritic uses.

To me, a Metacritic score should represent the percentile of the game in relation to all other games. Meaning a 50 should be average, an 80 the to 20%, a 90 the top 10%, etc. To that end, it would make sense to me to weight each site's score in terms of its PageRank (only for incoming links to its game reviews subpages, in the case of multimedia outlets like the NYT). The reviewer weight would then be applied to a weighted score for the game: the percentile of the score in the population of scores by the reviewer. So, instead of using GameTrailers' 93 as their score in the calculation, if we determine that a 93 from GameTrailers is only higher than 80% of their scores, then we use 80 in the final calculation.

There would probably be some issues with scales that use very few bins (like the 1 to 5 star scale), but overall it seems reasonable.

Matthew Downey
profile image
You could probably solve the question using a system of equations (yay, matrix mathematics!) that factors in the standard deviation from the norm of all previous scores (historic, otherwise old scores would fluctuate over time). This means that more inaccurate reviewers tend to be weighted less.

The only question (if that's even remotely true) is then "How is the standard deviation related to the weight?" My guess is that they take the standard deviation related to the average standard deviation of all reviewers, but i can only hypothesize.

They probably have multipliers for prestige as well, even though that is horrendous practice. (Although this multiplier would be reasonable for reviewers who give more honest, non-sugar-coated reviews, but that is highly unlikely in my opinion.)

[edit:] I made an equation and it would give scores higher than 100, so I'm definitely missing variables. Not an easy question, although you could probably find decent approximations by just solving a system of equations of all reviews in the last year.

Paul Laroquod
profile image
If Metacritic doesn't like this kind of statistical analysis to fill gaps in our information about them, they have a perfect solution: fill in the gaps themselves. Any other response is a waste of space.