Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
October 21, 2014
arrowPress Releases
October 21, 2014
PR Newswire
View All





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


Metacritic is here to stay, but can we fix it?
Metacritic is here to stay, but can we fix it?
July 10, 2012 | By Mike Rose

July 10, 2012 | By Mike Rose
Comments
    46 comments
More: Console/PC, Business/Marketing



Love it as a quick reference, hate it for reducing your work to a two-digit number, or sweat bullets over it when your bonus is on the line, but there is one unavoidable truth: Metacritic is a powerful force in the video games industry.

The review aggregate has, for better or worse, become our standard measure of a game's performance. But is Metacritic killing the video game review? And if so, what can be done about it?

"Not as many young and inexperienced reviewers are confident enough to go with how they actually feel," journalist Keith Stuart, of The Guardian, said in a discussion on the subject at Tuesday's Develop Conference in the UK.

Instead, he says, new critics will often use Metacritic to gauge the scores that are already out, and then base their score off the general consensus, leading to copycat reviews that fellow panelist Paul Wedgewood of Brink developer Splash Damage called "fundamentally banal."

Sameness is only one complaint lodged against Metacritic. Another complaint is that critics race to have the first, hasty review for a game on the site (something that makes panelist James Binns of Network N "always suspicious"). Another problem facing developers is that today's games -- particularly PC games -- are constantly being improved and updated, while their review scores are not.

And of course there is the absolute fallacy of including a Metacritic score in a game development contract, something that Wedgewood and another panelist, Andy Payne of publisher Mastertronic, both outlaw at their companies.

So what can be done? The consensus -- at least in this panel -- was absolutely nothing. The service is just too popular: journalists depend on it for exposure, gamers give it absolute authority, and publishers show no signs of letting up on using it as a metric.

"Even Steam has Metacritic embedded into it," said Payne.

"Everyone is just adding to the problem."


Related Jobs

Treyarch / Activision
Treyarch / Activision — Santa Monica, California, United States
[10.21.14]

Senior UI Artist (temporary) Treyarch
Treyarch / Activision
Treyarch / Activision — Santa Monica, California, United States
[10.21.14]

Lead UI Artist
Vicarious Visions / Activision
Vicarious Visions / Activision — Albany, New York, United States
[10.21.14]

Art Director - Vicarious Visions
Infinity Ward / Activision
Infinity Ward / Activision — Woodland Hills, California, United States
[10.21.14]

Senior AI Engineer - Infinity Ward










Comments


Sean Kiley
profile image
What can be done? Change the rating system from a 100 point scale to a 5 point system (no half points!).

What is the difference between a game with a rating of 89 vs 90? A 30 vs 40? I think the 100 point scale was adopted because that's how we grade school papers, but the two are not interchangeable. With a test paper you have a right and a wrong answer so trying to apply this to a game makes no sense.

A 5 point system is clear people know what the difference between 3 vs 4.

jin choung
profile image
i disagree. xplay was notorious for just giving everything a 3. that 3 gives you a lot of wiggle room and can be prefaced with anything from "it's pretty good for what it does" down to "there's no real reason to play this".

a bad game with pressure coming from publishers, give it a 3. 3 gives you an escape. we don't want the reviewers to have an escape. make a JUDGMENT.

better would be a letter grade. everybody knows the implications of a C versus a B. C is not good.

or if it must be numerical, choose a scale with no middle. 6 point system. 3 above average, 3 below. no hemming and hawing at the middle. a decision has to be made.

Sean Kiley
profile image
I think a 3/5 describes most games, so it's fitting that a lot of games would earn it. 3 does give a reviewer an escape, but at the same time, it de-fangs them. The individual taste of every player creates wiggle room around that 3 anyway.

Doug Poston
profile image
I agree with Jin.

It's too easy to say something is average even when it is slightly better or slightly worse. If the game is truly average then, by definition, with enough reviews the scores will average out to the middle.

Reviewers need to take a stand (even if it's just a small one) otherwise what's the point of listening to them?

Paul Marzagalli
profile image
The biggest problem is on the publishing side: tying in anything business-related to the metacritic score. That makes the scoring system more life & death than it needs to be.

There has been some pushback on media sites recently that I've noticed: folks strenuously noting that 6s and 7s are *not bad*, recognizing that their readership collectively seems to be cutting off anything less than an 8. I like the idea above of switching to a 4 or 5 point scale, like in movies. I suspect that most games would end up in that 2.5 range, where people are more apt to take a chance if the game already appeals to their interests.

Christian Nutt
profile image
IMO a 5 scale should not have halves. a 5 scale with halves is a 10 scale, and then the reviewers start using it like a 10 scale, and it loses its purpose because it goes back to the whole 70 is "average" thing once again.

In 2003 I helped spearhead GameSpy's review scale reboot (went from 100-point scale to 5 stars with no halves). 3 stars was "good", 4 "great". I gave several big games 3 star reviews and got some flak from PR for it. This was primarily driven because 3 stars = 60 Metacritic, a failing score.

I left GameSpy in 2004. Management which came in -- ad-sales focused management, not editorial staff -- forced the editorial staff to add halves to the scale to placate advertisers, and even in one case I am aware of forced a staffer to change a review score due to advertiser complaints. This was all because of Metacritic (or maybe, given this was 2005, GameRankings.)

I should note explicitly in this comment that the person who forced this scale and score change is no longer with GameSpy and the site is under the management of EIC Dan Stapleton, who was not there at the time this stuff went down (it was 7 years ago after all) so as far as I am aware this in no way applies current-day GameSpy.

Later I went to GamesRadar (Future), where my manager would not post reviews that he felt were score outliers until he saw what the Metacritic consensus was, sometimes holding reviews for days. He'd change scores if he felt they were too far from consensus.

Again, I will say that this person is no longer there and the site is now under the management of EIC Gary Steinman, so this info no longer applies.

What these two examples do illustrate, however, is how the Metacritic system can warp the decision making process on both the editorial and non-editorial management sides for different reasons.

Paul Marzagalli
profile image
For what it's worth, Christian, I'm definitely in favor of more general review policies such as Kotaku's "Yes/No/Maybe" combined with their willingness to revisit those reviews post-early patching (if a game has issues at launch). Like you said, though, the problem is that there's a traditional synergy between marketing and these "4.5 out of 5!!!!", "10 out of 10!!!" reviews, along with the stigma issues as the scores get lower. And of course, we can't forget that Giant Bomb was born out of just the kinds of controversy you talked about.

It's unseemly.

Christian Nutt
profile image
There's definitely a tremendous amount of groupthink in today's "mainstream" enthusiast game reviews.

E Zachary Knight
profile image
I like Netflix's rating system:

Hated it
Didn't like it
Liked It
Really Liked it
Loved it.

No question on your feelings toward it.

John Flush
profile image
...Other than the subjective nature of how much hype you took into it. Many games get little to no hype and people will dive in and say they really liked it or loved it. And a very similar game they were extremely excited for didn't live up to every promise and now gets a 'Didn't like it'. Or maybe the genre is getting tired on the reviewer and later games get "didn't like it" simply because it is more of the same despite it being better in every way from the previous iterations of similar games.

I'm just not sure review scores have any value at all except giving a quick idea of maybe perking interest to read why they didn't / did like it and letting one judge from there.

Mike Griffin
profile image
There's also the semi-secretive internal "weighting" system used by Metacritic, whereby review scores from certain media outlets are granted additional weight in the averaging, while the scores from other outlets contribute less, at Metacritic's discretion.

Jeffrey Crenshaw
profile image
It sounds like the problem isn't metacritic so much as people using it poorly. All the aggregation and weighting that metacritic does is a tool; it is neither good nor evil (assuming no bias like being paid to leave off bad reviews), and anyone with enough time could do the same thing without metacritic. Inasmuch as this "thing" -- aggregation and weighted averaging -- is useful, metacritic is providing a service for many that should be seen as a Good Thing.

Of the remaining issues mentioned in the article and by Christian, the only thing that strikes me as evil is coercing good reviews by threatening to pull advertising. This might seem like a gray area because "hey, it's their ad money, they don't have to spend it on a publication that bashes them", but let's be honest -- this lowers the trustworthiness of game journalism, hurts customers with inaccurate reviews, hurts other companies that took the time to make a good game but don't have the advertising budget to bribe journalists, and minimizes the incentive for companies to actually make good games (when they can just as easily strong arm the sales they want, why bother with that irksome act of improving quality?). So no, not gray; evil all around. The other issues like metacritic-related bonuses instead of sales target bonuses sound more like mistakes than acts of malice.

Mike Griffin
profile image
Perhaps it's a better service for consumers than creators.

Sadly both performance clauses are highly commonplace, especially in AAA:
Team-wide Metacritic target bonuses and sales target bonuses, equally weighted.

I've witnessed Metacritic averages affect distant DLC and sequel approval decisions before final sales figures. I've seen teams shuffled or outright dismantled within weeks of release, largely over Metacritic performance.

It can be inspiration to work really hard, or it can feel like a knife pressed against your neck.

It's an inescapable metric for big publisher marketing, and they frequently remind team members about those pressure targets during development -- inevitably adding to internal stress. And all-too-often it becomes a useful "escape plan" for big publishers to clean house on a studio after critic targets aren't met on a title.

Policy that relies on questionable external feedback metrics to influence grim choices that affect dozens or hundreds of workers. Policy nonetheless.

I do admire a guy like Paul Wedgewood, who I've chatted with in the past, for outlawing any Metacritic-based performance clauses from his studio's development contracts. It's one less shadow to fear over the course of an arduous production, and certainly if the studio head doesn't trust the system, he is justified to circumvent its potential shackles for the good of the team.

I detect it's not a lack of confidence in his team to hit positive scores; it's a lack of confidence in an aggregate scoring service that his team shouldn't be forced to worship, lest they be sacrificed at the altar.

Eric Geer
profile image
I use the site to get a general view about a game. but I don't generally pay attention to the published critic reviews, but rather the User reviews because I know I can generally trust someone who is posting their thoughts aobut it and aren't getting a dime for it. Yes there are trolls, but these are generally easily pointed out.

I do believe and in fact know that there are great critical reviewers, but lots of times this group think, or money incentives ruins the critical reviews.

Michael DeFazio
profile image
these ideas need blog post (but for the sake of brevity, here are some bullets):

1) allow metacritic users to get personal scores based on reviewers they subscribe to (and this score "ignores" reviews from reviewers they don't share common interests with or have "unsubscribed" from).

2) have "profiles" for each reviewer (not the publication mind you, but the actual reviewer). if a game i am interested in gets a bad review (lets say GoW: Ascension) from a hipster who doesn't like action games, i want to ignore it, (but that doesn't make their opinion invalid, just irrelevant to my tastes). Also, I'd like to not only know a little more about the reviewers previous reviews, but also their favorite games, genres, franchises, that will help me determine the context of their review.

3) in addition to having "profiles" for professional reviews, have a community of "users" who can create profiles and review games... it'd be be more interested in hearing the opinion of someone who shares many of my "top 10 games of all time". it would also be nice to see if a game got a (low or high) score from a user, what other games they loved or hated... (trusting anything related to the existing "User Score" on metacritic is a joke.)

People always say they want "unbiased" reviews, hogwash, I want biased reviews, just from people who share my same interests and values (so i can determine whether a game is worth my money/time/interest).

Fabio Macedo
profile image
I'm not sure that parts of 3) and (specially) 2) are exactly viable as they are maybe too complicated to compile. But 1) alone would be *fantastic*. I believe it's something gamers with half a brain probably already do to some extent, at least in their minds. And your last paragraph is spot-on: "unbiased" reviews not only are undesirable but ultimately impossible. You can only achieve an *illusion* of impartiality in any form of journalism, and in critical journalism even moreso as your views will always be coloured by your personal experience at the very least - no one has seen every movie/listened to every song/played every game out there, and that throws any pretense of pure objectiveness out of the window right at the outset. So let us know WHO is writing that review, and if possible let us subscribe to which reviews we do care about.

[User Banned]
profile image
This user violated Gamasutra’s Comment Guidelines and has been banned.

Matthew Downey
profile image
I don't think #2 is as hard as Fabio Macedo makes it out to be. All you really have to do for this one is label each game under a genre (which is hard, but do-able for most cases, anything too hard to classify can be put under "other" and is thus ignored by the rating system). Then, for any genre a reviewer gives consistently low reviews, that genre can be ignored on a per user basis. The only issue would be if a reviewer has only reviewed one game in a genre, but even then you can see how far below the median that person rated the game.

For the franchise and game categories, there's no real problem, since that isn't a very gray area.

Point being: The reviewers will express how much they like a game, franchise and even genre through how they rate them.

You could simply filter in/out reviews manually by on average X%/X points below/above the median score on game/franchise/genre. You could also filter out critics for inconsistently scoring a franchise/genre compared to the median.

I honestly think for franchises, reviewers should give two scores, one that is intended for newbies to the franchise, and a second for how fresh it feels to veterans of the franchise.

Jason Pineo
profile image
"...gamers give it absolute authority..."

No. Even accounting for hyperbole: No. I only trust game reviews expressed in words, not numbers. Takes a bit longer to read them, but the results are typically dependable. Assigning a number from 1-100 reads the same to me as a number from 1-5: I see it and start looking around for the text review.

TC Weidner
profile image
I always get a kick out of reviewers who are biased right out of the box. Nothing like a FPS fan reviewing a puzzle game.

anyway I agree with E-Zach Netflix rating system makes sense.

Matthew Cooper
profile image
As a consumer, when I'm researching a potential game purchase, Metacritic almost never fails in ball-parking the quality of the product. I for one am happy pretty darn happy with the service they provide.

Baking it into development contracts is another matter, but I don't see how one would view that as Metacritic's problem.

Jerry Curlan
profile image
It's amusing that Keith Stuart from The Guardian seems to be opposed to the idea of review aggregation, yet he chooses to have his reviews appear on all the major aggregators. And it seems to me that the proper target of the "problems" here should be the critics themselves. This panel is essentially accusing gaming critics of having no spine - of folding to the powers that be that produce these games. Are game critics today throne sniffers? Do they have the guts to speak truth to power? And if not, are they called out for it? I don't see it happening. I see a ton of blog ink spilled when a critic disagrees with the herd by giving a low score to the latest iteration in a beloved franchise, but when it goes the other way - when a critic gives a perfect score to a crap game, they seem to be given a free pass because "developer bonuses depend on good reviews." Poppycock. Let's grow a pair, game critics, and start acting like critics in every artistic field since the days of Shakespeare. Be true to yourself and judge these games as they should be judged. If everyone were to do this - ignoring all outside forces - then the aggregators would reflect the quality of games properly. And this stupid grade inflation would cease to exist.

Bob Stevens
profile image
The only way to fix Metacritic is to stop giving points, percentages, or stars for reviews. But we could never get the entire industry to band together to do something this sensible.

I'm actually curious where this practice started so if I ever have the ability to go back in time I could stop them after I'm done stopping Hitler.

I've read some of Truffaut's Cahiers du Cinema reviews from the 50s and unsurprisingly there were no stars attached, just thoughtful commentary.

John Flush
profile image
The aggregate score is the problem though. If you bonus is how the game scores on metacritic, how does that help the artist team know if they did a good job? or the gameplay engine? "5 stars, no halves" or "liked it; disliked it" doesn't really help it is all subjective BS. I miss the days where I knew that the game was done technically well, controlled well, had great art, performed to genre norms well, etc. - but that takes effort. Effort ad payments don't require.

How do you give out technical and other yearly or generational awards when no one actually scores those things anymore? You don't, you take the publishers ad money and slap an 8+ on it.

Reviews scores are a pile of BS unless they come in mass from people not paid by ad money.

I would also like user scores to be taken into account in some regard. This year has some games with really high "professional" scores with horrible user scores (see diablo 3 and Mass effect 3). The user scores point out that while the game might be 'technically' good it really didn't do much for fans or expanding the brand. Looking down the list, when you see a big difference between the two score sets you can almost correlate those to the fact there was a lot of ad money backing the "professional" scores.

Gary LaRochelle
profile image
"I would also like user scores to be taken into account in some regard."

Agreed. I also like to take in consideration the user's reviews. But nowadays, some game makers flood review sites with phony jacked-up reviews. These studios include pop-up review requests that default to a five star review. If I see multiple five star reviews and they contain just a one word comment, I know it's a jacked-up review. I would like to see an end to automatic five star reviews.

Robert Bevill
profile image
I wrote a blog about this a while back, but it boils down to this: scores, on their own, are harmless. They give the consume some at-a-glance advice on a product. For your average AAA release, it's just a safety net to make sure what you're buying is worth the money. For smaller developers, a great score can mean far more attention to your product than if scores weren't there at all.

However, because publishers rely on Metacritic for bonuses, and people are scared to be the outlier, reviews have become meaningless. One of the reasons Giant Bomb is my favorite gaming site is because of their five-star rating, which gives you an at-a-glance impression of its quality, but it's more important that you actually read the text. However, I get far more consumer advice listing to gaming podcasts, as well as videos and word of mouth. If a friend likes it, that means far more to me than getting a 95 somewhere.

Jeffrey Crenshaw
profile image
Interestingly, if publishers can use ad threats to invalidate the legitimacy of a review score and artificially inflate it, could they use similar tactics to keep review scores low to avoid bonus payoffs? Like, running the numbers, they decided they'll sell almost as many units with a 74 metacritic as a 75, but when they cross the 75 boundary there is a large payoff to the developer that they want to avoid. So they keep their ad staff on standby monitoring metacritic ready to pull ads for bad reviews _unless_ a bad review prevents this sort of pay off.

Perhaps there is something inherently wrong with a bonus system that can incentivize one party (the publisher) to want slightly worse metacritic scores. Not likely to happen, but the more I work in this industry and hear/see how greedy publishers are and how little they care for developers, the more I learn to look for exploits like this.

Addison Siemko
profile image
Yeah, I instantly wondered if something like that was possible after hearing about the poor New Vegas team being shafted for missing the 'mark' by a point....

For shame.

A S
profile image
Metacritic is an extremely effective solution to a problem - how do I judge if this game is worth my money. Defining it as a problem is pretty blinkered viewpoint, as is it seems superior to a single critics opinion.

A couple of case studies -

FNV. Current Metacritic score for FNV is 84 on PC, which I think is a good representation of that game. It's very good, but it isn't great, and when it came out it was heavily bugged, meaning this score feels good, if maybe a little generous.

Master of Orion 3. Current metacritic score on PC is 64. Again, it was an OK game contending with an incredible legacy from MOO2, this feels like a pretty accurate score.

I haven't found any cases where metacritic has really dropped the ball and rated a good game poorly or a terrible game highly, but I have seen plenty of individual reviewers do those things.

It comes down to this, you get rated in life, and it's not always enjoyable. It's not a huge surprise that it's mainly the developers and publishers whose income is directly affected that don't like metacritic, and yet consumers continue to use it in droves. This is the nature of a competitive market and the solution is to make better games, not to complain about the rating mechanism.

[User Banned]
profile image
This user violated Gamasutra’s Comment Guidelines and has been banned.

wes bogdan
profile image
While ncaa came out i have never bought it as i really don't care for football,basketball,hocky or tiger.

Do i have mario tennis and golf sure and most racers are fun.

I just downloaded rainbow moon and quantum conundrum off psn and can't wait for fall of cybertron and borderlands 2.

My point is you wouldn't want me reviewing major sports games as i simply don't care whereas i'm up for most other titles and love underapriciated gems like bg and e or brutal legonds.

Jakub Majewski
profile image
I'm continually amazed by how much people (i.e., the publishers and developers) care about Metacritic. It's a system that levels Gamespot with Joe's Irregularly Published Game Review Blog (note: that's a website name I just made up, it doesn't exist). It's a system that levels people who have been reviewing games for a decade with amateurs who really always wanted to review a game, and so they created a blog to do so.

Then again, for the last couple of years, I've been getting the impression that increasingly, no one gives a damn about Metacritic. Maybe this debate is a few years too late?

One sidenote - I'm amused to no end by the suggestion that one problem with review scores is that games improve, while review scores do not. Well, dash! How unfair it is that those pesky reviewers don't take into account the pressures we developers face! Why don't they understand that if we release a buggy product only to patch it three months later, they should pay more attention to our feelings (hurt by a low review score) than to the feelings of millions of customers (hurt by the low quality of the game). If anything, I get the impression that game reviews don't pay enough attention to the fact that some games are released in non-releasable form only to be patched later. Take Skyrim, for instance - yes, it is a great game, and yes, it eventually was (will be?) patched enough to deserve those 95% review scores... but on first bat, all reviewers across the world should have subtracted 20% from their scores as punishment for releasing a game that many players found to be barely playable.

wes bogdan
profile image
With metacritic the reviewers score's weight should come from what they care about not because it's game informer.

If i reviewed a sports game that i didn't care about my score would be based on mechanics and playability but should count less than a sports heavy reviewer who knows those games inside and out.

Mechanics and playability are important in every game because the more invisable the controls the deeper the play whereas the more visable and broken the controls the more you're pushed away and have to strugle with the game.

Maria Jayne
profile image
The ammount of 1 and 10 that metacritic gets from user reviews suggests to me users in the majority do not understand the concept of scale. Therfore any review site that uses a number isn't really saying anything more than "good/bad". Where good is in the top 25% and bad is everything else according to bonus payout clauses.

The problem with verbose scales like Netflix is that metacritic can still reduces it down to a number.

As an example

Hated it -1
Didn't like it -2
Liked It -3
Really Liked it -4
Loved it. -5

just because the site uses words, doesn't mean it won't be twisted into a numerical score. As long as we care more about arbitrary numbers than we do about experiences this won't change. We don't care enough about somebodies experience to actually read what they say, often we just want a number so we can decide if it's worth bothering with or moving on.

Michael Joseph
profile image
It's like gamification of reviews... what's not to like?

John Byrd
profile image
No. http://en.wikipedia.org/wiki/Betteridge's_Law_of_Headlines

Luis Guimaraes
profile image
http://www.metacritic.com/game/pc/hunted-the-demons-forge
http://www.metacritic.com/game/pc/borderlands
http://www.metacritic.com/game/pc/half-life-counter-strike
http://www.metacritic.com/game/xbox-360/naruto-shippuden-ultimate
-ninja-storm-2
http://www.metacritic.com/game/pc/unreal-tournament-iii
http://www.metacritic.com/game/playstation-3/unreal-tournament-ii
i

William Anderson
profile image
Personally I like the system Rotten Tomatoes has by way of having a Reviewer Score and a Public Score system side by side, for often I find myself going by the Public Score and not the scores posted by professional movie reviewers, for they tend to be far more critical than the general public.

Kimo Maru
profile image
Metacritic's grading system is, unfortunately, not a great indicator of game quality. I never pay attention to it. Neither are the Critic Scores (truly, it boggles the mind that anyone would read what a "critic" thinks of a game.) However, the feedback in the User Scores section is valuable - even when the gaming community becomes bent on trashing the score by stuffing the ballot with zeros. The overwhelming negative backlash on Mass Effect 3 and Modern Warfare 3 spoke volumes for those titles (both are turkeys, imo.)

So, yeah, Metacritic's not perfect but it's hard to get better feedback on a new release anywhere else at the moment. I'm sure that'll eventually change.

Patrick Haslow
profile image
I can sum up my feelings about metacritic best in one word:

"journalists"

Sorry to say, but the credentials of far too many reviewers are quite questionable. It's part of a larger issue of credibility for all web-based journalism in general, and it takes a lot of sorting to find the good amongst the amateurs. Sometimes when I read game reviews online, they seem like they were written by a college freshman- or worse.

When Metacritic creates an aggregate score with so many frankly junk reviews mixed in, it makes the final review score a really weak measure. In addition, Metacritic's weighting system is in itself a kind of a review of the reviewer. That's a lot for me to take on faith, and I am not pleased having my bonuses relying on it either.

Jerry Curlan
profile image
Problem is, it's tough for the average game to know who The New York Times or Roger Ebert of gaming are. I've seen people react to a low EDGE score by saying that it must be a blog or a fringe site. EDGE? One of the most respected magazines in the world. So sure, everyone knows Game Informer, but do they know who Tom Chick is? Do they know the writers who comprise the new Polygon?

Jonathan Jou
profile image
As a mathy person, I think it's worth noting that score aggregation sites like Metacritic aren't the "problem," they just don't go *far enough*!

The average score is meaningless without standard deviation; telling me how a publication reviews in compared to other publication ignores the ever-changing roster of reviewers that publication has.

I believe Metacritic needs to go all the way! Specifically, these sorts of things would give me the "filter" I'd need to really understand what reviewers think about a game:

1. Range, Median, Standard Deviation, Average. Why rely on averages when you can see the distribution directly? You can do this without listing confusing values, just show me a bell curve with all the scores, or a heatmap on a bar from 0 to 100. I want to know if a game is controversial, or if a game is undisputed in its quality. I want to know if someone hated it, and I want to know who's giving it a perfect score. I like reading extreme reviews, because they usually have something to say, even if I don't agree with it.

2. Reviewer profiles: Based on each reviewer, I'd like a tag cloud of words most commonly used in reviews, review counts by genre, recent reviews... the works! Is this reviewer a cynic, who uses words like "lackluster" or "mediocre" where other people are using "standard"? Is this reviewer an avid FPS player, who was for whatever reason given the thankless job of reviewing a JRPG? What does this reviewer think about other games in the genre, and games outside the genre? How many reviews as this reviewer written? If I need to know if I should agree with a reviewer, these are the things I go through and this is what would make Metacritic a critique of critics!

3. Publication trends: Show me how the scores are trending! Is Gamespot suddenly trying to give out scores that actually span the whole spectrum evenly? Are some words popping up unusually often in recent reviews? (Tag clouds are fun, and being confused with too much information is better than being mislead by oversimplification!) Are some genres being treated with obvious mercy while other genres are being hammered? Are some reviewers giving wildly different scores than they normally do? Are they playing genres they've never reviewed before?

4. What about sales? Can Metacritic get that data? Can it compare the correlation between a publication's reviews and sales? Can it tell me if a game that is being panned is still selling better than the critical darling? It's always fun to see if reviews drive sales, and so including sales in the aggregate data will probably really help people give some games a chance!

I appreciate that Metacritic might be sneakily weighting their reviews. Maybe they could get Metametacritical, and have gamers be able to upvote or downvote reviews to adjust those weights for them?

Anyway, that's my spiel. I like statistics. Show me more statistics, don't hide everything behind a number.

Matt Cratty
profile image
If it wasn't for user reviews on metacritic, I'd never be able to make a confident game-buying decision again.

NOT BECAUSE the user average is worth the digital bits that its made of. But, because I can read through them and find people that express similar gaming influences and loves and then get a picture of what gamers like ME think of this game. In other words, I can get context.

I could care less about what a commercial review site/magazine thinks for the most part. I'm not about to suggest that they are on the take like many do, but they simply have led me into a bad decision one too many times because they are now speaking to a much larger audience than when they were relevant and a useful part of the decision-making process. But, there isn't a way to fix this with just additional hoops to jump through, because it will be the same people providing the input.

One note: reviewer profiles are not a good idea as only the mass market brain rot would hold sway. Just like in real reviews.

Yes, my solution does nothing for the mass market and the people that don't want to waste time filtering through reviews for context. But, like Jason Pineo said above, numbers are nothing and context is everything (paraphrase).

Ara Shirinian
profile image
The problem with things like metacritic is that the industry (publishers) always does its best to most efficiently exploit such systems to the fullest extent. Metacritc isn't the biggest problem. The biggest problem is an industry that sees things like it, reviews, journalists and press as nothing more than tools to be ruthlessly exploited. If the press is not constantly vigilant and fighting against such forces, they will continue to be overwhelmed, and their meaning will continue to be compromised.

Metacritic didn't kill the video game review, that's been dead for some time already. They've just stuffed it to cover up the smell and attached puppet wires to it so we can all look at it, see it move and reassure ourselves that everything's still fine.

[User Banned]
profile image
This user violated Gamasutra’s Comment Guidelines and has been banned.

[User Banned]
profile image
This user violated Gamasutra’s Comment Guidelines and has been banned.


none
 
Comment: