Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
September 18, 2014
arrowPress Releases
September 18, 2014
PR Newswire
View All





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


Study: Review Scores Directly Impact Consumer Purchasing, Opinion
Study: Review Scores Directly Impact Consumer Purchasing, Opinion
July 7, 2010 | By Leigh Alexander

July 7, 2010 | By Leigh Alexander
Comments
    8 comments
More: Console/PC



The relationship between review scores and sales is oft-questioned, but now a groundbreaking new study examining the impact of review scores on video game sales finds that high scores do indeed push sales -- even when players make their own quality assessments at the same time.

The Guildhall at SMU, together with analyst group EEDAR, studied a total of 165 people who'd never before played PopCap's popular Plants vs. Zombies -- the only game chosen for the study, due to what was described as its unique combination of high quality and mass appeal.

Study participants were split into three groups: one exposed, prior to playing, to high-scored reviews of the game; a second exposed beforehand to negative reviews, and a control group who was not shown any reviews. After playing for 20 minutes, players were asked to give their own evaluation of the game.

On completing the study, participants got a choice: Take $10, or take a free copy of the game. Fascinatingly, "participants exposed to higher review scores were twice as likely to take a copy of Plants vs. Zombies over the $10 cash, and 85 percent more likely to take the game than the control group," says the study. And they were 121 percent more likely to take the game than were those who had seen poor scores beforehand. Players who asked about the game's real-world online or retail pricing didn't get an answer.

"The EEDAR/SMU study posits that the relationship between video game sales and professional review scores are not correlative but causal," it suggests.

Critical opinion not only affects purchase willingness, but also a player's own opinion: The group that saw high scores tended to offer their own scores on average 20 percent higher than those that were shown low scores, but only 6 percent higher than the control group who had not seen any scores.

"Consumer review scores had a greater variance from the mean than professional critic review scores, which had a tighter clustering around the mean," says the study. "The review score standard deviations of all the experimental groups were significantly higher than the standard deviation of professional reviews. Participant review scores ranged from 40 to 100, whilst the range for professional reviewers ranged from 60 to 100."

And being exposed to positive reviews has a strong effect on whether players recommend a game to a friend or not: "91% of participants exposed to high review scores for Plants vs. Zombies would recommend the product to a friend if they were asked to recommend a 'good game to play,' compared to only 65 percent from Group B (low review scores) and 80 percent from the control group."


Related Jobs

Runic Games, Inc.
Runic Games, Inc. — Seattle, Washington, United States
[09.18.14]

Visual Effects Artist
Cloud Imperium Games
Cloud Imperium Games — Santa Monica, California, United States
[09.18.14]

Technical Designer
Cloud Imperium Games
Cloud Imperium Games — Santa Monica, California, United States
[09.18.14]

Technical Designer
Master Games International
Master Games International — North America, California, United States
[09.18.14]

Monetization Manager










Comments


Chris Remo
profile image
This seems consistent with other similar psychological studies surrounding other topics. I'd be interested to see research on how many video game consumers are actually exposed to game reviews on a regular basis; that's probably a very meaningful metric.

Robert Hale
profile image
It won't be restricted to just reviews though but any opinions that you've been exposed to about the game. I may not read any reviews for a game but if my friends have told me that they think it's bad then I'm already tainted and now predisposed to think the game is bad when I eventually play it.



The article suggests that it's the score of the reviews that affected peoples opinions but they were required to read the entire review as well not just look at the score. The score alone will have an effect but their opinions will be largely formed by the content of the reviews they read. A review that praises the good parts of a game and ignores the bad will likely illicit a similar response from a player who only read that review while one that focuses on the bugs will influence the player to also focus on the bugs as they are then looking out for them in an effort to reconcile the content of the review with their own experiences.

Todd Boyd
profile image
This is just another incarnation of the "cookie scarcity" experiment: Two jars of cookies, one mostly empty and one full, were placed in front of subjects. An overwhelming amount of subjects rated the "scarce" cookies higher than the "abundant" cookies--even though they were the same cookies from the same box.



Preconceptions go a long way.

Sean Parton
profile image
The interesting part is that this is a casual game (so high chance of appeal irregardless, as the study noted), but also that they used professional reviews for the study. Considering the usual stereotype about casual gamers not caring about professional reviews, it's an interesting find indeed.



I wonder what would happen if user reviews were used instead of professional reviews? Or if they used a more core game instead of a (still excellent but nonetheless) casual game?

Jonathan Gilmore
profile image
I think that game purchases, since they are $60, involve a lot more risk than a movie ticket purchase, and that's why reviews matter so much more. It's a lot more rare that a critic favorite bombs and a critically trashed game is a hit than it is with movies, where people are a lot more likely to ignore critics.

Michael Clarkson
profile image
The title of the article appears to be misleading, if the content of the article is accurate. From the description of methods, it appears that positive reviews, not positive scores, influenced the opinions of the various groups. Numerical scores and written reviews are distinct influences, and given the ongoing discussion about Metacritic's (possibly undue) influence it is of value to distinguish them carefully in both the science and the reporting.



This article extensively quotes the study but leaves out vital information. Was this study peer-reviewed and published in a journal? If so, the article should provide as full a citation as possible, and if not, discuss the conditions under which this information was released. Is the study available online? If so, the article ought to provide a link. Did Guildhall and EEDAR fund this study themselves, or was there external support (and from whom)?



In addition, it's best, and I would even argue it's absolutely necessary, to get a comment from someone outside the study as to its implications and scope of validity. There are a number of reputable psychologists (with online presence, even!) who could discuss this intelligently. Such an expert might offer some guidance, for instance, on the question of whether the 6% difference in participant scores between the positive review group and control is statistically significant. I note that these are pervasive problems with contemporary science journalism; I don't mean to single Leigh out.



On the science side, I feel like this study needed some additional conditions. In particular, there should have been groups that read positive/negative reviews of a different game, in order to determine whether the priming effect is general or specific. Also, to address the Metacritic question, some groups that were just exposed to scores (without the reviews) would have been a welcome addition. Adding groups with a delay between reading reviews and playing the game would have been difficult logistically, but would have provided an interesting perspective on whether this was a long-term effect or simple priming.

Kim Pallister
profile image
Chris makes a good point. Interesting experiment that seems to point to positive reviews being better than none or negative ones, but doesn't look at whether consumers read reviews in general. Still, an interesting data point.

Michael Smith
profile image
According to their data, negative reviews have a more significant impact than positive reviews. That's interesting.



However, without looking at the percentage of gamers who get exposed to reviews, you can't estimate the real impact of reviews on purchases. It seems, anecdotally, that the smaller portion of gamers get exposed to reviews before purchase decision, and that of those that do get exposed, it is most often in the form of a numerical score. I think they're more likely to be exposed to a casual review than a professional one. So, if we're questioning the value of professional game critics in purchase decisions, I think it's much lower than this study suggests. But yes, a score does have such an impact on purchasing that it would matter to publishers. It's too bad because numerical scores are misleading.


none
 
Comment: