The quality of video game design ultimately depends on a developer's creative vision. However, evaluating how close the final game experience is to the original developer's intent is often hard. Games User Research (GUR) is an emerging field that aims to get the product closer to the developer's intention in terms of the player experience. GUR analyzes the interaction between players and games to get insights from players to improve game design (see here for a Gamasutra feature on what GUR is).
Games user researchers (GURs) use many different methods, tools and techniques to gather information about players and their gameplay experience. This information is used to support design decisions of game developers (see here for another Gamasutra feature on GUR methods).
When conducting user test sessions (either internally or through contractors), applying correct methods and thorough data analysis does not improve anything if the final GUR results are (1) not communicated well, (2) not convincing, or (3) not actionable for the development team.
In this feature we are focusing on a little-explored area of GUR: reporting. We will be looking at how user test (UT) findings are communicated among game development teams and how game development can benefit from storytelling and storyboarding techniques (which are common in web development).
We interviewed six game development professionals from midsize UK game design studios to explore storyboarding. We wanted to know how they communicate UT findings, what are the limitations in current approaches, and how these approaches can be improved.
For our developers, the main values of UT sessions are to see: (1) areas of frustration, (2) areas that are difficult to pass (blockers), (3) if the players are having fun, and (4) if players understand the game and are using all the game features. They mentioned that an ideal report would be a process to capture a massive UT data and report it in a way that it is easy to make sense of.
For example, in a previous title worked on by one of the developers (a racing game) they collected game metrics to generate a crash heatmap of each track. They added: "from heatmaps we could see the crashes, but we know they can lead to different experiences. Some of them lead to enjoyment and some lead to frustration; the heatmaps won't show this difference [...] [an ideal report] is somewhere between only seeing the heatmaps and talking to the actual players." They suggested: "for some issues you wouldn't feel a text report could put them in a right context and time line. For example, when interpreting from a text report, there is no way to see the change of pace and enjoyment."
Let's summarize the most important aspects of UT reports identified from these interviews:
The report summary is the section they all read and found most useful. All of our developers think a good UT report should show an at-a-glance-summary of a level.
Location of issues in each level is an important factor for prioritizing fixes for them. Our developers want to see where exactly the issues occurred. For example, they said the "UT report should show me where my good and worst parts are, this can help me to prioritize what to fix. It is more concerning if a negative experience is happening at the beginning of the game, because people can easily drop out there."
Trust is a vital matter for UT reports; if the development team does not trust the results the problems will stay. Our interviews suggested that the most convincing case is when the developers personally attend UT sessions and have a face-to-face conversation with players or watch the gameplay video. They also said "to trust a UT report, it should provide evidence of why something is wrong with the game; only stating the problems won't be enough."
Our developers want UT reports to enable them to make a comparison between the player's experience and the experience they intended to design for. They think it would be great if a report could show an accurate match to their intentional design. They suggested: "if using the UT report as a comparison tool, it has to distinguish between usability, user experience, and pace in the game." For example, the existence of a usability issue would not be intended, and it is not a way for pacing games.