Gamasutra is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Gamasutra: The Art & Business of Making Gamesspacer
The Player Becomes The Producer: Natural User Interface Design Impact
View All     RSS
July 21, 2019
arrowPress Releases
July 21, 2019
Games Press
View All     RSS







If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 

The Player Becomes The Producer: Natural User Interface Design Impact


February 24, 2011 Article Start Previous Page 3 of 4 Next
 

Iconography

The communication of the game, and the subsequent interpretation of the player of that communication, both rely heavily on the effectiveness of iconography. Traditionally we are used to selecting graphical representations for a game and using them until release.

The effectiveness of the iconographical mechanisms can and should be tested exhaustively with a usability program. It is no exaggeration to say that the use of one graphic over another can save untold numbers of problems. Truly, a picture can be worth a thousand words!

Iconography Example 1

For a demo featuring a driving mechanic, we tried a number of techniques to instruct the player how to steer a car. Everything from onscreen animated arrows to speech was attempted. Trying to get the player to understand how they should extend their arms and steer in mid-air seemed to be a tricky problem.

One of the artists had an idea and simply placed a graphic of a steering wheel onscreen. In tests, this worked perfectly; clearly, using the right image to trigger real-world actions is the most compelling way to interact with a NUI.

Iconography Example 2

During the development of Yoostar 2 the usability testing spawned a lot of very useful information. On the first screen we had a slider button to start the whole experience. The design went through several iterations until we settled on the single 'hover' button solution.


Evolution of the START button

The development of this button was driven by user feedback from the usability testing. To illustrate the kind of feedback we were working from, here are a couple of excerpts from the usability reports:

“How to interact with the 'handles and rails' UI component is not intuitive. Visually, it resembles a VCR-style play button with corresponding descriptive text. As a result, most users tried to push or grab the button as there is no indication that it should slide.”

“Many users hovered over the handle, then began the swipe action to select their choice.

However if they didn't move in a horizontal line, it was common for the action to be cancelled, as the cursor had moved too far in the vertical direction. Typically, users gesture in a diagonal motion, not horizontal.”

“Some users had difficulty staying within a handle. The 'gravity well' effect worked well in snapping them to the handle, but it was sometimes not strong enough to keep them contained within it.”

Speech Control

One of the jewels in the crown of NUI systems is the ability to use voice control. Whether it's phonetics-based or waveform analysis, there is no doubt that speech recognition is the area that requires the most investment in terms of design and technology as compared with the other sensors in Kinect.

The key to success here is twofold: firstly make sure that all active words or phrases are very different to one another, and then ensure that the system is only trying to identify a limited number of words or phrases. Too many, or non-dissimilar, words and phrases always results in much lower recognition confidence that will inevitably lead to recognition errors. In terms of the player's experience, remembering the phrases, saying them at the right time and accent issues will all complicate matters further.

Gestures

Gestures are a cornerstone of NUI interaction. Again, restricting the number of gestures that can be recognized at any one time is the best approach. From user testing it is also apparent that players can only remember a certain number of gesture shapes.

These shapes need to be regularly repeated by the player, or at least there must be a non-invasive memory jogger system such as environment shapes or tracing shapes. Additionally, gestures often work best in games where there is a good context for them, for example a wizard game where you are casting gesture "spells".

Front-end navigation is also a very obvious and viable use of gestural control. As more titles are released, some control systems are clearly more "usable" by a greater part of the gaming population than others. Development guidance started with handles and rails but seems to be polarising between the pointers and 'hover' buttons system, and the arm sweep menus and swipe system. Interestingly, approximately 20 percent of players seem to have difficulty with the best gesture systems indicating that there is still much to learn in this area.


Article Start Previous Page 3 of 4 Next

Related Jobs

DMG Entertainment
DMG Entertainment — Beverly Hills, California, United States
[07.19.19]

Game Engineer
Hi-Rez Studios
Hi-Rez Studios — Alpharetta, Georgia, United States
[07.18.19]

Senior Technical Artist
Hi-Rez Studios
Hi-Rez Studios — Alpharetta, Georgia, United States
[07.18.19]

Unannounced Project - Gameplay Programmer
Build A Rocket Boy Games
Build A Rocket Boy Games — Edinburgh, Scotland, United Kingdom
[07.18.19]

Senior Animation Programmer





Loading Comments

loader image