[In this analysis piece, Gamasutra news director Leigh Alexander looks at recent gaming interface trends, suggesting audiences always want the most abstract route possible -- with notable implications for social media and gamification.]
The shortest distance between two points is a straight line. Or, that's what people say, generally when they mean that the most direct way to solve a problem is ideal.
Although the 1980s gave us novelty power gloves and game pads, the most popular games were the ones that required a D-pad; press A to do this, B to do that. Simple. And yet the direct route isn't always very exciting.
We want to know where the flying cars are, of course. We were supposed to have our flying cars, despite the fact that their introduction onto the landscape would cause absolute logistical nightmares for existing cars, roads, trains, air travel, you name it.
So we play with interfaces. We're preoccupied with ways to make real life more exciting, to use technology to make our behaviors feel magical. So while pressing B is probably the most efficient way to jump, the past several years have seen a surge in simulated interfaces -- ones where the inputs are imitations of real behaviors.
These include plastic guitars; motion-control wands that let you "hold" tennis rackets, golf clubs and, naturally, wands; or virtual spaces navigated by an avatar.
The latest frontier, camera-controlled gaming as pioneered by Microsoft's Kinect, removes abstraction almost entirely, in favor of letting the player use his or her own body to literally simulate behaviors that will be acted out in the game.
Getting your average person -- one not particularly versed in gaming, for example -- to understand that a hand wave translates to an in-game behavior might be easier than asking them to learn a controller button combination that has the same effect. But while literal simulation may be more immediately comprehensible, the idea that it's more efficient in terms of interface is largely fallacious.
The '3D Web' Fad
For recent years' most obvious example, look at the sharp rise and sudden implosion of the "virtual worlds" craze as it was represented as recently as five years ago. Second Life makes magazine covers, an entire industry springs up around "virtual events", and there are entire schools of thought that envision a "3D Web" in the immediate future.
According to that vision, nobody would simply open a web browser and click around boring web pages anymore. Stores, media sites and social utilities would be represented as 3D virtual spaces that users would navigate with an avatar.
There would no longer be "walled garden" virtual worlds; everything would be interconnected, with a "universal avatar" that could freely wander the web, meeting other avatars, creating art and merchandise. Your avatar strolls into a bustling virtual marketplace, purchases a T-shirt, and dons the virtual version while the real version gets shipped to your home for you to wear.
It sounds exciting, and even logical. As people became more and more entrenched, from commerce to personal social behavior, with the internet, it makes sense to assume they'd want to make the experience ever more lifelike, ever more immersive.
But as it turns out, the only real market for virtual worlds was those early adopters -- futurists and consumers of science fiction who enjoyed playing with the concepts they could only read about as kids. There might have been a rush of excitement, but the virtual world phenomenon never grew as big as was expected.
As it turned out, that's because your average web user doesn't want to enter a virtual environment, create a customizable avatar, walk through a virtual plaza and interact with the avatar of a shopkeeper in order to make a purchase. Instead, they can just type in "Amazon.com", click what they want, and be done. All of that simulation just complicates things; an abstracted "shop" is much easier.
So virtual environments remained constrained to that limited audience of enthusiasts or people who had found very individualized, specific applications for the concept. The Second Life craze passed by; people flocked to Facebook, where a largely text-based interface that was essentially a network of interconnected, glorified bulletin boards turned out to be more accessible and more effective.
Waving At The Games Space
If we look at the video game industry, products that use simulated rather than abstracted interfaces have largely proven to be fads too, as in the case of instrument-based music games. Or at best, the addressable audience turns out to be much smaller than initial reception suggests.
There's no denying that the Wii reached an unprecedented userbase and revolutionized gaming by attracting brand-new audiences that had previously been unreachable, providing a huge market to explore new game design ideas.
But the hardware falloff has been steep once the Wii reached market saturation, and software sales have been challenging, particularly as concerns the most active market of gamers. Although it's impossible to prove, the theory that some good chunk of Wii buyers bought the console as a toy or fitness promise and then let it gather dust once the novelty wore off is at least viable.
Even developers and publishers that have thrived in the casual software market now feel there's less opportunity on the Wii platform these days. Aside from Nintendo's own Wii Sports, sports titles sell less briskly; the music craze appears officially over. It's arguable that Ubisoft stumbled on such an enormous hit with its Just Dance brand because that particular category had not yet been explored.
Even in the absence of data it's evident that literal-interface products follow a quick spike and fall-off pattern. It starts with a curiosity that gains strong initial attention, but then loses it once people realize that it's much easier to push a button than it is to swing their arms around.
Social Media And Gamifiers Take Heed
Now that social media's migration from the province of the tech-savvy to everyman phenomenon is well underway, lessons in the way that mainstream audiences naturally gravitate toward abstract, not literal interfaces become especially important.
Numerous new companies are popping up hoping to find exciting, engaging ways of using frameworks like Twitter, Facebook and mobile networks to engage audiences in play and commerce. But unless they can keep the abstraction concept in mind, it's going to be tough to permanently engage anyone on a meaningful scale.
Because socialization is basic. The most direct way to engage with your friend is to turn to her and talk. Some of your friends will be interested in gaming or digital culture and some won't.
Even if you're with someone from the former group, will it genuinely enrich your interaction with them to play a geolocation game where you have to complete "challenges" against each other in public to win digital badges or whatever? Me, I'd say, "uh, let's just go see the movie, dude."
And your friend who's less plugged in? Will he appreciate that you're not really present for his conversation over lunch because you're trying to check in on FourSquare to let everyone know that you're at this restaurant having lunch with him?
Ultimately, these interfaces aren't "social", as much as it sounds like they are in presentations to venture capital firms. And if it's simpler to connect with someone directly, people are going to want to do that -- to abstract the interface -- rather than to use these tools.
Sometimes the digital world looms too large in our minds. We live, work in and love this space, and we forget that as interconnected as it's becoming with our lives, it's not our whole lives. Game design concepts engage and motivate people when they've chosen to take time to sit down and play.
But try to implement them all over a person's working and personal life, install ideas of achievement, reward and connectivity all over everything, and people will naturally start to resist. There are a lot of ways to engage people, but creating or elaborating interfaces where none are needed creates a product or trend that's fun in concept, but has surprisingly little staying power in practice.