GAME JOBS
Latest Blogs
spacer View All     Post     RSS spacer
 
June 7, 2013
 
Postmortem: Game Oven's Bam fu [1]
 
Tenets of Videodreams, Part 3: Musicality
 
Post Mortem: Minecraft Oakland
 
Free to Play: A Call for Games Lacking Challenge [3]
 
Cracking the Touchscreen Code [4]
spacer
Latest Jobs
spacer View All     Post a Job     RSS spacer
 
June 7, 2013
 
LeapFrog
Game Designer
 
YAGER Development
Senior Game Systems Designer (f/m)
 
RealTime Immersive, Inc.
Animation Software Engineer
 
Havok
Havok- 3D Software Engineers (Relocate to Europe)
 
Treyarch / Activision
Senior Environment Artist
 
Social Point
Senior Game Developer
spacer
Latest Press Releases
spacer View All     RSS spacer
 
June 7, 2013
 
Bootcamp
 
Indie Royale Presents The
Arclight Bundle
 
A space hero among us
 
Make Family History! 7
Grand Steps: What
Ancients...
 
Who is Harkyn?
spacer
About
spacer Editor-In-Chief:
Kris Graft
Blog Director:
Christian Nutt
Senior Contributing Editor:
Brandon Sheffield
News Editors:
Mike Rose, Kris Ligman
Editors-At-Large:
Leigh Alexander, Chris Morris
Advertising:
Jennifer Sulik
Recruitment:
Gina Gross
Education:
Gillian Crowley
 
Contact Gamasutra
 
Report a Problem
 
Submit News
 
Comment Guidelines
 
Blogging Guidelines
Sponsor

 
Valve's wearable computing ace discusses the challenges facing VR
Valve's wearable computing ace discusses the challenges facing VR
 

May 16, 2013   |   By Kris Ligman

Comments 1 comments

More: Console/PC, Programming, Art, Design





Following on his talk on virtual reality delivered at this year's Game Developers Conference, Michael Abrash has written up a new blog post on some of the technical challenges in getting hardware like the Oculus Rift to feel "real" to the human brain.

There are three broad factors that affect how real – or unreal – virtual scenes seem to us, as I discussed in my GDC talk: tracking, latency, and the way in which the display interacts perceptually with the eye and the brain. Accurate tracking and low latency are required so that images can be drawn in the right place at the right time; I’ve previously talked about latency, and I’ll talk about tracking one of these days, but right now I’m going to treat latency and tracking as solved problems so we can peel the onion another layer and dive into the interaction of head mounted displays with the human visual system, and the perceptual effects thereof.

The article offers an in-depth discussion of the actual nuts and bolts of human visual perception and how head-mounted devices like the Oculus Rift need to address them.

"More informally, you could think of this line of investigation as: 'Why VR and AR aren't just a matter of putting a display an inch in front of each eye and rendering images at the right time in the right place,'" Abrash writes.

The post is the first in a series and a valuable read. Readers may also be interested in viewing Abrash's 25-minute talk on the GDC Vault, or viewing the slides available from Abrash's blog.
 
 
Top Stories

image
How Kinect's brute force strategy could make Xbox One a success
image
Microsoft's official stance on used games for Xbox One
image
Gearbox's Randy Pitchford on games and gun violence
image
Why you can't trade items in MMOs anymore


   
 
Comments

Simon Ludgate
profile image
Why does no one ever talk about the problem of strain on the lens to have the eye focus on an image an inch away, no nevermind the fact that the lens never changes focus when things are meant to be moving closer or farther away? Depth perception does not derive from stereoscopic vision alone; depth of focus matters just as much.

Until devices can accurately deliver deep focus images, 3D will never really work properly.


none
 
Comment:
 




 
UBM Tech