Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
August 24, 2016
arrowPress Releases
August 24, 2016
PR Newswire
View All






If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Mixed-reality VR Twitch streaming
by Sarah Northway on 01/11/16 01:46:00 pm   Expert Blogs   Featured Blogs

3 comments Share on Twitter Share on Facebook    RSS

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

Mixed-reality streaming is cool as it sounds. We've started live streaming our VR building game Fantastic Contraption every Thursday at noon pst on our Twitch stream, and the response has been pretty exciting. Here's a highlight from our first official stream:

Twitch isn't just for e-sports and speedruns anymore; it's getting downright mainstream as a marketing tool, a way for people to check games out before buying them, participate in events, and to obsess over games while at work / any moment they can't be playing them (guilty!). But for months we've been asking: how the heck do you stream virtual reality games? Especially room-scale VR using the HTC Vive?

The standard picture-in-picture game footage + webcam technique doesn't do VR justice. The first-person in-game feed from VR games gives at best a cropped, distorted view of what the player is actually seeing, and talking heads wearing VR headsets are even duller than regular talking heads. After recording some YouTube gameplay sessions and a few experiments with combining real life and VR feeds in Premiere, we did some brainstorming.

Then we geared up:

Our green screen studio

Our Twitch studio:

  • 400sqft of green muslin + 2 layers of black muslin
  • a 4-piece 2000 watt light kit
  • Logitech C930e webcam with tripod
  • Blue Yeti mic

Total: $1000 cad.

Our living room has huge windows on two sides, so it was a challenge to keep the green screen lighting consistent (bed sheets and cardboard were involved). But we discovered that our webcam feed has considerably less lag during the day when all that natural light lowers exposure time.

Three-camera OBS approach

Our first trials used OBS to combine three views. We stuck a webcam on a tripod and synced it's position with two in-game 3rd-person cameras by lining up real and virtual hand controllers.

One game camera only sees foreground objects located between the Vive headset and the camera, and the other sees the sky, ground, and objects behind the headset. To divide the two, we first tried using a clipping pane (easily done in Unity), but that caused issues with our semi-transparent objects.

A better solution was to blip entire game objects between two visibility layers by testing their positions at each frame update. We compensated for Unity's brief delay between the two in-game camera renders by having objects briefly appear on both layers at once.

We output the in-game cameras side-by-side, each taking up 1/4 of the regular screen, then stacked them together in OBS with the chromakeyed webcam layer in the middle:

OBS Screenshot showing input and output streams

We wanted to have spectators in the stream, so we added a couch on the left side of the shot. They can watch the final feed on a monitor across the room, giving the eerie impression that they actually see what's going on in VR right in front of them.

The whole effect is also not half bad if you film in front of a white wall instead of a green screen. In an early test, we had OBS overlay the background game camera onto the webcam view at 50% transparency, then add the foreground on top at full visibility. The brighter white room also improves webcam latency, which can be an issue if the webcam starts to lag noticeably behind the game cameras.

Moving plane in-game processing approach

For our next stream we're going to try piping the live webcam feed into the game itself, so we can display it on a moving plane in the game after doing the chromakeying ourselves. Thanks to Edwon for this suggestion and help! This single-game-camera technique should remove the glitches caused by blipping objects between visibility layers, and will increase our output resolution because we can use the full screen instead of 1/4. Here's how it'll work (courtesy of Edwon):

The added challenge for us is to include the ground in our shot without feet disappearing under Contraption's grass floor. We're attempting this by cutting the webcam footage in half; the top half displayed facing the camera as above, and the bottom half laid flat just above the ground, skewed based on the angle to the camera.

Other tools

We've added some in-game tools to use while streaming, including a floating Twitch comments feed that only the player can see, so they can respond live to everyone watching. We also have director controls that let our "couchies" swap the view between various game cameras: 3rd person mixed, 1st person view, and 3rd person flyable mode which shows the player as an avatar. Like the Twitch comments panel, the current camera is visible only to the player (it looks like a flying eyeball).

We're hoping to see more VR games join Twitch's new Virtual Reality game category soon, and are excited to see what new ideas come out of this. Stay Tuned for more live mixed-reality experiments!


Related Jobs

Infinity Ward
Infinity Ward — Woodland Hills, California, United States
[08.23.16]

Senior Rendering Engineer
Infinity Ward
Infinity Ward — Woodland Hills, California, United States
[08.23.16]

Senior Tools Engineer
Infinity Ward
Infinity Ward — Woodland Hills, California, United States
[08.23.16]

Engine Software Engineer
Infinity Ward
Infinity Ward — Woodland Hills, California, United States
[08.23.16]

Core Systems Engineer





Loading Comments

loader image