Gamasutra is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
January 22, 2020
arrowPress Releases







If you enjoy reading this site, you might also want to check out these UBM Tech sites:


Owlchemy Labs is changing the way viewers watch VR gameplay

Owlchemy Labs is changing the way viewers watch VR gameplay

October 3, 2016 | By Alissa McAloon

October 3, 2016 | By Alissa McAloon
Comments
    Post A Comment
More: VR, Console/PC, Indie, Production



Job Simulator and Rick and Morty Simulator: Virtual Rick-ality developer Owlchemy Labs is working on a way to fuse VR gameplay and green screen footage in-engine and without the need for any video editing. Known as depth-based realtime in-app mixed reality compositing, the process uses a stereo depth camera, custom shader, custom plugin, and green screen to place users directly and seamlessly into a VR environment. 

Typically, watching someone play a VR game is a strange and jittery second hand experience. Viewers either experience the game through a shaky first-person perspective or watch someone else experience VR against a flat, and clearly green screened environment. Technology like this in-engine mixed reality process aims to change that and show potential users a smoother and more immersive side of virtual reality. 

Owlchemy Labs’ mixed reality tech is able to sense depth and show users moving throughout the environment naturally.  Anyone viewing the footage sees the VR player moving behind and under objects in real time, without the need for any external programs or editing. 

The process uses a stereo depth camera to record video and depth data of a user against a green screen. The data is then sent through a custom Unity plugin and shader, which produces the in-engine footage of the user in the game environment. 

Owlchemy Labs notes that using the technology with its game Job Simulator was a particularly difficult application. “Essentially, Job Simulator is the worst case scenario for mixed reality, as we can’t get away with simple foreground / background sorting where the player is essentially static and all the action happens in front of them (a la Space Pirate Trainer). If a solution can work for Job Simulator, it can likely be a universal solution for all content.”

It isn’t quite ready to make its way out into the wild yet, but Owlchemy Labs is still in the process of developing their take on mixed reality and has plans to share some of the technology in the future. 



Related Jobs

Heart Machine
Heart Machine — Culver City, California, United States
[01.21.20]

Associate Producer
Yacht Club Games
Yacht Club Games — Los Angeles, California, United States
[01.20.20]

Chief Operating Officer / Producer
Deep Silver Volition
Deep Silver Volition — Champaign, Illinois, United States
[01.16.20]

Principal Writer
Sony PlayStation
Sony PlayStation — San Mateo, California, United States
[01.16.20]

Digital Analytics and Monetization Manager (Esports)









Loading Comments

loader image