[Gamasutra and GDC are sibilng companies under UBM. GDC takes place March 14-18 at the Moscone Center in San Francisco. This is paid ad.]
Game Developers Conference highlights major exhibitors who will be at the show. Check out this series, which includes Q&As with Unity, Epic Games, and Imagination Technologies.
Q: First off, give us a preview of your developer day! What subjects will you be covering?
Marcos Sanchez: The Unity Developer Day, held March 15 from 10:00am-5:00pm PT at Moscone West, will bring together hundreds of Unity developers, executives, and influencers to share best practices and learn how to get the most out of the Unity engine. The event will cover a broad range of topics and provide guidance on how to use various Unity tools and services, including building VR experiences, targeting WebGL, optimizing team work with Cloud Build, monetizing with Ads & IAP, among many others.
Q: Unity 5 has had a lot of significant upgrades, from improved shader rendering, to improved 2D support. What are some features that developers might not be as aware of?
Marcos Sanchez: Unity's latest version, 5.3, has added a variety of new updates and features, ranging from new multi-scene editing tools and automated unit testing, to new VR improvements and samples. We are building a true game development platform and engine, and have made great strides to accommodate all types of games and experiences. And as this is a continuous pace of innovation, we will have much more in store in the coming months, such as new graphics rendering, WebGL improvements, and the integration of in-app purchases. More information on our upcoming tools and features can be found on the Unity roadmap.
Q: Now that AppleTV is supported, how many platforms does Unity deploy to? How important is ease of porting to multiple platforms to Unity's success?
Marcos Sanchez: Unity now supports 24 platforms, including iOS, Android, Windows, Windows Phone 8, Windows Store Apps, Tizen, Mac, Linux/Steam OS, Web Player, WebGL, PS3, PS4, PSVita, XboxOne, Xbox360, WiiU, Android TV, Samsung SMART TV, Apple TV, Oculus Rift, Gear VR, Microsoft HoloLens, Playstation VR, Universal Windows Platform, and the New Nintendo 3DS.
Mulitplatform support is at the heart of Unity's engine, and is a critical component of our success. On Unity, developers can easily create multiple builds and publish across platforms. Using Unity's Cloud Build they can automate the process to create the build for multiple platforms, allowing them to save significant time and money and focus on developing the best game or experience possible.
Q: Unity 5 comes with a lot more animation tools than in the past. How do these make developers' lives easier?
Marcos Sanchez: Unity 5 took game creation to a new level and includes more features than ever. Our mission is to democratize development and make it easier to make great content, and many of the features we've introduced in Unity 5 do just that. That includes new features - like Physically Based Shading and Global Illumination - that allow developers to create the visually enticing experiences consumer crave. A great example of what's possible now is The Blacksmith, a recent demo developed by Unity that showcases the advanced graphics features that come with Unity 5. These tools allow developers to create impactful games and experiences faster and cheaper than ever before.
Q: How much has developer feedback affected Unity 5's updates as it moves forward? Any examples?
Marcos Sanchez: Unity wouldn't exist without the developer community, and we are always looking to improve our product, so dev feedback plays a large role in the tools and features we focus on.
We have teams here at Unity that focus solely on engaging developers and responding to feedback, and we always make an effort to incorporate their thoughts into future updates. Our goal is to create a full suite of tools that serves all developer needs, so we encourage the community to reach out with any questions, thoughts or concerns.
Q:How is UE4 supporting VR?
Nick Whiting: UE4 has been in the VR space since 2013, when the first Oculus HD prototype set was unveiled with an interactive version of our Elemental demo at E3. Since then, we've evolved as the VR space has evolved, and support all the major platforms, like Oculus, HTC Vive, PlayStation VR, and Samsung Gear VR out of the box. VR is a technically challenging platform, and our primary goal is to worry about those technical issues, so that content creators can focus on solving the myriad design challenges of the first new medium to come along in a very long time. To do that, we make sure that using UE4 for VR is simple, and works the same across all of the platforms out there. Make your content once and use that same content on the different devices!
In order to get ready for what's next in VR, we've been investing a lot of time in improving our rendering performance to further raise the bar for visuals. We've recently used Bullet Train to push the limits of what the engine can do in VR, and out of that came some huge performance gains in our renderer. We have even more in store, and are working hard with all of the VR platform creators and graphics card manufacturers in order to make sure that anyone can use UE4 to make great VR content.
Q: How have Blueprints evolved over the last few months? We've seen more new developers and non-technical designers using them to create experiences with less of a programmer ask.
Nick Whiting: Blueprints have come a long way recently. Originally when we were just starting out with UE4 and designing the Blueprint visual scripting system, we weren't expecting people to make entire games with them! Today, all of Epic's internal projects make heavy use of Blueprints. In fact, all of our VR content is entirely Blueprint-based! We do that so that engineers and designers have a common language to speak; when everyone can contribute to prototyping and implementation, games can be much more interactive and alive than previously possible. Because we've gone so broad with our usage of Blueprints, our efforts for the past few months have been focused on squeezing more and more performance out of them by converting them from script code into C++ code. That allows them to run super fast on all platforms, while maintaining the democratized workflow that's become such a big part of our game making process.
Q: UE4 works great on the mobile platforms it supports - when will we see more phones supported by the engine?
Ray Davis: With each update to Unreal Engine comes improvements to mobile compatibility, performance, and additional features to continually broaden the set of devices that developers can target. In the last release of UE4 we added several scalability settings which allows developers to automatically scale down their content to still maintain great performance. This allows a project to run beautifully on a Galaxy S6 Edge alongside a much older device such as a Galaxy S3, for example. We're also working closely with the teams driving the next generation of mobile graphics, primarily Apple's efforts around Metal, as well as the OpenGL ES successor Vulkan for Android devices. This effort, along with close partnerships with mobile GPU manufacturers, will ensure that UE4 always supports the latest mobile platforms for quality game development.
Q: What do you want developers to take away from your Developer Day at GDC? What are the big points?
Nick Whiting: For VR, we strongly believe that sharing is the best way to make sure VR is truly the revolution that we want it to be. That's why we're so open with sharing not only our successes -- which we roll right into UE4 for everyone to take advantage of -- but also what didn't work out. We are all invested in VR succeeding, and the best way to do that is to help others. Thanks to UE4's open source model, it makes sharing our advancements much easier and quicker.
Q: Imagination Technologies has a Developer Day at GDC. We know you primarily as a technology R&D house - what will the company be talking about during your Developer Day?
David Harold: Technology, sure. But crucially how it affects developers. From what you need to be prepared for, like ray tracing and the new Vulkan API from Khronos, through to how to code to get the best from the technology in new devices. For example we have a great session on SPIR-V, the intermediate Vulkan coding language. Plus, as ever, we'll have some cools guests, including EPIC and Codeplay.
Imagination Technologies has a 20 year history in "making worlds better" by creating the revolutionary PowerVR GPU family, which has been used in many arcade, console, OTT, tablet and mobile gaming platforms.
Q: How does your hardware suite scale for developers who might want to create something more custom, like a VR headset or a kickstarted micro console?
David Harold: Well for VR we have everything you need in order to split the display and make sure the images work for the eyes with a decent refresh and low latency. That's the fundamental bits you have to get right. We are in lots of VR helmets coming through in 2016, and because PowerVR can deliver good performance at low cost a lot of those will be very mainstream devices. You'll be able to see some at our booth in the GDC Expo Hall.
For things like Kickstarter we actually have a part of our business now, IMG Systems, that can help those people get into production and make sure they have a solid product. The thing that kills Kickstarter companies isn't getting up front interest, it's when they've made the hardware and a couple of months later it all gets returned because the quality wasn't there. Imagination, which has been doing high volume consumer grade manufacturing for decades, can really help there.
Q: You offer support for developers working on PowerVR-enabled mobile devices. Which are these, and how can you help?
David Harold: My gosh! The internet is big right so we have space to list them all?? PowerVR has shipped in billions of devices across iOS, Android, Windows, and more. We are in numerous handsets, tablets and OTT boxes today - as well as some TVs. Hundreds of devices shipping. In terms of how we help, we have a great team in-house who work very closely with the engine companies, so that for most developers the work of optimizing for PowerVR has already been done at the engine level.
However, we also have an amazing tool chain, which we give away for free. How can it be free? Well, we make our money from hardware sales, and we know there won't be hardware sales without great content, so we do everything we can to enable that. Add to all that our education program which includes our IDC and online events, our forums, as well as in-house teaching for larger companies, and you have a really world class support program.
Q: Seriously, though, can we get a new PowerVR chip to make the Dreamcast 2?
David Harold: Okay, let's be serious about it then. We have GPU IP that is highly scalable and would be ideal for a console. We're in a current handheld console - PS Vita - as well as in the three top OTT boxes with gaming capabilities from the industry giants. So we have some play in the space. However, with our hybrid ray tracing technology we have what I believe is the technology that can set a bold console maker well ahead of the competition. So now we have the question: who is ready to be different? To have games which actually look different from the competition in real ways, not just a few frames faster or with less blur. And to have technology that sets artists free from a lot of the drudgery, and lets them focus on creativity. We'll see, but I think there's a decent chance that someone will have the nerve to make that step before too long.