-smash-
Content Director
- Joined
- Aug 27, 2004
- Messages
- 1,823
- Reaction score
- 340
Speaking at the Vision Summit 2016, Joe Ludwig gave an overview of the SteamVR, OpenVR, and Unity VR APIs, and what power they bring to VR applications developers. I've gone ahead and basically done a transcription of his presentation in order to provide a document for future reference. I think this is important because Joe does outline some long-term goals of SteamVR and OpenVR, both very important sets of standards for the VR ecosystem.
In the spoiler tag below, you can find a video of Joe Ludwig's presentation, as well as my transcript. Some of the information is new, some is a refresher, but most of it is technical jargon, so you have been warned. Enjoy!
In the beginning of his presentation, Joe Ludwig first took some time to explain the differences between SteamVR and OpenVR.
SteamVR is considered to be the work that Valve is doing in VR. That includes the technology they've developed that's shipping with the HTC Vive, the APIs that are provided to application and hardware developers, and Steam itself running in VR.
SteamVR provides an in-application VR dashboard. Steam itself is included in this dashboard, and the Steam client uses public APIs to provide its overlay in the dashboard. The set of operations that Steam provides are what you would expect: launch games, browse the store, buy games, chat with friends, etc.. The dashboard also provides the user with access to VR settings from inside of the VR experience. It also provides access to controls of the whole system, such as turning off controllers, or exiting the VR system. SteamVR also provides the render-model API that gives access to high-quality models of whatever device the user is holding in their hand at the moment - this includes animation data for the device, and renders an animation depending on the current state of the controller, i.e. a button is pressed, or a finger is touching a point on the track pad. All of this mesh and texture data is provided to the application so that it can recolor or light the controller model in an appropriate way.
SteamVR also provides a system to allow the user to define the boundary of obstacles that exist around the them in their room - this is part of the chaparone system. If a user gets too close to the boundary, the outline will appear in their HMD to warn them that they're about to collide with the real world. For headsets that have a forward-facing camera, such as the HTC Vive, chaparone includes the camera view of what the user is about the run into. Inset into the chaparone bounds system is the safe-play area that provides applications with a guide of where to place objects for the user to interact with. If your application puts something inside of that blue rectangle, the user will be able to reach it. This all scales from a seated experience up to a full-room experience.
Another API that the application interface allows is the notifications API. Right now Valve is using it for Steam toast alerts, but VR applications can use this API for whatever they want. These notifications can appear wherever the user is and they can interact with these notifications however the developer wants them to. There's also support for a VR keyboard that allows the user to enter text with the controllers.
In addition to these user-facing tools, SteamVR also provides performance timing tools to developers. The graph below shows how much CPU and GPU time is being spent by each component of the system, and this helps the developers determine where the bottlenecks are.
OpenVR is the pair of APIs that Valve provides for interacting with the VR system.
The first API is the API that's used for developing VR applications. This includes providing object transforms, an interface to the compositor to send textures to display on the HMD, up-to-date input state of the controllers, access to device models at runtime, access to the user Chaperone system configuration, and more. Supporting this API through an application allows developers the flexibility of not only accessing current VR hardware in an abstract way, but also hardware to come from new and existing manufactures.
The other OpenVR API is for devices - the driver API. Hardware developers use this API to add new devices to the set of things that work with OpenVR. When hardware developers use this API, existing applications have immediate access to these new devices. So, for example, as Joe Ludwig said, if 100 OpenVR applications ship this year, and next year a hardware vendor releases a new OpenVR driver for their hardware, that hardware will immediately gain access to all 100 of those applications. And this happens without application developers having to update their titles.
So in conclusion, where OpenVR is the API, SteamVR is the customer-facing name that users actually install as part of a larger system.
Valve used the plugin firsthand to develop the Secret Shop VR demo that debuted at The International 2015. Secret Shop uses characters from Dota 2 and pulls them into a 5-minute interactive story. Those Dota 2 assets were pulled in straight from Dota 2, and the demo was built-up from there.
- The Room Setup tool that the user runs to tell the system where the physical obstacles are in their environment.
- Demo-transition content.
Valve has some challenges with the SteamVR Unity plugin, and both have to do with performance.
For one thing, traversing the game scene is slow. Because SteamVR is supported through a plugin, it doesn't have access to the VR-specific optimizations that Unity has added to the engine. Rendering the scene from two independent cameras is what you have to do with a plugin, and so the scene is traversed twice, effectively doubling the width of necessary optimizations for performance. To fix this problem, Valve has a few things they'd like to do.
First, OpenVR is being added to the native VR API in Unity 5.4. Valve has been working with Unity on this, and it should be in the 5.4 beta in a few weeks. This will be a free integration for all Unity developers. This means that the SteamVR plugin is going to change. Some of the work that the plugin does in Unity 5.3 will be moved over to the native Unity VR API in 5.4 - specifically rendering and tracking. Features that are not supported by the Unity VR API will continue to be supported by the plugin, and that includes controller input, overlays, and render models.
When you write application to the Unity VR API, it selects the Oculus SDK, OpenVR, or mobile or console VR SDKs. But if you are writing your application for a platform that is not yet supported, you will continue to go through the SteamVR plugin.
Lighting, specifically dynamic lighting, is a big part of the Enhanced Rendering plugin. Level designers and artists want to include as many dynamic lights as possible because it increases the richness of a scene. But having many dynamic lights has a cost, and so that's where deferred rendering will come in. Unfortunately, deferred rendering does not support MSAA, which is very important for VR experiences. So with Valve's Enhanced Rendering plugin, they're taking a different path. Dynamic lighting is the goal, but instead of deferred rendering, they're going to specific provide better support for dynamic lights in Unity. The plugin supports up to 10 shadow-casting lights per one draw call, which is an upgrade from current Unity specs which only supports 4 per draw call. So because the plugin will still use the forward renderer, MSAA will be available.
The Enhanced Rendering plugin itself is easy to use. It adds a camera component to the camera properties, and this allows you to control shadows and also hides the faster materials to make it easy to find the ones that haven't yet been switched over to the new model. There's also a new realtime light component to set lighting parameters. Finally, there's a new materials shader. This Enhanced Rendering plugin should arrive in the Unity Asset Store for free sometime around the GDC 2016 timeframe of early March.
Last edited: