Imagine you’re developing a video game for a virtual reality headset. The design document lays it all out; after the player overthrows the Zombie Queen, they will make their way to the safe house, where a simple door inside remains the only obstacle between peace of mind and a zombie chomping on a piece of their mind. Your work is cut and dry: make a door that the player can open and close.
This seemingly straightforward interaction has a surprising amount of depth. Should the door be a push door, or a pull door? Should there be a handle, or a knob? Can the player interact with the door at all? Is interacting with a door even fun? Kerry Davis from Valve has a talk that spans nearly an hour on this single topic, and for good reason; VR development is a unique intersection between Human-Computer Interaction and game design, and requires careful consideration to balance the two.
If you’re coming from a development background in another domain, such as app development, you already know the tune; things that look simple can be very complex to implement. This is no less true in VR development. In this article, we’ll discuss the essential considerations that will affect the scope of your VR game early on, how they differ from traditional game development and how to best meet them head-on.
1. Design for Your Platform’s Tech Specs
The performance of your platform is one of the first things you should consider when designing a VR game. It is typical in game development to save optimization for last, but this can be a trap in VR. The last thing you want is to reach the end of your dev cycle only to realize that because of your choice of platform, many of the game ideas you’ve had are either extremely difficult or impossible to get running at an acceptable performance level. The hazards of poor performance are too strong to ignore. For example, if your game suffers from frequent frame drops, you can make your users physically ill—a less-than-fun gaming experience.
Slight variances in platform power don’t matter too much for your game design, though. What matters is that VR platforms now come in two flavors that have a fairly wide gap in computing power, which will have an effect on your game design; these are called tethered and untethered platforms.
On tethered platforms (e.g., Valve Index, Oculus Rift), processing happens on a dedicated computer, and video is sent to the headset via a wire. This offers performance at the level of a dedicated gaming computer or high-end game console. On the other hand, untethered platforms have the computing hardware in the headset itself (e.g., Oculus Quest). This hardware is typically about as powerful as a good smartphone or gaming handheld.
Essentially, this choice is the same as when you’re deciding whether to target mobile phones for your game. The difference in power between a gaming computer and a mobile platform can be wide; this can affect the fidelity of your visuals, the number of objects that can appear on-screen, the richness of your physics simulations, and more. For example, if our zombie game calls for thousands of zombies that have complex AI and can bump into each other, it may not be feasible to accomplish on a mobile platform. Prototype early on to evaluate whether your game design fits your target platform. And if you want to deploy to multiple platforms, remember this: it’s a lot easier to scale a game from a low-power platform to a high-power platform than the other way around.
Once this decision is made, consider the refresh rate of your target platforms’ displays and ensure you meet this number as closely as possible. For reference, the Oculus Quest 2 supports 72/80/90 Hz intervals, while the Valve Index supports 90/120/144 Hz. To reiterate, you must match this refresh rate without frequent frame drops, or you’ll make your users physically ill. The platform makers know this, and accordingly some distribution platforms have a minimum framerate target that you have to meet in order to use them. For example, Oculus requires a minimum of 45 FPS to sell a game in their store. A VR game running at this level is generally considered a poorly performing game, though. Always strive to meet the minimum supported refresh rate of the platform at all times.
2. Design for Your Platform’s Tracking Volume
You’ll next want to consider how the player will move around the digital world you’ve created. To do this, look at the tracking system your platform uses. This is what allows the headset to know how it’s moving around in real-world space. There are two types of tracking systems: outside-in and inside-out.
Platforms such as the HTC Vive and Valve Index utilize outside-in tracking, wherein the user places sensors in the room that track how the headset and controllers are moving. This offers a limited tracking volume, or the area within which players can physically move within and still be picked up by the sensors. This type of tracking system is typically found on tethered VR platforms, where large amounts of computing resources are available for your game to use.
Platforms such as the Oculus Quest utilize inside-out tracking. These devices have the sensors for tracking movement contained within the headset itself, typically using an array of cameras. The lack of external sensors means that the tracking volume for inside-out tracking is theoretically unlimited, allowing for VR experiences that can fill large spaces. Because there are no external sensors to mount or calibrate, inside-out platforms are more suited for high-portability applications such as on-site demonstrations or trade shows.
However, if you’re designing an experience that’s tied to a physical location, you’ll need to make sure the space is well lit and full of details for the cameras to pick up and track—this type of tracking system doesn’t work well in a room with bare walls and a simple floor. Also note that inside-out tracking typically features on untethered platforms, meaning you’ll need to pay careful attention to performance in exchange for portability.
Based on your platform’s tracking volume, the next decision is whether to support a “standing” or “room-scale” experience. In the former, the player moves by using abstract means, such as a joystick or a few clicks of a pointer, rather than physically moving their body (known as artificial locomotion). This option works best for systems that don’t offer a large tracking volume, such as the Oculus Rift or PlayStation VR. However, with a large enough tracking volume, you can offer room-scale experiences in which the player can physically walk around the tracking volume (this is known as “natural locomotion”).
To a certain extent, this decision has been made for you: Always design for a standing experience with artificial locomotion first. It’s impossible to know just how big a player’s gaming space will be—unless you’re checking out their Zillow listing—so consider the player’s ability to walk a nice add-on if they have the means. Obviously, ignore this if your game design is built around room-scale natural locomotion, but remember that you’re losing out on a very large audience by doing so.
3. Consider Your Interactions
If all VR games simply required players to walk around, your work would be done. But now you have to consider how your players will interact with your game using their controllers. For example, in a game where the player paints on a canvas, what actually constitutes the action of painting? These considerations are made more easily if you can decide early on whether you want the controllers to act as the player’s hands, or as tools.
As hands, you’ll largely model controller interactions after real-world mechanisms. In our painting example, the player would pick up a brush and “just paint.” This is a very natural way to model interactions, but it comes with a lot of extra steps. For example, if a player can pick up a brush, then they can set it down, drop it or even lose it. What then? There’s such a thing as too realistic—is the user going to find it fun to drop the brush? Be sure to thoroughly consider the implications of this interaction model for your game.
If you’re treating the controllers as tools, you don’t necessarily need to stick to real-life conventions. For example, the controller could be some kind of multi-tool with a UI to change brush settings, as is the case with Google’s Tilt Brush. You could also treat the controller like a mouse of sorts, where the player aims a laser at the canvas and presses a button to apply paint. This functionality is easier to develop, but due to its abstract nature, it will require more front-loading of information to the user before they start painting.