CASE STUDY: STAR TREK BRIDGE CREW
BRIAN TATE, GAME DIRECTOR AT UBISOFT’S RED STORM ENTERTAINMENT STUDIO, EXPLAINS HOW VR ENABLED HIS TEAM TO BOLDLY GO WHERE NO ONE HAS GONE BEFORE
HOW DID THE DECISION TO CREATE A VR-SPECIFIC GAME AFFECT THE GAME’S DEVELOPMENT?
From day one we planned it so that players on all platforms, and with different types of VR headsets and controllers, could all play together. The decision to focus on a seated experience fell out of those commitments, and was an immediate win for us. It allowed us to make the game really comfortable with no compromises. It also happens to work beautifully for the game’s setting, and for the social aspects.
THE USER INTERFACE IS A KEY PART OF THE EXPERIENCE. HOW DIFFICULT WAS THIS TO GET RIGHT?
We support gamepads and three different types of hand-tracking controllers, plus we had to build virtual starship controls and character animation that worked really well with all of them. We experimented early on to find what felt good, where to place controls for players to easily operate them, what types of buttons and sliders are easy for anyone to operate, and so forth. And all the virtual controls had to have an authentic Star Trek look, feel and sound.
HOW DID THE APPROACH TO CHARACTER MODELLING AND ANIMATION DIFFER TO THAT OF A REGULAR 3D TITLE?
The game uses a mix of different technologies, many of which were created specifically for this game and our other VR title, Werewolves Within. Very little of it is conventional game animation. The head and upper body are driven by the player’s own head movements, as tracked by the headset. We blend animations to make the avatar’s head position and orientation closely match the player’s own. This is part of the magic of creating what we call ‘social presence’. On top of this, we animate the hands and arms using IK to match the player’s own hand movements if they have handtracking controllers. And then the hands animate to respond to what the player is doing with their real hands on the controls, and also what their avatar is doing when buttons and sliders are being manipulated. The blending is all programmatic, but the raw animations themselves are created in Motion Builder, from a mix of motion capture and hand-keyed content.
WHAT APPROACH DID YOU TAKE WITH THE FACIAL ANIMATION?
We created a real-time voice analysis system, so that phoneme and stress detection drives the avatar’s mouth shapes and facial expressions as a player speaks. Those various shapes and expressions are all blend shapes, and so are the customisation options players use to personalise their avatars. There’s no skeletal animation in the faces. Getting good results required close cooperation between the artists sculpting the shapes, the engineers developing the voiceanalysis algorithm, and traditional animators providing critical feedback along the way. We also used a package called Realistic Eye Movements. All we have to do is detect when a player is looking at another player or character, and this system does the rest of the work to make the eyes move naturally.
WHAT PROMPTED YOU TO GO WITH UNITY 3D AS YOUR GAME ENGINE?
When we started out, Unity had a tremendous lead on everyone else in VR support, and our team had already had positive experiences using it to prototype other game concepts, so it was an easy decision.
WHAT OTHER TOOLS WERE USED IN THE PRODUCTION OF THIS GAME?
In addition to Unity, we used 3D Studio Max, Motion Builder, Zbrush, and Visual Studio. Every team member here has an Oculus Rift, HTC Vive or Playstation VR at their desk. Looking at or testing your work on a 2D monitor just doesn’t give you a good sense of what it’s like in VR.
WHAT ARTISTIC CHALLENGES DID VR INTRODUCE?
There’s no frame to put your artistic composition in, or to anchor your interface elements to, so all the best practices of visual composition and interface design have to be rethought. Things we took for granted as trivial in non-vr games, like displaying simple error messages for example, can be surprisingly tough to do well in VR. That said, VR made some things easier. The sense of being in another place is incredibly powerful in VR, allowing artists to make stunning environments that people are thrilled just to be in. It’s hard to describe how compelling this can be. And in VR, the focus is not about achieving photorealistic detail.