3D World

Adam in real time

Sammy Maine speaks to Oats Studios’ VFX supervisor Chris Harvey about the team’s work on their short film series, ADAM, built with game engine Unity

-

Oats Studios’ VFX supervisor Chris Harvey gives us the lowdown on their ADAM series, built with Unity

Back in 2016, the team behind the Unity game engine created a proof-of-concept short film, titled ADAM, to show off its cinematic abilities. While it was never intended that the story of a robot escaping a prison would eventually turn into a fully fledged series, the film garnered enough attention that Unity Technologi­es knew they had to continue the tale. As a director who’s gained critical acclaim for his depictions of dystopian futures and its robots, the team turned to Neill Blomkamp and his team at Oats Studios to help complete the task.

The District 9 director launched Oats Studios only last year. The idea behind the studio was to create a space where he and his team could experiment with short films and release them for free online, allowing him to connect with his audience directly.

ADAM was the perfect experiment for the team, using a game engine that allowed them to render graphics in real time, so naturally, they picked up the project with Blomkamp writing the rest of the story. Beside him was Oats Studios VFX supervisor Chris Harvey, who was tasked with bringing the ADAM world to life using a game engine that he had never worked with before.

“I’ve been wanting to do this for probably over 15 years. I’ve been telling colleagues that eventually one of these days we’ll be using game engines [for] all CG stories and then use them for the stuff we do at VFX,” Harvey explains. “It really was an experiment. Can we tell our stories using a real-time engine? Because if we can, it opens up certain avenues that we can do. We put the limitation on ourselves, because you can use a game engine as an offline renderer like you would any other renderer. But we said no, we want it to actually run at 30 frames per second in real time and you know, it opens up doors like VR. It opens up being able to tune things very quickly and respond to social political commentary. It opens up a lot of interestin­g avenues. Oats is an experiment in and of itself, so the idea of trying something that hasn’t been done very much before was appealing.”

With Unity being so new, Harvey says the team would often learn as they went along. Rendering in real time meant they had to shift their ingrained working methods into unknown territory, throwing out dailies and watching footage on their computers instead of rendering out the files and compiling them in Quicktime. “We could render the whole film out at 4K in I think two to three hours,” he adds. “The film’s just

running live the whole time, it just exists as a whole. You can look at it, you can move around in it. It really does change things, you can experiment with creativity. There are trade-offs and certain qualities that you just can’t do in a real-time engine; it’s just not there yet but at the same time, all these tools have a place. It’s insane not to really consider a real-time engine for all CG.”

Working in a real-time engine was new to everyone, so Oats Studios decided to stick to what they knew in other aspects of the filmmaking. “We approached it from a liveaction standpoint,” Harvey says. “We were able to adapt the engine to our workflow rather than the other way around, which I think is a big stumbling block for some people. The reason we always do it that way is because starting with real life sort of anchors it in reality. You can stray in making creative choices and do some weird things you can’t do in real life with CG, but if you at least have it anchored in something that’s real, I think it helps with the believabil­ity.”

Harvey and his team went out and did real location scouting, blocking their cameras on a desert in California. They shot 30,000 photos of it and using photogramm­etry, digitised that world and brought it into the engine. They also created the props and clothing for the characters using the same photogramm­etry process, with Oats Studios’ costume designer building all the clothes practicall­y. Harvey and his team would then scan the clothes and photograph them, later shooting a reference video of how they moved. “We’d replicate all of that via VFX and stream them back into the engine,” he says. “It was definitely a learning curve.”

When it came to capturing the faces of their actors, things got a little more difficult. To get the highest resolution and the best capture data, the actors sat still while nine cameras captured 60 frames per second as they spoke. As the actors weren’t allowed to move their heads, their eyes wouldn’t dart around like they would during their stage performanc­es. “We had a reference video on the stage with head-mounted cameras so animators could go and match their eye performanc­e, but then you have an eyelid tracking problem and it actually looks quite

“You CAN do some weird things with CG that You Can’t do in real life, but if You at least have it anchored in something that’s real, i think it helps with the believabil­ity” Chris Harvey, VFX supervisor, Oats Studios

disturbing when parts of a body aren’t moving the way we have been trained since birth to expect them to,” Harvey says. “We had to come up with some fancy rigging solutions. Our rigger eric legare came up with some amazing workaround so that stuff would all bind back together.

“He’s a genius rigger. He’s got the body rig that you bring the motion capture onto, and he’s got the head rig which is actually just a per vertex animated model that gets stitched, and then he’s got a third rig that goes underneath in the head,” Harvey continues. “Often in rigging you run into these circular loop problems where you want one thing to affect something, but that thing it’s affecting needs to affect the thing that’s affecting itself. You have these weird circular dependenci­es and he was able to work around that somehow; he had a bunch of blend shapes as well as bones and then he was able to have them follow and yet drive the same mesh. He created some circular dependency loops that actually functioned properly and we were able to override some of the anomalies in the data.”

In order to help with the hurdles of working with an entirely new engine, Oats Studios hired five new team members who had worked in real time before – “one environmen­t guy, one character guy, one technical animation TD guy who could write some little tools, a back-end guy and an effects guy. The rest of the team was just standard VFX; they’d never worked in a realtime engine before so the alembic tool was a big deal. The alembic importer allowed us to work in a way that we were used to working; you just save it out as an alembic file, which is like a geometry cash file, and it would stream back into the engine. That single tool was a key component in the whole thing, in allowing us to do what we did.”

Harvey adds that Unity Technologi­es were paramount to the success of ADAM, offering support and a few custom tools

 ??  ??
 ??  ?? Cascaded shadow mapping (CSM) was used on the environmen­t geometry, while characters were handled by a set of spotlights
Cascaded shadow mapping (CSM) was used on the environmen­t geometry, while characters were handled by a set of spotlights
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? Below right: Harvey and his team did real location scouting, shooting 30,000 photos of a California­n desert using photogramm­etry
Below right: Harvey and his team did real location scouting, shooting 30,000 photos of a California­n desert using photogramm­etry
 ??  ?? Below left: Most of the environmen­t design was done in Maya, allowing oats Studios to have a working version of the world in unity
Below left: Most of the environmen­t design was done in Maya, allowing oats Studios to have a working version of the world in unity
 ??  ?? left: oats perfected their shadow quality by overriding the internal deferred shading and pushing PCF filtering to 7×7 resolution
left: oats perfected their shadow quality by overriding the internal deferred shading and pushing PCF filtering to 7×7 resolution

Newspapers in English

Newspapers from Australia