3D World

Meet the artist: Tim Doubleday

The experience­d product manager shares his insight into the world of visual effects and motion capture

-

The VFX product manager at Vicon talks his career journey

Tim Doubleday is the VFX product manager at Vicon, helping develop their motion capture tools such as Shogun. We chatted to him about his career journey, studio setup, and prediction­s for the future of the industry.

What got you into the industry?

I feel lucky to have grown up alongside the introducti­on of the internet and PCS becoming available on a commercial scale. I remember the introducti­on of bulletin boards and the availabili­ty of early 3D rendering software like V-ray and 3ds Max and being amazed at the possibilit­ies that computer graphics could offer.

It wasn’t until a year of doing an Art Foundation that I realised how this emerging industry could not only be amazingly fun but also offer a potential career. This was further cemented while doing a three-year BA in Computer Visualisat­ion & Animation at Bournemout­h University. At the time there were only a couple of university courses and I delayed a year to do the Art Foundation to make sure I got a place. In hindsight this was the right decision as it introduced me to motion capture along with numerous other animation techniques.

While we didn’t have access to a motion capture system at Bournemout­h, it was starting to be used more in videogames like

Virtua Fighter and films like The Matrix. It was actually one of my lecturers, John Vince, who introduced me to Vicon and I owe him so much because of this.

What is your daily work life like?

I’m actually in my third stint working at Vicon after gaining invaluable production experience at companies like The Imaginariu­m and Audiomotio­n Studios. Vicon is a great place to work, especially having grown up in Oxford. I get to cycle to work which is fantastic and Vicon has a weekly frisbee game which is a great way to split the week. Having done crunch and long days on set the more relaxed hours are a godsend, although I do miss those

days; being on a film set is an incredible feeling and boy do I miss the free snacks from Craft Services! While I have my fair share of meetings I try to offset these with product-focused days and try and organise mini shoots when possible. We also try and bring something new to trade shows like GDC and Siggraph so it’s exciting getting to collaborat­e on these projects. I’m also lucky in the fact that I get to visit customer sites and see the amazing work they are creating using our software.

What’s your setup and what kind of software do you use?

Since I travel a lot I use a 13-inch Surface Book the majority of the time: having a

dedicated GPU means it can run all the 3D software, game engines and Shogun, which is the name of our motion capture software. I then have a desktop machine for any heavy lifting including project work for shows. I’ve recently upgraded this to an AMD Ryzen 9 3950 and an RTX 2070 which absolutely flies. Our current Digital Human project doesn’t make use of raytracing yet, but I hope that we can add it in some form in the future!

I was trained in Autodesk Maya and then picked up Filmbox which became Motionbuil­der. At the time Motionbuil­der was the only 3D rendering software to offer anything close to real-time performanc­e.

Now game engines have taken that further by adding support for complex shaders, lighting and even real-time raytracing! We support both Unity and Unreal in our motion capture pipeline but personally I’ve had more experience using Unreal. While I’m no expert I can at least set up animation blueprints and create interestin­g environmen­ts and characters for the motion capture to take place in.

Motion capture is often seen as the enemy of keyframe animation, how do you feel about this?

I can see both sides of this argument really. I think over the last 20 years as the VFX and game markets have exploded there has become a necessity to use motion capture as a means to deliver enough human-looking motion. This has left traditiona­l animators feeling like work is being taken away from them. I can see their point but I think there is enough animation work to go round. There are also always things that you can’t motion capture like creatures and other non-human characters. Not to mention the huge range of games that have biped motion that needs to fit into a precise animation loop. The Souls games for example might use motion capture as a base but then be heavily keyframed to fit the gameplay so it feels tight and satisfying.

“I SEE MACHINE LEARNING HAVING AN EVEN BIGGER IMPACT ON THE MOCAP INDUSTRY”

How do you see motion capture and keyframe animation working together?

Having delivered final animation on a number of videogames including Battlefiel­d V, I can definitely state that no matter how good your motion capture data is it’s always going to need an artist to add their touch. Whether it be face or body animation there are always going to be areas of the animation that don’t hit the exact look you are after. The role of a motion editor has become a well-known part of the motion capture pipeline. It requires both a technical understand­ing of how the retargetin­g process works along with the ability to massage the motion and get the look you are after. The process of going from motion capture data onto a specific character rig often includes manipulati­ng weights and offsets, which is a real skill in itself.

Where do you see the mocap industry going in the next ten years?

As motion capture has become more widely adopted the tools have had to become easier and quicker to use. A big part of this is the switch to real time on set and I see this only becoming more important in the future. Being able to visualise the motion capture shoot in real time within a game engine using close to final quality assets has been common in the last five years or so. This requires a lot of prep before the shoot and removes the idea of fixing things in post, although this can still take place if required. TV shows like The Mandaloria­n are using live camera tracking using optical motion capture to help remove the need for traditiona­l film sets. By mixing live CG elements and background­s rendered across huge LED walls, the need for costly lighting crews and set dressing is greatly reduced.

This allows production­s to save money and work within the confines of TV budgets while delivering film-level quality VFX.

I also see machine learning having an even bigger impact on the motion capture industry over the coming years. The deaging and facial capture work that ILM did on The Irishman is a great example of how machine learning can be used to train a facial rig based off thousands of photos and create much younger versions of the actors. This combined with removing the head-mounted camera (HMC) and tracking dots from the actor’s face feels like a real game changer for performanc­e capture. Machine learning is also being used by Ubisoft in their videogames to help train motion models based off motion capture. This helps with delivering incredibly realistic movement to characters at runtime – as an avid gamer I love this technique!

 ??  ??
 ??  ?? Right: Vicon produce the mocap hardware as well as the software
Avid gamer Doubleday has a workspace personalis­ed with a variety of neat gaming memorabili­a, particular­ly from the likes of the Legend Of Zelda series
Right: Vicon produce the mocap hardware as well as the software Avid gamer Doubleday has a workspace personalis­ed with a variety of neat gaming memorabili­a, particular­ly from the likes of the Legend Of Zelda series
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? Doubleday is equally at home on stage or in the saddle
Doubleday is equally at home on stage or in the saddle
 ??  ??
 ??  ??

Newspapers in English

Newspapers from Australia