What does the rise of virtual production mean for the CG community? 3D World finds out from the experts
Whether you’re a 3D artist, VFX professional, filmmaker or student, you can’t go far these days without hearing the words ‘virtual production’. Studios and creators are constantly telling us that virtual production is revolutionising filmmaking, but what does it mean? What tools and techniques does it utilise? And, perhaps most importantly, why is it important to the 3D community at large? Will these high-end techniques become democratised? To answer these questions 3D World spoke to three very different creators, each using virtual production in unique ways.
Satore Studio, masters of visual design and immersive experiences, created a unique workflow using virtual production. Film director and founder of Unlimited Motion Ltd. Ryan Garry is utilising virtual production techniques to create the world’s first micro-budget motion capture feature film. Finally, innovative production studio Flipbook is perfectly placed to tell 3D World about the use of real-time engines and virtual production in the future of film and TV production.
WHAT IS VIRTUAL PRODUCTION?
Put simply, virtual production is the process of mixing traditional liveaction filmmaking with computer graphics in real time. It means that filmmakers can see or interact with digital elements in-camera and on set and it encompasses a wide range of cutting-edge technology, from the huge LED screen used to transport audiences to a galaxy
far, far away in The Mandalorian to the VR hardware that brought Disney’s The Lion King to life.
“The term is used very loosely at the moment,” explains Tupac Martir, founder and creative director of Satore Studio. “In the simplest terms, virtual production is the ability to use technology to bring environments to life, which can then be caught with traditional camera setups.”
Ryan Garry is quick to point out that virtual production is much more than the headline-grabbing advancements we’re used to seeing. “It’s the blending of physical and digital worlds,” he tells us. “This could be in the form of mega-sized LED walls, but also includes more affordable options like camera tracking – wherever you move the camera in the real world it moves in the virtual. By doing this, you can see VFX elements live on the camera monitor as if they exist in the real world.”
These capabilities are having a monumental effect on film production. “It enables VFX to be seen and changed in real time on set,” adds Garry. “If the director doesn’t like the size of an explosion, you can just modify it in-engine. It might not be the final render that’s used, but it massively speeds up the post-production workflow because a lot of the creative decisions have already been made. As these technologies advance and become more affordable, less imagination has to be spent wondering what the film is going to look like after postproduction. You can focus more on the world, both real and virtual, in front of you.”
As virtual production continues to become the norm across VFX it promises to have several positive effects for artists and the industry as a whole. “In addition to allowing for more creativity,” says Martir, “virtual production offers flexibility. Game engines have become an essential piece of the puzzle. We can combine many of the techniques used in gaming, VR, AR and MR to create wonderful environments, trigger effects and movements that can react to the actors or the camera – leading to more intricate scenes and projects.”
For indie creators like Garry, virtual production is already proving a game-changer. “In my film, for instance, I can 3D scan environments and recreate scenes from the film in VR,” he adds, “or, I could scan and animate actors and bring them to your living room with AR. As the technology becomes more accessible and the skills more widespread, this will be able to be done on more productions and at a lower cost.”
Ben Haworth, co-founder and creative director of Flipbook, predicts that artists will become more integrated into the entire production, rather than just in post. “Ultimately, this will help the final project, especially during compositing, and will open up new job opportunities for the artists too,” he adds. “It’s actually a lot of fun being on set when you can be, and this will allow for more creative direction from people who previously didn’t have a chance.”
“IN MY FILM, I CAN 3D SCAN ENVIRONMENTS AND RECREATE SCENES FROM IT IN VR” Ryan Garry, founder of Unlimited Motion Ltd.
LEVELLING THE PLAYING FIELD
Whilst virtual production is still an emerging technology, that doesn’t mean it is reserved for large
studios with infinite resources. Satore Studio is a multidisciplinary creative studio that sees virtual production as an opportunity to level the playing field. Satore was experimenting with virtual production workflows long before The Mandalorian highlighted what LED screen technology could do. “For us, it’s about taking everything we’ve learned over the past six years and using it to create a compelling story, whether that be in film, TV, broadcast or live events,” says Tupac Martir.
Satore recently partnered with MBS Equipment to demonstrate the power of virtual production in a lifelike demo. The short film was built on the idea that virtual production makes any number of realistic locations achievable. Despite that, the team kept most of the locations behind the actor fairly ordinary, from a street corner and nightclub to a forest and a car. Only at the end when the camera pulls back, revealing a film crew and the large LED wall, does the audience realise they’ve been fooled.
The short was shot over two days, with just a handful of props and furniture pieces to flesh out certain scenes. CG backgrounds did much of the heavy lifting, even fooling the experts. “Everything you see in the background of the demo was final pixel VFX, which is simply incredible,” says Martir. “After the demo was done,
I was watching it and I couldn’t remember seeing a market stall on set. It turns out even I was fooled!”
To visualise their convincing environments, Satore created two CG backgrounds using a combination of tools that included Maya, Disguise, Houdini, Zbrush and Substance. Six more scenes previously created by an archviz company were then optimised for use in the demo and rendered using Renderman, Octane and Arnold. The environments were housed on Universal Pixels servers running Unreal Engine. Each virtual background was created to run in 25fps, using a proprietary workflow developed by Satore. It was all tied together by an Ncam tracking