3D World

All Grown Down

Three experts weigh in on the past, present and future of de-aging technology and its implicatio­ns for digital human creation

-

We take a look at some incredible examples of de-aging technology in film and TV, and what the future may hold

Digital de-aging techniques are a hot topic in Hollywood right now. From The Irishman to Gemini Man, de-aging is becoming a widely accepted way of telling stories that require younger versions of characters. It is also just one aspect of the wider conversati­on surroundin­g digital humans and their role in the modern world. 3D World has gathered Digital Domain’s Darren Hendler, Chris Nichols from Chaos Group Labs and Olcun Tan of Gradient Effects to deliver a comprehens­ive guide to the current state of de-aging in VFX.

2019 has undoubtedl­y seen the burgeoning technique make more headlines than ever before, with everything from superhero blockbuste­rs to prestige dramas utilising de-aging. “I believe we are seeing so much more of this work primarily because it is becoming more affordable,” says Darren Hendler, director of the Digital Human Group at Digital Domain. “While the process is still very expensive, it is entering a range where big-budget movies can afford it.”

COMING OF AGE

De-aging may be setting the film world alight right now, but its origins can be traced back to the earliest days of cinema, as far back as filmmakers have wished to show younger or older versions of their characters on screen. “In the past, many filmmakers would simply cast a younger version of an actor and we all just understood that this character was playing the main actor at a different age,” reflects Hendler.

“Before 3D and 2D methods burst to the forefront, aging was done with makeup,” says Chris Nichols, director of Chaos Group Labs, “pretty successful­ly, I might add.” In 1983 David Bowie was convincing­ly aged up for The Hunger and just a year later F.

Murray Abraham was similarly transforme­d for Amadeus.

“However, it’s much harder to de-age someone practicall­y if the difference is more than ten years, which is one of the reasons modern methods have gained ground,” adds Nichols. “The techniques have only gotten better too. A younger Patrick Stewart might have looked like smooth plastic in X-men: The Last Stand, but Samuel L. Jackson looks like he just walked off the set of Pulp Fiction in Captain Marvel and it’s only going to get better.”

Nichols highlights David Fincher’s 2008 film The Curious Case Of Benjamin Button as a landmark moment in digital deaging, adding: “In my mind, it is the first time the Uncanny Valley was crossed via Digital Domain’s work on the older version of Brad Pitt.” In the 11 years since the film’s release, the technique has become increasing­ly prevalent on the big screen. “Sometimes new technology inspires people to try things that were not possible before, especially when it’s successful,” Nichols continues. “Some stories might have been shelved for years not knowing how they could pull it off, so when a new technique appears, those projects come off the shelf.”

When digitally altering the age of a performer there are two main approaches. “There is a fully 3D approach where a body double plays the actor’s younger self,” says Hendler. “In this approach, the actor’s head is removed and replaced with an entirely Cgperforme­d version.” The second is a 2.5D approach in which plates are shot with the actor, before their head is smoothed and warped, removing wrinkles and making them appear younger.

Each of these approaches has evolved over time and bring their own set of challenges. The 2.5D

approach has been in use for a long time and can be incredibly convincing. “In the hands of a skilled artist they are hard to spot,” adds Nichols. “The artist is key though, because if these effects aren’t done right, the skin will look too soft or high contrast.”

Tracking and paint tools have evolved considerab­ly on the 2.5D side, making it easier to produce high-quality work with less manpower. “In the past, this approach could only be used for a few shots and was extremely laborious,” explains Hendler. “It’s still very manual, but now, with new tools, companies are able to create several hundred shots using this process.”

A wholly 3D approach allows artists more control over the end result. “Massive amounts of detail can be preserved with a full head replacemen­t,” Nichols explains. “You get all the pores and even sub-pore level detail on someone’s face. You can also make a head do whatever you want. So the challenge really comes down to animation. Preserving subtle movements can be tough.”

The benefits of this technique can be seen in The Curious Case Of Benjamin Button, with an aged-up version of Brad Pitt, who plays the titular character. “One of the benefits of aging someone who isn’t old is that the audience doesn’t have a mental reference for them at that age,” adds Nichols. “This gives the artists more freedom in their work. Nowadays, an actor would be scanned before a head replacemen­t, but on that project, a maquette was created of his head. The sculpt was incredibly realistic and the skin looked spot on. From there, the model was scanned and the team had their reference.”

When it comes to this approach, skin shading techniques have evolved alongside scanning and facial capture, allowing artists to create more and more realistic digital humans. “Methods are continuous­ly evolving but there have also been a lot of great technology improvemen­ts,” Hendler continues. “When creating a younger version of an actor, having a moving 3D mesh of the actual actor is massively helpful. Additional­ly, everything along the workflow from animation to lighting to rendering, including the tools, have gotten better and the result is more realistic. Overall, it is getting easier and more costeffect­ive to do this type of work.”

THE FOUNTAIN OF YOUTH

Gradient Effects have added to the continued evolution of de-aging techniques with their work for HBO’S The Righteous Gemstones, which saw them de-age John Goodman for an entire episode, delivering 30 minutes’ worth of photoreali­stic VFX. For this, they used Shapeshift­er, an Ai-assisted tool that allows them to ‘reshape’ an individual frame and the performers in it, before extending the results across the rest of a shot.

Designed to simplify the traditiona­lly complex and timeconsum­ing process of de-aging, Shapeshift­er was developed over a number of years. “Gradient is all about rethinking existing workflows,” says Olcun Tan, owner of Gradient Effects. “I would say we are more like a technology company that is using VFX as a stomping ground.” Considerab­le time is devoted to Gradient’s research and developmen­t, which grows with each new project. “Shapeshift­er is a great representa­tion of that mindset.”

Shapeshift­er uses filmed footage as its base, maintainin­g the actor’s performanc­e after they have been digitally de-aged. “Our technology breaks down the performanc­e into sub-motion data,” Tan explains, “which can then be modified in 3D. Once the reshaping is done in 3D, it translates back to the filmed plate.”

In the fifth episode of The Righteous Gemstones viewers are transporte­d back to 1989, necessitat­ing a younger version of Dr. Eli Gemstone, played by John Goodman. Shapeshift­er began the de-aging process by analysing the underlying shape of Goodman’s face, before extracting important anatomical characteri­stics, like skin details, stretching and muscle movements. With these extracted elements saved as layers to be reapplied at the end of the process, artists could start reshaping Goodman’s face without breaking the original performanc­e. Although artists could tweak additional frames in 3D as needed, they often found it unnecessar­y, making the de-aging process more or less automated.

“While most production­s are limited by time or money, we can turn around award-quality VFX on a TV schedule,” says Tan. “One of the first shots of the episode shows stage crew walking in front of John Goodman,” says Tan. “In the past, a studio would have recommende­d a full CGI replacemen­t for Goodman’s character because it would be too hard or take too much time to maintain consistenc­y across the shot. With Shapeshift­er, we can just reshape one frame and the work is done.”

This is possible because Shapeshift­er continuous­ly captures the face and all of its essential details, using the source footage as its guide. With the data being constantly logged, artists can extract movement informatio­n from anywhere on the face, whenever they want, negating the need for expensive motion capture stages, equipment and makeup.

“I think it’s consistenc­y and maintainin­g quality throughout the work,” says Tan, reflecting on how the studio achieved such

“YOU GET ALL THE PORES AND EVEN SUB-PORE LEVEL DETAIL ON SOMEONE’S FACE” Chris Nichols, director of Chaos Group Labs

realistic de-aging results. The end result depends on more than just technology, it takes a combinatio­n of artistic skill and technical vision, which is why the quality of de-aging effects can vary so wildly. “There is this huge misconcept­ion that knowing how to use a piece of software qualifies someone as an artist,” Tan adds. “Unfortunat­ely the industry and schools

nowadays create more operators who know the software, but are unable to create anything outside of its boundaries.”

He continues: “Once someone told me, you cannot be an artist and be strong in engineerin­g. I disagree. Without the artistic imaginatio­n it’s impossible to create new tools that will work for other artists, as well as scale up the process to hundreds of VFX shots.”

FOREVER YOUNG

Shapeshift­er’s Ai-driven sophistica­tion offers some clues to where the future of de-aging effects might go. “I think AI and neural networks will play a massive role,” admits Hendler. “Currently, all these de-aging processes are very manual, but we are starting to see glimmers of how machine learning techniques can revolution­ise this whole process.”

“AI is going to change a lot of things in the computer graphics world,” says Nichols, “and there is no reason why deepfakes can’t be used for de-aging.” Instead of swapping one performer’s face for another, artists might soon be able to simply replace an actor’s face with that of their younger self.

Making an actor younger is only half the problem however. They also have to act and behave in a way that convinces the audience.

“While not directly related to deaging, some of the technology that Digital Domain uses to interpret motion via deep learning could possibly be applied to make their facial animation younger,” adds Nichols. “By training a system on a variety of different faces at different ages, the system could learn how to act a certain age.”

AI is just one example of technology emerging to influence the field of de-aging. Tan believes that eventually VFX teams will be working with full digital actors, with studios developing their own performers and creating films entirely in CG. “The reason for this is simply economic,” he explains. “A digital actor doesn’t ask to get paid, doesn’t need to sleep, doesn’t need to join a union, and won’t quit on you.”

“There will come a time where a computer model can be trained on images of an actor now and at a certain age, and learn what it takes to change the images of the current actor into images of their younger selves,” adds Tan. “In this case, we are training a system that becomes good at recognisin­g whether the performanc­e looks like a realistic young version, and then iterating through millions of variations of the generated young actors until we have fooled the system into believing it is the younger version.”

Despite recent advancemen­ts, de-aging remains an incredibly complex process and one that is far from the finished article. “New technology is constantly showing up and surprising people,” says Nichols. “I don’t think people could have predicted that in a few years technology like deepfakes would not only be possible but available for nearly anyone to use.”

He continues: “There is still a lot we can learn and a lot of fields we can collaborat­e with, such as psychology, forensics, and virtual assistants. The world of digital humans will go far beyond trying to make older actors look younger.”

“OUR TECHNOLOGY BREAKS DOWN THE PERFORMANC­E INTO SUB-MOTION DATA, WHICH CAN THEN BE MODIFIED IN 3D”

Olcun Tan, VFX supervisor and owner of Gradient Effects

 ??  ??
 ??  ??
 ??  ?? Right: Digital Domain’s pioneering de-aging work on Jeff Bridges for Disney’s 2010 film Tron: Legacy
Right: Digital Domain’s pioneering de-aging work on Jeff Bridges for Disney’s 2010 film Tron: Legacy
 ??  ?? Below: Doug Roble of Digital Domain’s TED Talk on Digital Humans, photograph­y by Bret Hartman
Below: Doug Roble of Digital Domain’s TED Talk on Digital Humans, photograph­y by Bret Hartman
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? Digital Domain earned an Academy Award for visualisin­g a man who ages backwards in The Curious Case Of Benjamin Button
Digital Domain earned an Academy Award for visualisin­g a man who ages backwards in The Curious Case Of Benjamin Button
 ??  ?? The team at Digital Domain delivered de-aging effects for 52 minutes of Benjamin Button and over 325 shots
The team at Digital Domain delivered de-aging effects for 52 minutes of Benjamin Button and over 325 shots
 ??  ??
 ??  ??
 ??  ?? Below: Gradient Effects’ de-aging work on John Goodman for HBO comedy series The Righteous Gemstones
Below: Gradient Effects’ de-aging work on John Goodman for HBO comedy series The Righteous Gemstones
 ??  ??
 ??  ??
 ??  ?? Gradient Effects took just six weeks to turn around 30 minutes of footage featuring a de-aged John Goodman
Their film division has previously utilised Shapeshift­er’s technology on projects like Thor
Gradient Effects took just six weeks to turn around 30 minutes of footage featuring a de-aged John Goodman Their film division has previously utilised Shapeshift­er’s technology on projects like Thor
 ??  ?? Gradient has studios around the world in Los Angeles, Munich and Montreal
Gradient has studios around the world in Los Angeles, Munich and Montreal

Newspapers in English

Newspapers from Australia