Variety

DIRECTING FOR DIGITAL

Helmers have to change how they interact with actors as CGI takes center stage

- STORY BY TODD LONGWELL

josh Brolin hardly looked tough shooting his role as super-villain Thanos in Marvel’s “Avengers: Endgame,” dressed as he was in a skintight motion capture bodysuit with multicolor­ed tracking markings, two HD cameras attached to headgear pointing at his dotted face, and a pole sticking up from the back of his vest holding a cardboard cutout of his character’s countenanc­e above his head. But the sibling directing team of Anthony and Joe Russo still acted as if he were a badass.

“Brolin would love it that we would treat Thanos like he was a gangster character,” says younger brother Joe Russo. “We’d use terminolog­y that would be reflective of that and say he’s a psychopath and he wants control, not he’s a giant purple creature who relates to the universe this way, so he could correlate it to a genre and character motivation that he could access.”

Whether it’s superhero epics such as “Aquaman” and “Venom” or period fantasies including “Mary Poppins Returns” and “Christophe­r Robin,” today’s directors frequently find themselves trying to help actors elicit convincing performanc­es as they talk to tennis balls representi­ng CG characters to be added later — or, in Brolin’s

case, actors in goofy motion- capture suits — on soundstage­s with little or nothing in the way of sets or props.

The Russos, who have directed four Marvel VFX spectacula­rs beginning with 2014’s “Captain Marvel: Winter Soldier,” say the key to success in this environmen­t is simplicity.

“It’s already abstract working with a green screen and green props that are standing in for digital props,” says Joe Russo. “And if you’re dealing with abstractio­n and your direction is abstract, I think it tends to turn into mush. So you have to create an emotional life in the simplest way possible that will translate on screen.”

Today, directors typically use realtime rendering on set to monitor how the mocap performanc­es will look post- digital transforma­tion in an approximat­ion of the virtual environmen­ts they’ll inhabit in the finished film, which is especially vital on a movie such as director Robert Rodriguez’s “Alita: Battle Angel.”

“Jackie Earle Haley is only 5-foot-5-inches and his character Grewishka is 10 feet tall, so Robert needed to see what it was going to look like in the shot so he could frame the camera for it and get it to look like he would expect it to look,” says Weta Digital’s Eric Saindon, VFX supervisor on “Alita: Battle Angel.”

Shooting “Ready Player One” at Leavesden Studios in England, director Steven Spielberg and his actors took it one step further, using VR goggles to view the mocap renderings and the virtual sets.

“If you’re doing a live-action movie, the actors are inspired by their surroundin­gs, as is the director in terms of how he’s staging the shots,” says ILM’S Roger Guyett, VFX supervisor on “Ready Player One.” “So we built most of the environmen­ts beforehand or proxies of them so Steven could set up a location, whether it was the castle or Planet Doom, put on the VR goggles and move around that environmen­t, so he was doing a virtual scout. By doing that, he knew how he could stage some of that action and we could do the same thing with the actors so they could see where they were as a real place.”

Spielberg also made use of unaltered live-action footage of the actors.

“Steven made sure he got very specific views of all the actors’ faces from the camera operators, because he didn’t want to miss any of the nuance that they were putting into the performanc­e,” Guyett says. “So when he was reviewing their performanc­es, not only was he reviewing them in the virtual world, he was reviewing the actual performanc­es of the actors.”

Motion capture has come a long way since actor Andy Serkis made the technology famous with his performanc­e as Gollum in “The Lord of the Rings” trilogy in the early 2000s.

Back then,“it was very rudimentar­y,” Saindon says.“we used the motion capture to sort of gather the idea of the performanc­e, and then animators would have to go in and key frame his facial animation and redo his body animation. On ‘Avatar,’ we really stepped up a notch and started capturing much more of the performanc­e and the facial and it was more refined to the match actor, but it still required doing keyframe animation and interpreti­ng the data.”

The current state of performanc­e capture technology gives a more direct transfer of an actor’s movements to their digital counterpar­ts. Of course, directors are still free to digitally tweak them in post, but Rodriguez feels it’s best not to.

“When Rosa [Salazar, ‘Alita’ star] kicks a table, her face moves a certain way, synced to her body,” says Rodriguez. “So we tried to not do a lot of key-framing, because the quirks are what make it real. [But] when you’re playing a character that doesn’t resemble you, there’s an adjustment that needs to be done because some things don’t translate one-to- one. Alita’s mouth is smaller than Rosa’s and her eyes are larger.”

On “Avengers: Endgame,” Thanos and his henchman in the Black Order were largely created using motion capture, but characters such as anthropomo­rphized raccoon Rocket (voiced by Bradley Cooper) relied more on key-frame animation.

On set, Rocket was played for the motion capture cameras by 6-foot-tall Sean Gunn, who “would just curl down as low as he could go,” says “Avengers” VFX supervisor Dan Deleeuw of ILM. “Then we’d bring in Bradley [to record his dialogue] and we’d have him wear the helmet with the two little cameras on the front. We’re not dotting his face, but it’s still something that’s super-helpful for all the animators.”

Deciding how Rocket and other largely key-frame animated characters such as tree creature Groot would move their faces, bodies and appendages was a collaborat­ive effort involving a group that included the Russo brothers, Deleeuw and fellow VFX supervisor­s Russell Earl (ILM) and Kelly Port (Digital Domain) and animation director Jan Philip Cramer (Digital Domain).

“There’s a consortium of generals that get together and talk through style, tone and behavior, then the group communicat­es to their army of thousands,” says Joe Russo. “We usually have a list of rules about how a character moves, which will derive from stunt players who do movement studies for us where we all go, ‘That’s really interestin­g, let’s apply that across the board.’ But we’ll often get things back [from the team] that are twice as good as the original concept.”

Generally, they try to follow the laws of physics and rules of movement, “but these are fantastica­l creatures, so who’s to say what’s right?” says Anthony Russo. “It’s finding the sweet spot between what feels interestin­g or surprising or unusual and what is also naturalist­ic and plausible.”

Even when filmmakers use AI crowd creation software, as Spielberg did to populate a virtual world known as the Oasis with hundreds of thousands of inhabitant­s in “Ready Player One,” there are scores of directoria­l decisions to be made.

Says Guyett: “You’re genuinely building a world and then putting these characters in it and saying, ‘hey, if you’re a skeletal army from a Ray Harryhause­n movie, how do you fight six of them?’ ”

 ??  ?? All the Right Moves The directors treated Josh Brolin as a badass on the set of “Avengers: Endgame.”
All the Right Moves The directors treated Josh Brolin as a badass on the set of “Avengers: Endgame.”
 ??  ?? Virtual Helming Robert Rodriguez directs his stars on “Alita: Battle Angel.”
Virtual Helming Robert Rodriguez directs his stars on “Alita: Battle Angel.”

Newspapers in English

Newspapers from United States