3D World

Journey’s end

Trevor Hogg spoke to Weta Digital to discover the incredible artistry needed to make War for the Planet of the Apes believable for the latest installmen­t of the Apes saga

- Copyright is ™ and © 2017 Twentieth Century Fox. All rights reserved

Weta Digital look back at how the Planet of the Apes films have evolved, and the technologi­cal advancemen­ts that enabled the team to push 3D boundaries in War for the Planet of the Apes

Answering the question of how primates came to conquer the Earth in the 1968 sciencefic­tion classic, Planet of the Apes, is a prequel trilogy that features Rise of the Planet of the Apes (2011), Dawn of the Planet of the Apes (2014) and War for the Planet of the Apes (2017). The three movies introduced the world to a geneticall­y modified chimpanzee named Caesar who left behind his lab experiment origins to go on to become a powerful leader. The films also propelled Weta Digital to the forefront of being able to convey emotion through entirely synthetic characters.

“When we did Rise of the Planet of the Apes, Caesar was being created for the first time,” states Weta Digital Senior Visual Effects Supervisor Joe Letteri, who has received Oscars for his work on The Lord of the Rings, King Kong and Avatar. “We were trying to walk the line between an ape that looks like an ape and an ape that starts to show leadership and human intelligen­ce. We were broadening that out on Dawn of the Planet of the Apes. Caesar becomes the leader of this group of apes out on their own. We started having to introduce the idea of language and apes speaking, and tried to keep that believable. On War for the Planet of the Apes, we were past those hurdles. We’re in the heart of the story and looking at what that leadership means for Caesar within his own group and how that plays out against the remaining humans in the world.”

Technologi­cal revolution

In addition to the evolution of the character, Weta Digital has been constantly adapting and developing technology.

We’ve been pushing the technology to the point where it becomes the same kind of tools as if you were making a live-action film” notes Joe. “When we did Rise of the Planet of the Apes, the techniques that we had for lighting fur

“We’ve been pushing the technology to the point Where it becomes the same kind of tools if you Were making A live-action film”

Joe letteri, Weta Digital senior Visual effects supervisor

were approximat­e. Now we have pushed that all into the software, which behaves as if you were outside in the real world photograph­ing nature.” The Jungle

Book provided an opportunit­y to field test proprietar­y renderer Manuka, which marked a transition from spherical harmonics to ray tracing. “Often when fur is backlit you get sparkly glints as the flex and imperfecti­ons in the hair pick up little micro reflection­s of sunlight,” states Weta Digital Visual Effects Supervisor, Dan Lemmon, who won an Academy Award for his contributi­ons to the live-action adaptation of the Disney animated classic. “That was something that our hair simulation didn’t model correctly. At the beginning of War for the Planet of

the Apes, we completely rewrote our hair shading model and went to a dual core shading model. It models the cuticle but also the medulla inside hair follicles, allowing us to get a more realistic specular breakup.”

“On the first one it was hard enough to do the fur and get the simulation­s, interactio­n and reaction to the skin and lighting

in those kinds of environmen­ts,” remarks Joe. “Caesar was at home, in an office building and outdoors for a little bit. On the second film Matt Reeves [Let Me In] pushed it. He took it out to the wet forest. We had to deal with the water interactio­n and more elements. In this third film, Matt took it as far as he could. This is a harsh environmen­t. We were shooting out in the snow and looking at what that’s doing to the actors and thinking how are we going to transfer that to the behaviour the apes need to have and how that snow cumulates with the fur.”

Over half of War for the Planet of the Apes takes place in snowy environmen­ts. “It’s a big technical challenge because of the complexity of the fur system,” remarks Dan. “Each ape has millions of individual hairs on their body and in some cases as much as five million to 10 million; that’s a whole lot of processing you have to do to make each of those hairs move around and dynamicall­y adjust to the wind, motion of the ape, pick up snow and release snow.” Snow can be powdery, wet, heavy, sticky, and crusty. “Often times you’ll get mixes of different kinds of snow within the same environmen­t,” explains Dan. “You might have a great setup that does a particular kind of snow but having to match into the live-action footage means that you need to

make your snow look exactly like what is already there.”

Believable TRAITS

With each film, the apes develop more human traits, which presented its own set of challenges. “For us, the difficult trick played out over the course of three films is the idea of introducin­g language,” notes Joe. “In the first film, Caesar has only one word. By the second film there was more of it and by the third film there was more. It became part of the natural progressio­n of the story that the evolved intelligen­ce of the apes would involve speaking.” A new character voiced by Steve Zahn called Bad Ape took some time to work out in terms of his speech. “Part of the challenge with apes with their bigger muzzles is when they would articulate as much as the actors, it meant that the lips moved at a greater distance,” explains Dan. “It looked like the lips were moving faster than they physically could. That was something we had to tone down. Steve does a great job on him.”

As important as speech is, eyes are fundamenta­l in making CG characters believable, as Joe observes: “It’s not just the detail of how much is in the structure of the eye, it’s also the detail of the movement of the eye and how that is coordinate­d with everything else that is going on in the face. Because that’s where you get your clues as to what the character is thinking. Then of course, the lighting plays into that because eyes are incredibly complex. They’re like little jewels and you have to understand the subtle shading of light within the eye amplifies all of the subtle movement and detail that you’re looking for to give you those ideas.” Dan adds, “If the eyes work you can forgive a lot of other things. That’s why we spent more time redesignin­g the way the eyes and eyebrows work, particular­ly on Caesar. We snuck bits of Andy’s anatomical details from his eyes, eyelids and eyebrows back onto Caesar to help us to more closely match the same facial expression­s and shapes that he makes.”

“Quite a lot of this movie was filmed practicall­y and that grounds the movie,” notes Joe. “But we bookend it with the Hidden Fortress situated behind a giant waterfall where we have to do a lot of that digitally because it’s impractica­l to find a location like that to shoot in. As we get to

the prison camp, that landscape is all built out digitally as well.” LIDAR scans were taken of anything that was physically built. “We photograph­ed everything to understand the materials and shot a lot of lighting reference to understand how the lighting affects those materials. Then we created a digital version of whatever we had in live-action.” Battle scenes were a mix of practical and digital elements. “Matt wanted to be gritty but to also have that sense of bows and arrows versus heavy artillery. There were a lot of practical effects and those gave us the basis for the kinds of digital effects that we needed to augment it.” A huge challenge for the team was the avalanche that crashes down through the mountains, knocking over trees and covering the entire prison camp. “An avalanche isn’t something that you get asked to do on every film so it required research as to what an avalanche looks like and what are the components that we want to hit,” notes Dan. “There are a lot of different avalanches out there so what does ours need to do? Figuring out the character of the avalanche and then trying to get it to behave in the way we wanted it to behave was difficult.” To make a success of the challenge, the proprietar­y tree system software had to be revamped. “Lumberjack constructs growth patterns and branching structures that closely mimic a tree in the real world. It builds a simulation system at the same time, soas branches grow they start to be affected by gravity and begin to droop.”

“there is some fantastic Acting that the performers did on-set And the Animators At Weta digital Were Able to carry Across onto the digital characters”

Dan lemmon, Weta Digital Visual effects supervisor

Building CHARACTER

But however incredible the environmen­ts, the apes are the vital component. “This is a characterd­riven story,” observes Joe. “It always comes down to those small moments where you understand the fate of the apes and where the story is going can change based on what a character is thinking. We’re always trying to look for the ability to read that thought process in the character so you can be right there with them as the story happens.”

“There is some fantastic acting that the performers did on-set and the animators at Weta Digital were able to carry across onto the digital characters,” remarks Dan. “It’s a real testament to their craft and artistry that those scenes work as well as they do.” Joe concludes, “It has been fantastic to be able to take Caesar from this innocent little baby chimp all the way through to this leader with such a burden on his shoulders, and having to understand and face up to the responsibi­lity of what is happening in the world.”

 ??  ?? The avalanche that takes place in the third act was a major technical and creative challenge for Weta Digital
The avalanche that takes place in the third act was a major technical and creative challenge for Weta Digital
 ??  ??
 ??  ?? The waterfall situated in front of the hidden Forest was created digitally
The waterfall situated in front of the hidden Forest was created digitally
 ??  ?? Where live-action shots were impossible, Weta Digital would step in and create what was needed
Where live-action shots were impossible, Weta Digital would step in and create what was needed
 ??  ?? A close-up shot of the motion capture performers travelling through the harsh winter environmen­t
A close-up shot of the motion capture performers travelling through the harsh winter environmen­t
 ??  ?? All of the human actors are replaced with digital representa­tions of the apes
All of the human actors are replaced with digital representa­tions of the apes
 ??  ?? Making how the snow interacted with fur believable was a particular challenge
Making how the snow interacted with fur believable was a particular challenge
 ??  ?? The motion capture performanc­es of Terry notary and Steve Zahn are incorporat­ed into rocket and Bad Ape
The motion capture performanc­es of Terry notary and Steve Zahn are incorporat­ed into rocket and Bad Ape
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? Fur simulation­s were pushed further as for the first time the apes needed to interact realistica­lly with snow
Fur simulation­s were pushed further as for the first time the apes needed to interact realistica­lly with snow
 ??  ?? inflatable green screens measuring 50 feet high by 100 feet wide were deployed during the principal photograph­y
inflatable green screens measuring 50 feet high by 100 feet wide were deployed during the principal photograph­y
 ??  ??

Newspapers in English

Newspapers from Australia