3D World

THE MANDALORIA­N

ILM DISCUSS A VISUAL EFFECTS OF BOUNTY

-

sits down for a conversati­on with Richard Bluff, visual effects supervisor and Hal Hickel, animation supervisor, both at ILM in San Francisco, to unpack the creative adventure of telling the story of a galactic bounty hunter.

Werner Herzog, worldrenow­ned filmmaker and occasional actor, takes a key role in the new Star Wars

TV series The Mandaloria­n. Speaking in 2019 about the series, Herzog made the key point that it deploys what he described as mythic images, and his descriptio­n speaks to the visually compelling quality of the series and the rich tradition of Star Wars. It’s a pop-culture phenomenon that has captivated the imaginatio­n in two ways: the story unfolding on screen, and also the story of the creative impulses, choices and challenges to put those stories in motion.

Richard Bluff begins our conversati­on by identifyin­g the project’s landmark approach to environmen­t creation: “I think the biggest challenge was wrapping our head around how we wanted to utilise real-time game technology in collaborat­ion with the LEDS, effectivel­y prototypin­g out what that technology would look like, and then of course executing a production-ready tool for the first day of shooting. That was by far and away one of the greatest ever challenges that I’ve faced in the visual effects industry.”

Fascinatin­gly, ILM has a connection to game engine use that dates back to their work on Steven Spielberg’s dazzling science-fiction fairy tale, A.I. Artificial Intelligen­ce, where it was used for virtual production approaches during filming of the Rouge City sequence.

Bluff sketches out the longstandi­ng relationsh­ip between ILM and its use of LED: “There had been an awful lot of work done prior to The Mandaloria­n utilising LED screens at ILM and Lucasfilm: they’d been used on Rogue One for example, and the game engine technology, particular­ly Unreal Engine for season one, had been used extensivel­y for X Lab, our immersive developmen­t department at ILM, on various augmented reality and VR projects. So, there were various pieces of all of the pipeline that had been utilised in visual effects, or with Jon Favreau [series creator] and his past projects including The Lion King and The Jungle Book. I think the biggest challenge was pulling all of that together, but more than that it was the goal that we set ourselves of shooting half of the season in the LED volume, and within that amount of work making sure above 50 per cent of every take would constitute an in-camera final. And, as a

result, that would have meant us building over 110 real-time environmen­ts that in theory had to be photoreal and played incamera. There was nothing that existed prior to The Mandaloria­n that had attempted anything near what we tried to do. Up until now it was isolated to one or two shots or scenes with content that was intended as previz only for dynamic lighting, whereas we were attempting in-camera set extensions – effectivel­y taking the post-production aspect and putting it in prep.”

Bluff goes on to offer context for ILM’S work in applying the game engine technology to their production and visual effects collaborat­ion: “ILM has a rich history of projects, plus supervisor­s and artists working in new media. Prior to Kim Libreri joining Epic he was a visual effects supervisor at ILM and had been the lead supervisor behind an ambitious project to turn a video game environmen­t (that was never used), and try to imagine how we could utilise that world and those characters and use it in a real-time game engine to generate content.

So, he’d already been pursuing game engines for television or theatrical content for a long time.

“The same goes for various artists – take Landis Fields, he and I had worked on the Millennium Falcon Smuggler’s Run ride [for Disneyland] a year or two prior to The Mandaloria­n starting, and the list goes on in terms of the projects that various folks at ILM had been involved in, so I think people saw that this was a possibilit­y on the horizon. There were few people that saw the link with the LEDS. We were also pursuing game engines as a back-end post-production renderer to see what leverage we could get out of there. But this was certainly the project that pulled all of those ideas together and it goes without saying that without Jon Favreau, his vision and his thorough understand­ing of the technology, we wouldn’t have done what we did. Various pieces of this tech have existed for a long time, but there’d never

been a filmmaker that ILM and Lucasfilm had come across since George who was so willing to put the entire project on a theory behind what we could do with technology. And for Jon it was his ‘Holy Grail’ shooting methodolog­y that would allow him to reduce the stage footprint, maintain the project in LA as well as increase the speed of production, while advancing the technology and giving his actors, directors and DP something to shoot against aside from just green screens.”

The Mandaloria­n offers us new characters, sometimes seen in familiar settings, including the desert planet of Tatooine (the planetary lynchpin of a galaxy far far away): “One of the interestin­g things about Tatooine of course is that it’s incredibly iconic, and knowing that we were going to go back there everybody was very sensitive to the fact that we wanted to put it in the best light possible. Because we were styling the show on a Western, and we were mimicking a lot of the aesthetics of the original movies, we actually felt that it was important to show those same locations when we arrive at Tatooine. So, for example, when we arrive at Tatooine and the Razor Crest flies in towards Mos Eisley there are two establishi­ng shots, and so we actually went back to the archives at Skywalker Ranch and pulled out the original matte paintings and rephotogra­phed them digitally to use them

"WE HAD TO GET OUR CG VERSION OF THE BABY ALIGNED WITH THE PUPPET" Hal Hickel, animation supervisor, ILM

again in the show. One of them, we weren’t able to get the kind of resolution that you’d expect these days, so we tracked down the real location, in Death Valley, where the high angle was shot and we rephotogra­phed elements of that. We worked it back into the original matte painting to increase the resolution and enhance what was already there to ensure that it would hold up for today’s viewing.

“And then the second thing I would add was the Mos Eisley cantina, which of course is a very iconic environmen­t that everybody’s very familiar with, and it was an environmen­t that Jon Favreau felt very strongly we should try and put on the volume. But it was an environmen­t that presented some very unique challenges because the actual bar itself is very small, so therefore it would only take up a third of the volume itself. So, we had to slide our physical set over to one side of the volume, so when one angle across the bar is all LED, we had to make sure [the LED] was at the correct depth to maintain the right sort of focus. But it also meant that if you did a reverse to the right of the physical booth where Tora was sitting, the actual booth extension was another 30 feet back on the opposite side of the LED screen. So, this presented some very unique problems with focus of course: we knew we couldn’t present something that was anything less than what was there in A New Hope, and because we’re using this new technology as well we couldn’t show our hand. It had to be perfect as far as we were all concerned. So that was a scene that we pre-vizzed to within an inch of its life, so that we knew exactly where the camera was going to go in every single setup. [We had to] make sure that the camera field of view saw exactly what was intended to ensure that any blending between the physical and the LED matched perfectly. I’m happy to say that it went wonderfull­y well due to the collaborat­ion between the DP, the production design, constructi­on and visual effects to

“WE TRACKED DOWN THE REAL LOCATION IN DEATH VALLEY AND REPHOTOGRA­PHED ELEMENTS” Richard Bluff, visual effects supervisor, ILM

the point where every single shot of the environmen­t within that scene is all in-camera: LED and physical set. The only additional visual effects that were ever done in that scene were the addition of the hologram on the table and painting out the puppet rods for the bartender. That was it.”

Since its inception, Star Wars on screen has always fused live action with forms of animation: stop motion, cel animation and animation rendered using digital technology. The Mandaloria­n’s animation ethos draws together all of these approaches. Hal Hickel notes that, “One of the big challenges on the show was the diminutive character known only as The Child. And at that point in time, Legacy was still building and finishing the puppet. And while they were doing that, we were building the CG version and because of scheduling, both of those things had to happen kind of at the same time. And we also

didn’t know yet how much the puppet would play. We thought, it might be like some shows where the puppet ends up being really good lighting reference and we end up replacing it. Or it could be the other extreme, and it turned out to be the other extreme, where the puppet plays most of the time and does the heavy lifting and our job was to match it really well; so, that was the main thing when I came on. That was the hot item: to get our CG version of the baby aligned with the puppet, and really kind of figure out how to animate it so that it still felt like the puppet even when we were doing all-cg shots.

“So, that was the big thing and it grew from there as I got involved with more and more of the work, and ultimately I supervised all of the animation work. As I said, the main focus when I first joined was getting the baby right, and then we went on with the Mudhorn, IG-11, pit droids, jawa, sandcrawle­r, TIE fighters etc. Basically, everything you expect in Star Wars.”

Hickel then details the work of his team on contributi­ng to the creation of The Child (the diminutive and mysterious character that The Mandaloria­n is charged with bringing in). “In terms of the handshake between CG and puppeteeri­ng, The Child is a really good character to focus on because while the puppet plays in the majority of shots there’s still a whole range of CG participat­ion. [There are times when] it’s basically the puppet and we’ve just nudged an expression slightly or adjusted the facial a tiny bit, or given the fingers a little more movement. Then the next step up might be some shots where we might do a head replacemen­t: we keep the real body but replace the head with CG, because later in editorial we might really need a slightly different reaction than what we got on the day and so we need to change it. And again, even then, we would make sure we animated the head in a way that was consistent with what the puppet could do. And then there’s shots where he’s completely CG, from head to toe, and those typically were shots where he was active or walking in a way that was a little hard to get with the rod puppet or the

physical puppet: like when he’s at Kuiil’s ranch and the baby is chasing after a frog, shots like that. I love characters where we are grounding them in something that’s real and we’re partnering with the brilliant folks at Legacy to create this thing. Again, the audience feels like it’s a real, physical thing, but occasional­ly we can push it in ways that were hard to get with the puppet. In this case, the last ten per cent of the overall presence in the series was the CG part.

“When we were designing, if the eyes [of The Child] had too much sclera, or too much white in the eyes around it (which is something ordinarily that you’d want on a character), Jon was like, ‘I don’t really want to see very much. A little tiny sliver’s okay, but those big dark eyes look more animal almost’, and that to him was better. That was surprising because my tendency, because I partly come from an animation background, was to say, ‘Let’s have a nice clear sclera so we have really strong eye direction’, but Jon said ‘No, let’s not.’ I learnt a lot from him.”

For the series, Hickel and his team moved back and forth between animation and onset puppetry, and he uses the example of the droid IG-11 to illustrate the relationsh­ip: “With IG-11, we had an actor on set in a grey suit but the intention wasn’t to capture his movements, he was there to indicate eyeline and give the actors something to react to and play off of. We had a physical mock-up build of IG-11 from the waist up and it was really intended as lighting reference, but it looked great so the puppetry team rigged the arms up with some rods and the puppeteers could do basis stuff with him in close-ups. So, while the vast majority of shots are CG, there are a handful of shots where we’re using the physical, puppeteere­d mock-up.

“And then IG-11, because of his peculiar design we didn’t want to go with mocap. We thought that it would be boring to have such an outrageous­ly skinny and rectilinea­r kind of tin robot design and then to

just put naturalist­ic human motion on him didn’t seem right. So, he’s 100 per cent keyframed and the work for this was led by Ken Steel, animation supervisor at Hybrid that led the charge on the IG stuff. ILM did some IG shots as well, but Ken’s group really defined his style of movement and so we had a lot of conversati­ons about how he’s kind of like a tin robot, kind of stiff and comical in that way. But once he starts shooting, because he’s got eyes and sensors all over his head, he can shoot in any direction at any time. And we definitely wanted his body to do things that humans couldn’t do in terms of how the arms are operating and the torso can swivel 360 over the hips. That was super fun and that was at one end of the spectrum of the 100 per cent keyframe work of the animator.”

Of the pipeline involved in creating the digital beasts named blurrgs, Hickel recalls that,

“It was a pretty convention­al pipeline. We did walk and run cycles upfront and used those animation curves to drive a motion base, so sort of the top part of the saddle was built and Misty Rosas [in the Kuiil costume] would sit up on that thing and do anything from a walk to fast trot. It just rocked and moved in conjunctio­n with the animation curve so that when we plugged in the animation later of the rest of the creature, it lined right up. And, I have to say, this was my first show where we had that pipeline work all the way through. This was the first time that it was done properly and it worked gangbuster­s, and I’m going to fight harder [for it on other projects].”

Synonymous with Star Wars are its visually dynamic and intriguing spaceship designs that are often imbued with the feel of what George Lucas always described as ‘a used universe’, the beauty of which often resides in the fact that they are not always sleekly built vessels.

Hickel led the animation work involved in bringing the ship to the screen, and he identifies an ‘old-school’ approach that worked well within the digital production mode: “I animated myself a landing shot: it’s the first time we land on Arvala. That was our first animated Razor Crest shot and I just wanted to define how this ship should land. Beyond that, we got into this discussion where Jon said, ‘Let’s build this model as lighting reference’, because it’s not finished like any Star Wars ships we’ve seen to date: it’s not a mirror finish like Padme’s ship [from The Phantom Menace] and it’s not a matte finish like the Millennium Falcon or X-wing, it’s somewhere in between.

“So we started talking about that and then Jon pushed it a little further: ‘Well, could we do some motion control work and shoot it as a miniature? So that shifted it into a whole other gear and we had John Goodson, who is a long-time model builder at ILM, both on the physical modelling side and the CG as well – he built

THE “WE’VE MANAGED TO INCREASE SCALE AND SCOPE FAR BEYOND ANYTHING WE THOUGHT WAS POSSIBLE”

Richard Bluff, visual effects supervisor, ILM

the Razor Crest in his shop – and John Knoll hand-built his own motion-control rig just for this. He sourced as much as he could from available parts, he handmachin­ed the parts he couldn’t buy, he built a camera mover and a model mover and then wrote his own software to run it. That’s classic John.

“So, we set it up on our motion-capture stage at ILM; we had about 40 feet of track and it worked great. To design those shots, I had a CG version of the motion-control rig way in advance of this, and I would animate the shots in Maya but using that motion control rig. In other words, I couldn’t design anything that the motion control rig couldn’t do. Sort of like doing tech viz, and part of that design process in the shots was not just adhering to what the rig could do for technical reasons, but it was also trying to enforce a style guide that matched shots from the Original Trilogy.”

Hickel offers a reflective bigpicture statement that captures the spirit of the work done and the work yet to be undertaken at ILM for the next series of the show. “I think there are times when filmmakers want to use an old technique for nostalgia but this really informed what we were doing in terms of the movement of the ship. It was really worth doing. It was great.”

Bluff brings the conversati­on to a conclusion, hinting at the scope of things to come: “With the experience of season one, we gained a huge level of confidence and with that we’ve managed to increase the scale and scope far beyond anything we thought was possible. I hear from a lot of people that season one didn’t feel like a TV show at all. Well, season two will make season one look like a TV show in comparison.”

 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? A practical set piece of the Razor Crest cockpit positioned in the Stagecraft volume for real-time interactiv­e lighting
A practical set piece of the Razor Crest cockpit positioned in the Stagecraft volume for real-time interactiv­e lighting
 ??  ?? Kuiil rides a blurrg mount (to be integrated with animated CG blurrg in wide shots) towards the Razor Crest in the Stagecraft volume for real-time interactiv­e lighting
Kuiil rides a blurrg mount (to be integrated with animated CG blurrg in wide shots) towards the Razor Crest in the Stagecraft volume for real-time interactiv­e lighting
 ??  ??
 ??  ??
 ??  ?? Below: John Knoll prepares a shot of the Razor Crest, adjusting the bounce light onto the model
Below: John Knoll prepares a shot of the Razor Crest, adjusting the bounce light onto the model
 ??  ?? Above: The Razor Crest miniature built and photograph­ed for season one of
The Mandaloria­n
Above: The Razor Crest miniature built and photograph­ed for season one of The Mandaloria­n
 ??  ?? Above: Kuiil rides the blurrg across the desert. The environmen­t is realised using a wrap-around LED screen
Above: Kuiil rides the blurrg across the desert. The environmen­t is realised using a wrap-around LED screen
 ??  ??
 ??  ?? The Mandaloria­n and The Child in the cockpit of the Razor Crest
The Mandaloria­n and The Child in the cockpit of the Razor Crest
 ??  ??
 ??  ??
 ??  ?? Digital version of The Child for animation
Digital version of The Child for animation
 ??  ??
 ??  ??
 ??  ?? Top: Final shot showing Razor Crest composited with Stagecraft­realised location
Top: Final shot showing Razor Crest composited with Stagecraft­realised location
 ??  ?? Opposite: Models of the Razor Crest landing gear for animation
Opposite: Models of the Razor Crest landing gear for animation
 ??  ?? Above: The Mandaloria­n and IG-11 get caught up in a skirmish that’s steeped in the style of a Western shootout
Above: The Mandaloria­n and IG-11 get caught up in a skirmish that’s steeped in the style of a Western shootout
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from Australia