‘It’s not like working with Captain America and Spidey where there are certain number of rules because Cap and Peter Parker are flesh and blood. You don’t have any of that in the Thor universe because your main character is a god!’ — Jake Morrison, vfx supervisor
ital version of Charlie’s face.
Shooting Dry for Wet Berardi soon understood after extensive talks about how Charlie would appear on screen that there was a lot of water work in the story, and that the water would have to act as a player in the film, too.
“The opening shot of the film, which was two minutes long, is entirely in a water environment,” says Berardi. “We had shot reference in Georgian Bay, north of Toronto, because I’m a big believer in reference. We got an underwater photographer and we had him swim through various rocky underwater areas and got beautiful work.”
The reference work helped them get a sense of what they wanted but ultimately, del Toro wanted greater control over the look and surrounds him is digital.
Putting together such extensive water work wasn’t new for Mr. X, since they’d already done lots of it on shows like Vikings, but the detail of it was unique. Berardi soon pulled together his Houdini effects team and reached out to the team at SideFX (makers of Houdini) to create something new for the film. Mr. X also had proprietary in-house tools. A Lengthy Post for Charlie After filling their toolbox with the most powerful things they could find, the team inched through the scenes and made choices on a shotby-shot basis about what to use. The director was interested in the entire process and joined the crew in casting which animators would work on specific scenes. Many of the most complex water scenes took eight to 15 hours to render just one frame.
“Normally on a feature you have 20 to 26 weeks to do all the post-production work, but we couldn’t have made this film on that schedule,” says Berardi. “Luckily we had longer. We had 45 weeks in post, and we had a crew of over 200 people working on the film at Mr. X for the entire duration of the show, and we just barely made it. I think we delivered our final shot on the day before Guillermo jumped on a plane to screen the film at the Venice Film Festival.”
Now that it’s done, Berardi is happy to look back on his time with Charlie submerged in digital water. “During shooting you’re always exhausted or borderline exhausted, and then in post-production you get redemption, because you see the fruits of your labor,” he says. Fox Searchlight releases The Shape of Water in theaters on December 8.
we work, from gathering reference during the shoot all the way through to final delivery.”
“Not that it is a new format, but the number of IMAX movies is increasing, and they certainly need to be shot in a certain way, and then re-framed for more standard cinema screens due to aspect ratios being so different from each other, and also how an audience physically views an image as large as IMAX,” Lockley says. “We most recently completed Dunkirk for IMAX and 70mm 5-perf at a resolution of 6.1K. Most movies we work on are still rendered between 2K and 3K, so 6.1K is a big jump, and requires a complete re-think on how to build and texture assets for the movie.”
Lockley adds that “texturing needs a lot of work, and gathering texture photography at the highest resolution and detail is critical to avoiding CG assets looking mushy in high-resolution frames. Our texture artists really have to keep an eye on whether the detailing is going to hold up on such a huge projection screen.”
Such issues “come up all through our vfx pipeline,” he adds. “Compositing has to be immaculate — every hair on an actor’s head is enormous. Edge work comes under a lot of scrutiny, and roto and paint departments are pushed to a breaking point to keep every bit of detail. From a technical point of view, the processing power needed to render and view these large images is colossal, not to mention the disc space required to store it. Just getting 24FPS playback on 6K EXR frames was quite a challenge.”
Virtual Production In terms of virtual reality, it’s not yet clear what the size, scope and nature of the market for actual VR content might eventually be, or how that might impact the visual effects industry. But what is clear is that tools for previsualization, production and post-production of ma-
jor movies have flowed out of that sector and are now central to the visual-effects industry. Various combinations of virtual cinematography systems, high-end rendering engines, live-action camera tracking tools, real-time compositing methods on set, and much more are increasingly becoming part of the regular toolkit of visual effects professionals.
Further, in some cases, such techniques are the foundation of certain productions in which live-action people and elements are delicately sewn into photoreal CG environments, as the original Avatar (2009) and its upcoming sequels, 2016’s Jungle Book, the upcoming Lion King (2019) and others are now illustrating.
Such technology and methods are another of those game-changers to help filmmakers better realize their ultimate vision, when used correctly. Legato, who has been a pioneer on the forefront of such techniques and who is now using them as visual effects supervisor on the upcoming The Lion King, says the whole point is to root such animated movies in reality. As such, virtual production techniques are evolving even as we speak into “a new way of making a film altogether — live-action versions of movies that do not look at all computer generated.”
“You can now make a movie — a total movie, not just a sequence, but a movie — rooted in reality, even though it was artificially created,” he says. “You go to great pains to make sure the artificial part is removed from the audience’s purview. Anyone can now walk onto elaborate sets and learn how to light and shoot them, what the best way of telling their story is, without being limited by budget and time in the same way they would be limited by [scouting or shooting at a real location].”
Scott Meadows, visualization department supervisor at Digital Domain, agrees, adding that the previsualization side of the industry has been greatly aided by the arrival of VR-related tools.
“Obviously, utilizing game engines, just being able to pipe this stuff in real time, being able to work with the cinematographer and director in real time to visualize scenes ahead of time, and to provide lots of information to actors who are working on a blue-screen or mocap stage, is incredibly helpful,” Meadows says. “We now have the tools to allow [filmmakers] and actors to see what [performances] would really be like in the digital world.”
Hendler adds that sophisticated GPU rendering/game engines are allowing “preview renders for our animators” to now include “full global illumination, physically correct lighting and detailed facial performances just to pre-
Blade Runner 2049 benchmark every time a new project arrives. “Newer motion-capture suits, like the Xsens system that we use, allow the actor to put the suit on and go anywhere,” Meadows says. “I was recently on a project working with a stunt team, and basically, on their off hours, we had them throw on the suits and act out some fight choreography and then quickly, within a day or two, we were able to move in and put cameras on it. The motion [on location] may not be as high fidelity as on a traditional mo-cap stage, but the flexibility that you get by simply being able to throw the suit on, wireless, running it off a laptop, or even controlling it with an iPad or iPhone to record, is tremendous.
“Mix that with the way they are using scanning devices today to scan sets during shoot-
(20th Century Fox) (Warner Bros./ New Line Cinemas)