Academy Sci-Tech Awards
The Academy of Motion Picture Arts and Sciences has selected 18 scientific and technical award recipients (34 individuals), as well as five organizations, which will be honored at the annual Scientific and Technical Awards Presentation, set for Feb. 11 at the Beverly Wilshire Hotel. Technical Achievement
Awards Thomson Grass Valley for the design and engineering of the pioneering Viper FilmStream digital camera system. Larry Gritz for the design, implementation and dissemination of Open Shading Language. Carl Ludwig, Eugene Troubetzkoy and Maurice van Swaaij for the pioneering development of the CGI Studio renderer at Blue Sky Studios. Brian Whited for the design and development of the Meander drawing system at Walt Disney Animation Studios. Mark Rappaport for the concept, design and development; Scott Oshita for the motion analysis and CAD design; Jeff Cruts for the development of the faux-hair finish techniques; and Todd Minobe for the character articulation and drive-train mechanisms, of the Creature Effects Animatronic Horse Puppet. Scientific And Engineering
Awards ARRI for the pioneering design and engineering of the Super 35 format Alexa digital camera system. RED Digital Cinema for the pioneering design and evolution of the RED Epic digital cinema cameras with upgradeable Glenn Sanders and Howard Stark for the design and engineering of the Zaxcom Digital Wireless Microphone System. David Thomas, Lawrence E. Fisher and David Bundy for the design, development and engineering of the Lectrosonics Digital Hybrid Wireless Microphone System. Parag Havaldar for the development of expression-based facial performance-capture technology at Sony Pictures Imageworks. Nicholas Apostoloff and Geoff Wedig for the design and development of animation rig-based facial performance-capture systems at ImageMovers Digital and Digital Domain. Kiran Bhat, Michael Koperwas, Brian Cantwell and Paige Warner for the design and development of the ILM facial performance-capture solving system. full-frame image sensors. Sony for the development of the F65 CineAlta camera with its pioneering high-resolution imaging sensor, excellent dynamic range, and full 4K output. Panavision and Sony for the conception and development of the groundbreaking Genesis digital motion picture camera. Marcos Fajardo for the creative vision and original implementation of the Arnold Renderer, and to Chris Kulla, Alan King, Thiago Ize and Clifford Stein for their highly optimized geometry engine and novel ray-tracing algorithms which unify the rendering of curves, surfaces, volumetrics and subsurface scattering as developed at Sony Pictures Imageworks and Solid Angle SL. Vladimir Koylazov for the original concept, design and implementation of V-Ray from Chaos Group. Luca Fascione, J.P. Lewis and Iain Matthews for the design, engineering, and development of the FACETS facial performance capture and solving system at Weta Digital. Steven Rosenbluth, Joshua Barratt, Robert Nolty and Archie Te for the engineering and development of the Concept Overdrive motion control system. [
It’s a good time for the animation, visual effects and related industries. Business is growing, thanks to the world’s increasing demand for entertainment and information, and animation’s unique ability to enlighten any subject.
But with growth comes many challenges: How can individuals find the best jobs for themselves and their talents and interests? Where can employers find people qualified and talented enough to execute their visions?
These aren’t always easy questions to answer, but Animation Magazine is making it easier with the debut Feb. 13 of its new Career Center, to be found online at www.animationmagazine.net.
The new Career Center is designed to meet employers’ recruitment needs, allowing them to easily post jobs, search effectively for highly qualified and professional candidates using prescreen filters, and only pay to see the resumes of job seekers interested in the position they are looking to fill.
Job seekers can similarly target openings that fit their experience and interests. They also can post resumes anonymously and cre-
flat and dead. So with this creature you don’t want a face with eyes that have nothing behind them because the audience will turn off and lose interest.”
The VFX supervisor made sure that — even as the CG creatures focused on a conversation with another character — their eyes didn’t stay static. The eyes were made to have little darting movements here and there since it makes it appear that the creatures are thinking and taking in the whole scene when their eyes move in this way. It’s a more natural way to bring these large, engaging beings to life.
“You try to make people forget there’s a CG character there for a moment,” says Aithadi. “They know it’s a CG character, but if you can switch off for a minute that they know that, then you can get people to enjoy the character for the character itself. So, in my opinion, the eyes are the most important thing and if you have good eyes, you have a way to make everything happen for the audience.” Karen Idelson is a freelance entertainment and tech writer who lives in the South Bay. She’s pretty excited for the next Blade Runner movie.
RealFlow has been used for fluid simulations for a long time now. And Next Limit continues to push its boundaries with new tools for controlling the simulations — tweaking things under the hood to make it faster and more stable, and all the while making things more complex.
The biggest advance is beefing up the Dyverso multiphysics solver, which has been optimized and takes advantage of both the GPU and the CPU in your workstation. Introduced in RealFlow 2015, the solver is meant to take into consideration many different kinds of objects, requiring different solvers, and get them to work together. So even though RealFlow has had fluid and rigid-body solvers, and they have worked together in the past, the Dyverso solver makes them faster, more accurate and more solid.
But then you add the latest craze of granular stuff — teeny tiny grains. Sand. Snow. Tiny ball bearings. Frozen’s dynamic snow-clumping simulations come to mind, as does the hyper-macro world of Pixar’s short, Piper, with its wet sand. And most recently, an Imperial AT-AT (or I guess, ATACT to be accurate) in Rogue One stepping onto the beaches of Scarif.
Yeah, granular simulations are a thing. And Next Limit not only has implemented it, but because it’s within the Dyverso universe, it can interact with other materials like water and rigid bodies.
But they haven’t ignored the Hybrido solver, which makes the larger fluid simulations, such as boats plowing through waves or water crashing up on craggy rocks. Speed, stability and memory management have all improved.