The Best Tech and Tools of 2019
Ihave to say that 2019 was a very good year for tech in animation and visual effects, probably more so than any past year I’ve put together this list for. Let us not tarry:
Unreal Engine 4.23. Always a popular one, and with the incorporation of RTX tech there is no place to go but up. Most significant to myself are the virtual production tools that Epic has developed or supported, streamlining incorporating what was once post-production actually into the production and pre-production process.
Nvidia RTX Series. It was announced two SIGGRAPHs ago, but the technology is spreading like a virus in both gaming, production and big industry level cards like the 2080 Ti, TItan RTX and the Quadro 6000.
Isotropix Clarisse. It may not be new, but it made the list nevertheless. And I had my first chance to dig into it a bit this year. The amount of data Clarisse can chomp on without a significant slowdown
Apple ARKit 3 and RealityKit. Here you have accessible augmented reality development tool right in your iOS device. The latest includes people recognition and motion capture to detect when people are in your camera frame, and allow them to interact or be immersed in the AR environment.
Oculus Insight’s inside-out tracking. Offers the ability to determine position data of a VR head without external sensors or equipment by using SLAM (Simultaneous Localization and Mapping).
Cached Playback in Maya 2019. This helps speeding viewport animation playback. There is nothing more important to an animator than seeing animation in real time, and if you don’t have to prerender for review, you save gobs of time.
Blender 2.8. Blender has been around for a long time, and given its open source nature, it’s taken a while for it to mature. It is now starting to take business away from the big guys and democratize the 3D world.
Rust programming language.
This open-source programming language similar to C++ has an emphasis on memory safety and concurrency. It has been the most-loved language for the last four years running.
USD 19.11. This Universal Scene Description was developed by Pixar and is available as open source. It’s a way to define the elements and parameters of those elements in a scene that remains consistent from department to department. It’s not new this year, but the amount of traction it has gained in the industry is phenomenal. The 19.11 release is pretty, pretty robust.
SideFX Solaris and LOPS. This is a new context within Houdini to support look dev, lighting and layout tasks using the aforementioned USD — and it’s a huge deal.
Deepfake. Yes, this technology is providing us with lots of smiles. How can you not want to see Nicolas Cage in every movie? But as it advances — and in deep learning, the more information you feed it, the more advanced it gets — the more mind-blowing and frightening the results are. I’m Deepfake using it in an actual production to change news interviews — for a fictitious timeline.
3Delight + NSI API. 3Delight is a RenderMan-compliant render engine, and its Nodal Scene Interface API is what is used to describe the 3D scene. It replaces the outdated API and provides for a simple but powerful way to access the engine.
Autodesk Bifrost. Starting out as a fluid simulator in Maya, Bifrost has grown into an ICE-like node-based workflow that Maya users have been lusting for since the dawn of man, and ex-Softimage user. have been missing for nearly as long.
Google Maps AR. It uses the camera on your device to allow Google Maps to better determine where you are by recognizing architecture around you, and it also places signs and directions into the environment (through the viewport of your device) to help show where you need to go. We just need to incorporate Pokémon GO! and the world will be perfect.
Unity 2019 HDRP. While you can dev for multiple platforms, Unity has set up different pipelines targeting specific platforms. The High Definition Render Pipeline is for your PCs and consoles, where you can push the performance closer to photoreal levels. It’s the ideal tool for developing and/ or converting projects for high fidelity in a real-time environment.
Azure Kinect DK. This is the next step up from the Kinect using AI to assist with the vision and hearing systems, utilizing a 12-megapixel RGB camera along with a 1-megapixel depth camera, orientation sensor and a seven microphone array. While the original Kinect was mainly focused on games and entertainment, the Azure Kinect DK is robust enough to make its way into more professional and industrial applications.The volumetric capture allows for interactive VR experience with performance. Pretty fancy! ◆