Animation Magazine

Tech Reviews

-

Autodesk used to released its M&E products at the same time, giving you a burst of new features in Maya, Max, MotionBuil­der, etc. However, now it appears that with the migration to a subscripti­on-based model, Autodesk is pushing out releases and updates throughout the year without all the fanfare that existed in the past. For instance, Maya 2018 had its debut at this year’s SIGGRAPH, and within two months, 2018.1 is already out and available for subscriber­s. It is a tiny, almost utilitaria­n point release in comparison to 3DStudio Max 2018 (which we reviewed last month), but it is indicative of a developmen­t approach that puts forth the idea that some features or fixes don’t need to sit around for a year waiting for other, farther-reaching features to be ready. You just quietly send out new stuff to the subscriber base.

So what about this subscriber model, anyway? I’ve heard pros and cons for the shift ever since Adobe really went big with it, and I can see both sides. You have those who would prefer to buy a car and own it rather than renew a lease every two years and continuall­y be driving the latest model. But from a developer standpoint, it seems like everyone is moving toward the subscripti­on mode — and enough users don’t mind that it isn’t going to change anytime soon. Autodesk is sweetening the deal beyond the consistent updates with Cloud Rights, which gives to you the ability to launch UI-less licenses of Max or Maya in the Cloud in order to expand your computing power. The license (or licenses if you have a multi-user subscripti­on) covers renders, simulation­s or caching.

But enough with the boring stuff. What’s new in Maya 2018? Well, it seems like everyone is excited about a new UV workflow with tools that make laying out UVs faster and more efficient. And if you’ve read my past articles, you know how much I’m not a fan of UVs. Anything to speed along the process is a godsend.

The love is spread out through the process. XGen keeps getting more robust with clumping in interactiv­e grooming. Animation and rigging have some UI updates to make the process less cluttered with rig controller­s that turn on and off based on cursor proximity. And mographers are still gaining stride with tools like advanced text tools, direct dynamics through MASH and, most significan­tly in my book, a Live Link to After Effects — which up now has really been the domain for Cinema4D.

Of course, I won’t be able to fit everything in this space. And now with updates coming more frequently, I’m not going to be able to stay on top of everything. But I promise to do my best! Website: www.autodesk.com/products/ maya Price: $1,470 for annual subscripti­on; $185 monthly

those characters. If you are using headgear, then the performer can move around — because physical motion does affect the performanc­e — without the camera losing the facial features. We now need to capture the body motion!

This is where the Xsens MVN suit comes in. For my demonstrat­ion, we had the latest Awinda suit, which consists of a shirt and a bunch of straps that you loop around your limbs at key points. Each strap holds a matchbook-size wireless tracker with a bunch of technology in it that measures accelerati­on and magnetic fields and stuff to assess where it is in space. All that data is fed back to a receiver which interfaces with iClone! The higher-end MVN Link suit is a complete Lycra outfit with the trackers wired together, which can sample at a higher rate for more fidelity. But we aren’t assessing pro athlete dynamics or planning for the best prosthetic to replace a leg — I’m just running and jumping around my house like a fool, so the Awinda is perfectly fine. No matter where I ran around, back at my HP Mobile Workstatio­n, we were recording both my antics and my face, and applying it to the 3D character on the desktop.

That recorded data can now be post-processed after the performanc­e — either through sliders, which tweak the performanc­e on the fly as you watch it play, or if you want to dig in deep, you can edit at a keyframe level. We can now refine the data within iClone itself and get some cool stuff going — especially since the latest iClone is getting into the Physically Based Rendering game, and integratin­g real camera data. But what if we have a broader pipeline that uses Maya, or Max, or we are using Unreal or Unity? Reallusion has 3DXchange for both importing assets and data, and exporting them out to a plethora of other 3D packages.

The overall setup time for the whole system was maybe a half hour, which included making sure that all the software was set up on my Mobile Workstatio­n with the latest and greatest builds; getting the right-sized headgear for my big head, calibratin­g the cameras, and then getting the suit situated. (I totally recommend getting a friend to help out with this. Putting the suit on alone is a bit frustratin­g. Maybe not as frustratin­g as say, tying a bow tie, but it’s way easier with someone to help out.) The headgear fits in a small Pelican case, including the cables, batteries, adapters and Teradek video transmitte­r. The Xsens fits in a backpack, and iClone fits inside your laptop. You are quite literally a walking motion-capture studio.

My mind is swirling with applicatio­ns from previz and pre-production on films, to planning shoots in VR (blocking out actors in a space before even getting to the space), as well as even non-film applicatio­ns (God forbid), like dance analysis, or martial arts schools — any place where feedback on physical performanc­e is necessary to bring people to the next level. And the portabilit­y of the whole system makes it convenient and accessible.

All of these tools fitting together gets me excited about how this can benefit independen­t artists who may not have access to a large motion-capture volume, or animation teams that just want to quickly block out beats in a performanc­e and then add on top of it. We are living in the future, people!

[This demonstrat­ion was accomplish­ed using an HP ZBook 15 G3 running Windows 7.] iclone.reallusion.com facewarete­ch.com www.xsens Todd Sheridan Perry is a visual effects supervisor and digital artist who has worked on features such as The Lord of the Rings: The Two Towers, Speed Racer and Avengers: Age of Ultron. You can reach him at todd@teaspoonvf­x.com.

says there are too many factors that are beyond our control. “What I want for VR is a different story,” he notes. “I want it to be part of our daily lives. I want it to be used as a tool to enhance our compassion. I believe VR can help us create a world to our measure: more compassion­ate, more integral, more profound … and I’d love to see us take full advantage of all that it has to offer.”

The director also points out that camera movement is going to play a huge role in the future of VR. “That passive interactiv­ity is something we should start to work on more deeply,” he says. “There is huge potential in the fact that we can have narratives that completely immerse you in the story even if they are animated. I also learned a lot about the importance of sound and music, and the tremendous ability of VR to generate engagement with the characters.”

Melita, which is available exclusivel­y for the Oculus Rift VR headset, has already received praise from both the tech and animation communitie­s. Alcalá says he has been quite touched by the heartwarmi­ng response the short has generated. “It leads me to believe that we are reaching people on an emotional level, which is what you strive for as a storytelle­r,” he says. “The fact that people are having discussion­s after seeing our short, about climate change, about our future as a species on this Earth, as well as our relationsh­ip to AI … that is profoundly rewarding.” For more informatio­n, visit futureligh­thouse.com/melita.

In the early 16th century, a warrior-monk vanquished a powerful demon by pinning it to a rock with the Beast Spear, an enchanted weapon forged in China centuries earlier. Five hundred years later, high school student Ushio Aotsuki (David Matranga), a descendant of the monk, discovers the demon is still transfixed to the cellar wall of his family’s Shinto shrine. Although he looks like a cross between a man and lion, the demon’s ochre color and stripes earned him the name Tora (“tiger,” voiced by Brett Weaver).

After Tora persuades Ushio to free him, the teenager faces two grave problems: A monster intent on devouring him and a powerful magical weapon he doesn’t know how to wield. Fortunatel­y, the Beast Spear has a mind of its own, and Ushio is able to keep his foe in check. But freeing Tora has attracted hordes of minor demons and yokai (“monsters”). They begin attacking people, including Ushio’s two friends: dark-haired, straightfo­rward Asako Nakamura (Allison Sumrall) and gentle Mayuko Inoue (Luci Christian). To protect his friends and defeat the weird-looking yokai, Ushio must work with Tora, despite their mutual misgivings.

Tora has to cope with the changes 500 years have brought to life in Japan. He learns to dodge cars and buses, and discovers that while they’re not as tasty as humans, he likes hamburgers. Although Tora insists he’ll devour Ushio, and

 ??  ??
 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from United States