LIVECGX brought Cg-enhanced improvisational performance to London Fashion Week – and it was all controlled by the performer
The ILMXLAB technical PM talks motion capture in the fashion world
Alittle over a year ago, my manager pitched an idea to me. He said, “Just hear what i have to say with an open mind. if you’re not in, okay, but it’s impossible to imagine you won’t be.” The result was LIVECGX, which represents the first step towards a completely new type of live performance – one that connects real-time visual effects and human-driven expression. it’s an evolution of Lucasfilm’s rich history with technology and storytelling.
LIVECGX has the potential to be used for any film franchise, as well as non-movie applications like sports, music and fashion. The latter helped to shape our first public deployment at London Fashion Week (LFW) and the fashion and storytelling collaboration was with the University of the Arts London: Fashion innovation Agency’s designer Steven Tai (steventai) via Matthew Drinkwater and the GREAT British campaign for 2018’s LFW.
Steventai’s Autumn/winter 2018 fashion presentation marked the global debut of LIVECGX. it was used to digitally transform the venue as well as pieces from steventai’s Macau-inspired collection. on a giant LED display in Durbar Court, located in the Foreign and Commonwealth office in Westminster, London, we saw the setting transform in real-time with elements from Macau layered onto the environment. While live models showed off steventai’s latest Autumn/winter collection on stage, another model performed in a motion-capture suit. Her avatar was visible within the environment on screen, modelling two steventai-designed digital garments.
While historically much has been done with sophisticated projection in highly choreographed performances, what made this presentation unique was that the digital elements were responding in real-time to an improvisational performance. Rather than the digital presentation controlling the performance, now the performer was driving the presentation.
Microsoft Kinects were used to capture the depth buffer of the people on stage and in the audience as we wanted to real-time composite the CG elements and the live video feed of Durbar Court. This allowed for convincing interaction between the avatar and the models – they could hug each other, look at each other and pass by one another. Eric Landreneau piped the depth buffer into Unreal Engine’s compositing module Composure to be able to accomplish this. peter Malnai also devised a waypoint-driven traversal system so that a CG character could auto-locomote between waypoints without colliding into real-life objects. The system was set up so that a gestural trigger started the auto-movement, which was shown in the finale when the avatar walked among the on-stage models.
The garments were created in Marvelous Designer with cloth simulation in Unreal Engine. in Unreal, omar Skarsvaag simulated the movement and drapery of the items of clothing. Yoon Kim and Mohammad Modarres modelled and look-dev’d the origin and Destination garments shown in the presentation.
The environment enhancements were created by Ben nadler and Tommy Alvarez Rodriguez. Throughout the show, the audience could see the environment change from Macau’s jungle to Macanese neon signs. We wanted to show that we could transform both the environment and garments in real-time.
Lucasfilm was extremely fortunate to have access to the ILM Chiswick Stage. Matt Rank, Chris Jestico, Jack Brown and Laura Millar were all very helpful in not only loaning us their set of Vicon Vero cameras, but also with the camera set-up and testing at Durbar Court. The Vicons were used to set up a motion capture stage at the show so that Vita oldershaw, our mocap artist, could drive the performance of the avatar.
The next steps for LIVECGX will be to find ways to explore letting audiences view the content through handheld devices. We would like to get this sandbox toolset into performers’ hands to create a unique experience for their audience members.
We would also like to scale LIVECGX up to be part of a live sports event or concert in a larger venue. Ron Radeztsky is already planning a refined code architecture that is scalable for deployment for events like this, and we’d like to enhance the sandbox nature of LIVECGX as a platform. LFW was exhilarating for the team and we would love to continue our fashion collaborations in the near future, as well.
About the LFW2018 Collaborators ILMXLAB is Lucasfilm’s immersive entertainment division based in San Francisco, California. Steventai, London College of Fashion’s innovation Agency and the GREAT British campaign have collaborated with ILMXLAB to bring this unique fashion presentation to life.