vicon: Masters of Motion Capture
Tim Doubleday tells us how Vicon are continuing to push the boundaries of motion capture, including their work on the real-time digital character Siren
We chat to mocap product developers Vicon about their work bringing digital human Siren to life, as well as the potential future of motion capture technology
“Motion capture or the idea of recording someone’s motion has been around for a very long time,” says Vicon’s VFX product manager Tim Doubleday. He refers to rotoscoping, a technique that involves a human tracing over film footage to produce realistic movement. “It was invented by Max Fleischer all the way back in 1915,” he adds.
Doubleday defines mocap as the ability to record a person’s movement, including everything from facial to finger movement. “The kind of optical motion capture that we know today came to prominence in 2001 thanks to Andy Serkis’s performance as Gollum in The Lord of the Rings,” he explains. “Nowadays, motion capture is used not only in the majority of video games and animated movies but also in technologies like robotics, VR and even drone tracking.”
Vicon has been leading the charge in motion capture for the past 30 years, having come out of research undertaken by two PHD students at the University of Strathclyde in the late Seventies. The company itself was then established in 1984 as a management buy-out from the Oxford Instruments Group.
“It was the introduction of realistic 3D graphics that propelled the use of motion capture in the entertainment industry,” explains Doubleday. “The popularity of gaming systems like the Playstation and Xbox meant that game developers needed realistic human motion in their games. Whether it’s FIFA, Call of Duty or Fortnite, the majority of today’s video games use motion capture.”
As they continue to pioneer the technology, Vicon have been involved in bringing the real-time digital human Siren to life, along with epic Games and Tencent. “Originally designed as a demo for a chinese audience, Siren was re-imagined as a digital human showcase at GDC 2018,” continues Doubleday. “This meant finding a new english-speaking actress, Alexa Lee, who performed as Siren during the show. cubic Motion delivered stunningly realistic facial animation all in real-time and Vicon supplied the body and finger animation. All this technology brought Siren to life and helped deliver a believable and engaging digital human live on stage.”
As Doubleday explains, Siren has grand implications for the future of mocap: “Traditionally, high-end performance capture has been used to create digital characters like Supreme Leader Snoke in the new Star Wars films or the more stylised characters seen in Ready Player One. These movies all require a ton of work in post and take hours to render each frame. Video games now feature hyper-realistic characters, like those seen in Detroit:
Become Human. While these are rendered in real-time they still require a lot of animation work to make them look so believable.”
He continues: “Siren was designed to not only run in real-time, but also be driven in real-time thanks to 3Lateral’s stunning facial rig that runs, in-engine, at 60fps as well as cubic Motion’s low-latency facial solver. This allowed Siren to deliver a high level of performance all in real-time. While the quality won’t be as high as in some film and game characters, it’s a big leap forwards and allows animators and artists to focus on adding their final touches to the performance.”
Vicon were responsible for animating Siren’s hands and body throughout the project, utilising 18 of their own lowlatency cameras and motion-capture platform, Shogun. “Latency was key for this project. If the animation was out of sync with Lexi’s performance the whole piece would fall apart,” Doubleday admits. “To reduce latency further, we solved the motion-capture markers directly onto the Siren custom skeleton. This was then streamed directly into
“Cubic Motion delivered stunningly realistic facial animation all in real-time and vicon supplied the body and finger animation. all this technology brought siren to life” Tim Doubleday, VFX product manager, Vicon
Unreal engine using epic’s new Live Link system.”
Producing realistic finger animation required Vicon to employ an entirely new solution: “We placed markers on each of Lexi’s fingers in a zig-zag pattern. This allowed for each finger to be animated individually without letting the markers get so close together that they would be hidden. This helped deliver high-fidelity finger animation running in real-time.”
Doubleday continues: “each morning at GDC we would adjust the Siren skeleton based on the new marker positions used on Lexi’s suit. To do this, we would get Lexi to stand in an ‘A pose’ with her hands facing forwards. Since we were streaming in real-time, we could see the results in both Shogun and Unreal. We could then pause the real-time stream and make adjustments to the skeleton in real-time and see the results update live in Unreal.”
Siren represents a milestone in motioncapture technology, but the technology continues to develop at an astonishing rate. Doubleday has ideas about where it might be headed: “While motion-capture systems are still considered high-end solutions, the cost has dropped dramatically over the last five years. It’s not quite at the point of having a Vicon system in your living room, but the introduction of location-based VR means you might see one in your local cinema complex.
“One recent example is that Dreamscape Immersive recently launched the Alien Zoo virtual reality experience in Los Angeles, allowing six people to go on a virtual safari and interact with each other as well as the animals they encounter on the way. each participant wears a cluster of markers on their hands and feet, with additional clusters on the backpack Pc and VR headset. These six clusters are used to create a realistically proportioned 3D character within the virtual world. This allows people to shake hands and interact with each other, making for an incredibly immersive experience. Location-based virtual reality is a way for you and a load of your friends to all go and experience virtual reality in a social environment.”
“it’s Not quite at the point of having a vicon system in your living room, but the introduction of location-based vr Means you Might see one in your local Cinema Complex” Tim Doubleday, VFX product manager, Vicon
Far left: cubic Motion provided the facial performance capture, tracking, solving and animation for siren
left: siren’s facial rig and underlying controls were both provided by 3lateral, which also handled all of the 3D and 4D scans of the performance
Below: since the 1980s much of vicon’s work has been in motion capture for clinical gait analysis
the character of siren was created using the likeness of chinese actress Bingjie Jiang