Produce interactive volumetric videos
Learn how to use Holosuite, a collection of post-production tools developed by Arcturus Studios, with this quick guide
Volumetric video captures reality as it really is, but instead of having a fixed point of view, you can examine a performance from any angle.
For users looking to record a performance in three dimensions, volumetric video offers realistic capture with 6DOF – if you have the right tools. You can record nuanced performances without needing to rely on artistic interpretation, or worrying about bumping up against the uncanny valley. With the right capture system, you can record the world with all the fidelity of film, and still take full advantage of both classic and cutting-edge 3D tools.
This tutorial will introduce the basics of working with volumetric video, and highlight how Holosuite can make that process much easier, starting with editing the performance of a volumetric capture with automatically generated skeletons. Adding a skeleton to a clip paves the way to tracking the positions of an actor’s body or limbs, animating in props or adding VFX around the actor, and applying real-time
IK retargeting for their head or limbs. We will show you how an actor recorded on a capture stage (at Crescent Studio in Tokyo) looks when they become a moving target in a Unity scene.
WHAT IS ARCTURUS’ HOLOSUITE?
Holosuite is a collection of postproduction tools developed by
Arcturus Studios, consisting of two pieces of software: Holoedit and Holostream. Together they make it possible to edit, compress and stream volumetric video. Holosuite can be used as a standalone tool, or through plugins built to work with volumetric data in Maya, Mari, Unreal and Unity.
Holoedit is designed for postproduction and compression of volumetric video. You can preserve the visual quality of the capture, while gaining the ability to dynamically manipulate it – a feature that could have previously only been done with a full motion capture setup. Using Holoedit’s non-linear editing features, you can compress and edit volumetric video by taking advantage of the Holocompute cloud computing services to process parallel jobs.
The other tool in the suite, Holostream, delivers on-demand, adaptive streaming for volumetric video. Users can deliver content to desktop PCS, mobile devices or VR headsets like Oculus Quest, without a loss of quality due to an unstable broadband or wireless connection. These streaming APIS offer flexibility to integrate volumetric streaming experiences for web, in engine or for AR/VR.
WORK WITH VOLUMETRIC DATA
Traditionally in 3D animation, a single model is manipulated using one of three methods: directly by the artist, through motion capture information where poses are set, or they are interpreted over a range of frames through programming. Volumetric video, however, generally contains independent mesh information and texture information for each frame, more like traditional film.
One of the most important things to be aware of when you are working with volumetric video is to know whether or not your volumetric data is stabilised. Stabilised data, also known as filtered or tracked data, is volumetric data that contains consistent topology and UVS over ranges of frames in a clip. Even though the topology and UVS might match over a range of frames that makes up a segment, each mesh is still its own unique element in each and every frame.
Working with stabilised data is an important element for achieving great compression with volumetric data. Holosuite uses stabilisation to reduce the manual work required from an artist, by allowing edits to segments of data, rather than having to edit every frame in a clip. Many capture stages provide volumetric data that is already stabilised, but if you’re working on unstabilised data, we recommend you use the ‘Stabilize Mesh’ stage in Holoedit to stabilise it.
Working with volumetrics can take up a lot of digital storage space. Being able to offer compact deliverables is the first step in making it accessible on the broad stage. Knowing your target for delivery and planning your projects with that in mind will also help you make the best decisions for texture resolution and mesh density.
Texture padding Having good texture padding will give you more flexibility for resizing your texture resolutions. At least three pixels of padding will keep your materials looking great when downsizing your texture resolution.
01 GET STARTED IN HOLOEDIT
Holoedit uses project structures called ‘Workspaces’. Your data will be stored here locally, and both mesh and texture files, as well as Holoedit compositions, can be accessed within the active workspace while using Holoedit.
Creating a clip in Holoedit is as easy as dragging a folder containing volumetric data into your Holoedit workspace, directly in the explorer window, and then pressing the Make Clip button. That’s it. Now you are able to view your clip in the viewport and timeline when you drag it into the composition window.
Holoedit’s stages allow for the non-linear editing of volumetric video. Each stage has a unique process that affects the mesh, texture and/or animation data from the stages above it. Stages can be added to your clip’s track in the composition window, and each processed stage will affect the subsequent stages that use the same data type.
02 GENERATE A SKELETON
Using Holoedit you can automatically generate a skeleton for any humanoid subject in a volumetric clip. Add a ‘Generate Skeleton’ stage using the Add
Stage button in the composition window. In the timeline you can then create an interval for the duration of the clip; just click and drag to select the relevant frame range, then right-click and select Create Interval.
The intervals work to establish what frames will be processed with interval-specific parameters, and they contain the data once processing has been completed.
Executing an interval will upload a job to the holographic processing servers (HPS) for quick, parallel processing. Once this is complete, the results will return the skeleton data for the clip. Frameto-frame jitter in the skeleton can be smoothed by using the ‘Stabilize Skeleton’ stage.
Next, we’ll compress the data, then generate the head retargeting skin weights.
03 COMPRESSION OF SKIN WEIGHTS
After skeletonisation, the clip can be compressed using SSDR. To add compression, SSDR needs to run before the ‘Generate Skin
Pre-shoot preparation
Delivery targets
Weights for Head Retargeting’ stage. The SSDR stage encodes each stabilised segment as one animated mesh, and stores its poses as a skeletal animation that can then be efficiently decoded during playback.
You are storing a single mesh per segment in the output OMS (Open Mesh Sequence) file, so you’ll see significant compression. The final compressed mesh stream is now ready for retargeting skin weights to be generated.
Using the ‘Generate Skin Weights for Head Retargeting’ stage, the weights can be automatically generated in Holoedit. For non-head skin weights, or especially complex clips, you can export your clip data and use our Maya plugin to paint and automatically propagate the skin weights across the clip. Once that’s done, you’re ready for export.
04 EXPORT AN OMS
The composition can be exported as an OMS file containing the mesh and animation data, and an MP4 containing the texture data. OMS supports fast and compressed playback of any volumetric capture. It also creates playback targets for web, streaming and popular game engines, all for local playback.
OMS maximises compression using temporal compression, mesh compression and sequence bytestream compression.
A bit of extra planning on shoot day will go a long way to achieving the best results. When shooting a clip that will have head retargeting, make sure the neck is not obstructed by the wardrobe and that hair is not sitting on or past the actor’s shoulders.
05 PLAYBACK IN UNITY OR UNREAL
Holosuite offers plugins to support playback for Unity and Unreal in-engine, as well as streaming support. To achieve dynamic retargeting, we’ll use either the Unreal or Unity OMS player plugin to support IK head retargeting.
With an OMS player, retargeting the head can be set to target an object or camera, and the rotation values clamped to deliver an interaction with the actor in the clip that feels natural and real.
With photoreal captures and real-time retargeting of volumetric video, the Holosuite collection of tools powers not just immersive, but truly interactive, video.•
Keep your delivery targets in mind when exporting volumetric video. Be aware of platform limitations for texture resolution or performance limitations when planning your clips.