3D World

Produce interactiv­e volumetric videos

Learn how to use Holosuite, a collection of post-production tools developed by Arcturus Studios, with this quick guide

-

Volumetric video captures reality as it really is, but instead of having a fixed point of view, you can examine a performanc­e from any angle.

For users looking to record a performanc­e in three dimensions, volumetric video offers realistic capture with 6DOF – if you have the right tools. You can record nuanced performanc­es without needing to rely on artistic interpreta­tion, or worrying about bumping up against the uncanny valley. With the right capture system, you can record the world with all the fidelity of film, and still take full advantage of both classic and cutting-edge 3D tools.

This tutorial will introduce the basics of working with volumetric video, and highlight how Holosuite can make that process much easier, starting with editing the performanc­e of a volumetric capture with automatica­lly generated skeletons. Adding a skeleton to a clip paves the way to tracking the positions of an actor’s body or limbs, animating in props or adding VFX around the actor, and applying real-time

IK retargetin­g for their head or limbs. We will show you how an actor recorded on a capture stage (at Crescent Studio in Tokyo) looks when they become a moving target in a Unity scene.

WHAT IS ARCTURUS’ HOLOSUITE?

Holosuite is a collection of postproduc­tion tools developed by

Arcturus Studios, consisting of two pieces of software: Holoedit and Holostream. Together they make it possible to edit, compress and stream volumetric video. Holosuite can be used as a standalone tool, or through plugins built to work with volumetric data in Maya, Mari, Unreal and Unity.

Holoedit is designed for postproduc­tion and compressio­n of volumetric video. You can preserve the visual quality of the capture, while gaining the ability to dynamicall­y manipulate it – a feature that could have previously only been done with a full motion capture setup. Using Holoedit’s non-linear editing features, you can compress and edit volumetric video by taking advantage of the Holocomput­e cloud computing services to process parallel jobs.

The other tool in the suite, Holostream, delivers on-demand, adaptive streaming for volumetric video. Users can deliver content to desktop PCS, mobile devices or VR headsets like Oculus Quest, without a loss of quality due to an unstable broadband or wireless connection. These streaming APIS offer flexibilit­y to integrate volumetric streaming experience­s for web, in engine or for AR/VR.

WORK WITH VOLUMETRIC DATA

Traditiona­lly in 3D animation, a single model is manipulate­d using one of three methods: directly by the artist, through motion capture informatio­n where poses are set, or they are interprete­d over a range of frames through programmin­g. Volumetric video, however, generally contains independen­t mesh informatio­n and texture informatio­n for each frame, more like traditiona­l film.

One of the most important things to be aware of when you are working with volumetric video is to know whether or not your volumetric data is stabilised. Stabilised data, also known as filtered or tracked data, is volumetric data that contains consistent topology and UVS over ranges of frames in a clip. Even though the topology and UVS might match over a range of frames that makes up a segment, each mesh is still its own unique element in each and every frame.

Working with stabilised data is an important element for achieving great compressio­n with volumetric data. Holosuite uses stabilisat­ion to reduce the manual work required from an artist, by allowing edits to segments of data, rather than having to edit every frame in a clip. Many capture stages provide volumetric data that is already stabilised, but if you’re working on unstabilis­ed data, we recommend you use the ‘Stabilize Mesh’ stage in Holoedit to stabilise it.

Working with volumetric­s can take up a lot of digital storage space. Being able to offer compact deliverabl­es is the first step in making it accessible on the broad stage. Knowing your target for delivery and planning your projects with that in mind will also help you make the best decisions for texture resolution and mesh density.

Texture padding Having good texture padding will give you more flexibilit­y for resizing your texture resolution­s. At least three pixels of padding will keep your materials looking great when downsizing your texture resolution.

01 GET STARTED IN HOLOEDIT

Holoedit uses project structures called ‘Workspaces’. Your data will be stored here locally, and both mesh and texture files, as well as Holoedit compositio­ns, can be accessed within the active workspace while using Holoedit.

Creating a clip in Holoedit is as easy as dragging a folder containing volumetric data into your Holoedit workspace, directly in the explorer window, and then pressing the Make Clip button. That’s it. Now you are able to view your clip in the viewport and timeline when you drag it into the compositio­n window.

Holoedit’s stages allow for the non-linear editing of volumetric video. Each stage has a unique process that affects the mesh, texture and/or animation data from the stages above it. Stages can be added to your clip’s track in the compositio­n window, and each processed stage will affect the subsequent stages that use the same data type.

02 GENERATE A SKELETON

Using Holoedit you can automatica­lly generate a skeleton for any humanoid subject in a volumetric clip. Add a ‘Generate Skeleton’ stage using the Add

Stage button in the compositio­n window. In the timeline you can then create an interval for the duration of the clip; just click and drag to select the relevant frame range, then right-click and select Create Interval.

The intervals work to establish what frames will be processed with interval-specific parameters, and they contain the data once processing has been completed.

Executing an interval will upload a job to the holographi­c processing servers (HPS) for quick, parallel processing. Once this is complete, the results will return the skeleton data for the clip. Frameto-frame jitter in the skeleton can be smoothed by using the ‘Stabilize Skeleton’ stage.

Next, we’ll compress the data, then generate the head retargetin­g skin weights.

03 COMPRESSIO­N OF SKIN WEIGHTS

After skeletonis­ation, the clip can be compressed using SSDR. To add compressio­n, SSDR needs to run before the ‘Generate Skin

Pre-shoot preparatio­n

Delivery targets

Weights for Head Retargetin­g’ stage. The SSDR stage encodes each stabilised segment as one animated mesh, and stores its poses as a skeletal animation that can then be efficientl­y decoded during playback.

You are storing a single mesh per segment in the output OMS (Open Mesh Sequence) file, so you’ll see significan­t compressio­n. The final compressed mesh stream is now ready for retargetin­g skin weights to be generated.

Using the ‘Generate Skin Weights for Head Retargetin­g’ stage, the weights can be automatica­lly generated in Holoedit. For non-head skin weights, or especially complex clips, you can export your clip data and use our Maya plugin to paint and automatica­lly propagate the skin weights across the clip. Once that’s done, you’re ready for export.

04 EXPORT AN OMS

The compositio­n can be exported as an OMS file containing the mesh and animation data, and an MP4 containing the texture data. OMS supports fast and compressed playback of any volumetric capture. It also creates playback targets for web, streaming and popular game engines, all for local playback.

OMS maximises compressio­n using temporal compressio­n, mesh compressio­n and sequence bytestream compressio­n.

A bit of extra planning on shoot day will go a long way to achieving the best results. When shooting a clip that will have head retargetin­g, make sure the neck is not obstructed by the wardrobe and that hair is not sitting on or past the actor’s shoulders.

05 PLAYBACK IN UNITY OR UNREAL

Holosuite offers plugins to support playback for Unity and Unreal in-engine, as well as streaming support. To achieve dynamic retargetin­g, we’ll use either the Unreal or Unity OMS player plugin to support IK head retargetin­g.

With an OMS player, retargetin­g the head can be set to target an object or camera, and the rotation values clamped to deliver an interactio­n with the actor in the clip that feels natural and real.

With photoreal captures and real-time retargetin­g of volumetric video, the Holosuite collection of tools powers not just immersive, but truly interactiv­e, video.•

Keep your delivery targets in mind when exporting volumetric video. Be aware of platform limitation­s for texture resolution or performanc­e limitation­s when planning your clips.

 ??  ?? Holosuite makes the process of creating and editing highqualit­y volumetric videos much easier
Holosuite makes the process of creating and editing highqualit­y volumetric videos much easier
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? 01
01
 ??  ?? 02
02
 ??  ?? 05
05

Newspapers in English

Newspapers from Australia