Design-driven virtual production with Notch
Learn how to design a virtual environment for an LED stage
Discover how to design a virtual environment for an LED stage
Virtual production utilises real-time graphics to capture a blend of virtual and physical environments live in-camera. Used by the events, broadcast and film industries, these technologies and techniques create immersive experiences that blend performers, presenters, actors and products inside photorealistic or imaginative virtual worlds.
3D designers are often faced with the challenge of working within the limitations of existing live-action footage or having to pre-empt any number of variables when producing content for use in live broadcasts. There’s a tricky balance to be maintained when blending the real and virtual worlds. While virtual production opens the door for creative decisions to be made on set and in the moment, creating a good experience for performers and the wider production team involves preparation and design thinking from the early stages.
Virtual production includes AR, MR and XR techniques and is created using either a greenscreen or an LED stage. Working as a real-time designer on a virtual production requires a solid understanding of the technical requirements and limitations of the process. In this tutorial, we’re going to walk you through the process of designing a virtual environment for an LED stage. We’ll cover the practical considerations you’ll need to make and how to create a comprehensive pre-visualisation that will set you up for on-site success. We’ll look at how using front and back plates add depth, and how to optimise your virtual environment to run in real time.
We’ll also look at tips and techniques for working in a studio environment, including Notch’s Network Editing feature which enables designers to make changes remotely.
01 VIRTUAL PRODUCTION WORKFLOW
Virtual production combines conventional TV and film production with real-time graphics creation. When designing a scene for an LED stage, it’s important to first understand the physical environment and the hardware used. Virtual productions use motion-tracked cameras to gather live-action footage and positional data at once. This data is fed into the media server and Notch to produce an image on the LED stage that’s relative to the camera's perspective. AR elements can be applied in front of the LED screen and the subject, appearing in the foreground of the output feed. All of this happens in real time and appears as a composited image in the camera monitor.
02 WHAT ARE LED STAGES?
LED stages are made up of many LED panels typically laid out in a diamond configuration or as a sweeping curve. The stage will also have a floor made up of walk-on LEDS. It is in this illuminated space that performers and props are filmed. Using a tracked camera and real-time rendering, the subject will be immersed in an environment that matches the perspective of their position on stage. The real stage and performers are lit corresponding to the virtual scenery. In this way, real and virtual elements blend into a single, comprehensive scene. It is common to augment the projected imagery with other physical lights for increased realism. These can also be controlled from within Notch.
03 TECHNICAL CONSIDERATIONS
Just because you can move your virtual camera in Notch to the perfect position, doesn’t mean the on-set camera can move to match that location. Physical constraints on-site are one of the biggest causes of on-set changes to virtual scenes. While you can
use Notch’s live editing features to make adjustments on-site, the best way to approach any virtual production shoot is to keep the physical constraints in mind when designing the virtual set. Knowing these constraints and being able to design around them will greatly improve the shoot.
04 KNOW YOUR SPACE
You will need the physical dimensions of the stage so you can build an accurate pre-visualisation. When pre-production begins, ask ‘How big is the room?’ and ‘What is the distance from the edge of the LED stage to the walls of the sound stage?’. It’s good to understand the resolution of the LED surfaces, so you know the level of fidelity you might achieve in close-ups. Sometimes the LEDS used on the floor and the walls can differ, affecting the reflectiveness of the surface. Finally, consider the camera and its movements and whether there is anything that might obstruct the camera from moving in a certain way. Planning is key to a good shoot!
05 CAMERA AND LENS SELECTION
The camera and lens selection has a significant impact on the creative and framing choices in Notch. Know which shots you’re going to need early on! Make sure you and the wider production team are on the same page. Ask what lenses will be used – at some point you will need to match the focal length of the physical and virtual cameras. If using non-prime zoom lenses, every lens will need to be calibrated to the camera tracking system, as failing to do so will cause unwanted barrel distortion. Calibrating the zoom lenses can be a lengthy process, even lengthier if done manually; however, a few tracking system vendors have created calibration profiles for specific lenses.
06 PRE-VISUALISATION
Creating an accurate, to-scale pre-visualisation of the real-world stage will enable you to scale your virtual content correctly, and ensure that you are framing shots from positions the camera can reach. Many designers use a ‘greyboxing’ technique, which is when you create the scene using basic geometry (like cubes and spheres) at the correct size/scale along with basic materials. There are numerous advantages to greyboxing, but most importantly it will help you understand your project's layout, what is seen by the camera, where the points of interest are, and where you can add detail.
07 SPLIT BACKGROUND AND AR FOREGROUND
Front and back plate elements are at the foundation of an extended reality setup. The front plate is whatever goes in front of your subject, and the back plate is what appears on the LED screens in the background. Front plates are sometimes referred to as AR elements. At this stage of the design, it’s useful to use to-scale human models in place of talent as it will help you identify which elements will need to be in the foreground and which in the background. When the final scene is rendered, the back plate and front plate will be composited in real time, creating a threedimensional appearance.
08 LIGHTING
Dynamic lighting can be performance heavy, and in virtual production, every millisecond of performance matters! A good virtual production lighting setup will have static lights that are baked into the scene’s geometry, with additional dynamic lights on top. Your static lights by no means need to be boring – baking provides the opportunity to use high-quality, path-traced lighting without the rendering overhead. Baking removes rendering overhead from the GPU by converting reflections, bounces and shadows into texture memory. Once you’ve successfully baked the static lights, you can use dynamic lights to add life and detail into your scene.
09 PERFORMANCE CONSIDERATIONS
Baked lighting is an integral part of optimisation, but there are many other tools in Notch to help you hit your target framerate. Start by removing unused nodes. Render speed will vary depending on what’s in view of the camera, so move your camera around your scene and monitor performance. Rendering reflections is expensive, so reflective surfaces should only be added where needed.
As the designer, you always need to make sure the project runs at or above the final resolution and FPS requirements. Using physical props such as a rug or chair will ground the subject and create natural shadows when used with physical lighting. It can make all the difference.
10 ON LOCATION
Once you are on location, your design should already be solid. As a Notch designer or virtual production technician, most of your work will be in everything we’ve talked about already: planning, designing, testing and optimising. When on location, your focus will shift to refining and adjusting the scenes you’ve created according to real-world effects like lights or changes from art and technical directors. This is where your experience comes into play, and you are bound to be asked to make last-minute changes. Remember the optimisation techniques and be flexible.
Notch has a built-in feature called Network Editing which makes it easy to change your blocks after you've loaded them into the media server.
11 NETWORK EDITING
The Network Editing feature allows you to connect Notch Builder to a running block over a network connection. This means you can make edits from anywhere in the world and they will update in real time. It is an incredibly powerful feature and a life-saver when doing on-location work. Common issues that come up while on location are misalignments between the virtual and physical lighting and requests to move or remove props, but these changes are easy to make via Network Editing. There are also a few optimisations you'll be able to make on-site, such as reducing the number of dynamic lights and reducing the number of objects affected by reflections.
12 CONCLUSION
The virtual production workflow is one of preparation. Everything that is normally planned for post-production shifts into pre-production. It’s unusual for a 3D designer to kick their feet up once the cameras stopped rolling, but that is what makes this workflow innovative. Virtual production enforces a symbiotic workflow between the 3D designer and the wider production team, bringing the designer closer to the process.•