3D World

Games march on

Simon Fenton, head of games at Escape Studios, explores how the latest technology and a convergenc­e of techniques means that game graphics are quickly catching up with film

- FYI Escape Studios’ courses include game art and VFX www.bit.ly/escapestud­ios

We explore how game graphics are quickly catching up with film

C onvergence has been a key buzzword across the industry over the last few years. The worlds of games and VFX are advancing, and with this change, a natural crossover is happening. We are now seeing that the production methods used in the creation of digital art for film and games are very similar. This has been partly triggered by the rise of new technologi­es such as VR and with both areas using the same tools such as Maya and Zbrush, but it wasn’t that long ago that a single plant in the film Avatar had more polygons than an entire game environmen­t.

Motion capture is a big area where we are seeing more and more convergenc­e. I still get excited teaching PBR texturing to my students, knowing I am using implementa­tions of the Disney GGX shader in real-time engines. Motion capture in games has allowed a more complex world of storytelli­ng to evolve, with technology allowing game makers to create characters who are relatable and human in their facial expression­s and physical mannerisms. Just look at the work of Imaginariu­m on Squadron 42, Naughty Dog’s The Last of Us and Guerrilla Games’ Horizon Zero Dawn. With the developmen­t of characters in games using motion capture, this allows the exploratio­n of wider human themes.

One of the other big developmen­ts that’s occurred in this space – which we’ve seen mostly recently at events such as the Games Developers Conference (GDC) – is the power of real-time rendering. Epic’s Unreal Engine has really stolen a march on this, and it has recently teamed up with The Mill and Chevrolet to demonstrat­e the engine’s potential with the short film Human Race. Merging live action storytelli­ng with realtime visual effects, the film showcases how these technologi­es are pushing the limits by using real-time rendering in a game engine.

The fact that a similar approach was used for some of the scenes in Star Wars: Rogue One by ILM, using a tweaked version of Unreal Engine 4, just adds to its credential­s. The team used this technology to bring the droid K-2SO to life in real time. While at the

moment this technique is predominat­ely being used for hard surfaces, it is surely only a matter of time before it supports more diverse objects. We are also seeing companies such as performanc­e capture studio Imaginariu­m expanding to adapt to this change, with Andy Serkis’ recently unveiling Imaginati Studios, a game developing studio with a focus on real-time solutions using Unreal Engine 4.

the pull of photogramm­etry

Another area where there is a real convergenc­e and crossover of talent is in the use of photogramm­etry, which involves taking photograph­ic data of an object from many angles and converting it into stunningly realistic fully textured digital models. Creating game assets from photograph­s may not be new, but the process has now reached the kind of standards we are used to seeing in film production. From the incredibly realistic recreation of the Star Wars universe by DICE in Star Wars Battlefron­t to Crytek’s Ryse: Son of Rome, the bar of video game graphics is getting higher and higher. It might be hyperbolic to suggest the visuals of Ryse are parallel to the classic film Gladiator, but it is neverthele­ss a stunning realisatio­n of ancient Rome.

The Vanishing of Ethan Carter is another game that uses photogramm­etry to great effect – artist Andrzej Poznanski details how this was done on the Astronauts blog (www.bit.ly/visualrev). Epic Games’ Paragon, where photoshoot­s were used to capture HDR lighting on hair and skin, is another fantastic example. Some of the most compelling-looking graphics in games were created with photograph­ic processes, and photogramm­etry has played a huge part in driving games graphics forward.

Epic Games used Agisoft Photoscan to capture its images, but there are many issues native to the photogramm­etry process that need to be solved. Dealing with reflection­s in photos of objects and poor lighting can be a real challenge, and can reduce the realism of the final output. But this is where the marriage of technology and artistry come together. In the fantastic blog Imperfecti­on for Perfection (www.bit.ly/epicproces­s), technical artist Min Oh outlines Epic Games’ process, detailing the use of colour checkers and capturing lighting conditions using VFX standard grey and chrome balls. Other inspiratio­n comes from the team at DICE, who overcame lighting issues when capturing Darth Vader’s helmet by removing light informatio­n from source images.

Famed VFX supervisor Kim Libreri, who is now CTO at Epic Games, predicted that graphics would be indistingu­ishable from reality in a decade in 2012. And a few years on, it seems like we’re well on our way.

Motion capture in games has allowed a More complex world of storytelli­ng to evolve Simon Fenton, head of games, Escape Studios

 ??  ??
 ??  ??
 ??  ?? Far left: game developer dice recreated the star Wars universe with great effect for Battlefron­t
Far left: game developer dice recreated the star Wars universe with great effect for Battlefron­t
 ??  ?? Above: Crytek’s ryse: son of rome is a stunning realisatio­n of ancient rome
Above: Crytek’s ryse: son of rome is a stunning realisatio­n of ancient rome
 ??  ?? Left: Photogramm­etry was used heavily in horror adventure game the Vanishing of Ethan Carter
Left: Photogramm­etry was used heavily in horror adventure game the Vanishing of Ethan Carter
 ??  ??

Newspapers in English

Newspapers from Australia