When Reel Gets Real – The Cost of CGI

HWM (Singapore) - - Contents - Text by Mar­cus Wong Art Di­rec­tion by Ken Koh

When was the last time you watched a movie and won­dered if that ex­plo­sion ac­tu­ally oc­curred, or if it was cre­ated in a com­puter? Com­puter-gen­er­ated im­agery (CGI) has reached a point where for most in­tents and pur­poses, for the au­di­ence the “reel” thing is as good as the real thing. We take a look at the how CGI got to this point, and where it’s go­ing from here.

Start

Par­ti­cles, grids, collisions, light sim­u­la­tions. You’d be for­given for think­ing we were talk­ing about as­tro­physics or some other school of sci­ence, but those are just some of the con­sid­er­a­tions that CGI artists grap­ple with in their at­tempts to bring things to life on the big screen.

“We used to have to fake the bounc­ing of light, or the way things re­flected off each other be­cause it was too com­pu­ta­tion­ally ex­pen­sive to do it ‘cor­rectly’” says Mr Philip Miller, ( Di­rec­tor of Pro­fes­sional So­lu­tions Busi­ness Unit, Nvidia) as he ex­plains how “clas­sic” CGI used to be done.

When CGI first started, it worked along the con­cept of point lights and spot lights. These could be placed wher­ever you pleased, and could both add and sub­tract light. The only other con­trol was whether the ob­jects in the scene cast shad­ows or not. Thus, tricks had to be used be­cause a lot of ex­tra CG el­e­ments were needed to make things look real­is­tic.

It is right if it looks right

CG in movies es­sen­tially started as a mix of vis­ual tricks and op­ti­cal il­lu­sions to cre­ate a sense of “re­al­ity”. How­ever, as the qual­ity of dis­plays be­gan to in­crease, it be­came harder and harder to get away with fak­ing it, and even harder to find people with the ex­per­tise and knowl­edge to do it.

Phillip re­counts a demon­stra­tion by Pixar on how they used to ren­der scenes in Toy Story – us­ing hun­dreds of lights to sim­u­late a ba­sic day­light in­te­rior. Be­cause there was no way to make ob­jects re­flect or ab­sorb light nat­u­rally, you needed hun­dreds of lights act­ing on the room in un­in­tu­itive ways – shin­ing from un­der­neath the floor, bounc­ing up off the ceil­ings, act­ing in neg­a­tive ( mak­ing ar­eas darker in­stead of brighter) – just to mimic what nat­u­ral light would do.

Us­ing the laws of na­ture

To­day, that same scene would prob­a­bly take only six lights, be­cause stu­dios can now ap­ply a phys­i­cally-based ap­proach: they sim­ply al­low the light to bounce the way it does in the real world, in­ter­act­ing with ob­jects based on the way they ab­sorb and re­flect light. It’s some­thing that’s only pos­si­bly now; thanks to the re­cent in­creases in com­put­ing power.

This was some­thing the in­dus­try knew how to do even back in the 90s, but the level of pro­cess­ing then was too slow – it would have taken a whole weekend just to com­plete the ren­der­ing! To­day, we’re at a level where the pro­cess­ing is fast enough to be in­ter­ac­tive, and that’s where things are re­ally chang­ing.

Ev­ery­thing speeds up but ren­der times

“In 1997, the CG on The Fifth El­e­ment used 256MB of RAM, and to­day most big shots re­quire 32 to 48GB. Back then, an en­tire show would take around 2TB of disk space, and to­day, tex­tures on a sin­gle as­set alone can take up that much space.” says Christo­pher Ni­chols, Cre­ative Di­rec­tor of the Chaos Group, as he ex­plains why the in­creases in com­put­ing power haven’t led to a de­crease in the time needed for ren­der­ing.

The dif­fer­ence is that now a sin­gle tal­ented artist can achieve what used to take a team of people, and that com­put­ing fa­cil­i­ties are be­ing drawn from the Cloud thanks to faster in­ter­net con­nec­tiv­ity, al­low­ing smaller stu­dios to also get into the e mix. Eight years ago, a large movie might have been of the or­der of 500 VFX shots and taken two years to cre­ate. To­day, the num­ber is closer to two thou­sand, and they have to be de­liv­ered in less than a year! The de­mand for high qual­ity CG con­tent is huge and it’s not just in movies but in broad­cast as well. Christo­pher says the vis­ual ef­fects in broad­cast ri­val many films and some shows have dras­ti­cally shorter dead­lines, so the need for CGI and thus for faster soft­ware and hard­ware will only in­crease.

Di­rect­ing the vir­tual

To­day, we’re at a level where the pro­cess­ing is fast enough to be in­ter­ac­tive, and that’s where things are re­ally chang­ing. In movies, the abil­ity to get real-time pre­views of what a scene will look like en­ables di­rec­tors to be more ef­fi­cient in mak­ing de­ci­sions. Where they would pre­vi­ously have to wait days for the ren­der­ing process to com­plete, to­day the ren­der­ing can be han­dled by the sys­tem’s GPUs, mak­ing it a much faster process.

An ex­am­ple of this is Chaos Group’s use of V- Ray RT for Mo­tionBuilder to cre­ate the graph­ics for di­rec­tor Kevin Margo’s Con­struct. As he demon­strates in a YouTube clip*, the sys­tem is able to play­back a low-res­o­lu­tion path-traced ver­sion of a cut in real-time. At any mo­ment, they can hit pause, and the im­age on screen re­solves to full qual­ity, al­low­ing Margo to see if he needs to ad­just light­ing or shad­ing. This also pre­sents valu­able feed­back to the ac­tors who are be­ing filmed, as they are able to view their takes im­me­di­ately. The re­sult is movies with a bet­ter sense of re­al­ism be­cause the ac­tors are bet­ter able to vi­su­al­ize what they’re in­ter­act­ing with.

Old school is new school

Be­cause the phys­i­cally- based ap­proach al­lows you to treat CG like the world around you, it’s much eas­ier for people to re­late to, and Phillip tells us that this is af­fect­ing the way stu­dios are work­ing too.

In­stead of hir­ing a lighter – some­one who spe­cial­izes in cre­at­ing the il­lu­sion of light and depth in a graph­ics pro­gram – they now hire some­one who is used to phys­i­cally light­ing a set ( like a gaffer). In this way, the learn­ing curve is a lot less, and the pre­dictabil­ity a lot higher.

The thing that makes it all pos­si­ble is the evo­lu­tion of the pro­ces­sors in com­put­ers to­day. No longer do stu­dios have to make do with a few ap­prox­i­ma­tions of hair. Now, they have soft­ware that can man­age the com­plex­ity to ren­der ev­ery sin­gle strand, and the hard­ware to make it hap­pen fast enough to be prac­ti­cal.

Mov­ing for­ward The trend is for ren­der­ing to be 100% in­ter­ac­tive, and ren­der­ing ap­pli­ances like NVIDIA’s Iray Vis­ual Com­put­ing Ap­pli­ance ( VCA) are mak­ing it pos­si­ble for de­sign­ers to in­ter­act with their ideas as if they were al­ready real. Iray VCA packs eight of NVIDIA’s most pow­er­ful GPUs in one ma­chine, each with 12GB of graph­ics mem­ory, and is built to be scal­able, thus al­low­ing com­pa­nies to build ren­der­ing clus­ters spe­cific to their needs.

One such com­pany is Honda, who as an early adopter, uses a to­tal of 25 Iray VCA ma­chines work­ing to­gether to re­fine styling de­sign on their cars in real-time. “Our TOPS tool, which uses NVIDIA Iray on our NVIDIA GPU clus­ter, en­ables us to eval­u­ate our orig­i­nal de­sign data as if it were real” says Daisuke Ide, Sys­tem En­gi­neer at Honda Re­search and De­vel­op­ment.

High qual­ity pho­to­re­al­is­tic CGI will be seen in ev­ery­thing from prod­uct de­sign to ad­ver­tis­ing soon, and it won’t be too long

be­fore you can do aug­mented re­al­ity. just as Honda’s de­sign­ers can view their new cars vir­tu­ally, soon you too will be able to put your tablet in your liv­ing room and – us­ing just the built- in cam­era – see how fur­ni­ture pieces su­per­im­posed in the space.

Given that apps like In­te­rior De­sign for iPad and Au­todesk’s Homestyler are al­ready of­fer­ing you ways to cre­ate 3D floor plans that you can pop­u­late with vir­tual fur­ni­ture, and apps like iTracer and Man­del­bulb Ray­tracer HD are bring­ing ray­trac­ing off the GPu in the iPad, it cer­tainly seems like it won’t be long till the sce­nario above is re­al­ized. Call it the next step in reel world tech­niques cross­ing over to the real world.

HWM

Newspapers in English

Newspapers from Singapore

© PressReader. All rights reserved.