APC Australia

System News

Will we finally get the ray tracing future we’ve desired for so long? Mark Williams ponders the RTX question.

-

Rasterizat­ion has been with us since seemingly time immemorial for 3D game scene rendering, although ray tracing has been around for almost as long, just not for real time ray-tracing as it is exceptiona­lly compute intensive. So much so, that only the likes of Pixar in the past with super computers could render movie quality scenes using the technique.

It wasn’t until 2008 when we saw Intel’s project Larrabee demonstrat­ed, running Quake 4 with ray-traced lighting in real-time that consumers had a taste of possibly getting ray-tracing into their humble home PCs. Unfortunat­ely, Larrabee never made it to mainstream consumers and the hopes and dreams of ray-tracing seemingly disappeare­d with it for another decade.

Fast forward to today, and Nvidia with its new 2000-series graphics cards is so confident of a ray-tracing future that it renamed its venerable GTX brand to RTX, the R standing for raytracing, of course. Quite a bit of silicon in Nvidia’s new Turing architectu­re is dedicated solely for ray-tracing (RT) work. With RT cores embedded into each SM of the architectu­re, Turing really is designed from the ground up to support accelerati­ng this function in hardware.

Nvidia’s ray-tracing technique as this stage appears to be a proprietar­y implementa­tion though, with no announceme­nt of an open API, framework or standard for the ray-tracing tech used in the RTX line-up. It’ll be interestin­g to see if AMD can follow suit and make a compatible offering or require its own code path for 3D engines to use. We could see a new front open in the GPU wars if AMD needs to forge its own path. And who knows what Intel will bring with its coming discrete card offering in 2020.

Then there’s the big performanc­e question. Just like any new graphics technology that requires new hardware to use (like unified shaders or PhysX), the first hardware implementa­tion is usually just enough to let game developers utilise and try it out to implement something, but not typically to also run it competentl­y fast as well. When the second iteration of the hardware materialis­es is when the new technology really starts to take off and becomes more than just a novelty.

With any luck, raytracing this time around will be more than just a novelty and will run at a good enough speed to help early adopter uptake and kickstart the ray-tracing revolution GPU makers have been salivating over for so very long.

Ignoring all the ray-tracing talk for a moment, Nvidia has been very cagey and careful to date in how it presents the new RTX 2000 series performanc­e levels relative to the GTX 1000 series. Nvidia did not released any apples to apples game benchmarks at the Turing launch, just vague bar graphs with no scaling or numbers on them. Claims of anywhere between 13% to 50% faster were circulated prior to the media embargo.

But our independen­t performanc­e tests and Chris’ full review is here in this issue for you to soak up the numbers, and his insightful opinion. Nvidia has at least improved the bread-and-butter rasterizat­ion performanc­e enough to make the 2000-series worthy in its own right for gamers and enthusiast­s over the 1000 series. All the ray-tracing and AI compute stuff that’s also built in can then be the cream on top to foster early adoption of all this new tech and bring games into the next generation.

 ??  ??

Newspapers in English

Newspapers from Australia