Nvidia’s chips keep grow­ing brainier

Nvidia’s pro­ces­sors are pow­er­ing break­throughs in deep learn­ing “There’s a lot of promis­ing stuff … in the com­ing year” ”

Bloomberg Businessweek (Asia) - - CONTENTS - −Jack Clark and Ian King

Nvidia’s mi­cro­pro­ces­sors have e long been the chips of choice for comm­puter game ad­dicts who crave re­al­is­tic graph­ics as they chase aliens ens or bat­tle en­emy sol­diers. The sameme pow­er­ful semi­con­duc­tors are now be­ing put to new uses at com­pa­nies in­clud­ing Alibaba, Face­book, Google, gle, and Mi­crosoft. Nvidia’s graph­ics chips hips un­der­pin speech recog­ni­tion sys­tems, ys­tems, soft­ware to de­velop gene ther­a­pies, apies, and pro­grams that trans­form satel­lite pho­tos into de­tailed maps.

Re­searchers at Deep­Mind, a Google­owned lab in Lon­don, har­nessed thou­sands of Nvidia’s K40 graph­ics pro­ces­sors, which cost $3,000 apiece, to train a com­puter to play Go, an an­cient board game. In what was praised as a mile­stone in ar­ti­fi­cial in­tel­li­gence, Deep­Mind’s ma­chine beat a Euro­pean Go cham­pion in five out of five matches last year. In March it will take on the world’s top-ranked pro­fes­sional player.

Ar­ti­fi­cial in­tel­li­gence’s big ad­vance over tra­di­tional soft­ware is that it can learn and im­prove with­out the as­sis­tance of hu­man pro­gram­mers: An AI pro­gram de­signed to pick out cars from ran­dom im­ages gets bet­ter the more pic­tures it’s ex­posed to. Graph­ics pro­cess­ing units, or GPUs, are well-suited for this kind of pat­tern recog­ni­tion work be­cause they can per­form thou­sands of sim­ple cal­cu­la­tions at the same time. In con­trast, stan­dard cen­tral pro­ces­sors made by In­tel per­form more com­plex cal­cu­la­tions very quickly but are lim­ited when it comes to do­ing mul­ti­ple things in par­al­lel.

The con­cept of us­ing graph­ics chips for AI got a big boost in 2012 when a team of re­searchers at the Univer­sity of Toronto used Nvidia’s GPUs to build an award-win­ning im­age clas­si­fi­ca­tion sys­tem. The break­through was helped by the chip­maker’schip­maker sup­port of a pro­gram­ming lan­gua lan­guage called CUDA, which lets de­vel­op­ers re­pur­poserep GPUs for uses otheroth than graph­ics. Ri­valRiv Ad­vanced Mi­cro De­vicesDe hasn’t made a com­pa­ra­bleco in­vest­ment,ment which has ham­pered the adop­tion of its graph­ic­s­graphic chips in this emerg­ing field.field Nvidia says about 3,5003,50 busi­nesses and or­ga­ni­za­tion­sorg are us­ing its GPUsG for AI and data anal­y­sis,ana up from 100 a cou­ple­coup of years ago. AI playsp a role in ev­ery­thing fro from Google searches to self-driv­ing cars,car which is “one rea­son we’re’ op­ti­misticti on [Nvidia’s] data cen­ter busi­ness,” says Craig El­lis, an an­a­lyst at B. Ri­ley, a bou­tique in­vest­ment bank. “Their par­al­lel-pro­cess­ing ar­chi­tec­ture is just

$3k

The price of a Nvidia K40 GPU

nat­u­rally su­pe­rior on an in­creas­ing num­ber of work­loads, which in­cludes AI,” he says.

Data cen­ters are a rel­a­tively new area for Nvidia, which draws the bulk of its $5 bil­lion an­nual rev­enue from its PC graph­ics busi­ness. While it’s eked out growth as com­puter gamers con­tinue to shell out for more pow­er­ful com­po­nents, the com­pany needs to coun­ter­act a four-year slump in PC sales. “Our GPU is now mov­ing from soft­ware de­vel­op­ment into hyper­scale data cen­ter pro­duc­tion. That’s quite ex­cit­ing,” says Chief Ex­ec­u­tive Of­fi­cer Jen-Hsun Huang. Once a com­pany fig­ures out how to ap­ply AI to its busi­ness, it tends to buy a lot of GPUs, he says. Still, lur­ing cus­tomers away from In­tel’s Xeon pro­ces­sors, the heart of more than 99 per­cent of the world’s servers, may prove dif­fi­cult.

Nvidia will also face com­pe­ti­tion from star­tups, such as Mo­vid­ius and Ner­vana, that are build­ing AI-op­ti­mized chips. Nvidia’s chief sci­en­tist, Bill Dally, says some large com­pa­nies, which he won’t name, are look­ing to do the same but they don’t pose a threat. “Nvidia re­ally took a bet on this type of com­pu­ta­tion, and they in­vested in this field be­fore it was ob­vi­ous there was a mar­ket there,” says Serkan Piantino, di­rec­tor of en­gi­neer­ing for AI Re­search at Face­book, which hich uses thou­sands of NvidiaN idi­idia GPUs for AI. Still Piantino is keep­ing his eyes peeled for new de­vel­op­ments. “There’s a lot of promis­ing stuff that’s go­ing to land in the com­ing year,” he says.

The bot­tom line Nvidia’s chips are be­ing used to teach ma­chines to think like hu­mans, which could pro­vide the com­pany with a new line of busi­ness.

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.