The Low­down: AI in chips

AI is in­creas­ingly finding its way into chipsets, but what ef­fect will it have?

Computer Shopper - - CONTENTS -

New chipsets are us­ing ar­ti­fi­cial in­tel­li­gence to boost their ca­pa­bil­i­ties, but will it ben­e­fit you?

AI ON CHIPS?

Think of ar­ti­fi­cial in­tel­li­gence and you will prob­a­bly think of ro­bots or clever soft­ware backed up by the power of the in­ter­net and cloud com­put­ing. And you’d be right. But AI tech is also be­ing em­bed­ded into the heart of var­i­ous pro­ces­sors and chipsets, and en­ter­ing smart­phones and head­sets. Take Qual­comm’s new Snap­dragon 845 mo­bile chip (see News,

Shop­per 363): it not only of­fers gutsy pro­ces­sor and graph­ics per­for­mance, it’s also tuned to run ma­chine-learn­ing al­go­rithms – ef­fec­tively the clever code be­hind AI sys­tems – on the chipset.

It uses a com­bi­na­tion of chip parts, such as the Adreno GPU and the Hexagon vec­tor pro­ces­sor, to power Qual­comm’s AI engine. This al­lows smart fea­tures such as the pro­cess­ing of voice com­mands in nat­u­ral lan­guage to work at quite a lick and im­age recog­ni­tion to be car­ried out on a smart­phone, with­out re­ly­ing on a con­sis­tent in­ter­net con­nec­tion.

Hav­ing the tech on a widely avail­able chipset gives de­vel­op­ers more scope to put smart AI fea­tures into their apps and ser­vices.

HOW CAN A SIN­GLE CHIP POWER AI?

That ques­tion needs an­swer­ing in two parts. First, AI sys­tems are trained us­ing ma­chine-learn­ing al­go­rithms to spot pat­terns in data to an­swer ques­tions or solve prob­lems rather than fol­low a hard-coded set of in­struc­tions. This train­ing can ei­ther be done un­der the su­per­vi­sion of hu­man experts or left to AI to draw its own con­clu­sions from data sets with or with­out guid­ance.

More ad­vanced AI sys­tems use a process called deep learn­ing, which is re­liant on an ar­ti­fi­cial neu­ral net­work.

WAIT, A NEU­RAL WHAT?

In much the same way a hu­man’s brain has a neu­ral net­work of neu­rons con­nected by synapses, an ar­ti­fi­cial neu­ral net­work is made up of lay­ers of nodes.

When pre­sented with data – for ex­am­ple, an im­age – the neu­ral net­work breaks it up into many pieces, or in­deed pix­els, and each node tries to work out what it’s see­ing, such as the colour of an im­age seg­ment or its shape.

As the data flows through the net­work, each node tries to de­cide how right or wrong the spe­cific data it’s look­ing at is for solv­ing a query, fig­ur­ing out if a dog is in a pic­ture, for ex­am­ple, and then as­sign­ing it a cor­re­spond­ing weight or value.

Of­ten data will fil­ter through many lay­ers of nodes, hence the term deep learn­ing. The process is re­peated with the weight­ing of the nodes changed un­til a sys­tem learns to get an an­swer that’s more than likely the cor­rect one.

This process takes a lot of data and re­quires servers with hefty amounts of pro­ces­sor power to carry out AI train­ing at scale. But once a ma­chine-learn­ing model is smart enough to solve a base level of tasks, the AI fea­tures it sup­ports can be put into ac­tion. This stage is called in­fer­ence: the AI still learns as it goes along, but it’s more likely to tai­lor its ca­pa­bil­i­ties to in­di­vid­ual users, such as learn­ing to better un­der­stand their voice or what they like to buy or eat.

At this level, AI doesn’t need nearly as much com­pute power to fig­ure things out, so a pow­er­ful mo­bile chip is enough to sup­port it. That be­ing said, more chip-mak­ers and tech firms are adding spe­cific hard­ware to better power AI fea­tures and al­low for smart de­vices to get smarter by them­selves.

SOUNDS IN­TRIGU­ING. WHO’S WORK­ING ON SUCH TECH?

A whole raft of tech firms. The likes of In­tel, AMD, ARM and Nvidia are all work­ing on chipsets that can ei­ther han­dle AI al­go­rithms or have ded­i­cated pro­ces­sors for pow­er­ing smart soft­ware.

Mi­crosoft’s next HoloLens aug­mented-re­al­ity head­set is set to con­tain a co­pro­ces­sor ded­i­cated to run­ning deep learn­ing neu­ral net­works, with the goal of de­liv­er­ing better move­ment track­ing and voice recog­ni­tion on the de­vice.

And Ap­ple’s lat­est A11 Bionic chip, found in the iPhone X and iPhone 8, fea­tures the Neu­ral Engine, which has cir­cuits ded­i­cated to ac­cel­er­at­ing AI-based soft­ware, im­age pro­cess­ing and speech recog­ni­tion, as well as the ad­vanced FaceID fa­cial-recog­ni­tion fea­ture. Ba­si­cally, there are a whole load of tech firms look­ing at putting AI into peo­ple’s hands and gad­gets rather than leav­ing it stuck in a data cen­tre.

GREAT, BUT HOW DOES ALL THIS AI TECH BEN­E­FIT ME?

It not only makes a lot of ex­ist­ing fea­tures smarter, such as the au­to­matic or­gan­i­sa­tion of pho­tos, but it also has a few other tricks up its sleeve. For ex­am­ple, re­searchers from the Mas­sachusetts Institute of Tech­nol­ogy have de­signed a neu­ral net­work chip that uses AI smarts to po­ten­tially re­duce smart­phone bat­tery con­sump­tion by a mas­sive 95 per cent, some­thing peo­ple with long com­mutes would wel­come.

Google is us­ing its AI tech to power Lens, a fea­ture on the lat­est Pixel phones that can recog­nise what’s in a pic­ture or cam­era feed and serve up use­ful in­for­ma­tion about it – lit­er­ally judg­ing a book by its cover, for ex­am­ple, to of­fer a pre­cis on the au­thor and where the book can be pur­chased.

In a nutshell, with the right train­ing AI can han­dle all man­ner of hu­man-as­sist­ing tasks, from things we need to things we don’t nor­mally no­tice. And no, AI is not yet ad­vanced enough to rise up and en­slave us all.

A whole load of tech firms are look­ing at putting AI into peo­ple’s hands and gad­gets

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.