Em­bed­ded AI to take off by year-end

Ris­ing tech­nol­ogy steers away from cloud, but ex­perts say it is ‘in in­fancy’

Global Times - Weekend - - TECH - By Li Xuan­min

One year ago, when ar­ti­fi­cial in­tel­li­gence (AI) ro­botic Al­phaGo rose to promi­nence af­ter it un­ex­pect­edly de­feated leg­endary player Lee Se-dol, tech gi­ants and ven­ture cap­i­tals were rush­ing to pump money into the rapidly ris­ing sec­tor, most of which im­ple­ments cloud-only com­pu­ta­tion.

But now, a new trend is driv­ing the de­vel­op­ment of em­bed­ded AI, a tech­nol­ogy that can process data and run AI al­go­rithms on de­vices with­out trans­fer­ring data to cloud servers.

A cell­phone equipped with em­bed­ded AI can rec­og­nize food and pro­vide real-time de­tails on calo­ries, help­ing di­eters choose healthy cuisines. Em­bed­ded AI also en­ables con­sumers to use mi­crowave ovens with­out setting a time, as the ap­pli­ance it­self in­stantly judges a food prod­uct’s nec­es­sary cook­ing time. Also, with em­bed­ded AI, peo­ple are free to in­stall cam­eras at home to check the safety of el­ders and chil­dren with­out wor­ry­ing about data be­ing leaked.

Those are just a few sce­nar­ios where em­bed­ded AI can be widely ap­plied in daily life, as pointed out by in­dus­try in­sid­ers dur­ing a fo­rum on em­bed­ded AI held over the week­end in Bei­jing.

Tech­nol­ogy ap­pli­ca­tion

“So far, most of the de­vel­op­ment in the AI sec­tor fo­cuses on cloud AI, or com­pu­ta­tion that is con­nected to the cloud, but there are a stream of sce­nar­ios where on-de­vice com­pu­ta­tions edge [over cloud com­pu­ta­tion],” Geng Zengqiang, chief tech­nol­ogy of­fi­cer of China-based op­er­at­ing sys­tems (OS) provider Thun­der­soft Soft­ware Tech­nol­ogy Co, also the event or­ga­nizer, told the Global Times in an exclusive interview.

Sun Li, vice pres­i­dent of Thun­der­soft, told the Global Times that in some cir­cum­stances of cloud AI ap­pli­ca­tion, the way data is trans­ferred via the In­ter­net and then pro­cessed in a cloud server in­flicts a se­ries of is­sues.

For ex­am­ple, the op­er­a­tion of jet air­liner Boe­ing 787 gen­er­ates 5 giga- bytes of data ev­ery sec­ond – al­most larger than the max­i­mum ca­pac­ity of any com­mer­cial wire net­work, mak­ing it “a mis­sion im­pos­si­ble to com­plete AI com­pu­ta­tion in the cloud,” said Sun.

An­other ap­pli­ca­tion sce­nario is au­to­matic driv­ing, which pro­duces al­most 1 gi­ga­byte of data ev­ery sec­ond and re­quires real-time al­go­rithms and in­tel­li­gent de­ci­sion-mak­ing.

“Con­nect­ing to the cloud and trans­mit­ting data back to the ve­hi­cle would cost a great amount of time, push­ing up driv­ing risks,” Geng said.

Be­sides those lim­i­ta­tions, as an in­creas­ing num­ber of do­mes­tic users raise con­cerns over the pri­vacy of in­tel­li­gent home ap­pli­ances, on-de­vice AI, how­ever, which can func­tion with­out link­ing to the In­ter­net, guar­an­tees their per­sonal pri­vacy, Sun noted.

Against this back­drop, “the year of 2017 is promis­ing for em­bed­ded AI tech­nol­ogy to take off – and that has be­come an in­dus­try con­sen­sus,” Geng said, point­ing to a huge mar­ket po­ten­tial.

He pre­dicted that the prospect of AI ap­pli­ca­tion would not be dom­i­nated by ei­ther cloud or on-de­vice com­pu­ta­tion, but in­stead, a com­bi­na­tion of both.

In the fu­ture, “on-de­vice AI should be able to de­tect and process raw data and run al­go­rithms be­fore­hand, and af­ter fil­ter­ing, more valu­able data will be trans­ferred to the cloud, form­ing a big data­base,” Geng said.

Com­ment­ing on the trend, Sun Gang, global vice pres­i­dent of USbased tech giant Qual­comm Tech­nolo­gies, also pro­posed a model that uti­lizes deep learn­ing through the cloud, with the de­vice ex­e­cut­ing in­tel­li­gent de­ci­sions.

He also said at the fo­rum that the smart­phone, with an ex­pected global ship­ment of 8.5 bil­lion units in the next five years, is likely to be the first type of mo­bile de­vice to widely em­ploy em­bed­ded AI tech­nol­ogy.

China is a pioneer, or at least not a late­comer com­pared with for­eign ri­vals, in the world of on-de­vice AI smart­phone ap­pli­ca­tion, de­spite gaps in the on-de­vice AI un­der­ly­ing plat­form – chips and OS – which is cur­rently led by US com­pa­nies Qual­comm and Google, Geng said.

Do­mes­tic tele­com heavy­weight Huawei in Oc­to­ber un­veiled its new Mate 10 model, pow­ered by the Kirin 970 pro­ces­sor with on-de­vice AI ca­pa­bil­i­ties.

The hand­set, with sen­sors and cam­eras, can pro­vide real-time im­age recog­ni­tion, lan­guage trans­la­tion and heed voice com­mands.

Yu Cheng­dong, CEO of Huawei’s ter­mi­nal ser­vice depart­ment, said that the Mate 10, with on-de­vice AI ca­pa­bil­ity, is 20 times faster in im­age recog­ni­tion than for­eign smart­phone ven­dors, news web­site ifeng.com re­ported in Oc­to­ber.

For ex­am­ple, “it takes only 5 sec­onds for the Mate 10 to rec­og­nize 100 pho­tos, but for the iPhone 8 Plus and Sam­sung Note 8, such time sky­rock­ets to 9 sec­onds and about 100 sec­onds, re­spec­tively,” Yu was quoted as say­ing in the re­port.

Geng also high­lighted China’s abun­dant AI tal­ents, most of whom have stud­ied and been trained over­seas, in bridg­ing the gap be­tween for­eign coun­ter­parts.

Bar­ri­ers ahead

At the fo­rum, in­dus­try in­sid­ers also took note of a bunch of tech­no­log­i­cal bar­ri­ers, stress­ing that the de­vel­op­ment of em­bed­ded AI is still in its “in­fant pe­riod” with a lim­ited scope of ap­pli­ca­tions.

This is be­cause “the ef­fi­ciency of the power, ther­mal and size of mo­bile de­vices con­strain the op­er­a­tion ef­fi­ciency of em­bed­ded AI,” Sun from Qual­comm said.

So far, even the chips of the high­est per­for­mance are un­able to ac­com­mo­date the amass­ing on-de­vice AI work­loads, Geng added.

Geng’s com­ment is echoed by Chen Yunji, co-founder of Bei­jing­based AI chip start-up Cam­bri­con Tech­nolo­gies Corp.

“Back in the 1990s, a sim­i­lar prob­lem of in­suf­fi­cient op­er­a­tion ca­pac­ity also ap­peared in the graphic pro­cess­ing sec­tor… it was not un­til the in­ven­tion of a spe­cial­ized graphic pro­cess­ing unit chip that the prob­lem was ad­dressed at that time,” Chen said at the fo­rum, urg­ing chip­mak­ers to step up ef­forts to de­velop a ca­pa­ble one with stronger and deeper learn­ing abil­i­ties.

But re­search and de­vel­op­ment of such chips cost a lot, and that has led to an­other headache.

“Are con­sumers will­ing to pay for a more ex­pen­sive mo­bile de­vice with on-de­vice AI ca­pac­ity? Tech com­pa­nies still need to bal­ance the costs and rev­enues,” Geng said.

An­other way to deal with the is­sue is to re­vise on-de­vice al­go­rithms, which could then de­crease the re­quire­ment for chip ef­fi­ciency, Tang Wen­bin, chief tech­nol­ogy of­fi­cer of Face++, a Bei­jing-based tech start-up that spe­cial­izes in fa­cial recog­ni­tion, said at the fo­rum.

Tang noted that Face++ is now study­ing for a re­vi­sion model of Shuf­fleNet, with the aim of speed­ing up the pro­ce­dures of data com­pu­ta­tion ten­fold from the cur­rent level.

The Huawei Mate 10 model with on­de­vice AI ca­pa­bil­i­ties rec­og­nizes food at a dis­play area in Bei­jing over the week­end.

Newspapers in English

Newspapers from China

© PressReader. All rights reserved.