Google I/O 2017: AI, pri­vacy and you

An AI-rst world is still rst pow­ered by ad­ver­tis­ing.

HWM (Singapore) - - Feature - By Alvin Soon

Last year, al­most a year into his new role as CEO of Google, Sun­dar Pichai boldly de­clared that Google would shift from a mo­bile rst strat­egy to AI-rst (articial in­tel­li­gence).

This year’s I/O de­vel­oper con­fer­ence demon­strated the fruits of that AIrst fo­cus, where Pichai once again em­pha­sized, “ … in an AI-rst world, we are re­think­ing all our prod­ucts and ap­ply­ing ma­chine learn­ing and AI to solve user prob­lems, and we are do­ing this across every one of our prod­ucts.”

One of the most im­pres­sive of these AI-rst prod­ucts is the new Google Lens. As part of Google As­sis­tant,

Lens can an­a­lyze the real world via a smart­phone cam­era and re­turn rel­e­vant re­sults on it. Google demon­strated point­ing the cam­era at a ower, and Lens then identied its species.

But Lens’ AI smarts isn’t just lim­ited to vis­ual recog­ni­tion, it can also sug­gest next ac­tions based on what it sees. In one ex­am­ple, it cor­rectly identied text as a Wi-Fi SSID and pass­word, and then asked if the user wanted to con­nect to the net­work us­ing those cre­den­tials.

Google Home and Photos also re­ceived new smart fea­tures, and even Gmail for iOS and An­droid got an AI-boost, with an au­to­mated- re­sponse fea­ture called Smart Re­ply (pre­vi­ously avail­able on the web) that lets you choose up to three au­to­mated re­sponses based on the email you re­ceived.

Google’s ad­vances in ma­chine learn­ing re­quire bril­liant peo­ple and smart cod­ing, but they also re­quires large sets of data that an AI can use to im­prove it­self over time. Lens, Photos and As­sis­tant har­vest more data about the world, and by ex­ten­sion, more of your per­sonal in­for­ma­tion.

So these new AI-en­abled fea­tures, while un­de­ni­ably help­ful, also raise scores of pri­vacy con­cerns. What does Google do with the lo­gin pass­word it just identied? How much of your

“Google’s ad­vances in ma­chine learn­ing re­quire bril­liant peo­ple and smart cod­ing.”

email does Google parse in or­der to au­to­sug­gest re­sponses? Does Google log your lo­ca­tion in­for­ma­tion when you up­load im­ages to Lens?

We know that Google al­ready culls your search habits to sell ads; it’s not un­com­mon, for ex­am­ple, to see an ad on YouTube ad­ver­tis­ing the very thing you’d just been search­ing for. It wouldn’t be a stretch to think that your speech and im­age queries would also be used in the same way.

Be­cause the more AdSense, Google’s ad­ver­tis­ing arm, knows about you and your world, the more it can sell mi­cro-tar­geted ads that you’re more likely to click. It pays to re­mem­ber that ad­ver­tis­ing has al­ways formed the bulk of Google’s pro ts, in 2016, for ex­am­ple, ad­ver­tis­ing brought in US$79 bil­lion out of its to­tal US$89 bil­lion rev­enues.

Google’s AI ad­vances have vast po­ten­tial to help the pub­lic good. But Google is a for-pro t com­pany, and not a pub­lic ser­vice provider; R&D fund­ing has to come from some­where and even AI has to an­swer to the bot­tom line.

In the past, Google could only track what you did on­line, but con­sider that with an app like Lens, Google will be able to see what you’re do­ing in the phys­i­cal world. As our on­line and off­line worlds en­croach on one an­other, so does the sur­veil­lance econ­omy threaten to per­vade more of our ‘real’ lives – even as it comes in the form of an AI-pow­ered app.

Newspapers in English

Newspapers from Singapore

© PressReader. All rights reserved.