So­cial me­dia’s power de­mands an ad­ver­tis­ing re­think

Given that ma­chine learn­ing can take hu­man ma­nip­u­la­tion from pow­er­ful to nearly per­fectible, we need strong lim­its

The Peterborough Examiner - - Opinion - CHRISTO­PHER Christo­pher Bar­ring­ton-Leigh is an as­so­ciate pro­fes­sor at McGill Univer­sity’s In­sti­tute for Health and So­cial Pol­icy and its School of En­vi­ron­ment.

It’s time for a ma­jor re­think of ad­ver­tis­ing. In fact, an enor­mous roll­back of this runaway in­dus­try is in or­der. It is clear that ad­vances in ma­chine learn­ing and data har­vest­ing have changed the game com­pletely.

But at the same time, knowl­edge about hu­man lim­i­ta­tions has also pro­gressed. We now know how hu­man vul­ner­a­bil­i­ties and psy­cho­log­i­cal quirks make us sus­cep­ti­ble to ma­nip­u­la­tion.

This is in ob­vi­ous ways like mak­ing us feel in­ad­e­quate about our bod­ies, to less ob­vi­ous ways to do with hi­jack­ing prim­i­tive re­ward cir­cuitry in our brains to make us do and buy things that don’t ac­tu­ally make us hap­pier.

In fact, we now even know from work in eco­nomics, in­clud­ing my own, that people can be­come mis­guided in their ex­plicit be­liefs about what makes for a bet­ter (hap­pier) life, over­all.

Econ­o­mists have also re­cently em­pha­sized the idea of a “dark side of cap­i­tal­ism:” just like our cap­i­tal­ist sys­tem is re­mark­ably suc­cess­ful at fill­ing niches when and where a need arises, it is equally able to cre­ate fake needs when­ever there ex­ist hu­man weak­nesses that can be ex­ploited. The mar­ket is cer­tain to be just as good at that task, which hurts people but makes profit, as it is at the task of help­ing people while mak­ing a profit.

So there is a good ba­sis for be­ing care­ful about what we al­low ad­ver­tis­ers to do and to tell people. In Canada, we are among those coun­tries that al­ready have some pro­tec­tions against ad­ver­tis­ing to chil­dren, and stronger on­line pri­vacy laws than ex­ist in the United States.

Those give us some pro­tec­tion against the fright­en­ing in­tru­sions of per­sonal au­ton­omy and pri­vacy for which Face­book CEO Mark Zucker­berg had been de­fend­ing his firm to the U.S. Congress over two days this week.

But is Zucker­berg’s opin­ion on reg­u­la­tion re­ally a good start­ing point?

Given that ma­chine learn­ing can take hu­man ma­nip­u­la­tion from pow­er­ful to nearly per­fectible, we need strong lim­its. In fact, tar­get­ing ads to in­di­vid­u­als’ pro­files should sim­ply be­come il­le­gal.

Dis­play­ing ads that are tai­lored to a par­tic­u­lar prod­uct, such as this news­pa­per, is fine and is how things used to work.

Dis­play­ing se­lec­tive ads on a web­site based on the search term used to sum­mon that par­tic­u­lar web­site is also not ter­ri­bly in­tru­sive. But be­yond that, the line should be drawn, so that ads may not make use of in­for­ma­tion from our past, our other on­line ac­tiv­ity or any other be­hav­iour not re­lated to the im­me­di­ate mo­ment.

This would mean that ad­ver­tis­ing would be­come slightly less ef­fi­cient. Ads would be less per­fectly tar­geted to us, but they would still be ef­fec­tive enough. Ad­ver­tis­ing rev­enue might also be spread back out a lit­tle more, rather than cap­tured nar­rowly by who­ever has the big­gest data­base and neu­ral net­work.

What would this mean for Face­book? It would mean the phase-out of the Face­book model. In the fu­ture, new mi­cro­pay­ment sys­tems may make it more fea­si­ble to pay for some of our on­line ser­vices, rather than sell our at­ten­tion and minds to providers. And other on­line ser­vices may be­come pub­lic in­fra­struc­ture.

But if tar­get­ing ads to in­di­vid­u­als was for­bid­den, there would be no more in­cen­tive for the cre­ation of eth­i­cal time-bombs like Face­book.

Mark Zucker­berg knows who Face­book’s cus­tomers are. They are the ad­ver­tis­ers, not its users. And Zucker­berg’s re­spon­si­bil­i­ties are pri­mar­ily to stock­hold­ers. Its profit is max­i­mized when it does what­ever is nec­es­sary to cre­ate a ma­chine with as much power as pos­si­ble to con­trol the thoughts, be­liefs and de­sires of ev­ery­one of us, ev­ery­where.

That sounds like a time-bomb we want to defuse early on.

In Canada, we are among those coun­tries that al­ready have some pro­tec­tions against ad­ver­tis­ing

Newspapers in English

Newspapers from Canada

© PressReader. All rights reserved.