THE ROADMAP TO FIX SO­CIAL ME­DIA

The plat­form mo­nop­o­lies Face­book and Google are so pow­er­ful that when they are ma­nip­u­lated, there is real dam­age to our pub­lic dis­course. High time to rein­tro­duce se­ri­ous anti-trust reg­u­la­tion.

METROPOLE - Vienna in English - - TECHNOLOGY -

Aware­ness of the role of Face­book, Google and oth­ers in Rus­sia’s in­ter­fer­ence in the 2016 elec­tion has in­creased dra­mat­i­cally in re­cent months, thanks in large part to con­gres­sional hear­ings on Oct. 31 – Nov. 1, 2017. This has led to calls for reg­u­la­tion, and the Hon­est Ads Act, sponsored by Se­na­tors Mark Warner, Amy Klobuchar, and John Mccain, to ex­tend cur­rent reg­u­la­tion of po­lit­i­cal ads on net­works to on­line plat­forms. Face­book and Google are op­posed, in­sist­ing that gov­ern­ment reg­u­la­tion would kill in­no­va­tion and hurt global com­pet­i­tive­ness and should be left to the in­dus­try. But we’ve seen where self-reg­u­la­tion leads. This prob­lem is just too com­pli­cated. First, we must ad­dress the fil­ter bub­bles. Polls sug­gest that about a third of Amer­i­cans be­lieve Rus­sian in­ter­fer­ence is fake news, de­spite unan­i­mous agree­ment to the con­trary by the coun­try’s in­tel­li­gence agen­cies. Help­ing them ac­cept the truth is a pri­or­ity. To do this, Face­book must be re­quired to con­tact each per­son touched by Rus­sian con­tent with a per­sonal mes­sage say­ing, “You, and we, were ma­nip­u­lated by the Rus­sians. This re­ally hap­pened; here is the ev­i­dence.” And in­clude ev­ery Rus­sian mes­sage the user re­ceived. There’s no doubt Face­book has the ca­pac­ity to do this. No mat­ter the cost, they must ab­sorb the price for their care­less­ness. Sec­ond, the chief ex­ec­u­tives of Face­book, Google, and Twit­ter – not just their lawyers – must tes­tify be­fore con­gres­sional com­mit­tees in open ses­sion. This is par­tic­u­larly im­por­tant for the em­ploy­ees. While the bosses are of­ten lib­er­tar­i­ans, the peo­ple who work there tend to be ide­al­ists, who want to be­lieve what they’re do­ing is good. Forc­ing tech CEOS like Mark Zucker­berg to jus­tify the un­jus­ti­fi­able would go a long way to punc­tur­ing their cults of per­son­al­ity.

REG­U­LA­TORY FIXES: A FEW IDEAS

1) Dig­i­tal bots must not im­per­son­ate hu­mans. Bots dis­tort the “pub­lic square” in a way never pos­si­ble in his­tory. At a min­i­mum, the law on bots should re­quire ex­plicit la­bel­ing, the abil­ity to block, and li­a­bil­ity on the part of plat­form for the harm they cause. Plat­forms must be ac­count­able.

2) New ac­qui­si­tions must be blocked un­til plat­forms have ad­dressed the dam­age and taken steps to pre­vent fu­ture harm and al­low open com­pe­ti­tion. Plat­form growth has of­ten de­pended on gob­bling up smaller firms to ex­tend their mo­nop­oly power.

3) Trans­parency about the sponsors of po­lit­i­cal and is­sues-based com­mu­ni­ca­tion. The Hon­est Ads Act is a good start, but should also cover is­sue-based mes­sages.

4) Trans­parency about the al­go­rithms. Users de­serve to know why they see what they see in their news feeds and search re­sults. If Face­book and Google had to be up-front about the rea­son you’re see­ing con­spir­acy the­o­ries – namely, that it’s good for busi­ness – they would be far less likely to stick to that tac­tic.

5) Eq­ui­table con­tracts with users. Face­book and Google have as­serted un­prece­dented rights in their terms of ser­vice, which can change at any time. All soft­ware plat­forms should be re­quired to of­fer a le­git­i­mate opt-out, in­creas­ing trans­parency and con­sumer choice, and forc­ing more care in ev­ery new roll­out. It would limit the risk that plat­forms would run mas­sive so­cial ex­per­i­ments on mil­lions of users with­out prior no­ti­fi­ca­tion.

6) Lim­its on the com­mer­cial ex­ploita­tion of con­sumer data. Cur­rently the plat­forms are us­ing per­sonal data in ways con­sumers do not un­der­stand, and might not ac­cept if they did. And they will use that data for­ever, un­less some­one tells them to stop.

7) Con­sumer own­er­ship of their own data. Users cre­ated this data, so they should have the right to ex­port it to other so­cial net­works. The likely out­come would be an ex­plo­sion of in­no­va­tion and en­trepreneur­ship. Star­tups and es­tab­lished play­ers would build new prod­ucts that in­cor­po­rate peo­ple’s ex­ist­ing so­cial graphs, forc­ing Face­book to com­pete.

8) Re­turn to tra­di­tional an­titrust law. Since the Rea­gan era, an­titrust law has fo­cused on prices for con­sumers, al­low­ing Face­book and Google to dom­i­nate sev­eral in­dus­tries—not just search and so­cial me­dia but also email, video, pho­tos and dig­i­tal ad sales. This ap­proach ig­nores the so­cial costs of ad­dic­tion, ma­nip­u­lated elec­tions an­dreduced in­no­va­tion. All of these costs are ev­i­dent to­day.

In­creas­ing aware­ness of the threat posed by plat­form mo­nop­o­lies cre­ates an op­por­tu­nity to re­frame the dis­cus­sion about con­cen­tra­tion of mar­ket power. Lim­it­ing the power of Face­book and Google not only won’t harm Amer­ica or Europe, it will al­most cer­tainly un­leash lev­els of cre­ativ­ity and in­no­va­tion that have not been seen in the tech­nol­ogy in­dus­try since the early days of, well, Face­book and Google. Be­fore you dis­miss reg­u­la­tion as im­pos­si­ble in the cur­rent cli­mate, con­sider this. Nine months ago, when Tristan Har­ris and I joined forces, hardly any­one was talk­ing about these is­sues. Now lots of peo­ple are, in­clud­ing pol­i­cy­mak­ers. And while it’s hard to be op­ti­mistic, that’s no ex­cuse for in­ac­tion. There’s far too much at stake.

“It reads like the plot of a sci-fi novel: A tech­nol­ogy cel­e­brated for bring­ing peo­ple to­gether is be­ing ex­ploited to drive us apart.” ROGER MC­NAMEE is the man­ag­ing di­rec­tor and a co­founder of El­e­va­tion Part­ners, an in­vest­ment part­ner­ship fo­cused on me­dia and con­sumer tech­nol­ogy. He is the brother of METROPOLE Ed­i­tor in Chief Dardis Mc­namee.

Newspapers in English

Newspapers from Austria

© PressReader. All rights reserved.