Why web reg­u­la­tion makes us less safe

The world around us is chang­ing, but tak­ing away our right to pri­vacy is not the way to solve it, says Ver­ity Burns

net magazine - - ESSAY - Il­lus­tra­tion by Kym Win­ters

When some­thing bad hap­pens, the knee­jerk re­ac­tion of ‘blame the in­ter­net’ is one that is overused and un­der-ef­fec­tive. In the wake of sev­eral dev­as­tat­ing ter­ror at­tacks on Bri­tish shores, the gov­ern­ment has once again wheeled out its favourite scape­goat, calling for yet more red tape and reg­u­la­tion on­line to stop the bad guys do­ing wrong.

While the de­sire to find a vil­lain in a ter­ri­ble sit­u­a­tion is un­der­stand­able, pro­vid­ing such a sim­plis­tic an­swer to one of the world’s most com­plex ques­tions is short­sighted at best and dan­ger­ous at worst. More­over, it shows just how out of touch our politi­cians are when it comes to the wide-rang­ing is­sues of the web.

In­ter­net reg­u­la­tion has been on Prime Min­is­ter Theresa May’s to-do list since she first be­came home sec­re­tary in 2010. While the orig­i­nal Draft Com­mu­ni­ca­tions Data bill was blocked by the Lib­eral Democrats in 2013, a re­vised ver­sion – the In­ves­ti­ga­tory Pow­ers Act – came into force un­der her pre­mier­ship on 30 De­cem­ber 2016, widely backed by both Labour and Con­ser­va­tive MPs.

Nick­named the Snooper’s Char­ter, it gives in­creased gov­ern­ment ac­cess to a user’s in­ter­net his­tory for up to 12 months with­out the need for a war­rant, along­side new pow­ers for in­ves­ti­ga­tors to hack the de­vices of sus­pects in or­der to col­lect data.

“That’s fine”, many of you will say. “I’m not do­ing any­thing il­le­gal, I’ve got noth­ing to hide!” You might not, but as whist leblower Ed­ward Snow­den put it: “Ar­gu­ing that you don’t care about the right to pri­vacy be­cause you have noth­ing to hide is no dif­fer­ent than say­ing you don’t care about free speech be­cause you have noth­ing to say.”

Pri­vacy is a right we’re all en­ti­tled to, and a right we en­joy daily when we close our cur­tains at night. And yet with this new leg­is­la­tion, it’s not just the po­lice or the Min­istry of De­fence who have free reign over it – the Food Stan­dards Agency and the NHS also have un­fet­tered ac­cess to your data. With no war­rant re­quired, no ques­tions asked, it’s a sys­tem that is wide open for abuse.

You don’t even have to be do­ing any­thing il­le­gal. You could find yourself on the re­ceiv­ing end of a search by as­so­ci­a­tion, and never even know about it. A metaphor­i­cal ri­fling through your knicker drawer – they might not find what they were look­ing for, but it doesn’t stop them hav­ing a good gawk at your granny pants.

That’s be­fore we even get into the pri­vacy con­cerns of stor­ing such a vast amount of in­for­ma­tion that hack­ers could have a field day with. You only have to con­sider TalkTalk’s data breach of 21,000 cus­tomers’ de­tails to know that com­pa­nies don’t al­ways have their se­cu­rity act to­gether.

De­spite this, fol­low­ing the Lon­don Bridge ter­ror at­tacks, Theresa May an­nounced she wanted to ex­tend such pow­ers fur­ther still, ac­cus­ing the in­ter­net and its com­pa­nies of al­low­ing “safe spa­ces” for such ide­ol­ogy to breed.

That’s fight­ing talk, but also ig­nores the fact that the large ma­jor­ity of ter­ror­ist in­ci­dents in the UK over the last 15 years have been car­ried out by peo­ple known to the se­cu­rity ser­vices.

Forc­ing more reg­u­la­tion will only drive such toxic mind­sets deeper into the dark­est cor­ners of the in­ter­net and make them harder to ob­serve, all while giv­ing up the pri­vacy of mil­lions of in­no­cent peo­ple in the process.

There’s also a fun­da­men­tal lack of un­der­stand­ing of how the in­ter­net works here. Fol­low­ing the rev­e­la­tion that West­min­ster at­tacker Khalid Ma­sood had used What­sApp in the min­utes be­fore his at­tack, Home Sec­re­tary Am­ber Rudd said it was “com­pletely un­ac­cept­able” that its con­tents were in­ac­ces­si­ble due to end-to-end en­cryp­tion. Her sug­ges­tion? Back­door ac­cess when the gov­ern­ment re­quires it. The prob­lem there, of course, is that it’s im­pos­si­ble. Any hole in en­cryp­tion means en­cryp­tion no longer ex­ists, leav­ing pri­vate com­mu­ni­ca­tions open to abuse and ex­ploita­tion.

Tim Bern­ers-Lee gave a stark warn­ing in the wake of Rudd’s com­ments. “I know that if you’re try­ing to catch ter­ror­ists, it’s re­ally tempt­ing to de­mand to be able to break all that en­cryp­tion”, he told the BBC.

“But if you break all that en­cryp­tion, then guess what? So could other peo­ple, and guess what? They may end up get­ting bet­ter at it than you are.”

It also opens up more con­cern­ing ques­tions. What would hap­pen to any for­eign com­pany (like Amer­i­canowned What­sApp) re­fus­ing any such co-op­er­a­tion with the Bri­tish gov­ern­ment? Could we see apps and web­sites banned, like the Great Fire­wall of China?

Per­haps. Theresa May has al­ready failed to rule out Chi­nese-style cy­berblock­ing when ques­tioned fur­ther on the topic, but even then, what’s to stop ter­ror­ists (and other wrong-do­ers) mak­ing their own en­crypted com­mu­ni­ca­tions if they’re pushed to?

That’s not to say the in­ter­net couldn’t pull its socks up a bit too. Con­tin­u­ing to im­prove on the meth­ods and ef­fec­tive­ness of self-reg­u­la­tion is key to avoid it be­ing taken out of our con­trol en­tirely and re­placed with a much heav­ier-handed ap­proach.

Un­for­tu­nately, some of that could al­ready be in the pipe­line. In June, the UK gov­ern­ment an­nounced a joint cam­paign with France to take stronger ac­tion against web com­pa­nies that fail to re­move “un­ac­cept­able con­tent” from their pages. That could be any­thing from child pornog­ra­phy to hate speech. While tech com­pa­nies un­doubt­edly have a role in pre­vent­ing it, plac­ing a le­gal li­a­bil­ity on them is un­for­tu­nately not that sim­ple.

Every minute, some 400 hours of video is up­loaded to YouTube and 939,000 pieces of con­tent are posted to Face­book. No mat­ter how big a com­pany you are, polic­ing that amount of con­tent is im­pos­si­ble and re­lies on an en­gaged com­mu­nity to report un­savoury ma­te­rial along­side a com­pany’s own mea­sures. Even then, things in­evitably slip through the net.

To help, the gov­ern­ment has said it wants to work with com­pa­nies to pro­duce tools to iden­tify and re­move harm­ful ma­te­rial au­to­mat­i­cally. While that sounds good on pa­per, as Ed John­son-Wil­liams of pri­vacy cam­paign­ing group the Open Source Ini­tia­tive points out, that also comes with its own con­cerns.

“First things first, how would this work?” he said in a blog­post. “It al­most cer­tainly en­tails the use of al­go­rithms and ma­chine learn­ing to cen­sor con­tent.

“Given the eco­nomic and rep­u­ta­tional in­cen­tives on the com­pa­nies to avoid fines, it seems highly likely that the com­pa­nies will go down the route of us­ing hair­trig­ger, er­ror-prone al­go­rithms that will end up re­mov­ing un­ob­jec­tion­able con­tent too.

“There are some that will say this is a small price to pay if it stops the spread of ex­trem­ist pro­pa­ganda but it will lead to a frame­work for cen­sor­ship that can be used against any­thing that is per­ceived as harm­ful.

“All of this might re­sult in ex­trem­ists mov­ing to other plat­forms to pro­mote their ma­te­rial. But will they ac­tu­ally be less able to com­mu­ni­cate?”

Ideas like this also set a dif­fi­cult prece­dent for com­pa­nies with a world­wide pres­ence, like Face­book. Do May and Macron ex­pect every coun­try to ac­cept our views of what is and isn’t harm­ful, or can in­di­vid­ual gov­ern­ments – less demo­cratic gov­ern­ments – set their own guide­lines? To say it’s open­ing a can of worms is putting it lightly.

There’s no easy so­lu­tion, and per­haps un­help­fully, this piece doesn’t seek to of­fer one. But as a web com­mu­nity, we need to recog­nise the dan­gers fac­ing us and come to­gether to en­sure our rights and free­doms aren’t taken away un­der the guise of keep­ing us safe. The cur­rent and pro­posed leg­is­la­tion will do noth­ing of the sort – in fact, it will do the op­po­site.

After the Manch­ester at­tack Theresa May re­minded us that the ter­ror­ists will never win; that they can­not be­cause “our val­ues… our way of life, will al­ways pre­vail”. We must remember that in our re­sponse to it. That the best way to deal with an at­tack on our core prin­ci­ples of jus­tice, tol­er­ance and free­dom, is to strengthen them only fur­ther, not to take them away.

“Con­tin­u­ing to im­prove on the meth­ods and ef­fec­tive­ness of self-reg­u­la­tion is key to avoid it be­ing taken out of our con­trol”

Ver­ity Burns (@ver­i­ty­burns) is a tech­nol­ogy jour­nal­ist writ­ing about the highs and lows of con­sumer tech­nol­ogy. Also: dog en­thu­si­ast. ver­i­ty­burns.com.

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.