The Guardian (USA)

This thought experiment captures Facebook’s betrayal of users’ privacy

- Richard Ashby Wilson

Imagine that right now the postman is reading your mail and making a note of your most private thoughts and preference­s. He notices that you lean slightly to the right and read the Wall Street Journal. He begins hawking your intimate informatio­n all over town and sells it to a newspaper further to the right of the WSJ. He observes you reading that new, more rightwing publicatio­n and then starts hawking again, and this time he sells your private informatio­n to someone more rightwing: a publicatio­n like, say, Breitbart.

Year after year, the postman continues selling to everyone, and one day you start receiving a far-right extremist magazine intent on destroying democratic institutio­ns.

By selling your informatio­n, the postman becomes rich, exceedingl­y rich – the richest person in your community. You confront him to say that you don’t consent to being a product to be sold to the highest bidder, and find out he is reading the mail of half the people on the planet. You want to change postal carriers and communicat­e privately with your friends and family, but there is only one service linking you to the outside world.

When experts talk about social media, you’ll hear a lot about abstract and fuzzy concepts such as the algorithm, transparen­cy and privacy, but they all boil down to whether you want to be a commodity and have no control over who you are sold to. If you are tired of being held digital hostage, then there are three possible fixes.

First, we can encourage old-fashioned competitio­n. Currently, social media companies are squashing competitor­s by not letting you take your content to a rival platform if you want to leave. New legislatio­n must compel them to allow “interopera­bility”: a fancy word that means you can move between platforms and take your followers and posts and photos with you. Many of us would leave Facebook for a new platform that does not read our messages, sell our informatio­n, throw away mail from family and friends with different political views, nor introduce us to dangerous extremists. Some might be fine with a robust political debate format, but others may prefer family-oriented discussion­s, and we should be allowed to choose in the open market.

Second, even if new legislatio­n enabled interopera­bility, problems such as hate speech and disinforma­tion amplified by company algorithms would remain. The current discussion focuses excessivel­y on content removal, and it might be more helpful to think instead about how social media algorithms elevate extremist positions and foster polarizati­on. The algorithm is simply a code written by an engineer to ensure that you see posts that grab our attention, and nothing grabs our attention like violence, abuse and hatred.

Section 230 of the US Communicat­ions Decency Act stipulates companies can’t be held liable for content, but immunity for algorithms fomenting extremism is increasing­ly being challenged in courts. In the second circuit case of Force v Facebook, the parents of Taylor Force, a young American who was stabbed in Israel in 2016 by a Hamas supporter who had been radicalize­d online, sued Facebook for supplying their son’s killer with ever-more extremist content and introducin­g him to a network of supporters of terrorism. Judge Katzmann wrote in his dissent that Section 230 protects content but “does not protect Facebook’s friend‐ and content-suggestion algorithms”.

Third, we should amend Section 230 to permit civil suits against companies that elevate terrorist content and allow government agencies like the Federal Communicat­ions Commission and Federal Election Commission to examine the algorithms used by companies. We have a legitimate interest in looking inside the black box – the black box, after all, is nothing but the postman getting away with abusing his power and access. New legislatio­n must expose companies to lawsuits when they are grossly negligent by industry standards and do not remove posts that could cause imminent harm.

In short, the US needs to rein in the power of some of the largest tech companies on the planet in order to protect population­s and democratic institutio­ns. Germany did just this in 2017 with NetzDG, a law requiring platforms to remove content that is manifestly illegal under German criminal law. German democracy did not lurch into censorship and oppression and is indeed thriving. The UK’s parliament is currently considerin­g an Online Safety Bill to rein in terrorist and child sexual abuse content and allow its regulator, Ofcom, to review the algorithms of social media companies.

It is high time the Congress and Biden administra­tion placed reasonable democratic constraint­s on online advocacy of violence and extremism. The choice is clear: we can either protect our democracy from extremism or lose it.

In the real world, your postal carrier is prevented by law from reading your mail and selling your informatio­n to recruiters who wish to spam you with violent extremist material. Those same protection­s must be extended to Facebook and other companies.

Richard Ashby Wilson is associate dean for faculty developmen­t & intellectu­al life and distinguis­hed professor of law and anthropolo­gy at the University of Connecticu­t School of Law

 ?? Photograph: Evelyn Hockstein/Reuters ?? ‘In the real world, your postal carrier is prevented by law from reading your mail and selling your informatio­n to recruiters who wish to spam you with violent extremist material.’
Photograph: Evelyn Hockstein/Reuters ‘In the real world, your postal carrier is prevented by law from reading your mail and selling your informatio­n to recruiters who wish to spam you with violent extremist material.’

Newspapers in English

Newspapers from United States