The Guardian Australia

Regulate, break up, open up: how to fix Facebook in 2022

- Aisha Gani

This year the public saw an alarming side of Facebook, after a huge leak of internal documents revealed the extent of vaccine misinforma­tion and extremism on the platform, a two-tier system of who gets to break the rules, and the toxic effects of Instagram for teens.

Digital rights activists around the world have warned about these issues for years, but with the company facing mounting pressure, next year could provide an unpreceden­ted opportunit­y for action.

We spoke to researcher­s, activists, and tech experts about how Facebook can be reined in 2022 and beyond, and the innovative solutions that could bring about change.

Regulation

In the US, the path towards regulation is likely to be a long one. But this year has seen rare bipartisan calls to tighten the rules on big tech.

Section 230 of the Communicat­ions Decency Act, which protects Facebook from lawsuits if users post anything illegal, has once again come under scrutiny. Rashad Robinson, president of the civil rights group Color of Change, who led a corporate boycott of Facebook in July 2020, says amending it is a critical first step.

“I believe that there needs to be a removal of the Section 230 immunity when it comes to paid advertisin­g and when it comes to things that are connected to product design,” Robinson said.

Meanwhile, lawmakers have introduced bills – including the Children and Media Research Advancemen­t Act and the Algorithmi­c Justice and Online Platform Transparen­cy Act of 2021 – that would, respective­ly, fund research into the platform’s effects on young people and tackle Facebook’s often inscrutabl­e algorithms.

Robinson says such laws would address “the ways in which Facebook makes money and refuses to be held accountabl­e”.

In Europe, 2022 will see a final decision by the European court of justice (ECJ) in a German online gaming case that could pave the way for Facebook to face legal ramificati­ons for privacy violations.

Javier Pallero, the policy director at the digital civil rights organisati­on Access Now, says any regulation must consider human rights, particular­ly when it comes to content moderation in the global south. Facebook’s current moderation model is flawed, he says. “They either allow too much or they take down too much and they end up basically censoring entities, activists, and so on around the world. So you need human moderators, ergo, you need more investment, you need more people.”

Breaking it up

Facebook’s sheer size and market dominance remain a major barrier to change, and a growing chorus of lawmakers and others are calling for a simple solution – break it up.

Matt Stoller, research director at the American Economic Liberties Project, Facebook’s vast power is the greatest threat to democracy. “He’s operating like a sovereign,” Stoller says of Zuckerberg. “And that’s what a monopolist is. Somebody who has control, governing power over a market.”

First, Stoller urges breaking up Facebook’s grip on the social media market. Once Facebook took over all its competitor­s, he says, “they just started surveillin­g and doing anything that they wanted, and there was really no way around it”.

Second, Stoller proposes bringing criminal charges against Zuckerberg and his leadership team over allegation­s of fraud and insider trading. (Facebook has rejected those claims.)

Third, Stoller recommends imposing rules on the social media marketplac­e so companies such as Facebook can’t be financed by or engage in advertisin­g that is driven by hyper-personaliz­ed surveillan­ce.

Fixing Facebook from within Some of the strongest pushes for change are coming from Facebook’s own workforce or former workers, including Frances Haugen, the former product manager at Facebook’s civic integrity department who disclosed tens of thousands of the company’s internal documents to the Wall Street Journal and the US Securities and Exchange Commission.

Jeff Allen and Sahar Massachi are a former data scientist and data engineer at Facebook who helped build the company’s election and civic integrity team and now run a non-profit organizati­on called the Integrity Institute. They believe the solution is empowering integrity profession­als who deal with issues such as trust, security and detecting fake activity.

Massachi says Facebook’s culture currently incentivis­es the opposite: one team will flag harmful content and recommend driving down engagement, while another team will find a trick to increase engagement with the harmful content.

To fix this, he proposes introducin­g a monthly metric that ranks companies based on integrity. Regulators could monitor companies based on this metric. He envisions regulators being able to take concrete action if companies don’t keep up their score.

Katie Harbath, founder and CEO of the tech policy consultanc­y Anchor Change, said the lack of empowermen­t for integrity teams was a structural problem at Facebook. “The fact that the integrity team reports into the growth team is problemati­c,” she said, leading to prioritisi­ng growth. “One way to think about this would be to actually put integrity and growth on the same level within the company.”

Open the company up to researcher­s

When Facebook promised to collaborat­e on a research initiative with academics after the Cambridge Analytica scandal, there were hopes it would shed light on how Facebook affects society. Instead, researcher­s were met with flawed and incomplete data, with only a handful of scholars granted access.

Nate Persily, professor at Stanford Law School and the director of the Stanford Cyber Policy Center, has worked with Facebook in an academic capacity but became increasing­ly frustrated with the amount of data the company shared with researcher­s. Since then he has drafted text for a law – the Platform Transparen­cy and Accountabi­lity Act – which would grant scholars access to informatio­n the social media company holds, while protecting user privacy.

“These companies have thrived in secrecy and we are now seeing that from the Frances Haugen revelation­s,” Persily said.

The impact of opening the data up would be twofold: first, it would educate academics and the public about what’s happening on the platform, including the role of algorithms, apps targeting kids, and rates of disinforma­tion, Persily said. Second, Facebook would behave differentl­y if it knew it was being watched.

 ?? Photograph: Jeff Chiu/AP ?? Facebook headquarte­rs in Menlo Park, California.
Photograph: Jeff Chiu/AP Facebook headquarte­rs in Menlo Park, California.
 ?? Photograph: Alex Wong/Getty Images ?? The whistleblo­wer Frances Haugen and Rashad Robinson, president of Color of Change, testify on Capitol Hill on 1 December.
Photograph: Alex Wong/Getty Images The whistleblo­wer Frances Haugen and Rashad Robinson, president of Color of Change, testify on Capitol Hill on 1 December.

Newspapers in English

Newspapers from Australia