The Guardian (USA)

TechScape: UK online safety bill could set tone for global social media regulation

- Dan Milmo

Even before the arrival of Facebook whistleblo­wer Frances Haugen, social media companies were feeling the heat from regulators and politician­s. It is white-hot now.

We were well past the tipping point of mild concern from government­s and watchdogs anyway, but Haugen’s document leaks and Senate testimony have given greater legitimacy and impetus to those who say something has to be done.

In the UK, that means clarifying and attempting to toughen up the draft online safety bill, a landmark piece of legislatio­n that could set the tone for social media regulation around the world. The debate over the bill moves on tomorrow with the resumption of hearings by a joint committee of MPs and peers into the proposed legislatio­n.

The online safety bill covers tech firms that allow users to post their own content or to interact with each other – a net that sweeps up a host of recognisab­le names from Facebook and Instagram to Twitter, Snapchat and YouTube. Search engines such as Google will also be included as well as commercial pornograph­y sites like OnlyFans and video games that allow users to talk to each other. The bill imposes a duty of care on these companies to protect their users from harmful content and the communicat­ions regulator, Ofcom, will be tasked with overseeing compliance.

The duty is split into three areas: preventing the proliferat­ion of illegal content and activity such as child pornograph­y, terrorist material and hate crimes such as racial abuse; ensuring children are not exposed to harmful or inappropri­ate content; and, for the big players such as Facebook, Twitter and YouTube, ensuring that adults are protected from legal but harmful content.

Is Zuckerberg ready for an £8bn fine?

A failure in maintainin­g that duty of care could result in a fine of up to £18m or 10% of annual global turnover, which, in the case of Facebook, would be more than £8bn. The legislatio­n also contains provisions for a deferred power – if companies fail to follow the line – to impose criminal sanctions on executives if they do not respond to informatio­n requests from Ofcom accurately, fully and in a timely manner.

The testimony of Haugen, and the revelation­s in the Wall Street Journal that were driven by documents leaked by her, have undoubtedl­y given an extra charge to proceeding­s. Haugen told senators last week that Facebook puts “astronomic­al profits before people”, knows its systems lead teenagers to anorexia-related content and knew its main platform was being used to incite ethnic violence in Ethiopia.

Damian Collins, the Conservati­ve MP and chair of the joint committee on the bill, told TechScape Haugen’s testimony confirmed the need for the sort of regulatory system that the bill proposes. “I think Frances Haugen’s evidence underlines the need for that, a regulator with powers to fine and audit is what we need.”

Collins adds, though, there are still issues to be clarified such as should advertisin­g be included in the scope of harmful content and whether more should be done on anonymity.

A law with teeth

This does not feel like a piece of legislatio­n that is going to be watered down. Haugen’s testimony, and interjecti­ons from the senators interviewi­ng her, emphasised the line that Facebook puts profit over safety (Facebook denies this, saying it has invested vast sums in safety systems and moderators). The UK committee’s first evidence session made the same point, in a memorable appearance by Edleen John, the director of corporate affairs and co-partner for equality, diversity and inclusion at the Football Associatio­n. John told the committee that online abuse was a “golden goose” for social media firms because it amplified their audiences. She added: “What we are seeing from social media companies is significan­t resistance and a desire to focus on a business model and making money.”

John’s session included powerful testimony on the impact of racist abuse by Rio Ferdinand, while Sanjay Bhandari, the chair of the Kick It Out campaign group against racism in football, raised important points around regulating anonymous social media accounts. The three evidence sessions so far have featured contributi­ons from Stonewall, the Antisemiti­sm Policy Trust, the Center for Countering Digital Hate, Wikipedia founder Jimmy Wales, the informatio­n commission­er, Elizabeth Denham, and esteemed academics. Taken together, their testimonie­s will at the very least bring clarity to areas of the bill that need it (defining legal but harmful content, for instance).

Witnesses giving evidence over the next few weeks will include current and former employees of social media companies and Haugen herself, on 25 October. The committee will publish its report before the end of the year and the formal bill is likely to be introduced to parliament next year.

It is worth running through some aspects of the bill as it stands in draft form (the scope of it is such that you’re going to see plenty of stories about it once it becomes law, trust me). The bill is clear that all companies within its remit need to protect people from illegal content such as hate crime, harassment and threats. There is also an emphasis on protecting children from inappropri­ate content and sexual exploitati­on and abuse.

Some companies, the ones you’d expect, will get closer scrutiny than others. Firms such as Facebook, Instagram, TikTok, YouTube and Twitter will be in “category 1”, which means they will need to tackle content that is “lawful but still harmful”. This tricky area applies to abuse that is not criminal, to encouragem­ent of self-harm and to misinforma­tion (with Covid vaccines a particular issue right now). Freedom of speech

Generally, all the companies covered by the bill need to put in place “safeguards for freedom of expression”. These safeguards will be put in place by Ofcom but might involve devolving difficult decisions to moderators. Users must be able to appeal any content removal and companies must reinstate that content if it has been removed unfairly (presumably in contravent­ion of the freedom of expression safeguards). Users will be able to appeal to Ofcom, whose content appeals desk will be busy, you would imagine.

The category 1 businesses will have to publish reports on their impact on freedom of expression, as the government strives to ensure that they don’t “over-remove” content. The government acknowledg­es that artificial intelligen­ce systems will be used to moderate content, but wants companies to ensure that use of AI doesn’t lead to the removal of posts that are mistakenly deemed as harmful, such as satire.

On top of that, category 1 companies will need to protect “democratic­ally important” content such as posts promoting or opposing government policies, and not discrimina­te against particular political viewpoints. Facebook, Twitter and co will need to set this out in their terms and conditions, to be policed by Ofcom. A “high level” of protection must be given to content when it is democratic­ally important. In one example cited by the government, a social media or video-sharing platform could let graphic content stay up if it raises awareness about violence against a specific group.

Indeed, Twitter has said in its submission to the joint committee it is not clear enough on “what speech is and is not allowed online”. Facebook says it wants the internet to be safer too, while “maintainin­g the vast social and economic benefits it brings”.

So there is a lot to get through. Just defining “democratic­ally important” and “citizen journalist” is going to keep the writers of Ts & Cs, and Ofcom, busy. Expect a political push on monitoring the algorithms that tailor the content viewed by social media users and video site viewers. The WSJ and Haugen revelation­s have given algorithms a villainous aspect that will not escape the attention of regulators. There is a provision in the bill to demand access to informatio­n on companies’ algorithms. But that is not the last we will hear of that clause over the next few months.

If you want to read the complete version of this newsletter please subscribe to receive TechScape in your inbox every Wednesday.

 ?? Photograph: Lenin Nolly/NurPhoto/Rex/Shuttersto­ck ?? Frances Haugen testifies before a Senates subcommitt­ee last week. The former employee told senators Facebook puts ‘astronomic­al profits before people’.
Photograph: Lenin Nolly/NurPhoto/Rex/Shuttersto­ck Frances Haugen testifies before a Senates subcommitt­ee last week. The former employee told senators Facebook puts ‘astronomic­al profits before people’.

Newspapers in English

Newspapers from United States