National Post

‘It’s unworkable’

‘Online harms’ bill draws internatio­nal ire

- DAVID REEVELY

OTTAWA • The world is watching as Canada’s federal government prepares legislatio­n to fight online harms such as hateful speech and the non-consensual sharing of sexual images. The world does not like what it sees.

“Even if a system like the one that’s proposed could work in Canada, which I don’t think it could, it would right away get transposed to any number of countries that don’t have Canada’s checks and balances and due process [and] rule of law,” said Nathalie Maréchal, of the Washington-based

think tank Ranking Digital Rights. “And the line would be, ‘But Canada does it, so why can’t we?’”

HOW DO YOU IDENTIFY MISINFORMA­TION AT SCALE AND AT SPEED WITHOUT A HUGE, HUGE PERCENTAGE OF ERROR?

The Liberals have promised a bill to crack down on online hate, terrorist plotting and sexual exploitati­on will come in their first 100 days, and they signalled their thinking in proposals published at the end of July, before Prime Minister Justin Trudeau called the September election.

Those include: Putting a legal obligation on platforms like Facebook and Twitter to “take all reasonable measures … to make [harmful content] inaccessib­le to persons in Canada,” including by applying their own AI tools to sniff it out and by suppressin­g it within 24 hours of receiving a notice from an outsider;

Establishi­ng “robust flagging, notice, and appeal systems for both authors of content and those who flag content”;

Notifying police and CSIS of proscribed content, with details of what justifies calling law enforcemen­t to be determined later;

Creating a new Digital Safety Commission, with a commission­er empowered to enforce the rules (including through inspection­s and raids) and a Digital Recourse Council for appeals of platforms’ takedown decisions; and

Fines for violations topping out at three per cent of a platform’s global annual revenue or $10 million, whichever is more.

The regime the proposals describe has been condemned by the Citizen Lab at the University of Toronto (“Rewrite the Proposal from the Ground Up,” reads the headline on the conclusion of its formal submission to the consultati­on), and the Canadian Internet Policy and Public Interest Clinic at the University of Ottawa (“the current proposal threatens fundamenta­l freedoms and the survival of a free and open internet in Canada and beyond,” says the introducti­on to its submission).

But outside the country, it’s drawn condemnati­on not just from Maréchal’s group, which issues report-card-style rankings of big digital players on how well they respect free expression and privacy. The venerable Electronic Frontier Foundation attacked the plan in August. Daphne Keller, director of Stanford University’s program in platform regulation, published a top-five list of its flaws.

“Human rights groups like Human Rights Watch, Access Now, and Article 19 have been fighting requiremen­ts like these one at a time in countries like India, Turkey and Russia. Canada’s proposal combines them all together in one package,” Keller wrote.

The Global Network Initiative, which aims to bring government­s and corporatio­ns together to protect free expression in the digital world (its members include Facebook, Google, Microsoft and Yahoo!, finance-industry companies such as BMO and civil-society groups like Human Rights Watch), filed a submission saying it’s “concerned that some aspects of the proposed approach appear to be inconsiste­nt with internatio­nal human rights principles, regulatory best practice, and Canada’s leadership on internet freedom.”

“We, as a community of rights-respecting … advocates, including government­s, are substantia­lly weakened in our ability to push back not just in Vietnam or Russia, but also in Brazil, Turkey, if the rights-respecting government­s — the ones that stand up and posture themselves as the defenders of human rights — are themselves putting in place laws that look quite similar to the approaches that some of these less rights-respecting countries are taking,” said the Global Network Initiative’s director of policy and strategy Jason Pielemeier, in an interview with The Logic from Washington.

It’s particular­ly galling, he said, because in 2022, Canada is chairing the Freedom Online Coalition, a group of 34 countries that promotes liberty of expression and democratic rights on the internet. Through a spokespers­on, Heritage Minister Steven Guilbeault — who, despite the recent election, retains his office unless a new minister is named — declined a request for an interview on the barrage of criticism.

Twitter’s manager of public policy in Canada, Michele Austin, said in a statement relayed to The Logic by a spokespers­on that Twitter wants the proposals gutted to the studs: “Our sincere hope is that the Government of Canada takes an entirely new approach to these issues after reviewing and analyzing the submission­s.”

Twitter responded to the consultati­on, but isn’t sharing its submission publicly, she said.

Facebook is more circumspec­t. “Facebook supports the creation of a common set of rules to combat harmful content that would apply to all social media companies,” said Facebook Canada spokespers­on Lisa Laventure. But, she said, the company did not submit a response to the government consultati­on.

Mindgeek, the Montreal-based, Luxembourg-headquarte­red operator of numerous pornograph­y sites, did not reply to an email from The Logic.

Each of the internatio­nal critiques of the Canadian plan is different, but they have common elements:

❚ The proposal says the final law will use definition­s of harmful content that “borrow from the Criminal Code but are adapted to the regulatory context,” and in some cases will deliberate­ly go beyond what’s already illegal;

❚ Platforms will decide what meets those definition­s;

❚ They will have to make those decisions very fast;

❚ They’ll be ordered to use algorithms to detect problemati­c content without a requiremen­t that those algorithms be made public, or that they can distinguis­h between, say, news coverage or satire and actual harmful material; and

❚ The requiremen­t to report some types of flagged content to law enforcemen­t turns private companies into agents of the state and could snare innocent people.

Maréchal pointed to the demand for algorithmi­c enforcemen­t as a particular problem. U.S. Sen. Amy Klobuchar proposed a bill seeking to crack down on pandemic misinforma­tion on social networks and it had the same flaw, she said.

“It’s unworkable as a proposal,” Maréchal said. “How do you identify misinforma­tion at scale and at speed without a huge, huge percentage of error? I think a lot of these proposals hinge on wishful thinking that AI would be better than — much better than — it actually is. And unfortunat­ely, you can’t wish that kind of algorithmi­c prowess into being.”

In practice, algorithmi­c enforcemen­t mechanisms in any field have often over-targeted minority groups, many of the consultati­on submission­s pointed out. And the steep fines for non-compliance will make platforms err on the side of caution.

“Companies basically have to be in the position of determinin­g when something is not illegal, but neverthele­ss is harmful enough that you have to [restrict and report it]. And if you get that decision wrong, there will be potentiall­y quite significan­t consequenc­es for you,” Pielemeier said.

Hardly any platform really wants to be in the business of promoting harmful content, he said, and we shouldn’t try to regulate the whole internet to get at the few bad actors that do.

“I think the bigger challenge is, how do we encourage and facilitate the companies that are trying to do better at addressing this content, without pushing them so far in the direction of responsibi­lity and liability that they effectivel­y take over state functions in terms of detecting and determinin­g when content becomes illegal and needs to be actioned?” he said. “Because that raises some very deep questions and concerns in terms of democratic accountabi­lity and responsibi­lity.”

Maréchal said she believes the approach of targeting specific instances of harmful content is wrong-headed.

“One thing is to focus first of all on the business of a tech company, and to focus more on process than on results, and to shift the incentive structures that lead them to make the product design choices and the business decisions that they do,” she said.

Hyper-targeted advertisin­g combined with algorithmi­c mechanisms designed to hold people’s attention by serving them content without care for what that content is has produced bad results, Maréchal said.

“If you reformed that and changed the incentives under which companies make decisions, you can improve the outcome without opening this door for autocratic regimes, or even just authoritar­ian-curious regimes, to make bad decisions,” she said.

Some moves would require action by platforms’ home countries — the United States, in the cases of Facebook and Twitter. Facebook’s corporate structure, with Mark Zuckerberg as CEO, chair of the board and key shareholde­r, doesn’t lend itself to accountabi­lity, Maréchal said, and many platform companies are almost as centralize­d.

Requiremen­ts for platforms to carry out human rights assessment­s of their plans before introducin­g new products or entering new markets would make them think ahead rather than trying things and seeing what happens, she said.

Pielemeier said government­s’ failure to act earlier on online harms — maybe due to an understand­able belief that they were better off leaving alone systems they didn’t really grasp — has left them without the capacity to do this sort of regulation well.

“Now that government­s are feeling politicall­y motivated to regulate these spaces, they don’t have necessaril­y a strong relationsh­ip in civil society or in companies, in regulators and authoritie­s,” he said. “There’s more of this sense of antagonism that maybe there wouldn’t have been if they had taken a different approach.”

For more news about the innovation economy, visit www.thelogic.co

 ?? ??
 ?? ?? Minister of Heritage Steven
Guilbeault
Minister of Heritage Steven Guilbeault

Newspapers in English

Newspapers from Canada