Toronto Star

DISINFORMA­TION

- Jennifer Yang

The issue: In recent years, the world has rudely awakened to the proliferat­ion of hate speech and “fake news” on social media — especially during election cycles, when online content can be used to disrupt democratic processes.

Regulators and social media companies have come under pressure to stem the flow of online hate speech and malicious “disinforma­tion.” But these are complicate­d problems with no easy fixes; the sheer volume of harmful content online is overwhelmi­ng and the notion of regulating speech will always provoke free speech concerns while drawing fierce opposition.

And while some countries have legal definition­s for hate speech, defining “fake news” is a tricky propositio­n; it can also be a dangerous one, especially in the hands of authoritar­ian regimes that want to censor informatio­n or crack down on the free press.

Regulators have struggled to find workable solutions. “No one’s figured it out,” says Heidi Tworek, an assistant professor of history at the University of British Columbia who studies media, democracy and the digital economy. “We continue to have to really evaluate many of these different schemes that are being proposed. These are such large and complex questions.”

What others are doing: Several European countries are now moving toward regulating social media platforms, deploying strategies that have drawn both criticism and praise. In France, the government passed a law in November that empowers judges to order the removal of “fake news” during elections; violators face a penalty of one year in prison or a fine of 75,000 euros.

In the United Kingdom, a new parliament­ary report — released last week following an 18-month investigat­ion — is also calling for platforms like Facebook to be brought under regulatory control. The report proposes several new regulation­s, including a mandatory code of ethics and independen­t regulator who can bring legal proceeding­s against social media companies.

Germany has taken a particular­ly bold approach to regulating hate speech on social media. In January 2018, the country passed its Net Enforcemen­t Act — sometimes dubbed the NetzDG or “Facebook law” — which forces tech companies to remove hate speech within 24 hours of illegal content being reported. (When it’s unclear whether content is actually illegal under German law, tech companies have seven days to consult and decide.)

The penalty for breaking this law? Up to 50 million euros in fines. “This is probably the furthest anyone has gone in trying to get large social media companies to adopt their policies to local laws,” Tworek says. What Canada is doing: “Canadians, and the Canadian government, are alive and alert to the issues,” says Dwayne Winseck, a journalism professor at Carleton University and director of the Canadian Media Concentrat­ion Research Project.

He points to Canada’s recently-passed electoral reform bill, C-76, as a positive step. Online platforms like Facebook and Google must now create a registry of digital advertisem­ents placed by political or third parties during elections and ensure they remain visible for two years. The law also bans the use of foreign money by “third party” advocacy groups during campaign periods — meaning social media companies can’t knowingly accept advertisem­ents paid for with foreign funds.

Another provision prohibits making false statements about a candidate to influence an election. This only applies narrowly to certain types of statements, however (for example, statements about whether a candidate has broken the law or their place of birth).

In January, the federal government unveiled plans for safeguardi­ng the upcoming election, including a $7 million initiative to improve the public’s ability to detect “online deceptive practices” and a team of five bureaucrat­s who will alert the public whenever they find evidence of election interferen­ce.

But where Canada could be innovating more is with social media regulation more generally, says journalist Chris Tenove, a PhD candidate at the University of British Columbia who studies global governance and digital politics.

He would like to see Canada follow the lead of jurisdicti­ons like the European Union, which worked together with social media companies to develop a code of conduct. As a result, platforms have voluntaril­y committed to “quickly and efficientl­y” addressing hate speech and early reports have shown good results, he says.

One promising spot is Canada’s involvemen­t with an internatio­nal effort called the Grand Committee on Disinforma­tion and Fake News. Comprising parliament­arians from nine countries, the committee is scheduled to meet in Ottawa this May and has called for social media executives — including Facebook’s Mark Zuckerberg and Google CEO Sundar Pichai — to appear so they can explain what they’re doing to stop the spread of disinforma­tion.

 ??  ??

Newspapers in English

Newspapers from Canada