New tack needed on online harms
OTTAWA • A panel appointed by the Liberal government to help rework its controversial proposal to regulate online harms is in support of a new approach to put responsibility on platforms — though they disagreed on whether that should include contentious measures like content takedowns.
The group also warned not to expect the problem to be fully solved. “Experts emphasized that the legislative and regulatory framework should strive for meaningful improvement, not perfection,” says a summary of the group's final meeting in June. The model wouldn't fix all incidents of harmful content online, but “would improve online services' overall practices in addressing risk.”
The expert advisory group of 12 members was appointed by Heritage Minister Pablo Rodriguez to give him advice on how to redesign the government's initial proposal, which a variety of organizations and experts warned would result in the blocking of legitimate content and censorship, and would violate Canadians' constitutional and privacy rights.
The Liberal government then put a revised approach to the advisory group this spring, the National Post reported last month. The group hasn't released formal recommendations to the government, but Heritage Canada has published summaries of the opinions expressed in their meetings. The National Post obtained a copy of the final summary, scheduled to be released Friday.
“Experts advised that a risk-based approach to regulation, anchored in a duty to act responsibly, would be most appropriate,” the summary says. The three-step process would first see platforms identify and assess the risks posed by their service, mitigate those risks, and then report on what they're doing.
The idea is that the “approach should involve regulated services continually testing the effectiveness of certain tools and adapting them based on their findings.”
While the group was in agreement on the general approach, there was little consensus on some of the most controversial aspects of the government's initial proposal. That included forcing platforms to proactively monitor content and take down flagged posts within 24 hours.
“Some experts voiced concern over mandating removal of any form of content, except perhaps content that explicitly calls for violence and child sexual exploitation content,” the summary said.
“Other experts voiced preference for obligations to remove a wider range of content. They explained that it would be better to err on the side of caution. They expressed a preference for over-removing content, rather than under-removing it.”
A portion of the group warned against forcing or even encouraging platforms to monitor posts on their platforms. “They stated that it is very challenging to justify such a scheme as it introduces risks to fundamental rights and freedoms under the Charter,” the document said.
But other members of the advisory groups said “services should be compelled to proactively monitor their platforms, as such monitoring, in many cases, could effectively prevent a violent attack.”