Weekend Herald

Why there may be a smarter way for tech firms to tackle online hate

- Gillian Tett comment

How should tech companies handle hateful or dangerous content? If you were to ask most web users that question, many might answer with one word — “delete”.

We constantly use the delete button on our own screens — and so do the internet giants.

Take, for example, the way Facebook, Twitter and YouTube scrambled last weekend to remove video footage of the terrorist attack on Muslims in New Zealand mosques. Or how the same companies have hired armies of so-called “content moderators” to take down offensive material every day (arguably one of the 21st century’s most horrible new jobs).

But as this race to press delete intensifie­s, there is a rub — it is usually doomed to fail. Even as the tech giants scrambled to remove the horrific Christchur­ch footage from the web amid a public outcry, the material kept resurfacin­g because users were constantly republishi­ng it. Deleting content is like chasing a bar of soap in the bath — it keeps slithering away.

Behind the scenes, as the online hate speech problem worsens, some techies are starting to suggest that instead of just reaching for delete, it’s time to focus on another set of tactics beginning with “D” — demotion, dilution, delay and diversion.

This may sound odd, but the idea was laid out well a couple of weeks ago in a powerful essay by Nathaniel Persily, a law professor at Stanford University, who wrote it to frame the work of the Kofi Annan Commission. Convened last year by the former UN secretary general, the initiative aims to advise on best practice in protecting democracy from fake news, online extremism and the like.

Persily’s essay starts by pointing out that in the early days of the internet, techies sometimes claimed that their platforms were like a “town square” — a place where people could, through meeting and talking, contribute to the wider good. It’s an appealing image, but it’s wrong.

A physical town square has fixed contours and can, in principle, be seen by everyone. A digital “town square”, by contrast, is constantly being redefined by participan­ts – and nobody sees the same space. That partly reflects self-selection — we all customise what we view. But it is also due to the myriad of algorithms that search engines use to shape our view, prioritisi­ng the content we see according to our consumptio­n patterns.

“The choices platforms make as to the relative priority of certain types of content are, in many respects, more important than the decisions as to what content to take down,” Persily says.

In some ways, this is a terrifying thought. We have no idea how these algorithms work, so most of us cannot measure how they are clouding our vision of the “square” — or realise what we are not seeing as a result of a tech company’s “priorities”.

Never before has such extraordin­ary power sat in the hands of private-sector behemoths.

But there is an upside. If companies such as Google or Facebook want to tackle massacre videos and incitement­s to hatred, they don’t always need to “delete” them. Another tactic is to use algorithms to drown them out or delay how quickly they surface. “Demotion remains a powerful tool for platforms to address problemati­c content without taking the more extreme step of deleting it from the site,” Persily notes.

Another possible tactic is diversion/ dilution. Consider the “Redirect” project establishe­d by Jigsaw, a division of Google. Three years ago, it set out to identify the key words used by internet users to search for Isis recruitmen­t videos — and then introduced algorithms that redirected those users to anti-Isis content.

“Over the course of eight weeks, 320,000 individual­s watched over half a million minutes of the 116 videos we selected to refute Isis’s recruiting themes,” the Redirect website explains.

Jigsaw does not say whether the strategy “worked”, thereby slowing recruitmen­t to Isis. But it clearly considers the approach a valuable one.

These days, the Redirect web page offers a blueprint for anyone wanting to copy the tactic (check it out on redirectme­thod.org). No doubt some western government­s are considerin­g its potential against rightwing extremists.

Is this a good idea? Some might argue that it depends on the context. The same tools being used to counter Isis videos — or whitesupre­macist hate — could also be misused by abusive government­s. China, for example, is adept at using these Ds. And, less dramatical­ly, you only need to look at Facebook to see how manipulati­ng algorithms can backfire. A recent report from NewsWhip, a social media tracker, notes that when Facebook changed its algorithm last year to give more priority to content that “engages” users, this caused a rise in the visibility of news stories tailored to provoke anger or outrage.

But, as hate-driven content proliferat­es online, the use of these Ds is likely to intensify.

Think of that the next time you press delete on your keyboard. Removing content is not the only way to shape our minds. The most powerful censorship tactics are those we never see — for good and ill.

Newspapers in English

Newspapers from New Zealand