The New Zealand Herald

FILTH & FURY

Why filtering the internet is so hard

- Juha Saarinen comment

Another live streamed mass murder, this time in Thailand where a soldier went on a shooting rampage. That’s something nobody needs to see, but how to prevent monstrous acts from being transmitte­d and viewed by people remains an impossible challenge.

It is true that the internet provides access to both great and awful material; however, bear in mind that there is no “raw” or unfiltered internet where anything goes.

On the contrary, most of us have filtered internet access already. Email would be even less usable than it is without spam filtering and internet providers drop connection­s from risky networks; that’s considered best practice network engineerin­g, and it would be unsafe not to do so.

Google filters search results, and its Chrome browser blocks access to malicious sites. Facebook and Instagram happily let people and bots post horrendous disinforma­tion but any nude bits are a no-no.

In a nutshell, filtering out bad stuff on the internet works poorly. If it worked well, the internet would be a safe environmen­t, without hackers, ransomware criminals, viruses, government and corporate surveillan­ce, you name it.

Despite access restrictio­n efforts struggling to keep up with malicious actors abusing technology, the Government now appears to be looking for more of the same to stop objectiona­ble content from finding its way to New Zealanders’ devices.

Let’s call the proposed new law for what it is, namely censorship of the internet.

As we’ve seen with past “internet laws” like the Telecommun­ications Intercepti­on Capability and Security Act and the anti-file sharing amendment to the copyright law, the Government put the onus on providers to work out how to be compliant with the regulation­s, or face huge fines.

Censoring the internet shouldn’t be the providers’ job though.

For starters, the law would have to have a rock-solid definition of what constitute­s bad things on the internet. That’s existing banned material, and bad things that haven’t yet popped up, but when they do, they should be filtered out. Some things everyone can agree on are bad, but there are so many areas that aren’t clear cut.

For example, I would be happy to see all anti-vaccine sites and Facebook groups blocked.

Children are dying because of the dangerous anti-science nonsense, but even then I can guarantee that thousands of people would disagree and not want anti-vax disinforma­tion banned. Ditto anti-1080, anti5G, anti-fluoride campaigner­s, nutters advocating drinking bleach to cure novel coronaviru­s infections and other crazed nonsense.

Asking internet provider staffers to make snap decisions as to what’s a bad thing and what is not simply isn’t fair on them. What if they get it wrong? They’d have to watch and hear horrible things, and risk being traumatise­d in the process.

Technicall­y, devising censorship filtering would be complex and risky and require constant monitoring for errors and problems. On top of that, should providers become NetCops as well, keeping tabs on which users are actively trying to bypass censorship, and then report them to the authoritie­s?

There’s no single solution available that does all the required filtering effectivel­y. Instead, providers have to block and filter at several levels, each with their own set of problems.

You could block links or web addresses leading to objectiona­ble content, but they are easy to change. You’d have to be careful not to drop traffic to, say, microsoft.com in the process.

Blocking internet protocol addresses is another way to do it but the risk of collateral damage is enormous. The Australian Securities and Investment Commission meant well but got it wrong a few years ago, and accidental­ly blocked access to 250,000 sites hosted on one IP address. Oops.

Content filtering relies on an agreement as to what constitute­s bad things, and is subject to the Scunthorpe Problem which needs no further explanatio­n.

I’m told that one idea is to take the existing Department of Internal Affairs NetClean filter against child abuse material, and extend it to block objectiona­ble content.

That wasn’t ever within the

It is almost impossible to moderate and censor content when millions or even billions of people can create whatever they like and post it on social media. That’s even harder for live streaming.

scope of the filter. Pushing all of New Zealand’s data traffic through that filter is likely to become a single point of failure, overwhelme­d by giga-bit-per-second fibre connection­s at home and fast 4G/LTE networks.

Then there’s the inconvenie­nt fact that most internet connection­s are strongly encrypted.

This means content filters can’t see what’s being sent or viewed. To do that, data streams would have to be decrypted so that bad things could be identified. That is, you’d have to remove an essential security feature and put people’s informatio­n and privacy at risk.

It is almost impossible to moderate and censor content when millions or even billions of people can create whatever they like and post it on social media. That’s even harder for live streaming when there’s no clue in advance as to what will be broadcast.

The tech solution touted here is machine learning and artificial intelligen­ce which is getting better all the time at recognisin­g images and audio.

Taking that idea to the devices that people use to create content, even mid-range smartphone­s now contain powerful AI processors.

Software running on smartphone­s that uses AI chips to recognise and block objectiona­ble images and videos and for example hate speech from being recorded is entirely possible.

Would such tech be infallible? No, and there’s every chance AI-based censorship on phones would be abused by government­s to restrict freedom of expression and for tracking users.

Not taking action is not an option, and we’ll probably end up with a broad-brush general requiremen­t that leaves the difficult techie bits for someone else to work out.

Whatever comes, the sad thing about it is that none of it addresses the most important issue, which is what the hell is wrong with people who post and watch awful stuff?

 ??  ??
 ?? Photo / AP ?? Filtering for live streams of mass murders like that at Thailand’s Terminal 21 mall is near impossible.
Photo / AP Filtering for live streams of mass murders like that at Thailand’s Terminal 21 mall is near impossible.
 ??  ??
 ??  ??

Newspapers in English

Newspapers from New Zealand