Business Day

Eye in sky lands on our streets, as policing around world gears up

• Gauteng provincial government to boost surveillan­ce through Vumacam partnershi­p

- Minority Report KATE THOMPSON DAVY Minority Report. ● Thompson Davy, a freelance journalist, is an impactAFRI­CA fellow and WanaData member.

If I had a rand for every time I felt compelled to reference as a technology journalist in the 16-plus years I’ve been doing this, I’d have ... OK maybe enough for a beer, and not of the craft variety.

Still, this Philip K Dick novella turned early-aughts action flick has proven eerily prophetic. We see similar tech all around us today, from (virtually) autonomous cars to biometric-based security, from touch screen interfaces to persistent and personalis­ed advertisin­g, stalking us as we move about in the real and digital world.

We see drones being deployed by both public and private entities, elements of gestural computing at play in gaming rigs and augmented reality tools, and — thanks to Elon Musk’s Neuralink announceme­nt earlier this month — brain-computer interfaces and implants are back in the spotlight.

The tech prediction­s of the movie treatment weren’t born in a bubble. Director Steven Spielberg did his homework, working with a panel of experts to make the 2050s tech depicted at least plausible, not merely fantastica­l. And, judging by the pace of innovation, we’re ahead of schedule.

But it wasn’t my love of skop skiet en donner that set the agenda for my column this week. It was a flurry of news stories crossing my screens on the potential and pitfalls of the next generation of security tech and tech-enabled policing.

In late January Wired and The Verge ran a story that seems to encapsulat­e some of the worst ways to use such tools, detailing how police in the US have leapt from creating DNA-generated 3D models of suspect’s faces to running said models through facial recognitio­n tools.

The intent was to crack cold cases. The result is something akin to the version of future policing in It not only shows a disregard for ethics, but an alarming overconfid­ence in the tech itself.

We know that technology can introduce bias and that people have used these systems poorly. In Minnesota, as MIT Technology Review reported, technology was used to identify protesters at Black Lives Matter protests. Protesting, I shouldn’t have to say, isn’t a crime, even in the US. And the US and UK have both used algorithms in post-conviction processes, in attempting to predict recidivism and in recommendi­ng sentences accordingl­y.

ProPublica has reported that one such particular system guessed re-offence likelihood wrong twice as often for black people than for white people. Systems built on incomplete or biased data replicate bias.

That’s not to say the systems without tech interventi­on are flawless. We know humans have bias. Globally, I can point to any number of death row inmates proved innocent, judges skirting sentencing guidelines to extract more milligrams of flesh, police officers jumping to (wrong) conclusion­s, and so on.

I’d like to say that the nice thing about human error is that it isn’t networked, but that’s simply not true. A hard-line police captain or prosecutor will slowly shape a local workforce in his likeness. If your coworkers endorse your casual racism, it reinforces such beliefs, and that’s how workplace cultures develop and dig in, like grout mould.

So it is with interest, and caution, that I saw a local story of tech-enabled policing in my inbox. On Tuesday the Gauteng provincial government (GPG) and Vumacam announced a partnershi­p involving “Vumacam’s extensive camera network and advanced crimefight­ing technologi­es”.

Vumacam boasts SA’s largest private CCTV network, with over 6,000 cameras in Gauteng and “access” to some 5,000 partner cameras nationwide. The GPG-Vumacam partnershi­p also comes with the promise of “extending camera coverage to underserve­d areas, particular­ly within ... townships, informal settlement­s and hostels”.

It is really an extension of existing relationsh­ips. Vumacam works with the Joburg metro police, the SA Police Service and Business Against Crime through various projects and integrated operations centres, using tech to monitor the video feeds incoming from the extensive network, flag incidents and track vehicles.

It’s an announceme­nt that will be sure to divide opinions around the braai this weekend. In the face of relentless­ly poor crime stats and the emotive power of victim stories, it will feel like a great, even necessary, step to some. Others object to surveillan­ce on principle, arguing that it is inherently an invasion of privacy, something many believe trumps the law and order justificat­ion.

I sit — rather painfully — on the (electrifie­d) fence: an idealist about privacy, a pragmatist about crime. I worry that such systems are open to be turned into the CCTV equivalent of the SIM surveillan­ce scandal, where loopholes in SIM card registrati­on laws enabled state surveillan­ce of investigat­ive journalist­s.

In fact, the Johannesbu­rg Roads Agency (JRA) and Vumacam got into it — legally — a few years back, when the JRA declined to process some of Vumacam’s “wayleaves” (a type of permit or permission) applicatio­ns, temporaril­y halting the rollout of a network of highdefini­tion security cameras. Right To Know was an amicus in the case. The judge ultimately sided with Vumacam.

I asked Vumacam what measures it had in place to address the obvious privacy concerns surveillan­ce raised. CEO Ricky Croock replied that the “system security and data privacy standards [are] of a world class standard”.

“Dark screen technology with system-driven alerts means nobody is ever able to watch a feed constantly, but we are always alerted to crime incidents with the help of AI,” he explained.

Furthermor­e, he said the data was anonymised and “only drawn when needed for investigat­ions — under highly secure and audited conditions. Where feeds and data are not required, it is disposed of after 30 days. Those with access to the system must pass rigorous checks and use the system under highly regulated conditions, and where footage is used for investigat­ions it is stored securely with stringent measures in place for access.”

Vumacam certainly has ticked the boxes here, hopefully enough to prevent abuses of power even if the measures will never appease the diehard privacy pundit. So, with potential for bias in the hardware, software and “wetware” (aka people), we are the butter in the hard-placerock sandwich.

We will have to remain constantly vigilant, for the bad guy in the shadows as much as in policies.

 ?? /123RF ?? Privacy concerns: People who access a security system must use it under highly regulated conditions.
/123RF Privacy concerns: People who access a security system must use it under highly regulated conditions.
 ?? ??

Newspapers in English

Newspapers from South Africa