POLICING THE FUTURE with drones, AI and facial recognition
Police are trialling facial-recognition cameras, human-hunting drones and artificial intelligence. Nicole Kobie reveals the tech on test
The police van looks like any other, but on its roof sits a pair of large cameras. Along the side, above the blue and yellow reflective paint, is the explanation: “Adnabod
Wynebau Wedi Ei Gosod”. Or, in English: “facial recognition fitted”.
This innocuous statement may well be the future of British policing – and the South Wales Police force’s facial recognition tech has already paid off with its first arrest. A man wanted for an unspecified crime was spotted by the cameras at the UEFA Champions League final in Cardiff in June.
It’s not the only trial of future tech by British police. AI is helping decide who to bail and who to keep in jail, drones are hunting for missing people, and body-worn cameras are soon to be the norm. The future is coming faster to forces elsewhere, with Dubai boasting its own “RoboCop” and a driverless car that chases down criminals ( see p126).
Alongside reducing danger to the public and officers, police are eyeing the same benefits from technology as any other organisation – saving cash, time and effort – but the downsides are potentially much larger ( see p127), with concerns over privacy invasion and algorithmic bias. Rollouts must therefore be carefully trialled and considered, with British police forces frequently working with academics for robust research into what works and what knock-on effects there may be to any new technology. For example, body-worn cameras are growing in popularity – the Metropolitan Police force is rolling out cameras made by Taser-maker Axon to 22,000 frontline staff – so the University of Cambridge studied 2,000 officers sporting the wearable recorders. Its findings? Their use cut complaints from the public by 93%.
“Cooling down potentially volatile police-public interactions to the point where official grievances against the police have virtually vanished may well lead to the conclusion that the use of body-worn cameras represents a turning point in policing,” said
POLICE TECH SPECIAL “Police are eyeing the same benefits from technology as any other organisation, but the potential downsides are much larger”
Cambridge criminologist and study author Dr Barak Ariel, at the time of publication.
AI-boosted decisions
Another such project is a joint effort between police in Durham and the University of Cambridge, who are trialling the use of artificial intelligence to help decide which people accused of crimes should be kept behind bars or sent home on bail. Working with Cambridge, Durham Constabulary has created the Harm Assessment Risk Tool, which uses data known about a suspect to decide if they’re likely to pose a risk of violent crime if released from police custody.
The system gives a rating of low, medium or high risk, although a final decision is made by a custody sergeant. The system has been tested for two years before being allowed to influence decisions; during that time, the researchers who developed it say it was accurate more than nine times out of ten.
However, the rise of AI and intelligent cameras in police work has more to do with boosting officers’ abilities rather than replacing them, notes Dr Anne Adams, senior lecturer in innovation at the Open University. “The interesting and intriguing way forward with technology is not so much in the technology itself, but how the technology merges with the intelligence of us,” she told PC Pro.
Crime prediction
It’s nigh on impossible to mention police technology without referencing Minority Report, the 2002 film based on Philip K Dick’s short story about crime prediction. Police already throw AI at datasets for predictive policing, but not in the way that story suggests.
Instead, data is analysed and mapped to predict where crime might flare up. This not only means police can be deployed to that area but also helps us understand and address the reasons behind such crime hotspots.
That idea is being trialled by Imperial College London and the Metropolitan Police, alongside a host of other universities and police forces around the country, with mathematicians and police teaming up to develop algorithms to spot potential problem areas.
“These models are based on crime data and seem to work well, but we can make improvements to provide greater levels of statistical validity of model-based predictions,” Professor Mark Girolami of Imperial College London said at the project’s launch earlier this year. “With more powerful models we can start to predict not just where, but when, and what type of crime is likely to occur.”
He warned that there are ethical issues to consider, particularly when such models are extended from predicting which neighbourhoods could become crime hotspots to spotting potential serial offenders. “For example, some police forces would like to be able to predict who might become a serial offender, and make an intervention at an early stage to change the path followed,” he noted. “The ethical issues are really huge there – should we even be thinking about such interventions?”
For that reason, he said the project will include not only mathematicians “working on theorems and proofs” but will also include psychologists and social scientists.
Where humans are better
Such tools may help support police, but Dr Adams notes they can’t outright replace officers. As one example, she points to a subset of police workers known as “superrecognisers” – people who can spot a face in the crowd and identify it as one of hundreds or thousands of wanted criminals.
While those super-recognisers can be helped in their work via databases of images being better connected, so criminals roaming the country can be more easily noted, there’s simply no computer system that’s as good at recognising faces as these people – including the Welsh facial-recognition van. “They can literally see somebody for two seconds and recognise them far better than technology,” Adams explained.
That’s especially true as police forces face challenges not only in crime, but in managing other areas of social services, such as mental health issues. “They have to deal with everything,” Dr Adams said. “It’s hard enough for a police officer to understand issues around health care… it’s not so simple that a computer can deal with it.” Drones, facial recognition and AI may make officers’ jobs easier, but policing still needs a human touch.