Facial recognition proposed for police body cameras
Taser parent company wants to pursue technology; privacy advocates concerned
The country’s biggest seller of police body cameras on Thursday convened a corporate board devoted to the ethics and expansion of artificial intelligence, a major new step toward offering controversial facial-recognition technology to police forces nationwide.
Axon, the maker of Taser electroshock weapons and the wearable body cameras now used by most major American city police departments, has voiced interest in pursuing face recognition for its body-worn cameras. The technology could allow officers to scan and recognize the faces of potentially everyone they see while on patrol. A growing number of surveillance firms and tech startups are racing to integrate face recognition and other AI capabilities into real-time video.
The board’s first meeting will likely presage an imminent showdown over the rapidly developing technology. Shortly after the board was announced, a group of 30 civil rights, technology and privacy groups, including the American Civil Liberties Union and the NAACP, sent members a letter voicing “serious concerns with the current direction of Axon’s product development.”
The letter urged an outright ban on face recognition, which it called “categorically unethical to deploy” because of the technology’s privacy implications, technical imperfections and potentially life-threatening biases. Most facial-recognition systems, recent research found, perform far less accurately when assessing people with darker skin, opening the potential to an AI-enabled officer misidentifying an innocent person as a dangerous fugitive.
Axon’s founder and chief executive, Rick Smith, said the company is not currently building facial-recognition systems but said the technology is “under active consideration.” He acknowledged the potential for “bias and misuse” in face recognition but said the potential benefits are too promising to ignore.
“I don’t think it’s an optimal solution, the world we’re in today, that catching dangerous people should just be left up to random chance, or expecting police officers to remember who they’re looking for,” Smith said. “It would be both naive and counterproductive to say law enforcement shouldn’t have these new technologies. They’re going to, and I think they’re going to need them. We can’t have police in the 2020s policing with technologies from the 1990s.”
Axon held the board’s first meeting Thursday morning at its Arizona headquarters with eight company-selected experts in AI, civil liberties and criminal justice. The board, whose members are paid volunteers and have no official veto power, will be asked to advise the company on “future capabilities Axon’s AI Research team is working on to help increase police efficiency and efficacy,” the company said in a statement.
Face recognition has long had major appeal for law enforcement and government surveillance, and recent advances in AI development and declining camera and hardware costs have spurred developers to suggest it could be applied for broader use. Roughly 117 million American adults, or about half the country, can be found in the vast facial-recognition databases used by local, state and federal law enforcement, Georgetown Law School researchers estimated in 2016.
Faces are regarded as a quick, reliable way to identify someone from video or afar — and, in some cases, seen as easier to acquire than other “biometric identifiers,” such as fingerprints, that demand close proximity and physical contact. The Department of Homeland Security scans the faces of international travelers at many of the biggest airports, and plans to expand to every travelers flying overseas.
But critics say facial-recognition systems are still unproven in their ability to uniquely identify someone. Faces age and change because of circumstance, and they aren’t always that unique. Identical twins, for instance, have been shown to be able to fool the facial-recognition systems used to unlock Apple’s iPhone X.
“Real-time face recognition would chill the constitutional freedoms of speech and association, especially at political protests,” the letter from the dissenting groups states. It “could also prime officers to perceive individuals as more dangerous than they really are and to use more force than the situation requires. No policy or safeguard can mitigate these risks sufficiently well for real-time face recognition ever to be marketable.”
Axon has moved aggressively to corner the market on police technologies, offering free oneyear trials for its body cameras and online storage to police departments nationwide. The company said in February that more than half of the major city law-enforcement agencies in the United States have bought Axon body cameras or software.
The company, which changed its name last year from Taser International, also advertises itself as “the largest custodian of public safety data in the U.S.,” saying more than 20 petabytes — or 20 million gigabytes — of police photos, body-camera video and other criminal-investigation documents have been uploaded to its cloud-storage service.
Police video is seen as a major growth market for AI-development firms, both for real-time surveillance and after-crime review: One company, BriefCam, allows city officials and police investigators to narrow hours of video down into seconds by filtering only the footage of, for instance, red trucks or men with suitcases. Axon’s long-established contracts with police forces could push the technology’s real-world deployment rapidly forward. Instead of signing new deals with tech firms, police departments with Axon body cameras could push facial-recognition features to its officers in potentially the same way they apply a software update.
Face recognition is one of the most competitive and hotly debated subsets of AI, with Apple, Facebook and Google all devoting teams to expanding its use in security, photo tagging and search.