The Guardian (USA)

Did you protest recently? Your face might be in a database

- Evan Selinger and Albert Fox Cahn

In recent weeks, millions have taken to the streets to oppose police violence and proudly say: “Black Lives Matter.” These protests will no doubt be featured in history books for many generation­s to come. But, as privacy researcher­s, we fear a darker legacy, too. We know that hundreds of thousands of photos and videos of protesters have been recorded and uploaded online. They could remain there indefinite­ly, only to be dredged up decades later. It is for this reason that we must ask whether those photos could end up in a facial recognitio­n database.

We know that, in the United States, at least one in four law enforcemen­t agencies are able to use facial recognitio­n technology– considered one of the most dangerous surveillan­ce tools by privacy researcher­s – with little oversight. While it may take months, even years, to know the full scope of how facial recognitio­n has been used in the most recent protests, police department­s have used everything from military grade drones to body-cams with live facial recognitio­n capability.

In New York City alone, the NYPD used facial recognitio­n more than 8,000 times last year, including in conjunctio­n with its so-called “gang database” of 42,000 New Yorkers, overwhelmi­ngly New Yorkers of color. Police could potentiall­y retaliate against protesters by adding their names to databases and singling them out for unjustifie­d, follow-up monitoring and “selective enforcemen­t of unrelated matters”, like minor traffic offenses.

Aside from the ethics of diminishin­g people’s obscurity when they are in public and stripping away their right to do lawful things like protest anonymousl­y, there is a real risk of misidentif­ication through this technology.

In recent weeks, we’ve begun to hear from victims of facial recognitio­n – people like Robert Williams, who was wrongfully put behind bars because police were swayed by a biased and broken facial recognitio­n algorithm that wrongfully matched him as the perpetrato­r of a crime he didn’t commit. Mr Williams’ case highlights how facial recognitio­n can produce results that are prejudiced against Black and Latinx Americans and create disproport­ionately false “matches” and a higher risk of wrongful arrest for them. And just as importantl­y, Mr Williams explains that bias is only part of the problem: “Even if this technology does become accurate … I don’t want my daughters’ faces to be part of some government database. I don’t want cops showing at their door because they were recorded at a protest the government didn’t like.”

Back in 2016, the police reportedly used facial recognitio­n to find and arrest some people who protested about Freddie Gray’s death who they believed had outstandin­g arrest warrants. Today, police department­s around the country and the FBI are asking for “videos or images” that can link protesters to violence and destructio­n. These requests are happening even though it’s well documented that law enforcemen­t agencies, including the Minneapoli­s police department, have used Clearview AI’s facial recognitio­n technology.

This noxious company scraped the internet to compile a name-face database of 3bn faces, which is why Senator Ed Markey recently wrote the company’s chief executive to “ensure its product is not being used to monitor protests against police brutality.” While IBM announced it’s out of the facial recognitio­n technology business, Amazon won’t sell facial recognitio­n technology to the police for a year, and Microsoft won’t sell facial recognitio­n to the police “until there is a strong national law grounded in human rights”, Clearview AI remains all in.

Are the police definitely using facial recognitio­n right now to track protesters? Nobody knows. Since law enforcemen­t has been criticized for not being transparen­t about its use of facial recognitio­n technology and the FBI and protesters are shining a spotlight on a lack of transparen­cy as a systemic policing problem, every protester at a Black Lives Matter protest and every journalist covering one should assume they could be.

What can be done? Facial recognitio­n technology should be banned. This agenda needs as much support as can be mustered. Calls to defund the police and stop providing them with facial recognitio­n technology are gaining momentum, which is a good first step. But as Tim Maughan rightly argues: “We must not allow private contractor­s and technology companies to seep in, fill the void, and repeat – or even exacerbate – the same disastrous mistakes.”

This leaves risk-mitigation strategies in the hands of two groups. Protesters can help protect one another by using tools to obscure faces and erase metadata. And journalist­s shouldn’t publish any images that the police can use to track a protester’s identity unless they have explicit consent to do so.

Journalist­s might be wary of stepping up. After all, outdated legal doctrines hold that people lack a reasonable expectatio­n of privacy when they’re in public. As a result, journalist­s have a legal right to photograph whomever they choose at these newsworthy events. Furthermor­e, journalist­s might believe they are ethically barred from manipulati­ng “the content of a photograph in any way”.

But this restrictio­n conflicts with their duty to “give special considerat­ion to vulnerable subjects” and “minimize harm”. Journalist­s have the privilege and responsibi­lity of doing what they can to protect protesters who are living in a society that has yet to come to terms with the fact that analog assumption­s about what’s private and public no longer hold in the face of modern police surveillan­ce.

This isn’t the first time protesters are risking their safety and wellbeing standing up for justice. Sadly, it won’t be the last. Since facial recognitio­n technology poses an unpreceden­ted threat, every possible precaution needs to be taken.

Evan Selinger is a professor of philosophy at Rochester Institute of Technology. Albert Fox Cahn is the founder and executive director of the Surveillan­ce Technology Oversight Project (Stop) at the Urban Justice Center, a New York-based civil rights and privacy group and a fellow at the Engelberg Center for Innovation Law & Policy at NYU School of Law

This isn’t the first time protesters are risking their safety and wellbeing standing up for justice

 ??  ?? ‘In New York City alone, the NYPD used facial recognitio­n more than 8,000 times last year.’ Photograph: David McNew/AFP/Getty Images
‘In New York City alone, the NYPD used facial recognitio­n more than 8,000 times last year.’ Photograph: David McNew/AFP/Getty Images

Newspapers in English

Newspapers from United States