‘You need the tried and true police work’
Measure would limit use of facial recognition technology amid concerns about its accuracy
The Hartford City Council is considering a measure to protect city residents from the use of facial recognition technology by police to establish probable cause for a criminal charge.
City councilman Nick Lebron, who proposed the resolution, said police are allowed per state law to use facial recognition software as an investigatory tool, but his measure makes it clear that the technology can’t be “the smoking gun,” in a case, he said.
“You need the tried and true police work,” Lebron said. “This is to protect citizens from facial recognition software.”
Hartford Lt. Aaron Boisvert said the Hartford Police Department doesn’t currently have or use facial recognition software.
Lebron said the resolution would protect residents, as well as the city, from lawsuits that could stem from misidentification.
The resolution describes facial recognition as “a biometric technology that uses distinguishable facial features to identify a person.” The technology is used in numerous ways in the world, including to unlock phones, go through security at the airport and identify and tag people on social media.
Lebron’s resolution acknowledges that technology helps to find missing people, protects businesses against theft by strengthening safety and security measures and helps law enforcement to identify criminals.
“While the algorithms that steer the facial recognition technology are effective to different degrees, it is important that the use of facial data require guidelines and human oversight to narrow the margin of error and bias,” the resolution states.
Lebron said bias is a major flaw of the technology because while it can identify white males with 99% accuracy, that number plummets in accurately identifying African-Americans and women to about 70%. For Black women, the error rate is even higher, he said.
University of New Haven assistant professor of computer science Vahid Behzadan, who does research on artificial intelligence, said the technology, using algorithms, takes tracks of identifiable characteristics, stores them as numbers and the numbers
come up again if one searches the database to match digital records.
He said facial recognition technology is “not definitive” and can be “manipulated intentionally.” The technology is also inherently prone to misrecognition.
There’s a reason why the identifications are more accurate for white males, Behzadan said. That’s because the features used for the data set are mostly white male faces.
Facial recognition technology is a biometric tool like DNA and fingerprints, which are run through a database to identify people.
Faces age over time — in a way that DNA and fingerprints don’t — and that further decreases accuracy over time, he said. After four years, the error rate of an image “goes high,” Behzadan said.
Lighting and angles can also affect the accuracy of identification.
Mismatches “increase greatly,” the larger the database, Behzadan added. Accuracy decreases when using a database for all of Connecticut or all of the United
States.
Connecticut lawmakers in 2016 regulated the use of the technology and allows it to be used by law enforcement. It does not address the probable cause issue.
The heart of Lebron’s measure that was received by the council states, “if and after identification has been made through facial recognition, further investigation is needed to develop probable cause to arrest.”
That line gives a “shimmer of hope,” said Behzadan. “There’s quite a bit of room for error.”
Lebron said cameras already raise a lot of suspicion in the community, and the resolution is “protection for our citizens from big brother watching.”
“We have to improve relations between police and the community,” Lebron said.