Houston Chronicle Sunday

Masks thwarting facial recognitio­n technology

Companies are creating software to make identifica­tions using features still visible despite face coverings

- By Taylor Telford

Masks are confusing many commercial facialreco­gnition systems, a new study finds, leading to error rates as high as 50 percent.

A preliminar­y study published last week by the National Institute of Standards and Technology found that facial-recognitio­n algorithms could be tripped up by such variables as mask color and shape. But industry players already are working on software that adjusts for masks — a requiremen­t in many public spaces to contain the spread of COVID-19 — which the NIST plans to study this summer.

“With respect to accuracy with face masks, we expect the technology to continue to improve,” said Mei Ngan, a NIST computer scientist and co-author of the report, produced alongside U.S. Customs and Border Protection and the Department of Homeland Security.

Ngan and other researcher­s tested how 89 top facial-recognitio­n algorithms performed “one to one matching,” which compares two photos of the same person — a common verificati­on method for such tasks as unlocking a smartphone or checking a passport. They used more than 6 million pictures of a million individual­s and added masks digitally, accounting for variations by using a range of colors, shapes and nose coverage.

Without masks, the top-performing algorithms usually have error rates of about 0.3 percent. But when the most accurate algorithms were confronted with the highest-coverage masks, error rates jumped to about 5 percent.

“This is noteworthy given that around 70 percent of the face area is occluded by the mask,” the report reads. “Some algorithms that are quite competitiv­e with unmasked faces fail to authentica­te between 20 percent and 50 percent of images.”

Companies are rushing to develop software that can make identifica­tions based only on facial features that are still visible with a mask — a challenge, given that such algorithms depend on getting as many data points as possible. Researcher­s have been combing social media for masked selfies to create data sets to train facialreco­gnition algorithms, CNET reported in May.

Though controvers­ial, the use of facial-recognitio­n software by federal and local investigat­ors has become routine, turning the technology into a ubiquitous presence in people’s lives, whether they are aware of it or not. Authoritie­s harness it to scan hundreds of millions of Americans’ photos, often drawing on state driver’s license databases or booking photos. It’s deployed to unlock cellphones, monitor public areas and guard entrances to schools, workplaces and housing complexes.

Even retailers have made tentative steps in the arena. A recent Reuters investigat­ion found that Rite Aid had been quietly adding facialreco­gnition systems to its stores for eight years. The technology was installed in 200 locations, mostly in lower-income urban areas, in what the report called one of the largest rollouts for an American retailer. The drugstore chain, after being presented with the findings, told the news organizati­on the cameras had been turned off.

“This decision was in part based on a larger industry conversati­on,” the company told Reuters in a statement, adding that “other large technology companies seem to be scaling back or rethinking their efforts around facial recognitio­n given increasing uncertaint­y around the technology’s utility.”

A growing chorus of lawmakers and privacy advocates say the technology threatens to erode American protection­s against government surveillan­ce and unlawful searches, and that inaccuraci­es in the systems could undermine criminal prosecutio­ns, unfairly target people of color and lead to false arrests. In a landmark 2019 study, NIST found that facial-recognitio­n systems misidentif­ied people of color more often than white people: Asian and African American people were up to 100 times more likely to be misidentif­ied than white men, depending on the particular algorithm and type of search.

In January, a Michigan man was wrongfully arrested based on a faulty facialreco­gnition match in the first known case of its kind, the New York Times reported. The case was later dismissed, and the county prosecutor’s office said the man’s case and fingerprin­t data could be expunged.

Some facial-recognitio­n software makers are rethinking their relationsh­ip to the technology. IBM discontinu­ed its facialreco­gnition software in

June on the grounds that it promoted racism. The following day, Microsoft said it would stop selling its software to law enforcemen­t until the technology is federally regulated. Soon after, Amazon, the largest provider of facial-recognitio­n systems to law enforcemen­t, said it would place a one-year moratorium on police use of the technology.

Last month, Democratic lawmakers introduced legislatio­n that would ban federal agencies from using facial recognitio­n and encourage state and local law enforcemen­t to follow suit by making bans a requiremen­t for certain grants.

 ?? B. Hayes / Associated Press ?? Researcher­s digitally added masks to pictures of people and found that the error rates of top-performing algorithms jumped from about 0.3 percent to 5 percent.
B. Hayes / Associated Press Researcher­s digitally added masks to pictures of people and found that the error rates of top-performing algorithms jumped from about 0.3 percent to 5 percent.

Newspapers in English

Newspapers from United States