The Washington Post

FBI, DOD immersed in facial recognitio­n research

Files reveal their hand in academic work that could aid surveillan­ce

- BY DREW HARWELL

The FBI and the Defense Department were actively involved in research and developmen­t of facial recognitio­n software that they hoped could be used to identify people from video footage captured by street cameras and flying drones, according to thousands of pages of internal documents that provide new details about the government’s ambitions to build out a powerful tool for advanced surveillan­ce.

The documents, revealed in response to an ongoing Freedom of Informatio­n Act lawsuit the American Civil Liberties Union filed against the FBI, show how closely FBI and Defense officials worked with academic researcher­s to refine artificial-intelligen­ce techniques that could help in the identifica­tion or tracking of Americans without their awareness or consent.

Many of the records relate to the Janus program, a project funded by the Intelligen­ce Advanced Research Projects Agency, or IARPA, the high-level research arm of the U.S. intelligen­ce community modeled after the Pentagon’s Defense Advanced Research Projects Agency, known as DARPA.

Program leaders worked with FBI scientists and some of the nation’s leading computer-vision experts to design and test software that would quickly and accurately process the “truly unconstrai­ned face imagery” recorded by surveillan­ce cameras in public places, including subway stations and street corners, according to the documents, which the ACLU

shared with The Washington Post.

In a 2019 presentati­on, an IARPA program manager said the goal had been to “dramatical­ly improve” the power and performanc­e of facial recognitio­n systems, with “scaling to support millions of subjects” and the ability to quickly identify faces from partially obstructed angles. One version of the system was trained for “Face ID . . . at target distances” of more than a half-mile.

To refine the system’s capabiliti­es, researcher­s staged a datagather­ing test in 2017, paying dozens of volunteers to simulate real-world scenarios at a Defense Department training facility made to resemble a hospital, a subway station, an outdoor marketplac­e and a school, the documents show. The test yielded thousands of surveillan­ce videos and images, some of which were captured by a drone.

The improved facial recognitio­n system was ultimately folded into a search tool, called Horus, and made available to the Pentagon’s Combating Terrorism Technical Support Office, which helps provide military technologi­es to civilian police forces, the documents show.

The Horus tool has since been offered for use to at least six federal agencies, and their feedback is “continuing to be used to refine the tool,” Department of Homeland Security officials said last year.

The internal emails, presentati­ons and other records offer an unmatched look at the way the nation’s top law enforcemen­t agency and military have aggressive­ly pursued a technology that could be used to undermine Americans’ privacy and already has a counterpar­t in mass surveillan­ce systems in London, Moscow and across China.

The documents also show that federal officials were more closely involved in the technology’s developmen­t than was previously known, even as three states and more than a dozen cities passed laws banning or restrictin­g the technology’s use by local police.

No federal laws regulate how facial recognitio­n systems can be used. Sen. Edward J. Markey (DMass.) said Tuesday he intends this year to push again for a bill, first introduced in 2020, that would restrict how federal agencies can tap facial recognitio­n and other biometric search techniques.

“Americans’ ability to navigate our communitie­s without constant tracking and surveillan­ce is being chipped away at an alarming pace,” Markey said in a statement to The Post. “We cannot stand by as the tentacles of the surveillan­ce state dig deeper into our private lives, treating every one of us like suspects in an unbridled investigat­ion that undermines our rights and freedom.”

The tool’s use in domestic mass surveillan­ce would be a “nightmare scenario,” said Nathan Wessler, a deputy director at the ACLU. “It could give the government the ability to pervasivel­y track as many people as they want for as long as they want. There’s no good outcome for that in a democratic society.”

The FBI said in a statement it “is committed to responsibl­e use of facial recognitio­n technology ensuring it appropriat­ely respects individual­s’ privacy and civil liberties.” A Defense Department official acknowledg­ed a request for comment but did not respond to a list of questions by the time of publicatio­n. An IARPA spokeswoma­n said the agency is focused on developing the technology rather than how it is applied.

Federal officials have often argued that the technology is an irreplacea­ble tool for fighting terrorism and crime.

The documents are as recent as 2019, when the ACLU requested and then sued for the records’ release, and they offer no detail on how the research is currently used or deployed. In the years since, facial recognitio­n technology has become widely used by federal investigat­ors and local police.

A Government Accountabi­lity Office audit in 2021 found that 20 federal agencies, including the U.S. Postal Service and the Fish and Wildlife Service, had used facial recognitio­n in some capacity, though most of the agencies did “not have awareness” of which tools employees were using and had “therefore not fully assessed the potential risks.”

The documents offer an intimate look at the kinds of daily technical decisions researcher­s have made in recent years to capitalize not just on breakthrou­ghs in artificial intelligen­ce and computer imaging but also on the fast-growing trove of data related to Americans’ personal lives.

In some emails, FBI scientists talk with academic researcher­s and technical specialist­s about the facial recognitio­n tool in close detail, including discussing how the system processed informatio­n about a photo of a face using attributes such as “face rectangle x start coordinate,” “pitch of the head” and “probabilit­y of being male.”

And in a presentati­on given at a forensic-science luncheon in Baltimore in 2019, an FBI senior scientist said that some of the “biggest enablers of better face recognitio­n” included “cellphones with cameras” and “social media.”

Named for the two-faced Roman god of beginnings and gateways, Janus launched in 2014 with the goal of “radically expanding the scenarios in which automated face recognitio­n can establish identity,” the documents show.

At that time, federal investigat­ors seeking to use facial recognitio­n were limited largely to databases of “constraine­d” photos from passports or driver’s licenses to help identify suspects, victims and witnesses of people recorded near the scene of a crime, using slow and imprecise algorithms that tended to “severely underutili­ze and under-exploit all available face informatio­n in a video,” as one research filing said.

Research teams were tasked with developing new algorithms that could help investigat­ors tap into a new generation of surveillan­ce footage, allowing for instant identifica­tion and the ability to track the same person’s face across multiple videos and camera angles. The goal was to “change video from an impediment to an advantage,” one document states.

Erik Learned-miller, a University of Massachuse­tts at Amherst professor who was part of one of the research teams, said federal officials involved in the program were careful to make “a distinctio­n between things they were willing to do for American citizens and capabiliti­es they wanted to develop for use in the rest of the world.”

The research, he said, was aimed at improving the performanc­e of a technology that was already seeing increasing use by law enforcemen­t at the local, state and federal levels. And the mission was hard to refuse: An FBI official at one point spoke to the researcher­s about how the system would be used to identify the perpetrato­rs in videos of child sexual abuse.

But the research’s lofty goals did at times leave him wondering how the work might be put to broader long-term use. “The question always in the back of my mind was: What does the intelligen­ce community really want to do with this stuff ?” he said.

The Janus project officially ended in 2020, though its work was then folded into the webbased interface Horus — named for another deity, the falconhead­ed Egyptian god of the sky. IARPA said in public filings that the Janus program had helped advance “virtually every aspect of fundamenta­l face recognitio­n research” and led to algorithms that were “twice as accurate as the most widely used government­off-the-shelf systems.”

The Janus research marked only a fraction of the FBI’S sizable technical interest in facial recognitio­n. The agency’s Interstate Photo System uses a facial recognitio­n search tool, available to state and local police, that can scan through tens of millions of jail booking photos as well as images of people’s scars and tattoos.

The photo system is part of a broader FBI biometric database, called Next Generation Identifica­tion, that contains the fingerprin­ts, palm prints, face photos and eye patterns collected from millions of people applying for citizenshi­p, getting booked into jail or requesting job background checks.

Among the documents revealed by the ACLU, one Interstate Photo System how-to guide tells investigat­ors that they must use it for “investigat­ive purposes only” but that “it is the responsibi­lity of the user agency to develop appropriat­e usage policies.”

The documents also include forms that local police officers can use to submit a photo to the FBI’S Facial Analysis, Comparison and Evaluation (FACE) Services Unit, which then runs it through a facial recognitio­n search and returns possible matches. Officers can use the form to request the photos also be run through a biometric database of foreign citizens and combatants run by the Defense Department and the passport and visa photos managed by the State Department, the documents show.

In 2019, government auditors said the FBI had access to more than 640 million face photos and that the FACE unit had run more than 390,000 facial recognitio­n searches over the previous eight years.

Beyond government-sponsored research, federal agencies have also paid for access to private facial recognitio­n systems. The FBI signed a $120,000 contract earlier this year with Clearview AI, maker of a facial recognitio­n tool that uses face photos taken without consent from across social media and the public internet. FBI officials said in the contract they were paying for “a search engine of publicly available images . . . to be used in ways that ultimately reduce crime.”

The Defense Department last year also awarded a nearly $730,000 contract to the video firm Realnetwor­ks for facial recognitio­n software that could be used on autonomous drones for “identifica­tion and intelligen­cegatherin­g” purposes, a contract shows.

Civil liberties advocates have warned that facial recognitio­n research could hasten the technology’s rollout for real-time public surveillan­ce in the United States. Facial recognitio­n systems have also been shown in research to perform worse when assessing the faces of people of color, and they have been blamed for several recent cases in which Black men were wrongfully arrested for crimes they did not commit.

Proposals to use facial recognitio­n for mass-surveillan­ce efforts in the United States, as they are used in China, have faced public backlash. FBI Director Christophe­r A. Wray said in January that he was concerned about the Chinese government’s developmen­t of similar technologi­es, saying, “AI is a classic example of a technology where I have the same reaction every time. I think, ‘Wow, we can do that?’ And then I think, ‘Oh, God, they can do that.’”

Clare Garvie, an attorney with the National Associatio­n of Criminal Defense Lawyers who has studied facial recognitio­n, said the technology has become “a ubiquitous, routine forensic investigat­ive technique” in the years since the Janus program began.

But the lack of transparen­cy into how widely the technology is used, along with ongoing questions of how reliable it is when used to identify criminal suspects, raises the risk of dangerous misuse.

“It’s one thing for a company or research entity to say the technology discretely performs in [some] capacity in a lab. It is quite another to say it’s ready to be used from drones or on low-quality surveillan­ce images for the purpose of making arrests,” Garvie said. “We’re essentiall­y beta-testing technology on real people with real-world consequenc­es.”

“We cannot stand by as the tentacles of the surveillan­ce state dig deeper into our private lives, treating every one of us like suspects in an unbridled investigat­ion that undermines our rights and freedom.” Sen. Edward J. Markey (D-mass.)

 ?? Mark Lennihan/associated Press ?? A surveillan­ce camera hangs above a subway platform in Brooklyn in 2020. Documents revealed in response to a lawsuit show that the Defense Department and the FBI were actively involved in research and developmen­t of facial recognitio­n software that they hoped could be used to identify people from footage captured by such cameras.
Mark Lennihan/associated Press A surveillan­ce camera hangs above a subway platform in Brooklyn in 2020. Documents revealed in response to a lawsuit show that the Defense Department and the FBI were actively involved in research and developmen­t of facial recognitio­n software that they hoped could be used to identify people from footage captured by such cameras.

Newspapers in English

Newspapers from United States