Face-photo databases proliferate
Critics worry about privacy, potential for images’ misuse
SAN FRANCISCO — Dozens of databases of people’s faces are being compiled without their knowledge by companies and researchers, with many of the images then being distributed around the world in what has become a vast ecosystem fueling the spread of facial-recognition technology.
The databases are pulled together with images from social networks, photo websites, dating services such as OkCupid, and cameras placed in restaurants and on college campuses. While there is no precise count of the data sets, privacy activists have pinpointed repositories that were built by Microsoft, Stanford University and others, with one holding more than 10 million images while another had more than 2 million.
The compilations are being driven by the race to create leading facial-recognition systems. This technology learns how to identify people by analyzing as many digital pictures as possible using “neural networks,” which are complex mathematical systems that require vast amounts of data to build pattern recognition.
Tech giants such as Facebook and Google have most likely amassed the largest data sets, which they do not distribute, according to research papers. But other companies and universities have widely shared their image troves with researchers, governments and private enterprises in Australia, China, India, Singapore and Switzerland for training artificial intelligence, according to academics, activists and public papers.
Questions about the data sets are rising because the technologies that they have enabled are being used in potentially invasive ways. Documents released earlier this month revealed that Immigration and Customs Enforcement officials employed facial-recognition technology to scan motorists’ photos and identify migrants in the country illegally.
There is no oversight of the data sets. Activists and others said they were angered by the possibility that people’s likenesses had been used to build ethically questionable technology and that the images could be misused. At least one facial database created in the United States was provided to a company in China that has been linked to ethnic profiling of the country’s minority Uighur Muslims.
Over the past several weeks, some companies and universities, including Microsoft and Stanford, removed their facial data sets from the Internet because of privacy concerns. But given that the images were already so well distributed, they are most likely still being used in the United States and elsewhere, researchers and activists said.
“You come to see that these practices are intrusive, and you realize that these companies are not respectful of privacy,” said Liz O’Sullivan, who oversaw one of these databases at the artificial-intelligence startup Clarifai. She said she left the New York-based company in January to protest such practices.
Google, Facebook and Microsoft declined to comment.
One database, which dates to 2014, was put together by researchers at Stanford. It’s called Brainwash, after a San Francisco cafe of the same name, where the researchers tapped into a camera.
According to research papers, it was used in China by academics associated with the National University of Defense Technology and Megvii, an artificial-intelligence company that The New York Times previously reported has provided surveillance technology for monitoring Uighurs.
The Brainwash data set was removed from its original website last month after Adam Harvey, an activist in Germany who tracks the use of these repositories through a website called MegaPixels, drew attention to it.
“As part of the research process, Stanford routinely makes research documentation and supporting materials available publicly,” a university official said. “Once research materials are made public, the university does not track their use nor did university officials.”
Stanford researchers who oversaw Brainwash did not respond to requests for comment.
Matt Zeiler, founder and chief executive of Clarifai, said his company had built a facial database with images from OkCupid, a dating site. He said Clarifai had access to OkCupid’s photos because some of the dating site’s founders invested in his company.
He added that he had signed a deal with a large social media company — he declined to disclose which — to use its images in training facial-recognition models. The social network’s terms of service allow for this kind of sharing, he said.
An OkCupid spokeswoman said Clarifai contacted the company in 2014 “about collaborating to determine if they could build unbiased AI and facial recognition technology” and that the dating site “did not enter into any commercial agreement then and have no relationship with them now.”
Clarifai used the images from OkCupid to build a service that could identify the age, sex and race of detected faces, Zeiler said.
Zeiler said Clarifai would sell its facial-recognition technology to foreign governments, military operations and police departments, provided the circumstances were right. It did not make sense to place blanket restrictions on the sale of technology to countries, he added.