Look who’s watching you now
A small startup in Manhattan is – depending on your viewpoint – providing police with the ultimate hi-tech crime-fighting tool or posing an existential threat to citizens’ privacy.
Clearview AI has created a powerful facial recognition tool by scraping billions of images of individuals from publicly available sites, such as Facebook and YouTube, which can be searched in seconds.
Its technology is being used by about 600 law enforcement agencies in the US to identify criminal suspects and the company says its new research tool has helped authorities ‘‘track down hundreds of at-large criminals, including paedophiles, terrorists and sex traffickers’’ by comparing images of individuals with to the vast database of faces.
Tim Berners-Lee, the British founder of the world wide web, has warned in the past of how the internet risks becoming a ‘‘tool for the automated surveillance and control of its users’’. Is the technology developed by Clearview what he had in mind?
Founded by Hoan Ton-That, an Australian computer expert, Clearview’s website boasts of its successes, quoting a testimonial from a detective in a sex crimes unit who said police were able to identify eight offenders and victims in just over a week and a half, thanks to the tool.
A Clearview spokesman told The Daily Telegraph it has an accuracy rating of 99.6 per cent.
‘‘We are the only facialrecognition provider with a data set of billions of photos generated from publicly available sources.
‘‘Clearview search results can never be used as the sole evidence for the purpose of an arrest.
‘‘Search results established through Clearview and its related systems and technologies are indicative and not definitive.’’
So far, police have used the technology to help them solve everything from shoplifting and fraud to murder and child exploitation. But while it may be known to US law enforcement, Clearview’s technology has largely slipped under the public radar for the past three years.
That is beginning to change, however. The New York Times recently revealed the extent of the company’s operations. It also analysed the computer code underlying Clearview’s app to uncover details about how the tool could be used alongside augmented reality glasses. That could mean users could identify every stranger they meet.
Clearview claims it has no plans to release such a feature.
Nevertheless, the discovery has caused a stir among privacy advocates.
‘‘This is a disturbing demonstration of the dangerous reality of face recognition technology today, and the urgent need for US lawmakers to immediately halt law enforcement use of it,’’ said Nathan Freed Wessler, staff attorney at the American Civil Liberties Union. ‘‘Police should not be able to investigate and prosecute us by secretly running an error-prone and unregulated technology provided by an untested startup that has shadily assembled a database of billions of face scans of everyday Americans,’’ Freed Wessler said.
Some believe the app has considerable potential. ‘‘I believe it certainly can be useful,’’ a retired senior US police officer told The Telegraph. ‘‘Clearly there are concerns from the privacy standpoint. But we have courts which can balance the needs of law enforcement with those of the individual.’’
Adam Scott Wandt, at John Jay College of Criminal Justice, said it could open up new opportunities for law enforcement. Traditionally, he said, facial recognition had been used only by Homeland Security and the most elite of law enforcement agencies. Now, local police forces are getting access to quality facial recognition technology – and that is proving to be a game changer.
‘‘Here in New York a couple of months ago, we had a kid put some fake pressure bombs in the subway,’’ he said.
‘‘Police knew almost instantaneously who that person was.
‘‘It gave them a head start; they knew who he was before they knew the bombs were fake.’’
Anil Jain, professor of computer science at Michigan State University, said there were some unanswered questions. For instance, since there are multiple images of the same face, it is difficult to know how many unique identities there are in the database. ‘‘What happens if the query does not have a mate in the database? How often will the query be matched with a nonmate?’’ Jain asked.
Backlash to the company’s business model is likely to be particularly strong in Europe, where lawmakers are considering banning facial recognition technology in public areas for up to five years to give it time to work out how to prevent abuses.
Britain has been more open to the idea. In September, the High Court ruled in favour of South Wales police, allowing the force to use automatic facial recognition technology.
In London, the Metropolitan Police is still evaluating how it will proceed with facial recognition, having carried out a trial in which the technology was used on 10 occasions since 2016.
A Scotland Yard spokesman said: ‘‘We know from the trials that live facial recognition technology could assist our officers to locate criminals wanted for serious and violent offences, such as knife and gun crime, and the sexual exploitation of children.’’
Amid the debate over its use, Clearview’s technology has highlighted how our digital footprints can be used against us.
And, if a small startup can unmask strangers using artificial intelligence, a larger company may be able to reveal far more than we could ever imagine. – The Daily Telegraph
‘‘Clearview search results can never be used as the sole evidence for the purpose of an arrest. Search results established through Clearview and its related systems and technologies are indicative and not definitive.’’