The Post

Look who’s watching you now

- United States

A small startup in Manhattan is – depending on your viewpoint – providing police with the ultimate hi-tech crime-fighting tool or posing an existentia­l threat to citizens’ privacy.

Clearview AI has created a powerful facial recognitio­n tool by scraping billions of images of individual­s from publicly available sites, such as Facebook and YouTube, which can be searched in seconds.

Its technology is being used by about 600 law enforcemen­t agencies in the US to identify criminal suspects and the company says its new research tool has helped authoritie­s ‘‘track down hundreds of at-large criminals, including paedophile­s, terrorists and sex trafficker­s’’ by comparing images of individual­s with to the vast database of faces.

Tim Berners-Lee, the British founder of the world wide web, has warned in the past of how the internet risks becoming a ‘‘tool for the automated surveillan­ce and control of its users’’. Is the technology developed by Clearview what he had in mind?

Founded by Hoan Ton-That, an Australian computer expert, Clearview’s website boasts of its successes, quoting a testimonia­l from a detective in a sex crimes unit who said police were able to identify eight offenders and victims in just over a week and a half, thanks to the tool. A Clearview spokesman told

The Daily Telegraph it has an accuracy rating of 99.6 per cent.

‘‘We are the only facialreco­gnition provider with a data set of billions of photos generated from publicly available sources.

‘‘Clearview search results can never be used as the sole evidence for the purpose of an arrest.

‘‘Search results establishe­d through Clearview and its related systems and technologi­es are indicative and not definitive.’’

So far, police have used the technology to help them solve everything from shopliftin­g and fraud to murder and child exploitati­on. But while it may be known to US law enforcemen­t, Clearview’s technology has largely slipped under the public radar for the past three years.

That is beginning to change,

however. The New

York Times recently revealed the extent of the company’s operations. It also analysed the computer code underlying Clearview’s app to uncover details about how the tool could be used alongside augmented reality glasses. That could mean users could identify every stranger they meet.

Clearview claims it has no plans to release such a feature.

Neverthele­ss, the discovery has caused a stir among privacy advocates.

‘‘This is a disturbing demonstrat­ion of the dangerous reality of face recognitio­n technology today, and the urgent need for US lawmakers to immediatel­y halt law enforcemen­t use of it,’’ said Nathan Freed Wessler, staff attorney at the American Civil Liberties Union. ‘‘Police should not be able to investigat­e and prosecute us by secretly running an error-prone and unregulate­d technology provided by an untested startup that has shadily assembled a database of billions of face scans of everyday Americans,’’ Freed Wessler said.

Some believe the app has considerab­le potential. ‘‘I believe it certainly can be useful,’’ a retired senior US police officer told The

Telegraph. ‘‘Clearly there are concerns from the privacy standpoint. But we have courts which can balance the needs of law enforcemen­t with those of the individual.’’

Adam Scott Wandt, at John Jay College of Criminal Justice, said it could open up new opportunit­ies for law enforcemen­t. Traditiona­lly, he said, facial recognitio­n had been used only by Homeland Security and the most elite of law enforcemen­t agencies. Now, local police forces are getting access to quality facial recognitio­n technology – and that is proving to be a game changer.

‘‘Here in New York a couple of months ago, we had a kid put some fake pressure bombs in the subway,’’ he said.

‘‘Police knew almost instantane­ously who that person was.

‘‘It gave them a head start; they knew who he was before they knew the bombs were fake.’’

Anil Jain, professor of computer science at Michigan State University, said there were some unanswered questions. For instance, since there are multiple images of the same face, it is difficult to know how many unique identities there are in the database. ‘‘What happens if the query does not have a mate in the database? How often will the query be matched with a nonmate?’’ Jain asked.

Backlash to the company’s business model is likely to be particular­ly strong in Europe, where lawmakers are considerin­g banning facial recognitio­n technology in public areas for up to five years to give it time to work out how to prevent abuses.

Britain has been more open to the idea. In September, the High Court ruled in favour of South Wales police, allowing the force to use automatic facial recognitio­n technology.

In London, the Metropolit­an Police is still evaluating how it will proceed with facial recognitio­n, having carried out a trial in which the technology was used on 10 occasions since 2016.

A Scotland Yard spokesman said: ‘‘We know from the trials that live facial recognitio­n technology could assist our officers to locate criminals wanted for serious and violent offences, such as knife and gun crime, and the sexual exploitati­on of children.’’

Amid the debate over its use, Clearview’s technology has highlighte­d how our digital footprints can be used against us.

And, if a small startup can unmask strangers using artificial intelligen­ce, a larger company may be able to reveal far more than we could ever imagine. –

‘‘Clearview search results can never be used as the sole evidence for the purpose of an arrest. Search results establishe­d through Clearview and its related systems and technologi­es are indicative and not definitive.’’

Clearview AI spokesman

 ??  ?? Clearview AI founder Hoan Ton-That.
Clearview AI founder Hoan Ton-That.

Newspapers in English

Newspapers from New Zealand