The Phnom Penh Post

In China, facial recognitio­n is sharp end of drive for total surveillan­ce

- Simon Denyer

FOR 40-year-old Mao Ya, the facial recognitio­n camera that allows access to her apartment house is simply a useful convenienc­e. “If I am carrying shopping bags in both hands, I just have to look ahead and the door swings open,” she said. “And my 5-year-old daughter can just look up at the camera and get in. It’s good for kids because they often lose their keys.”

But for the police, the cameras that replaced the residents’ old entry cards serve quite a different purpose.

Now they can see who’s coming and going, and by combining artificial intelligen­ce with a huge national bank of photos, the system in this pilot project should enable police to identify what one police report, shared with the Washington Post, called the “bad guys” who once might have slipped by.

Facial recognitio­n is the new hot tech topic in China. Banks, airports, hotels and even public toilets are all trying to verify people’s identities by analysing their faces. But the police and security state have been the most enthusiast­ic about embracing this new technology.

The pilot in Chongqing forms one tiny part of an ambitious plan, known as “Xue Liang” which can be translated as “Sharp Eyes”. The intent is to connect the security cameras that already scan roads, shopping malls and transport hubs with private cameras on compounds and buildings, and integrate them into one nationwide surveillan­ce and data-sharing platform.

It will use facial recognitio­n and artificial intelligen­ce to analyse and understand the mountain of incoming video evidence; to track suspects, spot suspicious behaviours and even predict crime; to coordinate the work of emergency services; and to monitor the comings and goings of the country’s 1.4 billion people, official documents and security industry reports show.

At the back end, these efforts merge with a vast database of informatio­n on every citizen, a “Police Cloud” that aims to scoop up such data as criminal and medical records, travel bookings, online purchase and even social media comments – and link it to everyone’s identity card and face.

A goal of all of these interlocki­ng efforts: to track where people are, what they are up to, what they believe and who they associate with – and ultimately even to assign them a single “social credit” score based on whether the government and their fellow citizens consider them trustworth­y.

At this housing complex in Chongqing, “90 percent of the crime is caused by the 10 percent of people who are not registered residents”, the police report said. “With facial recognitio­n we can recognise strangers, analyse their entry and exit times, see who spends the night here, and how many times. We can identify suspicious people from among the population.”

Adrian Zenz, a German academic who has researched ethnic policy and the security state in China’s western province of Xinjiang, said the government craves omnipotenc­e over a vast, complex and restive population.

“Surveillan­ce technologi­es are giving the government a sense that it can finally achieve the level of control over people’s lives that it aspires to,” he said.

In this effort, the Chinese government is working hand-in-glove with the country’s tech industry, from establishe­d giants to plucky startups staffed by graduates from top American universiti­es and former employees of companies like Google and Microsoft, who seem cheerfully oblivious to concerns they might be empowering a modern surveillan­ce state.

The name of the video project is taken from the Communist slogan “the masses have sharp eyes”, and is a throwback to Mao Zedong’s attempt to get every citizen spying on one another. The goal, according to tech industry executives working on the project, is to shine a light into every dark corner of China, to eliminate the shadows where crime thrives.

The Sharp Eyes project also aims to mobilise the neighbourh­ood committees and snoopy residents who have long been key informers: now, state media reports, some can turn on their television­s or mobile phones to see security camera footage, and report any suspicious activity – a car without a licence plate, an argument turning violent – directly to the police.

To the eyes of the masses, in other words, add the brains of the country’s fast-growing tech industry.

By 2020, China’s government aims to make the video surveillan­ce network “omnipresen­t, fully networked, always working and fully controllab­le”, combining data mining with sophistica­ted video and image analysis, official documents show.

It is China’s ambition that sets it apart. Western law enforcemen­t agencies tend to use facial recognitio­n to identify criminal suspects, not to track social activists and dissidents, or to monitor entire ethnic groups. China seeks to achieve several interlocki­ng goals: to dominate the global artificial­intelligen­ce industry, to apply big data to tighten its grip on every aspect of society, and to maintain surveillan­ce of its population more effectivel­y than ever before.

“Deep learning is poised to revolution­ise the video surveillan­ce industry,” Monica Wang, a senior analyst in video surveillan­ce and security at research consultant­s IHS Markit in Shanghai, wrote in a report. “Demand in China will grow quickly, providing the engine for future market growth.”

In the showrooms of three facialreco­gnition startups in Chongqing and Beijing, video feeds roll past on big screens, with faces picked out from crowds and matched to images of wanted men and women. Street cameras automatica­lly classify passersby according to gender, clothes and hair length, and software allows people to be tracked from one surveillan­ce camera to the next, by their faces alone.

“The bigger picture is to track routine movement, and after you get this informatio­n, to investigat­e problem- atic behaviour,” said Li Xiafeng, director of research and developmen­t at Cloudwalk, a Chongqing-based firm. “If you know gambling takes place in a location, and someone goes there frequently, they become suspicious.”

Gradually, a model of people’s behaviour takes shape. “Once you identify a criminal or a suspect, then you look at their connection­s with other people,” he said. “If another person has multiple connection­s, they also become suspicious.”

The startups also showcase more consumer-friendly applicatio­ns of their technology. Companies like SenseTime, Megvii and Cloudwalk provide the software that powers mobile apps allowing people to alter, “beautify” or transform their faces for fun.

Much of their business also comes from banks and financial companies that are using facial recognitio­n to check identities, at ATMs or on phones. Some airports in China already employ facial recognitio­n in security checks, and hotels are doing the same at check-in; a Chinese version of Airbnb promises to use it to verify guests’ identities, while China’s version of Uber, Didi Chuxing, is using it to verify those of its drivers.

Some of the applicatio­ns have a slightly gimmicky feel. A lecturer at a Beijing university was said to be using a face scanner to check if his students were bored; a toilet roll dispenser at a public facility outside the Temple of Heaven in Beijing reportedly scans faces to keep people from stealing too much paper, while a Kentucky Fried Chicken in Hangzhou allows customers to simply “smile to pay”.

Other ideas are struggling to move beyond the pilot stage: a plan to identify jaywalkers in Chongqing has already been abandoned, while residents have responded to facial-recognitio­n gates on some apartment buildings in Chongqing and Beijing by propping the doors open.

Yet facial recognitio­n is not going away, and it promises to become a potent tool for maintainin­g control of Chinese society. So far, the technology doesn’t quite match the ambition: It is not foolproof.

“There will be false positives for the foreseeabl­e future,” said Jim Dempsey, executive director of UC Berkeley’s Center for Law and Technology. This raises two questions, he said: Does a country’s due process system protect people from being falsely convicted on the basis of facial-recognitio­n technology? And are the false positives disproport­ionately skewed towards certain minority groups?

In China, the tech companies claim many times greater accuracy rates than, for instance, the FBI, and probably justifiabl­y so, experts say: after all, they have been able to draw on a huge pool of photos from government records to improve algorithms, without any pesky concerns about privacy.

More than anything else, experts say, deep learning technologi­es need huge amounts of data to come up with accurate algorithms. China has more data than anywhere else in the world and fewer constraint­s about mining it from its citizens.

“Now we are purely data driven,” said Xu Li, CEO of SenseTime. “It’s easier in China to collect sufficient training data. If we want to do new innovation­s, China will have advantages in data collection in a legal way.”

Smart technology backed by artificial intelligen­ce will be a tool to assist the police of the future. Chinese IT and telecoms giant Huawei says its Safe Cities technology has already helped Kenya bring down urban crime rates.

But who’s a criminal? In China, documents for the Police Cloud project unearthed by Human Rights Watch list “petitioner­s” – people who complain to the government about perceived injustices – as potential targets of surveillan­ce, along with anyone who “undermines stability” or has “extreme thoughts”. Other documents cite members of ethnic minorities, specifical­ly Muslim Uighurs from Xinjiang, as subjects of scrutiny.

Maya Wang, a researcher at Human Rights Watch, said what sets China apart is “a complete lack of effective privacy protection­s”, combined with a system that is explicitly designed to target individual­s seen as “politicall­y threatenin­g”.

In Muslim-majority Xinjiang, where a spate of violent incidents has been blamed on separatist­s or Islamist radicals, facial-recognitio­n cameras have become ubiquitous at roadblocks, petrol stations, airports, railway and bus stations, and at residentia­l and university compounds and entrances to Muslim neighbourh­oods, experts say. DNA collection and iris scanning add extra layers of sophistica­tion.

At Megvii, marketing manager Zhang Xin boasts that the company’s Face++ program helped police arrest 4,000 people since the start of 2016, including about 1,000 in Hangzhou, where a major deployment of cameras in hotels, subways and train stations preceded that year’s G-20 summit.

Very likely among that number: some of the dozens of dissidents, petitioner­s and citizen journalist­s who were detained around the city at that time.

Frances Eve, a researcher for Chinese Human Rights Defenders in Hong Kong, argues China’s tech companies are complicit in human rights abuses.

“It’s basically a crime in China to advocate for human rights protection,” she said. “The government treats human rights activists, lawyers and ethnic Uighurs and Tibetans as criminals, and these people are being caught, jailed and possibly tortured as a result of this technology.”

[H]uman rights activists, lawyers and ethnic Uighurs and Tibetans ... are being caught, jailed and possibly tortured as a result of this technology

 ??  ?? A CCTV display using the facial-recognitio­n system Face in Beijing.
A CCTV display using the facial-recognitio­n system Face in Beijing.
 ?? GILLES SABRIÉ/THE WASHINGTON POST ?? At Megvii offices in Beijing, a designer prepares marketing material for a facialreco­gnition product.
GILLES SABRIÉ/THE WASHINGTON POST At Megvii offices in Beijing, a designer prepares marketing material for a facialreco­gnition product.

Newspapers in English

Newspapers from Cambodia