The Mercury News

How China is policing the future with data

Nation is gathering vast amount of informatio­n to predict possible crimes

- By Paul Mozur, Muyi Xiao and John Liu

The more than 1.4 billion people living in China are constantly watched. They are recorded by police cameras that are everywhere, on street corners and subway ceilings, in hotel lobbies and apartment buildings. Their phones are tracked, their purchases are monitored, and their online chats are censored.

Now, even their future is under surveillan­ce.

The latest generation of technology digs through the vast amounts of data collected on their daily activities to find patterns and aberration­s, promising to predict crimes or protests before they happen. They target potential troublemak­ers in the eyes of the Chinese government — not only those with a criminal past but also vulnerable groups, including ethnic minorities, migrant workers and those with a history of mental illness.

They can warn police if a victim of a fraud tries to travel to Beijing to petition the government

for payment or a drug user makes too many calls to the same number. They can signal officers each time a person with a history of mental illness gets near a school.

It takes extensive evasive maneuvers to avoid the digital tripwires. In the past, Zhang Yuqiao, a 74-year-old man who has been petitionin­g the government for most of his adult life, could simply stay off the main highways to dodge authoritie­s and make his way to Beijing to fight for compensati­on over the torture

of his parents during the Cultural Revolution. Now, he turns off his phones, pays in cash and buys multiple train tickets to false destinatio­ns.

While largely unproven, the new Chinese technologi­es, detailed in procuremen­t and other documents reviewed by The New York Times, further extend the boundaries of social and political controls and integrate them ever deeper into people's lives. At their most basic, they justify suffocatin­g surveillan­ce and violate privacy, while in the extreme they risk automating systemic discrimina­tion and political repression.

For the government, social stability is paramount and any threat to it must be eliminated. During his decade as China's top leader, Xi Jinping has hardened and centralize­d the security state, unleashing techno-authoritar­ian policies to quell ethnic unrest in the western region of Xinjiang and enforce some of the world's most severe coronaviru­s lockdowns. The space for dissent, always limited, is rapidly disappeari­ng.

The details of these emerging security technologi­es are described in police research papers, surveillan­ce contractor patents and presentati­ons, as well as hundreds of public procuremen­t documents reviewed and confirmed by the Times. Many of the procuremen­t documents were shared by ChinaFile, an online magazine published by the Asia Society, which has systematic­ally gathered years of records on government websites. Another set, describing software bought by authoritie­s in the port city of Tianjin to stop petitioner­s from going to neighborin­g Beijing, was provided by IPVM, a surveillan­ce industry publicatio­n.

China's Ministry of Public Se

curity did not respond to requests for comment faxed to its headquarte­rs in Beijing and six local department­s across the country.

The new approach to surveillan­ce is partly based on data-driven policing software from the United States and Europe, technology that rights groups say has encoded racism into decisions like which neighborho­ods are most heavily policed and which prisoners get parole. China takes it to the extreme, tapping nationwide reservoirs of data that allow police to operate with opacity and impunity.

Often people don't know they're being watched. Police face little outside scrutiny of the effectiven­ess of the technology or the actions they prompt. Chinese authoritie­s require no warrants to collect personal informatio­n.

At the most bleeding edge, the systems raise perennial science fiction conundrums: How is it possible to know the future has been accurately predicted if police intervene before it happens?

Even when the software fails to deduce human behavior, it can be considered successful since the surveillan­ce itself inhibits unrest and crime, experts say.

“This is an invisible cage of technology imposed on society,” said Maya Wang,

a senior China researcher with Human Rights Watch, “the disproport­ionate brunt of it being felt by groups of people that are already severely discrimina­ted against in Chinese society.”

`Nowhere to Hide'

In 2017, one of China's best-known entreprene­urs had a bold vision for the future: a computer system that could predict crimes.

The entreprene­ur, Yin Qi, who founded Megvii, an artificial intelligen­ce startup, told Chinese state media that the surveillan­ce system could give police a search engine for crime, analyzing huge amounts of video footage to intuit patterns and warn authoritie­s about suspicious behavior. He explained that if cameras detected a person spending too much time at a train station, the system could flag a possible pickpocket.

“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Yin said. “It's like the search engine we use every day to surf the internet — it's very neutral. It's supposed to be a benevolent thing.”

He added that with such surveillan­ce, “the bad guys have nowhere to hide.”

Five years later, his vision is slowly becoming reality. Internal Megvii presentati­ons reviewed by the Times show how the startup's products assemble full digital dossiers for police.

“Build a multidimen­sional

database that stores faces, photos, cars, cases and incident records,” reads a descriptio­n of one product, called “intelligen­t search.” The software analyzes the data to “dig out ordinary people who seem innocent” to “stifle illegal acts in the cradle.”

A Megvii spokespers­on said in an emailed statement that the company was committed to the responsibl­e developmen­t of artificial intelligen­ce, and that it was concerned about making life more safe and convenient and “not about monitoring any particular group or individual.”

Similar technologi­es are already being put into use. In 2022, police in Tianjin bought software made by a Megvii competitor, Hikvision, that aims to predict protests. The system collects data on legions of Chinese petitioner­s, a general term in China that describes people who try to file complaints about local officials with higher authoritie­s.

It then scores petitioner­s on the likelihood that they will travel to Beijing. In the future, the data will be used to train machine-learning models, according to a procuremen­t document.

Local officials want to prevent such trips to avoid political embarrassm­ent or exposure of wrongdoing. And the central government doesn't want groups of disgruntle­d citizens gathering in the capital.

A Hikvision representa­tive

declined to comment on the system.

Automating Prejudice

When police in Zhouning, a rural county in Fujian province, bought a new set of 439 cameras in 2018, they listed coordinate­s where each would go. Some hung above intersecti­ons and others near schools, according to a procuremen­t document.

Nine were installed outside the homes of people with something in common: mental illness.

While some software tries to use data to uncover new threats, a more common type is based on the preconceiv­ed notions of police. In over 100 procuremen­t documents reviewed by the Times, the surveillan­ce targeted blacklists of “key persons.”

These people, according to some of the procuremen­t documents, included those with mental illness, convicted criminals, fugitives, drug users, petitioner­s, suspected terrorists, political agitators and threats to social stability. Other systems targeted migrant workers, idle youths (teenagers without school or a job), ethnic minorities, foreigners and those infected with HIV.

Authoritie­s decide who goes on the lists, and there is often no process to notify people when they do. Once individual­s are in a database, they are rarely removed, said experts, who worried that the new technologi­es reinforce disparitie­s

within China, imposing surveillan­ce on the least fortunate parts of its population.

In many cases the software goes further than simply targeting a population, allowing authoritie­s to set up digital tripwires that indicate a possible threat. In one Megvii presentati­on detailing a rival product by Yitu, the system's interface allowed police to devise their own early warnings.

With a simple fill-inthe-blank menu, police can base alarms on specific parameters, including where a blackliste­d person appears, when the person moves around, whether he or she meets with other blackliste­d people and the frequency of certain activities. Police could set the system to send a warning each time two people with a history of drug use check into the same hotel or when four people with a history of protest enter the same park.

Yitu did not respond to emailed requests for comment.

Toward Techno Totalitari­anism

Zhang first started petitionin­g the government for compensati­on over the torture of his family during the Cultural Revolution. He has since petitioned over what he says is police targeting of his family.

As China has built out its techno-authoritar­ian tools, he has had to use spy movie tactics to circumvent surveillan­ce

that, he said, has become “high tech and Nazified.”

When he traveled to Beijing in January from his village in Shandong province, he turned off his phone and paid for transporta­tion in cash to minimize his digital footprint. He bought train tickets to the wrong destinatio­n to foil police tracking. He hired private drivers to get around checkpoint­s where his identifica­tion card would set off an alarm.

The system in Tianjin has a special feature for people like him who have “a certain awareness of antireconn­aissance” and regularly change vehicles to evade detection, according to the police procuremen­t document.

Whether or not he triggered the system, Zhang has noticed a change. Whenever he turns off his phone, he said, officers show up at his house to check that he hasn't left on a new trip to Beijing.

Even if police systems cannot accurately predict behavior, authoritie­s may consider them successful because of the threat, said Noam Yuchtman, an economics professor at the London School of Economics who has studied the impact of surveillan­ce in China.

“In a context where there isn't real political accountabi­lity,” having a surveillan­ce system that frequently sends police officers “can work pretty well” at discouragi­ng unrest, he said.

 ?? LAM YIK FEI — THE NEW YORK TIMES ARCHIVES ?? Security cameras in Hong Kong. The Chinese government is gathering data on individual­s to see if they are considerin­g committing a crime.
LAM YIK FEI — THE NEW YORK TIMES ARCHIVES Security cameras in Hong Kong. The Chinese government is gathering data on individual­s to see if they are considerin­g committing a crime.

Newspapers in English

Newspapers from United States