‘An Invisible Cage’ of Surveillance
Chinese Authorities Use Technology to Police the Future
The more than 1.4 billion people living in China are constantly watched. They are recorded by police cameras that are everywhere, on street corners and subway ceilings, in hotel lobbies and apartment buildings. Their phones are tracked, their purchases are monitored, and their online chats are censored.
Now, even their future is under surveillance. The latest generation of technology digs through the vast amounts of data collected on their daily activities to find patterns and aberrations, promising to predict crimes or protests before they happen. It searches for potential troublemakers in the eyes of the Chinese government — not only those with a criminal past but also ethnic minorities, migrant workers and those with a history of mental illness.
It can warn the police if a victim of fraud tries to travel to Beijing to petition the government for payment or a drug user makes too many calls to the same number. It can signal officers each time a person with a history of mental illness gets near a school.
While mostly unproven, the new Chinese technologies, detailed in procurement and other documents reviewed by The New York Times, extend the boundaries of social and political controls and integrate them deeper into people’s lives. At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression.
For the government, social stability is paramount and any threat to it must be
eliminated. During his decade as China’s leader, Xi Jinping has hardened and centralized the security state, unleashing techno-authoritarian policies to quell ethnic unrest in Xinjiang and enforce severe coronavirus lockdowns. The space for dissent, always limited, is rapidly disappearing.
The algorithms are often trumpeted as triumphs.
In 2020, the authorities in southern China denied a woman’s request to move to Hong Kong to be with her husband after software alerted them that the marriage was suspicious, the local police reported. An ensuing investigation revealed that the two were not often in the same place at the same time and had not spent the Spring Festival holiday together. The police concluded that the marriage had been faked to obtain a migration permit.
The details of these security technologies are described in police research papers, surveillance contractor patents and presentations, as well as hundreds of public procurement documents reviewed and confirmed by The Times.
China’s Ministry of Public Security did not respond to requests for comment faxed to its headquarters in Beijing and six local departments across the country.
Often people do not know they are being watched. The police face little outside scrutiny of the effectiveness of the technology or the actions it prompts. The Chinese authorities require no warrants to collect personal information.
The systems also raise science-fiction conundrums: How is it possible to know the future has been accurately predicted if the police intervene before it happens?
Even when the software fails to deduce behavior, it can be considered successful since the surveillance itself inhibits unrest and crime, experts say.
“This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch, “the disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”
‘Nowhere to Hide’
In 2017, one of China’s best-known entrepreneurs, Yin Qi, told Chinese state media that a surveillance system made by Megvii, his artificial intelligence start-up, could give the police a search engine for crime. The system would analyze huge amounts of video footage to intuit patterns and warn the authorities about suspicious behavior. For instance, if cameras detected a person spending too much time at a train station, the system could flag a possible pickpocket.
“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Mr. Yin said. “It’s like the search engine we use every day to surf the internet — it’s very neutral. It’s supposed to be a benevolent thing.”
He added that with such surveillance, “the bad guys have nowhere to hide.”
Five years later, his vision is slowly becoming reality. Internal Megvii presentations reviewed by The Times show how the start-up’s products assemble full digital dossiers for the police.
“Build a multidimensional database that stores faces, photos, cars, cases and incident records,” reads a description of one product, called “intelligent search.” The software analyzes the data to “dig out ordinary people who seem innocent” to “stifle illegal acts in the cradle.”
In 2022, the police in Tianjin bought software made by a Megvii competitor, Hikvision, that aims to predict protests. The system collects data on Chinese petitioners, a term that describes people who try to file complaints about local officials with higher authorities.
It then scores petitioners on the likelihood that they will travel to Beijing. Local officials want to prevent such trips to avoid political embarrassment or exposure of wrongdoing. And the central government does not want groups of disgruntled citizens gathering in the capital.
A Hikvision representative declined to comment.
Under Mr. Xi, efforts to control petitioners have grown increasingly invasive. Zekun Wang, a 32-yearold member of a group that for years sought redress over a real estate fraud, said the authorities in 2017 had intercepted fellow petitioners in Shanghai before they could even buy tickets to Beijing. He suspected that the authorities were watching their communications on the social media app WeChat.
In Tianjin, a Hikvision system analyzes individuals’ likelihood to petition based on their social and family relationships, past trips and personal situations, according to the procurement document. It helps the police create a profile of each, with fields for officers to describe the temperament of the protester, including “paranoid,” “meticulous” and “short tempered.”
Many people who petition do so over government mishandling of a tragic accident or neglect — all of which goes into the algorithm. “Increase a person’s early-warning risk level if they have low social status or went through a major tragedy,” reads the procurement document.
Automating Prejudice
When the police in Zhouning, a rural county in Fujian Province, bought 439 cameras in 2018, they listed coordinates where each would go. Some hung above intersections and others near schools, according to a procurement document.
Nine were installed outside the homes of people with mental illness.
In more than a hundred documents reviewed by The Times, the surveillance focused on blacklists of “key persons.”
These people, according to some of the documents, included those with mental illness, convicted criminals, fugitives, drug users, petitioners, suspected terrorists, political agitators and threats to social stability.
The authorities decide who goes on the lists, and there is often no process to notify people when they do. Once individuals are in a database, they are rarely removed, said experts, who worried that the new technologies reinforce disparities within China.
In many cases, the software allows the authorities to set up digital tripwires that indicate a possible threat. In one Megvii presentation detailing a rival product by Yitu, the system’s interface allowed the police to devise their own early warnings. The police can base alarms on where a blacklisted person appears, when the person moves around, whether he or she meets with other blacklisted people and the frequency of certain activities. The police could set the system to send a warning each time two people with a history of drug use check into the same hotel or when four people with a history of protest enter the same park.
Yitu did not respond to emailed requests for comment.
In 2020 in the city of Nanning, the police bought software that could look for “more than three key people checking into the same or nearby hotels” and “a drug user calling a new out-of-town number frequently,” according to a bidding document.
In Yangshuo, the authorities bought a system to alert them if a foreigner without a work permit spent too much time at foreign-language schools or bars, an apparent effort to catch people overstaying their visas or working illegally.
Techno Totalitarianism
Zhang Yuqiao, 74, has been petitioning the government for most of his adult life for compensation over the torture of his family during the Cultural Revolution. He has also petitioned over what he says is police targeting of his family.
When he traveled to Beijing in January from his village in Shandong Province, he turned off his phone and paid for transportation in cash to minimize his digital footprint. He bought train tickets to the wrong destination to foil police tracking. He hired private drivers to get around checkpoints where his identification card would set off an alarm.
The system in Tianjin has a feature for people like him who have “a certain awareness of anti-reconnaissance” and regularly change vehicles to evade detection, according to the police procurement document.
Whether or not he triggered the system, Mr. Zhang has noticed a change. Whenever he turns off his phone, he said, officers show up at his house to check that he has not left on a trip to Beijing.
The technology has encoded power imbalances. Some bidding documents refer to a “red list” of people whom the surveillance system must ignore. One national procurement document from Guangdong Province stipulated that the red list was for government officials.
Mr. Zhang said that he still believed in the power of technology to do good, but that in the wrong hands it could be a “scourge and a shackle.”
“In the past if you left your home and took to the countryside, all roads led to Beijing,” he said. “Now, the entire country is a net.”