Huawei tested software that triggers ‘Uighur alarms’ to police, report says
The Chinese tech giant Huawei has tested facial recognition software that could send automated ‘‘Uighur alarms’’ to government authorities when its camera systems identify members of the oppressed minority group, according to an internal document that provides further details about China’s artificialintelligence surveillance regime.
A document signed by Huawei representatives – discovered by the research organisation IPVM and shared exclusively with The Washington Post – shows that the telecommunications firm worked in 2018 with the facial recognition start-up Megvii to test an artificialintelligence camera system that could scan faces in a crowd and estimate each person’s age, sex and ethnicity.
If the system detected the face of a member of the mostly Muslim minority group, the test report said, it could trigger a ‘‘Uighur alarm’’ – potentially flagging them for police in China, where members of the group have been detained en masse as part of a brutal government crackdown.
The document, which was found on
Huawei’s website, was removed shortly after the Post and IPVM asked the companies for comment.
Such technology has in recent years gained an expanding role among police departments in China, human rights activists say.
But the document sheds new light on how Huawei, the world’s biggest maker of telecommunications equipment, has also contributed to its development, providing the servers, cameras, cloudcomputing infrastructure and other tools undergirding the systems’ technological might.
John Honovich, the founder of IPVM, a Pennsylvania-based company that reviews and investigates videosurveillance equipment, said the document showed how ‘‘terrifying’’ and ‘‘totally normalised’’ such discriminatory technology has become.
‘‘This is not one isolated company. This is systematic,’’ Honovich said. ‘‘A lot of thought went into making sure this ‘Uighur alarm’ works.’’
Huawei and Megvii have announced three surveillance systems using both companies’ technology in the last couple of years. The Post could not immediately confirm if the system with the ‘‘Uighur alarm’’ tested in 2018 was one of the three currently for sale. Representatives from Huawei and Megvii did not immediately respond to requests for comment.
Chinese officials have said such systems reflect the country’s technological advancement, and that their expanded use can help government responders and keep people safe.
But to international rights advocates, they are a clear sign of China’s dream of social control – a way to identify unfavourable members of society and squash public dissent. China’s foreign ministry did not immediately respond to requests for comment.
Maya Wang, a China senior researcher at the advocacy group Human Rights Watch, said the country has increasingly used AI-assisted surveillance to closely monitor the general public and oppress minorities, protesters and others deemed threats to the state.
‘‘China’s surveillance ambition goes way, way, way beyond minority persecution,’’ Wang said, but ‘‘the persecution of minorities is obviously not exclusive to China . . . . And these systems would lend themselves quite well to countries that want to criminalise minorities.’’ – Washington Post