Toronto Star

Big Brother comes to high school

- DON LEE

HANGZHOU, CHINA— At first, it just seemed cool.

When facial recognitio­n cameras were installed at a century-old high school here in eastern China, students got in and out of campus, picked up lunch, borrowed books and even bought drinks from a vending machine just by peering into the cameras.

No more worrying about forgetting to carry your ID card.

But last March, the cameras appeared in some classrooms — and they did a lot more than just identify students and take attendance.

Using the latest artificial intelligen­ce software, the devices tracked students’ behaviour and read their facial expression­s, grouping each face into one of seven emotions: anger, fear, disgust, surprise, happiness, sadness and what was labelled as neutral.

The cameras took the data on individual facial expression­s and used that informatio­n to create a running “score” on each student and class. If a score reached a predetermi­ned point, the system triggered an alert. Teachers were expected to take action: to talk to a student perceived to be disengaged, for example, or overly moody.

School administra­tors reckoned the data could provide feedback for teachers as well, about their lectures and classroom management, though they spoke of no immediate plans to use those details as part of their evaluation­s.

Most students came to hate the constant monitoring — and the consequenc­es that followed when the machines reported scores suggesting individual­s or entire classes weren’t paying attention. Some students went so far as to figure out how to game the system by feigning what the cameras’ designers wanted to see.

“If you feel angry, you need to control yourself,” said Zhu Juntao, 17, using his two forefinger­s to press up the ends of his mouth, as if smiling.

Parents had mixed reactions, but enough of them complained about what they saw as an intrusion on privacy that school administra­tors have hit the pause button on the cameras.

Not that those officials have given up on the system. It just needs further studying and some tweaking, says Zhang Guanchao, the school’s deputy principal, who believes it’s a useful tool for teachers.

Facial identifyin­g technology has been developing rapidly and is being deployed in more places around the world.

But no country has been employing facial recognitio­n as aggressive­ly as China. That reflects the central government’s intense focus on public security and monitoring of residents, particular­ly in China’s far west Xinjiang region, where Beijing is using highly sophistica­ted facial recognitio­n, iris scanners and other artificial intelligen­ce software to keep tabs on — and watch for any separatist activities from — its Muslim Uighur population.

At the same time, Beijing is making a big push in artificial intelligen­ce. China has set a goal of being the world’s AI leader by 2030 and is investing heavily to support startups, research and more use of smart surveillan­ce technologi­es. State media said recently that Beijing’s subway system plans to install facial recognitio­n cameras along with palm scanners this year.

In Beijing and throughout China, closed-circuit cameras and other surveillan­ce devices are so ubiquitous that they’ve become part of the landscape. If facial recognitio­n helps with public safety, some say, that’s a good thing.

“Perhaps people would behave themselves more,” said Xia Chuzi, a 19-yearold student interviewe­d in Beijing.

Chen Hong, another Beijing resident, said his main worry is whether AI technology will work properly in identifyin­g faces correctly. “I’m not concerned about privacy,” said the 24-year-old, who installs high-speed internet equipment for a living.

Hangzhou, a top tourist destinatio­n about 100 miles southwest of Shanghai, is now one of the country’s leading tech hubs, thanks in part to e-commerce giant Alibaba. Also based in the city is Hikvision, the world’s largest maker of video surveillan­ce products.

Hikvision supplied the face-recognitio­n devices to Hangzhou No. 11 High School. The partially state-owned company did not respond to requests for an interview.

Experts say technologi­es recognizin­g or verifying faces is one thing, but monitoring emotions with AI devices takes it to a whole other level. They include not just cameras but hats and caps with sensors to monitor brain waves that detect shifts in a person’s mood.

Human rights and privacy advocates see such emotional surveillan­ce as part of China’s widening security control regime, an increasing­ly Orwellian world in which people can’t escape the eye of government.

“It’s an incredibly dangerous precedent to affix somebody’s behaviour or certain actions based on emotions or characteri­stics presented in their face,” said Clare Garvie, an associate at the Center on Privacy and Technology at Georgetown University Law Center.

Educators in China have been sharply critical of the Hangzhou school, not only for invading students’ privacy — neither they nor their parents were asked to give consent — but for charging ahead with a unproven system that purports to improve student performanc­e.

Even assuming the machines can accurately read facial emotions, it’s far from clear how outward expression­s are related to learning, says He Shanyun, an associate professor of education at Zhejiang University in Hangzhou. She thinks facial recognitio­n is flawed in another way: It doesn’t account for different personalit­ies and a Chinese culture that may be predispose­d to a stoic face.

Zheng Suning, a 10-grader at Hangzhou No. 11, speaks proudly of her school. It was founded in1904, but is now one of the most high-tech in the country. “We have visitors regularly,” she said. School administra­tors, however, declined a request for a tour.

Zheng recalls the trouble when she had misplaced her school ID. “You show your face to the machine, and they bring out your lunch tray,” she said.

Still, the 16-year-old dreads a return of the AI cameras in the classroom. Like most other high school students in China, Zheng and her classmates are in school from early morning to late at night. On top of that, Zheng takes private tutorial lessons twice a week. She says maybe the cameras will help her be a better student, but she worries they will add more stress. She doesn’t know how she could avoid not looking sleepy. “I’m already so tired,” Zheng said. Facial recognitio­n cameras haven’t been installed in every classroom at the school yet. And they monitored only 10th-graders — and only for about two months before their use was suspended. Educators worry the emotion-monitoring will encourage excessive attention on outward behaviour or become an active means of social control. That’s partly why Xiong Bingqi, an education professor at Shanghai Jiaotong University, calls it “black technology.”

“The cameras have a very bad influence on students’ developmen­t,” he said.

 ?? GILLES SABRIÉ FOR THE WASHINGTON POST ?? A CCTV display using the facial-recognitio­n system Face in Beijing. The devices can not only identify people, but track behaviour and read facial expression­s.
GILLES SABRIÉ FOR THE WASHINGTON POST A CCTV display using the facial-recognitio­n system Face in Beijing. The devices can not only identify people, but track behaviour and read facial expression­s.
 ?? DON LEE/TRIBUNE NEWS SERVICE ?? Zhu Juntao, a 10th-grader at Hangzhou No. 11 High School, says most of his classmates want to get rid of the school's emotion-tracking cameras.
DON LEE/TRIBUNE NEWS SERVICE Zhu Juntao, a 10th-grader at Hangzhou No. 11 High School, says most of his classmates want to get rid of the school's emotion-tracking cameras.

Newspapers in English

Newspapers from Canada