Sun Sentinel Palm Beach Edition

Surveillan­ce cams record students’ expression­s

Cameras, software used to evaluate teenagers’ behavior

- By Don Lee don.lee@latimes.com

Informatio­n is used to create a running “score” on students in Chinese classrooms.

HANGZHOU, China — At first, it just seemed cool.

When facial recognitio­n cameras were installed at a century-old high school here in eastern China, students got in and out of campus, picked up lunch, borrowed books and even bought drinks from a vending machine just by peering into the cameras.

No more worrying about forgetting to carry your ID card.

But last March, the cameras appeared in some classrooms — and they did a lot more than just identify students and take attendance.

Using the latest artificial intelligen­ce software, the devices tracked students’ behavior and read their facial expression­s, grouping each face into one of seven emotions: anger, fear, disgust, surprise, happiness, sadness and what was labeled as neutral.

Think of it as a little glimpse of the future.

While American schools, as well as students and parents, are worrying about the increased emphasis on standardiz­ed tests — and the loss of classroom freedom that comes with “teaching to the test” — China has carried things to a whole new level.

Here, the surveillan­ce cameras took the data on individual facial expression­s and used that informatio­n to create a running “score” on each student and class. If a score reached a predetermi­ned point, the system triggered an alert. Teachers were expected to take action: to talk to a student perceived to be disengaged, for example, or overly moody.

School administra­tors reckoned the data could provide feedback for teachers as well, about their lectures and classroom management, though they spoke of no immediate plans to use those details as part of their evaluation­s.

Most students came to hate the constant monitoring — and the consequenc­es that followed when the machines reported scores suggesting individual­s or entire classes weren’t paying attention.

Some students went so far as to figure out how to game the system by feigning what the cameras’ designers wanted to see.

“If you feel angry, you need to control yourself,” said Zhu Juntao, 17, using his two forefinger­s to press up the ends of his mouth, as if smiling. He says he was never called out by a teacher, but others were.

Parents had mixed reactions, but enough of them complained about what they saw as an intrusion on privacy that administra­tors last month hit the pause button on the cameras.

Not that those officials have given up on the system. It just needs further studying and some tweaking, says Zhang Guanchao, the school’s deputy principal, who believes it’s a useful tool for teachers.

“Hopefully we will bring the system back to campus in September,” he said this week as students were wrapping up finals.

Facial identifyin­g technology has been developing rapidly and is being deployed in more places around the world. Some U.S. airports and law enforcemen­t agencies now use such systems to screen travelers and detect wanted people.

But no country has been employing facial recognitio­n as aggressive­ly as China. That reflects the central government’s intense focus on public security and monitoring of residents, particular­ly in China’s far west Xinjiang region, where Beijing is using highly sophistica­ted facial recognitio­n, iris scanners and other artificial intelligen­ce software to keep tabs on — and watch for separatist activities from — its Muslim Uighur population.

At the same time, Beijing is making a big push in artificial intelligen­ce. China has set a goal of being the world’s AI leader by 2030 and is investing heavily to support start-ups, research and more use of smart surveillan­ce technologi­es. State media said recently that Beijing’s subway system plans to install facial recognitio­n cameras along with palm scanners this year, ostensibly to ease congestion by allowing riders to gain faster entry — but also giving authoritie­s another tool to monitor the population.

In Beijing and throughout China, closed-circuit cameras and other surveillan­ce devices are so ubiquitous that they’ve become part of the landscape. If facial recognitio­n helps with public safety, some say, that’s a good thing.

Chen Hong, a Beijing resident, said his main worry is whether AI technology will work properly in identifyin­g faces correctly. “I’m not concerned about privacy,” said the 24year-old, who installs highspeed internet equipment for a living.

Hangzhou, a top tourist destinatio­n about 100 miles southwest of Shanghai, is now one of the country’s leading tech hubs, thanks in part to e-commerce giant Alibaba. Also based in the city is Hikvision, the world’s largest maker of video surveillan­ce products.

Hikvision supplied the face-recognitio­n devices to Hangzhou No. 11 High School. Rolling them out to schools across the country would be highly lucrative. The partially state-owned company did not respond to requests for an interview.

Experts say technologi­es recognizin­g or verifying faces is one thing, but monitoring emotions with AI devices takes it to another level. They include not just cameras but hats and caps with sensors to monitor brain waves that detect shifts in a person’s mood.

Human rights and privacy advocates see such emotional surveillan­ce as part of China’s widening security control regime, an increasing­ly Orwellian world in which people can’t escape the eye of government and the pressures of conformity.

“It’s an incredibly dangerous precedent to affix somebody’s behavior or certain actions based on emotions or characteri­stics presented in their face,” said Clare Garvie, an associate at the Center on Privacy and Technology at Georgetown University Law Center.

Educators in China have been sharply critical of the Hangzhou school, not only for invading students’ privacy — neither they nor their parents were asked to give consent — but for charging ahead with a unproven system that purports to improve student performanc­e.

Educators worry the emotion-monitoring will encourage excessive attention on outward behavior or become an active means of social control. That’s partly why Xiong Bingqi, an education professor at Shanghai Jiaotong University, calls it “black technology.”

“The cameras have a very bad influence on students’ developmen­t,” he said. “The cameras just shouldn’t be used any longer.”

 ??  ??
 ?? ELISE AMENDOLA/AP ?? Rana el Kaliouby, CEO of Boston-based AI firm Affectiva, demonstrat­es its facial recognitio­n technology. A Chinese high school installed cameras using similar technology in 10th-grade classrooms to monitor students’ emotions.
ELISE AMENDOLA/AP Rana el Kaliouby, CEO of Boston-based AI firm Affectiva, demonstrat­es its facial recognitio­n technology. A Chinese high school installed cameras using similar technology in 10th-grade classrooms to monitor students’ emotions.

Newspapers in English

Newspapers from United States