EMOTION DETECTION A glimpse of the future ?
Not motion sensors, but emotion sensors: at a Chinese school, Big Brother charts every smile or frown. Don Lee, of the Los Angeles Times, reports from Hangzhou.
AT first, it just seemed cool. When facial recognition cameras were installed at a centuryold high school here in eastern China, pupils got in and out of campus, picked up lunch, borrowed books and even bought drinks from a vending machine just by peering into the cameras.
No more worrying about forgetting to carry your ID card.
But last March, the cameras appeared in some classrooms — and they did a lot more than just identify pupils and take attendance.
Using the latest artificial intelligence software, the devices tracked pupils’ behaviour and read their facial expressions, grouping each face into one of seven emotions: anger, fear, disgust, surprise, happiness, sadness and what was labelled as neutral.
Think of it as a little glimpse of the future.
While American schools, as well as pupils and parents, are worrying about the increased emphasis on standardised tests — and the loss of classroom freedom that comes with
‘‘teaching to the test’’ —
China has carried things to a whole new level.
Here, the surveillance cameras took the data on individual facial expressions and used that information to create a running ‘‘score’’ on each pupil and class. If a score reached a predetermined point, the system triggered an alert. Teachers were expected to take action: to talk to a pupil perceived to be disengaged, for example, or overly moody.
School administrators reckoned the data could provide feedback for teachers as well, about their lectures and classroom management, though they spoke of no immediate plans to use those details as part of their evaluations.
Most pupils came to hate the constant monitoring, and the consequences that followed when the machines reported scores suggesting individuals or entire classes were not paying attention.
Some pupils went so far as to figure out how to game the system by feigning what the cameras’ designers wanted to see.
‘‘If you feel angry, you need to control yourself,’’ said Zhu Juntao (17) using his two forefingers to press up the ends of his mouth, as if smiling. He says he was never called out by a teacher, but others were.
Parents had mixed reactions, but enough of them complained about what they saw as an intrusion on privacy that school administrators have hit the pause button on the cameras.
Not that those officials have given up on the system. It just needs further studying and some tweaking, says Zhang Guanchao, the school’s deputy principal, who believes it is a useful tool for teachers.
‘‘Hopefully we will bring the system back to campus in September,’’ he said late last month as pupils were wrapping up finals.
Facial identifying technology has been developing rapidly and is being deployed in more places around the world. Some US airports and law enforcement agencies now use such systems to screen travellers and detect wanted people. Britain and Russia are among others trying the software as part of their overall policing and surveillance efforts.
But no country has been employing facial recognition as aggressively as China. That reflects the central government’s intense focus on public security and monitoring of residents, particularly in China’s far west Xinjiang region, where Beijing is using highly sophisticated facial recognition, iris scanners and other artificial intelligence software to keep tabs on — and watch for any separatist activities from — its Muslim Uighur population.
At the same time, Beijing is making a big push in artificial intelligence. China has set a goal of being the world’s AI leader by 2030 and is investing heavily to support startups, research and more use of smart surveillance technologies. State media said recently that Beijing’s subway system planned to install facial recognition cameras along with palm scanners this year, ostensibly to ease congestion by allowing riders to gain faster entry, but also giving authorities another tool to monitor the population. In Beijing and throughout China, closedcircuit cameras and other surveillance devices are so ubiquitous that they have
become part of the landscape. If facial recognition helps with public safety, some say, that is a good thing.
‘‘Perhaps people would behave themselves more,’’ said Xia Chuzi, a 19yearold student interviewed in Beijing.
Chen Hong, another Beijing resident, said his main worry was whether AI technology would work properly in identifying faces correctly.
‘‘I’m not concerned about privacy,’’ said the 24yearold, who installs highspeed internet equipment for a living.
Hangzhou, a top tourist destination about 160km southwest of Shanghai, is now one of the country’s leading tech hubs, thanks in part to ecommerce giant Alibaba. Also based in the city is Hikvision, the world’s largest maker of video surveillance products.
Hikvision supplied the facerecognition devices to Hangzhou No 11 High
School. Rolling them out to schools across the country would be highly lucrative. The partially stateowned company did not respond to requests for an interview.
Experts say technologies recognising or verifying faces is one thing, but monitoring emotions with AI devices takes it to a whole other level. They include not just cameras but hats and caps with sensors to monitor brain waves that detect shifts in a person’s mood.
Human rights and privacy advocates see such emotional surveillance as part of China’s widening security control regime, an increasingly Orwellian world in which people cannot escape the eye of government and the pressures of conformity in social behaviour.
‘‘It’s an incredibly dangerous precedent to affix somebody’s behaviour or certain actions based on emotions or characteristics presented in their face,’’ Clare Garvie, of the Centre on Privacy and Technology at Georgetown University Law Centre, said.
Educators in China have been sharply critical of the Hangzhou school, not only for invading pupils’ privacy — neither they nor their parents were asked to give consent — but for charging ahead with a unproven system that purports to improve pupil performance.
Even assuming the machines could accurately read facial emotions, it was far from clear how outward expressions were related to learning, He Shanyun, an associate professor of education at Zhejiang University in Hangzhou, said.
He thinks facial recognition is flawed in another way: It does not account for different personalities and a Chinese culture that may be predisposed to a stoic face. Even if an AI device could help a teacher, he said, ‘‘you shouldn’t use it to punish pupils or put a simple label on them’’.
Zheng Suning (16), a 10thgrader at Hangzhou No 11, speaks proudly of her school. It was founded in 1904 but is now one of the most hightech in the country. ‘‘We have visitors regularly,’’ she said.
School administrators, however, declined a request for a tour.
Zheng recalls the trouble when she had misplaced her school ID. Now she is a little selfconscious about her face flashing before others but says it is exceptionally convenient.
‘‘You show your face to the machine, and they bring out your lunch tray,’’ she said.
Still, she dreads a return of the AI cameras in the classroom.
Like most other high school pupils in China, Zheng is in school from early morning to late at night. On top of that, she takes private tutorial lessons twice a week, lasting two hours each. She said maybe the cameras would help her be a better pupil, but she worried they would add more stress. She did not know how she could avoid looking sleepy.
‘‘I’m already so tired,’’ Zheng said.
Then there is the matter of faking expressions and behaviour pupils think the cameras look for. No matter how tired or boring the lecture was, they said the trick was to look straight ahead.
‘‘Some students pretend to be very focused,’’ Chu Haotian (17) said.
Fellow 10thgrader Zhu Juntao added: ‘‘Even though you’re a good student, you may not have a good expression.’’
Facial recognition cameras have not been installed in every classroom at the school yet. And they monitored only 10thgraders — and only for about two months before their use was suspended.
Educators worry the emotion-monitoring will encourage excessive attention on outward behaviour or become an active means of social control. That is partly why Xiong Bingqi, an education professor at Shanghai Jiaotong University, calls it ‘‘black technology’’.
‘‘The cameras have a very bad influence on students’ development,’’ he said. ‘‘The cameras just shouldn’t be used any longer.
‘‘It was a bad idea from the beginning . . . New technology shouldn’t be an excuse to do this kind of thing.’’
❛New technology shouldn’t be an excuse to do this kind of thing❜
— education professor Xiong Bingqi