Otago Daily Times

EMOTION DETECTION A glimpse of the future ?

Not motion sensors, but emotion sensors: at a Chinese school, Big Brother charts every smile or frown. Don Lee, of the Los Angeles Times, reports from Hangzhou.

-

AT first, it just seemed cool. When facial recognitio­n cameras were installed at a centuryold high school here in eastern China, pupils got in and out of campus, picked up lunch, borrowed books and even bought drinks from a vending machine just by peering into the cameras.

No more worrying about forgetting to carry your ID card.

But last March, the cameras appeared in some classrooms — and they did a lot more than just identify pupils and take attendance.

Using the latest artificial intelligen­ce software, the devices tracked pupils’ behaviour and read their facial expression­s, grouping each face into one of seven emotions: anger, fear, disgust, surprise, happiness, sadness and what was labelled as neutral.

Think of it as a little glimpse of the future.

While American schools, as well as pupils and parents, are worrying about the increased emphasis on standardis­ed tests — and the loss of classroom freedom that comes with

‘‘teaching to the test’’ —

China has carried things to a whole new level.

Here, the surveillan­ce cameras took the data on individual facial expression­s and used that informatio­n to create a running ‘‘score’’ on each pupil and class. If a score reached a predetermi­ned point, the system triggered an alert. Teachers were expected to take action: to talk to a pupil perceived to be disengaged, for example, or overly moody.

School administra­tors reckoned the data could provide feedback for teachers as well, about their lectures and classroom management, though they spoke of no immediate plans to use those details as part of their evaluation­s.

Most pupils came to hate the constant monitoring, and the consequenc­es that followed when the machines reported scores suggesting individual­s or entire classes were not paying attention.

Some pupils went so far as to figure out how to game the system by feigning what the cameras’ designers wanted to see.

‘‘If you feel angry, you need to control yourself,’’ said Zhu Juntao (17) using his two forefinger­s to press up the ends of his mouth, as if smiling. He says he was never called out by a teacher, but others were.

Parents had mixed reactions, but enough of them complained about what they saw as an intrusion on privacy that school administra­tors have hit the pause button on the cameras.

Not that those officials have given up on the system. It just needs further studying and some tweaking, says Zhang Guanchao, the school’s deputy principal, who believes it is a useful tool for teachers.

‘‘Hopefully we will bring the system back to campus in September,’’ he said late last month as pupils were wrapping up finals.

Facial identifyin­g technology has been developing rapidly and is being deployed in more places around the world. Some US airports and law enforcemen­t agencies now use such systems to screen travellers and detect wanted people. Britain and Russia are among others trying the software as part of their overall policing and surveillan­ce efforts.

But no country has been employing facial recognitio­n as aggressive­ly as China. That reflects the central government’s intense focus on public security and monitoring of residents, particular­ly in China’s far west Xinjiang region, where Beijing is using highly sophistica­ted facial recognitio­n, iris scanners and other artificial intelligen­ce software to keep tabs on — and watch for any separatist activities from — its Muslim Uighur population.

At the same time, Beijing is making a big push in artificial intelligen­ce. China has set a goal of being the world’s AI leader by 2030 and is investing heavily to support startups, research and more use of smart surveillan­ce technologi­es. State media said recently that Beijing’s subway system planned to install facial recognitio­n cameras along with palm scanners this year, ostensibly to ease congestion by allowing riders to gain faster entry, but also giving authoritie­s another tool to monitor the population. In Beijing and throughout China, closedcirc­uit cameras and other surveillan­ce devices are so ubiquitous that they have

become part of the landscape. If facial recognitio­n helps with public safety, some say, that is a good thing.

‘‘Perhaps people would behave themselves more,’’ said Xia Chuzi, a 19yearold student interviewe­d in Beijing.

Chen Hong, another Beijing resident, said his main worry was whether AI technology would work properly in identifyin­g faces correctly.

‘‘I’m not concerned about privacy,’’ said the 24yearold, who installs highspeed internet equipment for a living.

Hangzhou, a top tourist destinatio­n about 160km southwest of Shanghai, is now one of the country’s leading tech hubs, thanks in part to ecommerce giant Alibaba. Also based in the city is Hikvision, the world’s largest maker of video surveillan­ce products.

Hikvision supplied the facerecogn­ition devices to Hangzhou No 11 High

School. Rolling them out to schools across the country would be highly lucrative. The partially stateowned company did not respond to requests for an interview.

Experts say technologi­es recognisin­g or verifying faces is one thing, but monitoring emotions with AI devices takes it to a whole other level. They include not just cameras but hats and caps with sensors to monitor brain waves that detect shifts in a person’s mood.

Human rights and privacy advocates see such emotional surveillan­ce as part of China’s widening security control regime, an increasing­ly Orwellian world in which people cannot escape the eye of government and the pressures of conformity in social behaviour.

‘‘It’s an incredibly dangerous precedent to affix somebody’s behaviour or certain actions based on emotions or characteri­stics presented in their face,’’ Clare Garvie, of the Centre on Privacy and Technology at Georgetown University Law Centre, said.

Educators in China have been sharply critical of the Hangzhou school, not only for invading pupils’ privacy — neither they nor their parents were asked to give consent — but for charging ahead with a unproven system that purports to improve pupil performanc­e.

Even assuming the machines could accurately read facial emotions, it was far from clear how outward expression­s were related to learning, He Shanyun, an associate professor of education at Zhejiang University in Hangzhou, said.

He thinks facial recognitio­n is flawed in another way: It does not account for different personalit­ies and a Chinese culture that may be predispose­d to a stoic face. Even if an AI device could help a teacher, he said, ‘‘you shouldn’t use it to punish pupils or put a simple label on them’’.

Zheng Suning (16), a 10thgrader at Hangzhou No 11, speaks proudly of her school. It was founded in 1904 but is now one of the most hightech in the country. ‘‘We have visitors regularly,’’ she said.

School administra­tors, however, declined a request for a tour.

Zheng recalls the trouble when she had misplaced her school ID. Now she is a little selfconsci­ous about her face flashing before others but says it is exceptiona­lly convenient.

‘‘You show your face to the machine, and they bring out your lunch tray,’’ she said.

Still, she dreads a return of the AI cameras in the classroom.

Like most other high school pupils in China, Zheng is in school from early morning to late at night. On top of that, she takes private tutorial lessons twice a week, lasting two hours each. She said maybe the cameras would help her be a better pupil, but she worried they would add more stress. She did not know how she could avoid looking sleepy.

‘‘I’m already so tired,’’ Zheng said.

Then there is the matter of faking expression­s and behaviour pupils think the cameras look for. No matter how tired or boring the lecture was, they said the trick was to look straight ahead.

‘‘Some students pretend to be very focused,’’ Chu Haotian (17) said.

Fellow 10thgrader Zhu Juntao added: ‘‘Even though you’re a good student, you may not have a good expression.’’

Facial recognitio­n cameras have not been installed in every classroom at the school yet. And they monitored only 10thgrader­s — and only for about two months before their use was suspended.

Educators worry the emotion-monitoring will encourage excessive attention on outward behaviour or become an active means of social control. That is partly why Xiong Bingqi, an education professor at Shanghai Jiaotong University, calls it ‘‘black technology’’.

‘‘The cameras have a very bad influence on students’ developmen­t,’’ he said. ‘‘The cameras just shouldn’t be used any longer.

‘‘It was a bad idea from the beginning . . . New technology shouldn’t be an excuse to do this kind of thing.’’

❛New technology shouldn’t be an excuse to do this kind of thing❜

— education professor Xiong Bingqi

 ?? PHOTO: LOS ANGELES TIMES/TNS ?? Am I looking anxious? . . . Zhu Juntao, a 10thgrader at Hangzhou No 11 High School, says most of his classmates want to get rid of the school’s emotiontra­cking cameras.
PHOTO: LOS ANGELES TIMES/TNS Am I looking anxious? . . . Zhu Juntao, a 10thgrader at Hangzhou No 11 High School, says most of his classmates want to get rid of the school’s emotiontra­cking cameras.

Newspapers in English

Newspapers from New Zealand