Santa Fe New Mexican

Parkland to use program that can track students

Move sparks privacy concerns among parents, students, teachers

- By Drew Harwell Washington Post MATT MCCLAIN/WASHINGTON POST

Kimberly Krawczyk says she would do anything to keep her students safe. A year ago, the Parkland high school math teacher barricaded students behind her classroom door during one of the deadliest mass shootings in U.S. history.

But one of the unconventi­onal responses the local Broward County, Fla., school district has said could stop another tragedy has left her deeply unnerved: an experiment­al artificial-intelligen­ce system that would surveil her students.

The south Florida school system, one of the largest in the country, said last month it will install a camera-software system called Avigilon that will allow security officials to track students based on their appearance. With one click, a guard can pull up video of everywhere else a student has been recorded on campus.

The 145-camera system, which administra­tors said will be installed around the perimeters of the schools deemed “at highest risk,” will also automatica­lly alert a school-monitoring officer when it senses events “that seem out of the ordinary” and people “in places they are not supposed to be.”

The supercharg­ed surveillan­ce network has raised major questions for some students, parents and teachers, like Krawczyk, who voiced concerns about its accuracy, invasivene­ss and effectiven­ess. Her biggest doubt: that the technology could ever understand a school campus like a human can.

“How is this computer going to make a decision on what’s the right and wrong thing in a school with over 3,000 kids?” said Krawczyk, a 15-year teacher who was on the third floor of what’s known as Marjory Stoneman Douglas High School’s “freshman building” when the shooting began. “We have cameras now every 2 feet, but you can’t get a machine to do everything a human can do. You can’t automate the school. What are we turning these schools into?”

The specter of student violence is pushing school leaders across the country to turn their campuses into surveillan­ce testing grounds on the hope it’ll help them detect dangerous people they’d otherwise miss. The supporters and designers of Avigilon, the AI service bought for $1 billion last year by tech giant Motorola Solutions, say its security algorithms could spot risky behavior with superhuman speed and precision, potentiall­y preventing another attack.

But the advanced monitoring technologi­es ensure the daily lives of U.S. schoolchil­dren are subjected to close scrutiny from systems that will automatica­lly flag certain students as suspicious, potentiall­y spurring a response from security or police forces, based on the work of algorithms that are hidden from public view.

The camera software has no proven track record for preventing school violence, some technology and civil-liberties experts argue. And the testing of their algorithms for bias and accuracy, how confident the systems are in identifyin­g possible threats, have largely been conducted by the companies themselves.

Elizabeth Laird, a former state education official in Washington, D.C., and Louisiana and current senior fellow at the think tank Center for Democracy and Technology, said systems such as Avigilon have faced little public testing for their validity or long-term impact. As they multiply across campuses, she fears they could leave a chilling effect over a place where kids are taught to think independen­tly, express themselves and learn from their mistakes.

School officials, she added, often lack the experience or know-how to understand all the data these systems can gather, and the potential pitfalls if they get something wrong. Administra­tors pressured to do something, anything, to increase school security may regard this kind of technology as a cure-all, even when its implicatio­ns aren’t entirely understood, she said.

“We’re seeing that the uses of AI and technology like this are coming with unintended consequenc­es, things the education sector has not experience­d before, that may endanger the students it intends to protect,” she said. Students could be mischaract­erized as dangerous based on how they were dressed or where they were walking, she said. And security officials could be overwhelme­d with false alarms, making it harder for them to focus on actual threats.

If the Avigilon contract wins final approval from county leaders in the coming weeks, the school district will spend more than $600,000 in federal and local funds activating the AI-powered system around the high school campuses “with the highest security incidents,” contractin­g records show. The camera system will run independen­tly alongside another 10,000 other cameras already recording across the county’s schools.

Many aspects of the program, however, remain a mystery, and it’s unclear how exactly the surveillan­ce system’s data and performanc­e will be regulated, measured or tested for potential flaws.

 ??  ?? People gather last year at a memorial at Marjory Stoneman Douglas High School in Parkland, Fla., after a shooting on campus left 17 people dead. The school district said last month it would install a camera-software system called Avigilon that would allow security officials to flag students based on their appearance.
People gather last year at a memorial at Marjory Stoneman Douglas High School in Parkland, Fla., after a shooting on campus left 17 people dead. The school district said last month it would install a camera-software system called Avigilon that would allow security officials to flag students based on their appearance.

Newspapers in English

Newspapers from United States