Parkland to use program that can track students
Move sparks privacy concerns among parents, students, teachers
Kimberly Krawczyk says she would do anything to keep her students safe. A year ago, the Parkland high school math teacher barricaded students behind her classroom door during one of the deadliest mass shootings in U.S. history.
But one of the unconventional responses the local Broward County, Fla., school district has said could stop another tragedy has left her deeply unnerved: an experimental artificial-intelligence system that would surveil her students.
The south Florida school system, one of the largest in the country, said last month it will install a camera-software system called Avigilon that will allow security officials to track students based on their appearance. With one click, a guard can pull up video of everywhere else a student has been recorded on campus.
The 145-camera system, which administrators said will be installed around the perimeters of the schools deemed “at highest risk,” will also automatically alert a school-monitoring officer when it senses events “that seem out of the ordinary” and people “in places they are not supposed to be.”
The supercharged surveillance network has raised major questions for some students, parents and teachers, like Krawczyk, who voiced concerns about its accuracy, invasiveness and effectiveness. Her biggest doubt: that the technology could ever understand a school campus like a human can.
“How is this computer going to make a decision on what’s the right and wrong thing in a school with over 3,000 kids?” said Krawczyk, a 15-year teacher who was on the third floor of what’s known as Marjory Stoneman Douglas High School’s “freshman building” when the shooting began. “We have cameras now every 2 feet, but you can’t get a machine to do everything a human can do. You can’t automate the school. What are we turning these schools into?”
The specter of student violence is pushing school leaders across the country to turn their campuses into surveillance testing grounds on the hope it’ll help them detect dangerous people they’d otherwise miss. The supporters and designers of Avigilon, the AI service bought for $1 billion last year by tech giant Motorola Solutions, say its security algorithms could spot risky behavior with superhuman speed and precision, potentially preventing another attack.
But the advanced monitoring technologies ensure the daily lives of U.S. schoolchildren are subjected to close scrutiny from systems that will automatically flag certain students as suspicious, potentially spurring a response from security or police forces, based on the work of algorithms that are hidden from public view.
The camera software has no proven track record for preventing school violence, some technology and civil-liberties experts argue. And the testing of their algorithms for bias and accuracy, how confident the systems are in identifying possible threats, have largely been conducted by the companies themselves.
Elizabeth Laird, a former state education official in Washington, D.C., and Louisiana and current senior fellow at the think tank Center for Democracy and Technology, said systems such as Avigilon have faced little public testing for their validity or long-term impact. As they multiply across campuses, she fears they could leave a chilling effect over a place where kids are taught to think independently, express themselves and learn from their mistakes.
School officials, she added, often lack the experience or know-how to understand all the data these systems can gather, and the potential pitfalls if they get something wrong. Administrators pressured to do something, anything, to increase school security may regard this kind of technology as a cure-all, even when its implications aren’t entirely understood, she said.
“We’re seeing that the uses of AI and technology like this are coming with unintended consequences, things the education sector has not experienced before, that may endanger the students it intends to protect,” she said. Students could be mischaracterized as dangerous based on how they were dressed or where they were walking, she said. And security officials could be overwhelmed with false alarms, making it harder for them to focus on actual threats.
If the Avigilon contract wins final approval from county leaders in the coming weeks, the school district will spend more than $600,000 in federal and local funds activating the AI-powered system around the high school campuses “with the highest security incidents,” contracting records show. The camera system will run independently alongside another 10,000 other cameras already recording across the county’s schools.
Many aspects of the program, however, remain a mystery, and it’s unclear how exactly the surveillance system’s data and performance will be regulated, measured or tested for potential flaws.