USA TODAY International Edition
Can AI prevent the next school shooting?
Companies have developed systems for finding warning signs
Schools are increasingly turning to artificial intelligence backed solutions to stop tragic acts of student violence such as the shooting at the Marjory Stoneman Douglas High School in Parkland, Florida, a year ago. Bark Technologies, Gaggle.Net and Securly Inc. are three companies that employ AI and machine learning to scan student emails, texts, documents and, in some cases, social media activity. They look for warning signs of cyber bullying, sexting, drug use, alcohol use and depression, and to flag students who may pose a violent risk not only to themselves, but classmates.
When potential problems are found, and depending on the severity, school administrators, parents and – under the most extreme cases – law enforcement officials, are alerted.
In fall 2017, Bark ran a test pilot with 25 schools. “We found some pretty alarming issues, including a bombing and school shooting threat,” says Bark chief parent officer Titania Jordan. A few months later Parkland happened and we knew what we had, and how it could help, but “we didn’t want to seem opportunistic or capitalize on a tragedy,” Jordan said.
The Bark product is free to schools in the U.S. for perpetuity. The company says it can afford to give the service away to schools because of the money it makes from a version aimed at parents.
There are limitations.
❚ None of the companies USA TODAY talked to for this story claim the ability to catch suspect behavior every time. Loosely defined, artificial intelligence describes machines that may demonstrate human behavior and learn from the data they digest. False positives sometimes arise.
❚ A school can’t police a student’s smartphone or other devices outside the ones it issued, unless the student
signed into a social media or other account using the email or credentials the school provided.
❚ Students are often more tech savvy than their parents and won’t tell them about every account they have.
If an issue is detected, Bark sends a text and/or email alert to parents and schools, with recommended steps on how to address the issue.
Bark’s parent product costs $9 per month, per family, or $99 per year, and includes monitoring across more than 25 social media platforms, from Twitter and Instagram to Snapchat and YouTube. Bark is used in more than 1,100 school districts, covering 2.6 million children. If it detects something that is considered exceedingly severe such as a child abduction or school shooting threat, the issue is escalated to the FBI.
According to Jordan, Bark sends out between 35,000 and 55,000 alerts each day, many just instances of profanity. But 16 plausible school shootings have been reported to the FBI since Bark launched its school product last February, she says.
Preventing deaths by suicide
Gaggle, which has been around 20 years, charges schools $6 per student, per year. Since July 1, the company claims to have stopped 447 deaths by suicide at the 1,400 school districts that use its service, and that last year it prevented 542 potential deaths by suicide.
Gaggle also says it stopped 240 instances last year where a child brought a weapon to school to harm another student or intended to. Under such circumstances, Gaggle will immediately alert an emergency contact at the school and, if needed, law enforcement.
“Studies have shown that kids will communicate before a violent act happens and they will communicate electronically. If you don’t have the means to hear those cries out for help, you’re going to have children in jeopardy,” said Bill McCullough, vice president of sales at Gaggle.
McCullough adds that the company doesn’t rely on machine learning alone to make threat determinations. If a Gaggle scan of school-issued emails and documents uncovers a child in crisis, the content is analyzed by trained human safety experts who verify whether the threat is legitimate and then determine its severity.
If the threat is deemed minor, maybe the use of profanity, a student may be warned directly. If a student’s life is considered to be in jeopardy, an emergency contact at the school is notified immediately.
For its part, a third company, Securly, works with about 2,000 school districts. It charges $3 per student per year for a flagship product called Filter, with premium add-ons that can add about $2.50 per student to the cost. This past October, one of Securly’s premium services known as 24 – it combines AI with trained human analysts – flagged a student who had searched Google for “how to make a bomb” and “how to kill yourself.” The analyst contacted the school.