Predictive analytics could help prevent student suicides
and should be used to predict and prevent student suicide.
Hospitals, international schools and other health organizations have identified a way to predict suicide attempts, which could revolutionize suicide-prevention programs at universities: predictive analytics.
Predictive analytics is the crystal ball of a tech society, analyzing large quantities of data to make predictions about the future. Predictive analytics can even anticipate individuals at risk of committing suicide. As an example, the medical managed-care giant Kaiser Permanente developed an analytics model to predict which medical patients were at risk. This model used a range of data, including medical conditions, mentalhealth and substance-usedisorder diagnoses, current and past prescriptions and patterns of health-care use. The model allowed researchers to predict the likelihood of a suicide attempt within 90 days of a mental-health or primarycare outpatient visit.
Universities also collect a wide range of data about students throughout their years at the university, including health records, attendance, grades, student surveys, test scores and even what time a student swipes into or out of a campus building. Using this data, universities should create an algorithm to anticipate and prevent suicide.
Once the algorithm identifies a student as at risk, professors and academic advisers should be given notice. This at-risk notification can parallel notification of students’ individualized learning disabilities — professors and advisers should treat an at-risk notification with the same confidentiality and discretion as required for diagnosed learning disabilities. Professors and advisers may then notify the school and trained medical professionals when they see changes in an at risk student’s behavior.
The best way to implement new programs is to follow existing structures. Universities across the country are already designing and implementing suicide-prevention programs. OSU’s program, for example, trains professors and staff to recognize at-risk students and help them. OSU has trained more than 15,000 students, staff and faculty to recognize the signs of suicide. These programs can be a starting point to train faculty and staff to identify distinctive changes in an at-risk student’s behavior and immediately involve medical professionals.
Using student data to make predictions about a student’s future behavior raises concerns about student privacy. However, student privacy can be protected by placing restrictions on how an at-risk identification is used, without limiting the usefulness of a predictive-analytics model.
In consideration of student privacy, other students should not be notified about which of their peers are at risk. Parents should also not be notified. Under the Family Education Rights and Privacy Act, the university cannot discuss student account information and academic records with external third parties unless a student agrees. If an algorithm for identifying at-risk students uses information protected by FERPA, the university cannot disclose a student’s at-risk status.
So long as third parties outside the university are not notified, identifying which students are at risk poses no greater threat to student privacy than other ways universities already use student data. From medical records to family history, universities collect student data for enrollment, scholarships, and even recruitment of prospective students.
Third-party providers also access student data to provide universities with services. The harm to student privacy in analyzing student data for the student’s benefit is minimal, and the benefit could mean everything to a student in need of help.
The top floors of parking garages are now closed. But these are retroactive measures, not proactive measures. If we can use predictive analytics to stop another jump, why aren’t we?