Why police struggle to detect online threats
The idea that authorities can catch shooters before they strike is a fantasy, experts say
In the aftermath of back-to-back mass shootings in the United States earlier this month, President Donald Trump called for greater vigilance of a place where violent intentions are, increasingly, forewarned: “the dark recesses of the internet.”
About 20 minutes before a gunman killed 22 people inside an El Paso Walmart, someone who identified himself as the shooter posted a manifesto on 8chan — an anonymous internet forum used by white supremacists — railing against immigrants and what he called “the Hispanic invasion of Texas.”
That narrow window of opportunity between the online threat and the outbreak of violence presents a chance to act, Trump suggested. Law enforcement and social media companies should develop technology to “detect mass shooters before they strike,” he said.
That call to action has been criticized by tech experts and police alike for being nearly impossible to achieve.
The notion that police could have “somehow seen these posts” and prevented these attacks is unrealistic, said Warren Bulmer, a recently retired Toronto police officer who is an expert in cybercrime and online investigations.
The idea is a “fallacy that only exists on Netflix or in a Hollywood movie,” he said.
But in a world where manifestos are posted online, crimes are live-streamed and evidence is shared on social media, the job descriptions of police must change. There is an expectation, and need, for officers to become more adept at working online, capable of everything from intervention to long-term investigations.
The proliferation of social media and online forums, such as 8chan or the video gamer chat app Discord, gives unprecedented opportunities to track potential perpetrators, uncover evidence and even intervene to prevent crime before it happens. But there are myriad challenges, too — where the digital aspect of “IRL” (in real life) crime creates major complications for law enforcement.
In other words: detecting mass shooters before they strike is easier said than done.
Among the reasons why realtime monitoring is unfathomable is the sheer volume of information on even just mainstream social networking sites, such as Twitter and Facebook. On Facebook alone, there are 300 million photo uploads a day, five new profiles every second, and 510,000 comments and 293,000 status updates each minute, Bulmer said.
“No law enforcement agency is equipped to handle these volumes of data,” he said.
Artificial intelligence may offer tools to help crime prevention, but they are not uncomplicated.
In the U.S., companies are using artificial intelligence and machine learning — computer algorithms that “learn” to predict patterns in big data sets — in an attempt to prevent the next school shooting, in part by monitoring school-issued student emails, texts and social media for signs of bullying, depression and more, according to USA Today.
And earlier this month, the Federal Bureau of Investigation called for a proposal for what it called a “social media alerting subscription” — an “early alerting tool in order to mitigate multifaceted threats.”
“It is an acknowledged fact that virtually every incident and subject of FBI investigative interest has a presence online. Consequently, law enforcement gaining lawful access … to this data will result in early detection and/or containment of the magnitude of any harm caused by these threats,” reads the request for proposal.
But predictive technology enabled by AI is far from a perfect solution, in part because it remains nearly incapable of detecting tone or fully understanding context — in essence, adequately gauging a threat level.
It also presents serious privacy considerations.
“Authorities still have to be able to enunciate they have a reason to intrude upon someone’s privacy,” said Chris Parsons, research associate at the University of Toronto’s Citizen Lab. “And so having an automated system that’s perpetually scanning the entirety of everything would probably run afoul of that.”
Equally, there are reasons to be skeptica l and cautious around predictive policing and its disproportionate impact on racialized people, Parsons said. Although many may consider an algorithm to be free of bias, researchers and journalists are uncovering ways in which AI can discriminate. For example, a 2016 ProPublica investigation found a system being used to determine whether a convicted person was likely to reoffend was biased against Black people.
In the cases where police have found or been alerted to potential threats online — say, someone posting an intention to harm another person — there are also significant challenges in determining the location of the threat.
It takes time to reach out to a service provider, such as Facebook or Twitter, to access personal information, such as an IP address, a unique identifier that can help pinpoint a precise location.
That process typically requires a warrant — a step that’s more difficult if the third party does not have a physical presence in Canada, meaning it’s outside of a Canadian court’s jurisdiction. There is a way for police investigators to obtain evidence from a company that’s not in Canada — via what’s called a Mutual Legal Assistance Treaty Request — but the complicated system often takes months or even a year.
In situations where a life is on the line, investigators can fasttrack that process, though there are still complicating factors, including the fact that thirdparty providers may have different definitions of a crisis in which a user’s personal information, such as an IP address, should be handed over.
Additionally, there are a lot of ways for people to access the internet anonymously, such as through an encrypted connection or using a virtual private network to mask their location.
The recent killing of four members of one Markham family demonstrates some of the challenges police face locating a threat: Several hours before York police discovered the crime, members of a niche video game community were desperately trying to figure out the location of a user who had sent a series of disturbing messages through the app Discord, so that they could call police.
Based on the forum members’ account, it took at least 16 hours between those first messages and the alleged killer’s arrest at the Markham home.
Considering the inherent challenges and the growing need, Parsons and Bulmer agree far more training is necessary for police officers. Although some officers who work in specialized units have access to education to help them do online probes, those who work in regular investigative roles don’t get the same opportunities, in part because the training is expensive, Bulmer said.
Earlier this month, Public Safety Canada provided further details about its National Cyber Security Action Plan, which includes the establishment, by next year, of a National Cybercrime Coordination Unit. The unit, comprising RCMP officers and civilians, will “work with law enforcement and other partners to help reduce the threat, impact and victimization of cybercrime in Canada,” according to an RCMP spokesperson.
Asked if one of the goals of this unit will be to monitor threats being posted online — for instance, a situation where a possible perpetrator of a mass shooting posts online about his intentions — RCMP spokesperson Cpl. Caroline Duval said the unit will work to combat “a wide range of cybercrime incidents.”
Duval added that in addition to the unit, the RCMP is “further developing its cybercrime enforcement capacity by establishing new federal investigative teams in strategic locations,” including in Montreal and Toronto.
In spite of the ever-shifting challenges of policing crime with online elements, Bulmer says police must do what they can to adapt.
“There’s an obligation on law enforcement to pursue evidence wherever that evidence goes, and exists,” he said.
“There are challenges. There are technological challenges; there’s also legal ones, when it comes to the jurisdiction. But at the end of the day, police have to be diligent and do whatever they are able to do.”