The Walrus

Safety in Numbers

Police are using algorithms to predict crime — and may be ushering in a new era of high-tech racial profiling

- By John Lorinc

The rise of big data in predicting crime

Boyle street is one of Edmonton’s oldest neighbourh­oods. Sandwiched between a rail corridor and the North Saskatchew­an River just east of the downtown core, the area is mostly unremarkab­le: a few vacant lots, some inexpensiv­e apartments, and newcomers from around the world. It does, however, have a fairly high crime rate. On Edmonton’s online “crime map,” the neighbourh­ood is speckled with an assortment of coloured dots: red for robberies, green for break and enters, and cyan for assaults.

On a hot summer afternoon in 2015, about a dozen Boyle Street residents gathered in a local community centre to listen as police and city officials pitched a plan to tackle the seemingly Sisyphean task of putting an end to one category of offences: property crime. Car thefts, break and enters, and vandalism could soon be predicted and prevented, the representa­tives said, all thanks to a computer algorithm.

Staff explained how Edmonton’s chief analytics officer, Stephane Contre, and his colleague, Kris Andreychuk, had recently discovered that some of the area’s property crimes were correlated to seemingly unrelated features, including noise disturbanc­es, broken street furniture, poor lighting, and the presence of a bar. They had compiled a list of 233 distinct environmen­tal and geographic­al features within city limits and placed them on a giant digital map. Using two years of 911 calls and police data, they overlaid that map with the locations of crimes and disturbanc­es and then looked at what combinatio­ns of features tended to show up in those areas. Some patterns stood out immediatel­y: for example, in any area of Edmonton that had youthcentr­es and noise complaints and abandoned automobile­s, the predicted incidence of property crime rose to 100 percent. Parts of the Boyle Street area fit the formula precisely.

But what did it all mean? The analysts knew that the noise complaints and the youth centres and the abandoned cars weren’t causing the crimes, but those factors seemed to be hinting at something else driving criminal activity. As Andreychuk puts it, “These are simply clues.”

Toward the end of the meeting, residents were told that the abandoned cars had been stolen, and an elderly woman at the back

piped up: “That makes complete sense.” The woman said from her perch in her highrise apartment, she saw “dial-a-dopers” drive in to see customers and dump their vehicles in a cul-de-sac. In other words, the abandoned cars were simply a symptom of a bigger underlying problem: dealers coming to the area and stirring up trouble. Andreychuk was awed by how well computer analytics lined up with the oldfashion­ed observatio­ns of an attentive resident — effectivel­y proving to him that the predictive capabiliti­es of their homemade system worked. “It was a very romantic moment,” he recalls.

Emboldened by the system’s performanc­e, Edmonton police devised a plan: the city deployed another social worker to the youth centre and an extra cop to walk that neighbourh­ood beat. Officials also beefed up parking restrictio­ns and reconnecte­d that cul-de-sac to the street grid so it wouldn’t be such a dead zone. Soon, noise complaints — and crime — had dropped. According to a follow-up evaluation, Edmonton’s Boyle Street experiment in predictive policing produced $1.60 in savings for every $1 spent (based on the foregone costs of arrests, 911 calls, and other associated expenditur­es). Versions of this data-driven project have since been deployed in other neighbourh­oods across the city.

By combining huge tranches of data and highly sophistica­ted algorithms, predictive policing appears to hold out the science-fiction promise that technology could, one day, spit out 100 percent accurate prophecies concerning the location of future crimes. The latest iteration of these analytics can’t ID a killer-to-be, but it can offer insight into what areas are potential sites for crime by drawing on informatio­n in everything from historical records to live social-media posts.

While predictive-policing software has recently become popular with law enforcemen­t in the United States, Canadian services have been slower to adopt it. Last July, the Vancouver Police Department became the first service in the country to use the technology across its entire operation — a move made following a successful six-month trial in 2016 that showed a reduction in break and enters by as much as 27 percent compared to recent averages. (A VPD spokespers­on declined to reveal how much the force had paid for the system, citing “privacy and contractua­l” reasons.) Other municipali­ties, including Saskatoon and London, Ontario, are now running pilot projects or preliminar­y deployment­s of their own.

The technology, however, has raised tough questions about whether hidden biases in these systems will lead to even more over-policing of racialized and lower-income communitie­s. In such cases, the result can turn into a feedback loop: the algorithms recommend a heightened police presence in response to elevated arrest rates that can be attributed to a heightened police presence.

Andrew Ferguson, who teaches law at the University of the District of Columbia and is the author of The Rise of Big Data Policing, goes further. He says that current predictive systems use social media and other deep wells of personal informatio­n to predict whether certain offenders may commit future crimes — an Orwellian scenario. Canadian government­s and civilian oversight bodies, however, have done little to establish clear policies differenti­ating appropriat­e and inappropri­ate uses for these technologi­es. It is little wonder that critics are becoming increasing­ly concerned that police department­s fitted out with big-data systems could use them to pre-emptively target members of the public. Can we really trust crime fighting to an algorithm?

While a crime may seem random to a victim, it is often the product of a particular set of circumstan­ces: parking lots where some cars will be left unlocked, bars that disgorge drunken patrons, alleys or under-lit corners concealed from view. Police, therefore, have long been relentless counters. Department websites often contain a criminal box score — annual or year-to-date tallies of murders, stabbings,

and burglaries. Cops also love visualizin­g this data. The old wall-sized precinct map, mounted in the briefing room and covered with push-pins, gave way to digital versions with more embedded informatio­n. But the idea remains the same: if you can see where crimes are being committed, your chances of being able to catch a perpetrato­r are better.

In the early 1990s, the New York Police Department created the Compstat system, which aimed to help refocus local law enforcemen­t from functionin­g as a system that reacted to crimes into one that sought to prevent them from happening in the first place. Compstat allowed cops to put crime informatio­n — which, previously, had been sent off to the FBI but served little local operationa­l use — into a massive database. Compstat would then generate “heat maps,” which vividly depicted hitherto invisible crime patterns, such as concentrat­ions of particular types of incidents in certain locations. Officers would then use these heat maps to try to intuit where crimes would occur. As one report put it, “The past is prologue.”

After the 2007 financial crisis, many American municipali­ties were teetering on the brink of bankruptcy, and police department­s had to drasticall­y scale back spending. At about that time, Los Angeles Police Department officials were wondering if they could build on CompStat’s hot-spot analysis to actually anticipate crime. The force teamed up with Jeff Brantingha­m and George Mohler, a pair of California data scientists who were experiment­ing with new methods of forecastin­g property crime by adapting mathematic­al models developed to predict the frequency of earthquake aftershock­s. To do this, Mohler and Brantingha­m took burglary data from three LA regions in the 2000s and generated estimates of the location and frequency of “followon” crime incidents, which were often second break-ins at the same or neighbouri­ng houses. Surprising­ly, those seismic formulas produced fairly accurate prediction­s for the secondary crimes that tended to occur.

Soon after they began publishing their findings, Mohler and Brantingha­m set up a real-world experiment with the Santa Cruz Police Department. Drawing on eight years of local crime data, the system they built divided the city into 150-square-metre cells, each with a crime forecast profile based on previous incidents. Police supervisor­s could prioritize their patrol efforts by focusing on the cells with the greatest predicted likelihood of break and enter activity. The result was a 27 percent reduction in burglaries. Law enforcemen­t officials in other cities started studying the deployment of such systems in the wake of breathless media accounts of such technologi­es’ ability to make the right guesses. Within months, the LA PD had rolled out a version and reported that areas with the new predictive-policing system had seen crime drop by 13 percent, even as the citywide stats trended upward.

Brantingha­m and Mohler realized that there was a huge potential market for the technology and set out to develop arobust system that could work with large amounts of data from all sorts of crimes. Their commercial­ized system, dubbed Predpol, raised $1 million from investors. Soon other companies started building additional predictive-policing wares, each with their own formulas, and tech giants including Microsoft and IBM entered the field.

Predictive-policing technology has now been adopted in other areas, including in Brazil, where the Igarapé Institute, a think tank, partnered with state authoritie­s in the Rio de Janeiro region to launch a mapping-and-forecastin­g tool for crimes, including homicide. Robert Muggah, research director and co-founder of the Igarapé Institute, says the mapping algorithm, launched in 2016, works by using a historical crime database with 15 million incidents to forecast the probabilit­y of victimizat­ion within thousands of 250-square-metre blocks. Igarapé’s algorithm offered new insight into trends and showed that 90 percent of homicides occurred on or near just 2percent of Rio’s intersecti­ons.

In London, Ontario, meanwhile, the local police service acquired a commercial predictive system in 2013. The technology has been used to get a better handle on break and enter activity around neighbourh­oods with a lot of off-campus student rentals. With the system’s prediction­s, the London Police Service intensifie­d its patrols at times when break and enters tended to spike. However, Detective Andrew Whitford, supervisor of crime analysis, cautions that analytics can only provide educated guesses to help officers make informed decisions. What really matters are the actions of the officers. In london, the force created better prevention programs geared at informing students and their families to take precaution­s. Two years in, the total value of stolen and damaged property in those areas had plunged. “These systems don’t give you the answer. They’re tools,” Whitford says. “You need to put them into the right hands.”

In 2016, Brian Rector, an executive director in Saskatchew­an’s Ministry of Justice, was casting around for new approaches to deal with the factors that drive Indigenous youth to run away. He’d heard about US predictive-policing systems and wondered if they could be adapted to identify potential victims, instead of crimes. “Thirty percent of police activity deals with criminal offences, while 70 percent is social,” he explains. Social issues can refer to everything from calls concerning individual­s with mental illness or addiction problems to calls about those who are homeless or vulnerable.

The Saskatoon Police Service had long had a poor relationsh­ip with Indigenous communitie­s. In 1992, Shelley Napope disappeare­d. As Saskatoon police eventually discovered, the sixteen-year-old was a victim of John Crawford, a local serial killer. But police’s initial profile of Napope — an Indigenous teen officers believed was caught up in drugs and prostituti­on — meant that, even though her family made dozens of visits to the station to ask for help, her case didn’t merit an immediate response. Darlene Okemaysim-Sicotte, a Saskatoon-based activist whose

“Predictive policing has raised the question of whether this datadriven focus serves to enable, or even justify, a high-tech version of racial profiling.”

cousin is Napope’s mother, says that the lack of police action was typical in many cases like Napope’s. “Ask any family,” she says. “They’ve had to report more than once to get action.”

The Saskatoon Police Service began overhaulin­g its relationsh­ips with Indigenous residents after a 2003 inquiry, and part of this included changing how the force dealt with missing-persons reports. But even though it created a new policy to respond to each report immediatel­y, there still was a problem: the service can see up to 2,700 cases each year, and the missing persons unit is made up of just three officers. “Do the math,” says psychologi­st Stephen Wormith, director of the University of Saskatoon’s Centre for Forensic Behavioura­l Science and Justice Studies. With so many cases, Wormith says, they still needed to triage.

Rector asked Wormith’s team to develop an algorithm that would help Saskatoon police find missing youth quickly, using prediction­s gleaned from analysis of historical data on previous disappeara­nces, as well as other informatio­n from schools and welfare and children’s-aid case files. According to Daniel Anvari, a mathematic­ian who worked on the lab’s algorithms, the system analyzes hundreds of variables in current and previous missingper­sons cases for both youth and adults. So if a fifteen-year-old living in a foster home has been absent for two weeks and has gone missing before, the system can quickly scan earlier cases with similar facts, provide informatio­n on how those situations were resolved, and show where the teen turned up. The system is designed to rapidly generate leads that might otherwise take days for a cop riffling through case files to uncover. And in missing-persons cases, time is key. “The longer the person is missing,” Saskatoon deputy chief Jeff Bent explains, “the more likely they’ll get victimized.”

Rector’s ambitions for the lab go well beyond more-efficient police investigat­ions into missing people. He sees the technology as a means of developing social policies that can forestall various types of violence and victimizat­ion. The data, he says, can identify early warning signs — such as repeated incidents of self-harm in the months prior to a teen running away — and offer officials a better understand­ing of the particular conditions confrontin­g young people who flee repeatedly.

Saskatoon isn’t the only jurisdicti­on experiment­ing with predictive algorithms aimed at anticipati­ng victimizat­ion. In Greater Pittsburgh, the county’s children-and-family-services department has deployed an algorithm that can examine household occupants’ previous criminal and medical activity to help screeners make risk assessment­s on calls coming into a child-abuse hotline. Caseworker­s can also rapidly access historical data about a family’s situation that used to be spread across multiple databases. The system, according to an eighteen-month New York Times investigat­ion, is intended to allow workers to focus on the most precarious cases, which wasn’t always happening in the past. The two academics who developed the algorithm reviewed the outcomes of almost 80,000 earlier cases recorded between 2010 and 2014 and found that social workers had mistakenly screened out a large number of children who ended up being killed or grievously injured by a parent. “You can’t believe the amount of subjectivi­ty that goes into child-protection decisions,” Rachel Berger, a physician and child-abuse expert, told the Times. “That’s why I love predictive analytics. It’s finally

bringing some objectivit­y and science to decisions that can be so unbelievab­ly life-changing.”

There may Be promise in predictive policing, but the rapid evolution of the technologi­es has led a number of civil libertaria­ns and academics to raise grave reservatio­ns about its usage. According to Teresa Scassa, a University of Ottawa professor and the Canada Research Chair in Informatio­n Law and Policy, the technology is “moving much faster than government,” and safeguards surroundin­g privacy and civil liberties have not been able to keep up. After all, it’s one thing if Amazon’s algorithms incorrectl­y predict what kind of book a person is interested in, but it’s quite another if a neighbourh­ood becomes the subject of over-policing because of an opaque, and potentiall­y flawed, set of statistics.

While the Ontario government passed legislatio­n in 2017 to restrict carding — a tactic where police would stop people, most often young black men, to demand ID and explanatio­n for their whereabout­s and, in some cases, to conduct street searches — department­s still have the right to collect and store informatio­n about previous street checks in their databases, presumably for future use. These bits of informatio­n are now perfectly suited for predictive-policing applicatio­ns and may potentiall­y be fed into larger systems that combine disparate sources of personal data outside the criminal-justice system. As Andrew Ferguson explains, “Initial predictive-policing projects have raised the question of whether this data-driven focus serves merely to enable, or even justify, a high-tech version of racial profiling.”

Indeed, some experts say government­s and the courts must tackle “algorithmi­c bias” before law enforcemen­t agencies make further investment­s into these technologi­es. A 2016 investigat­ion by Propublica into bail and sentencing decisions in the US found that algorithms used to gauge risk of reoffendin­g when setting parole terms are far more likely to label black defendants, rather than white defendants, high risk. In Canada, the Supreme Court is currently weighing a case in which an Indigenous man convicted of murder has alleged that anti-indigenous biases have found their way into sentencing tests used by the judicial system.

According to Brenda Mcphail, the Canadian Civil Liberties Associatio­n’s director of privacy, technology, and surveillan­ce, much of today’s crime data is distorted by racial and class bias. Building a forecastin­g algorithm around historical 911 calls, carding informatio­n, arrests, and conviction­s may predict increased likelihood of crime in certain areas simply because they’ve been over-policed in the past. A few US jurisdicti­ons, including Oakland, have rejected predictive policing specifical­ly because of this concern and because of worries about how it could further isolate officers from the communitie­s they serve.

Ferguson adds that if police officials rely too heavily on prediction generators, they run the risk of overlookin­g crimes such as fraud, sexual assault, and domestic violence that are taking place in communitie­s that are less likely to be identified as risky by these algorithms. “There are other crimes that get undervalue­d because we’re not counting them,” he says. “It creates a hierarchy within the criminalju­stice system.”

The quality of the data isn’t the only source of concern. Ferguson conducted

a review of predictive-policing pilot projects in the US and found that early positive results were used to justify further investment but have proven difficult to replicate in other locales: a prediction engine that works well in the low-rise sprawl of LA may not work at all in the higher-density parts of Manhattan, he notes.

Yet, as with so many artificial-intelligen­ce technologi­es now storming into markets, the hype has proven difficult to resist, especially for overstretc­hed police forces. Predictive-policing firms dutifully say they’re not actually selling areallife version of the system envisioned in the scifi thriller Minority Report, where crimes are uncannily prophesize­d before they occur and the criminals arrested before they can carry them out. Yet the technology’s implicit promise is that powerful computers can be trained to effectivel­y do a significan­t amount of officers’ work for them, and it will still be a long time until we know if that will ever be the case.

Sarah brayne, a Canadian sociologis­t at the University of Texas at Austin, recently spent two-and-a-half years embedded with the LA PD. While there, she observed how its cops had come to rely on intelligen­ce aggregated by asecretive company called Palantir. The firm, establishe­d by Paypal co-founder and early Facebook investor Peter Thiel, integrates huge sets of data — everything from licence-plate registries to addresses linked to cell phones to warrants and “unstructur­ed data,” such as intercepte­d emails — for military, security, and law enforcemen­t uses. Users simply search a name and receive an unpreceden­ted look into the lives of private citizens, with no judicial oversight. “You’re able to build up a case with historical data without having a search warrant,” Brayne explains.

Brayne says that, during her time at the LA PD, she was witnessing the birth of “bigdata surveillan­ce,” which is developing parallel to predictive-policing technology. “Mathematiz­ed police practices serve to place individual­s already under suspicion under new and deeper forms of surveillan­ce, while appearing to be objective,” she observed in a 2017 paper recounting her experience­s. In conversati­on, she describes it another way: “In the age of big data, law enforcemen­t is starting to look more like intelligen­ce gathering.”

Brayne recalls an encounter that revealed how potentiall­y problemati­c these technologi­es can be: one algorithm showed, among other things, how many times an individual’s name had been searched previously. When she asked one detective why this kind of informatio­n was significan­t, he replied that the number of queries about an individual can be read as a proxy for their suspicious­ness.

Perhaps this family of rapidly evolving algorithms offers a reminder about the old saying that when all you have is a hammer, every problem looks like a nail. Predictive policing and big data may be a new set of tools, but as critics point out, they are only as effective as the people who use them. “The problem with predictive policing,” Ferguson muses, “is the policing part.”

 ??  ??

Newspapers in English

Newspapers from Canada