Former cop puts himself in conflict over crime prediction
THERE’S a story Brett Goldstein likes to tell. It starts on a Friday night in 2010 with him sitting in a darkened Crown Victoria on a Chicago street, poring over maps. Goldstein was a commander at the Chicago Police Department, in charge of a small unit using data analysis to predict where certain types of crimes were likely to occur at any time.
Earlier that day, his computer models forecast a heightened probability of violence on a particular South Side block. Now that he and his partner were there, Goldstein was doubting himself.
“It didn’t look like it should be a target for a shooting,” he recalled. “The houses looked great. Everything was well manicured. You expect, if you’re in this neighbourhood, you’re looking for abandoned buildings, you’re looking for people selling dope. I saw none of that.”
Still, they staked it out. Goldstein’s wife had just given birth to their second child, and he was exhausted after a day in the office. He started to doze off. Goldstein’s partner argued that the data must be wrong. At 11pm, they left.
Several hours later, Goldstein woke up to the sound of his BlackBerry buzzing. There had been a shooting-on the block where he’d been camped out.
“This sticks with me because we thought we shouldn’t be there, but the computer thought we should be there,” Goldstein said. He took it as vindication of his vision for the future of law enforcement. “I do believe in a policeman’s gut. But I also believe in augmenting his or her gut.”
Seven years after that evening, Goldstein threw on a grey suit and headed from his Manhattan hotel to New Jersey. Last spring he founded CivicScape, a technology company that sells crime-predicting software to police departments. Nine cities, including four of the country’s 35 largest cities by population, are using or implementing the software, at an annual cost from US$30,000 (RM135,000) for cities with less than 100,000 people to US$155,000 in cities with populations over one million. Goldstein was checking in on the two clients who were furthest along: The police departments in Camden and Linden.
Goldstein likes to harp on his own lack of charisma, but he’s well suited to be a pitchman for police departments. In Chicago he rose from patrol officer to the city’s chief data officer over a seven-year government career and regularly drops a few war stories into his conversations with cops.
He’s also peddling something that every department is after nowadays: Technological sophistication. The criminal justice system produces reams of data, and new computing methods may turn any pool of numbers into something useful. Today, almost every major US. police department is using or has used some form of commercial software that makes crime predictions, to determine what blocks warrant heightened police presence or which people are most likely to be involved. Technology is transforming the craft of policing.
Not everyone is rubbing their hands in anticipation. Many police officers still see so-called predictive policing software as mumbo jumbo. Critics outside law enforcement say it’s actively destructive. The historical information these programmes use to predict patterns of crime aren’t a neutral recounting of objective fact; they’re a reflection of socio-economic disparities and the aggressive policing of black neighbourhoods. Computer scientists have held up predictive policing as a poster child of a how automated decision making can be misused. Others mock it as pseudoscience.
“Systems that manufacture unexplained ‘threat’ assessments have no valid place in constitutional policing,” a coalition of civil rights and technology associations, including the ACLU, the Brennan Center for Justice, and the Center for Democracy & Technology, wrote in a statement last summer.
A numbing progression of police shootings in the past several years serve as a reminder of what’s at stake when police officers see certain communities as disproportionately threatening. Over eight days in late June, juries failed to convict officers who killed black men in Minnesota, Ohio and Wisconsin. In each case, the officer’s defense relied on his perception of danger. The worst-case scenario with predictive policing software is deploying officers to target areas with their ears raised, leading them to turn violent in what would otherwise be routine encounters.
The police departments Goldstein visited in New Jersey didn’t raise any questions about fairness during his recent trip - but there was scepticism nonetheless. He had barely started speaking to a group of top officers in the Linden Police Department when the man who handled the city’s procurement process confessed how wary he was of software vendors’ magical-sounding claims. Goldstein nodded. As a cop, he said, he hated sitting through “the vendor nonsense.” Goldstein launched into a singsong voice: “Oh, you’re going to have a flying car, and it’s going to stop people, and you’re going to be Super Po-Po!’ They’ll promise you anything.” Goldstein’s company does make one unusual promise, which it thinks can satisfy sceptics in law enforcement and civil rights circles simultaneously. Other companies that make predictive software for criminal justice settings keep their algorithms secret for competitive reasons. In March, CivicScape published its code on GitHub, a website where computer programmers post and critique one another’s work. The unprecedented move caused an immediate stir among people who follow the cop tech industry.
“They’re doing all the things I’ve been screaming about for years,” said Andrew Ferguson, a professor at the University of the District of Columbia’s law school and author of the forthcoming book, “The Rise of Big Data Policing.”
Posting computer code online won’t erase the worries about predictive policing. There are still concerns about how CivicScape responds to perceived shortcomings, and there’s also the big question of what police departments do with the intelligence it produces. But more than any other company, CivicScape has turned itself into a test case for what it means for law enforcement to use artificial intelligence in a way that’s transparent and accountableand whether that’s even possible.
Goldstein, 43, didn’t start off wanting to be a cop. He was director of information technology at Open Table, the online restaurant reservation company, but after 9/11 he began to question the significance of that work. In 2004, Goldstein saw an advertisement for the Chicago Police Department’s entry exam, took it and did well. He left Open Table in 2006.
After 13 months as a beat cop, Goldstein was promoted to commander and put in charge of a new unit running computer models to anticipate where crime would happen. The unit was providing intelligence that far exceeded what it had been using before, according to Michael Masters, who first met Goldstein during his academy days when Masters was an adviser to Mayor Richard M. Daley, then moved to the police department and now works at CivicScape.
“We were well ahead of our time,” said Masters. Goldstein was perfectly placed to build technology into the daily work of policing. “You don’t have people who were cops, and have ridden in squad cars, building these tools.”
Last spring he founded CivicScape, a technology company that sells crime-predicting software to police departments.