minority report
John Lorinc is right to raise the issue of “algorithmic bias” in his explanation of predictive policing (“Safety in Numbers,” April). He is right, too, to call the term something of a misnomer: algorithms simply perpetuate the prejudices of the data they’re fed. Policing strategies guided by the use of crime data, which often reflect existing inequalities, such as the over-policing of racialized communities, might only make matters worse. The features of the high-crime neighbourhood Lorinc describes are symptomatic of a community that is impoverished and underserved. The people who live there need more social programs and better economic prospects — not more police officers. An algorithm can’t address the root causes of crime. Joshua Oliver The Walrus Editorial Fellowship alumnus London, UK