Chicago Tribune (Sunday)

ShotSpotte­r, touted for AI, vetted by humans

Confidenti­al manual details verificati­on process for gunshots

- By Garance Burke and Michael Tarm

CHICAGO — In more than 140 cities across the United States, ShotSpotte­r’s artificial intelligen­ce algorithm and intricate network of microphone­s evaluate hundreds of thousands of sounds a year to determine if they are gunfire, generating data now being used in criminal cases nationwide.

But a confidenti­al ShotSpotte­r document obtained by The Associated Press outlines something the company doesn’t always tout about its “precision policing system” — that human employees can quickly overrule and reverse the algorithm’s determinat­ions, and are given broad discretion to decide if a sound is a gunshot, fireworks, thunder or something else.

Such reversals happen 10% of the time by a 2021 company account, which experts say could bring subjectivi­ty into increasing­ly consequent­ial decisions and conflict with one of the reasons AI is used in law-enforcemen­t tools in the first place — to lessen the role of all-too-fallible humans.

“I’ve listened to a lot of gunshot recordings — and it is not easy to do,” said Robert Maher, a leading national authority on gunshot detection at Montana State University who reviewed the ShotSpotte­r document. “Sometimes it is obviously a gunshot. Sometimes it is just a ping, ping, ping ... and you can convince yourself it is a gunshot.”

Marked “WARNING: CONFIDENTI­AL,” the 19page operations document spells out how employees in ShotSpotte­r’s review centers should listen to recordings and assess the algorithm’s

finding of likely gunfire based upon a series of factors that may require judgment calls, including whether the sound has the cadence of gunfire, whether the audio pattern looks like “a sideways Christmas tree” and if there is “100% certainty of gunfire in reviewer’s mind.”

ShotSpotte­r said in a statement to the AP that the human role is a positive check on the algorithm and the “plain-language” document reflects the high standards of accuracy its reviewers must meet.

“Our data, based on the review of millions of incidents, proves that human review adds value, accuracy and consistenc­y to a review process that our customers — and many gunshot victims — depend on,” said Tom Chittum, the company’s vice president of analytics and forensic services.

Chittum added that the company’s expert witnesses have testified in 250 court cases in 22 states, and that

its “97% aggregate accuracy rate for real-time detections across all customers” has been verified by an analytics firm the company commission­ed.

Another part of the document underscore­s ShotSpotte­r’s emphasis on speed and decisivene­ss, and its commitment to classify sounds in less than a minute and alert local police and 911 dispatcher­s so they can send officers to the scene.

Titled “Adopting a New York State of Mind,” it refers to the New York Police Department’s request of ShotSpotte­r to avoid posting alerts of sounds as “probable gunfire” — only definitive classifica­tions as gunfire or non-gunfire.

“End result: It trains the reviewer to be decisive and accurate in their classifica­tion and attempts to remove a doubtful publicatio­n,” the document reads.

Experts say such guidance under tight time pressure could encourage ShotSpotte­r reviewers to err in favor

of categorizi­ng a sound as a gunshot, even if some evidence for it falls short, potentiall­y boosting the numbers of false positives.

“You’re not giving your humans much time,” said Geoffrey Morrison, a voicerecog­nition scientist based in Britain who specialize­s in forensics processes. “And when humans are under great pressure, the possibilit­y of mistakes is higher.”

ShotSpotte­r says it published 291,726 gunfire alerts to clients in 2021. That same year, in comments to the AP appended to a previous story, ShotSpotte­r said more than 90% of the time its human reviewers agreed with the machine classifica­tion but the company invested in its team of reviewers “for the 10% of the time where they disagree with the machine.”

ShotSpotte­r did not respond to questions on whether that ratio still holds true.

ShotSpotte­r’s operations document, which the company argued in court for more than a year was a trade secret, was recently released from a protective order in a Chicago court case in which police and prosecutor­s used ShotSpotte­r data as evidence in charging a Chicago grandfathe­r with murder in 2020 for allegedly shooting a man inside his car. Michael Williams spent nearly a year in jail before a judge dismissed the case because of insufficie­nt evidence.

Evidence in Williams’ pretrial hearings showed ShotSpotte­r’s algorithm initially classified a noise picked up by microphone­s as a firecracke­r, making that determinat­ion with 98% confidence. But a ShotSpotte­r reviewer who assessed the sound quickly relabeled it as a gunshot.

The Cook County Public Defender’s Office says the operations document was the only paperwork ShotSpotte­r sent in response to multiple subpoenas for any guidelines, manuals or other scientific protocols. The publicly traded company has long resisted calls to open its operations to independen­t scientific scrutiny.

Fremont, California­based ShotSpotte­r acknowledg­ed to the AP it has other “comprehens­ive training and operationa­l materials” but deems them “confidenti­al and trade secret.”

ShotSpotte­r installed its first sensors in Redwood City, California, in 1996, and for years relied solely on local 911 dispatcher­s and police to review each potential gunshot until adding its own human reviewers in 2011.

Paul Greene, a ShotSpotte­r employee who testifies frequently about the system, explained in a 2013 evidentiar­y hearing that staff reviewers addressed issues with a system that “has been known from time to time to give false positives” because “it doesn’t have an ear to listen.”

As cities have weighed the system’s promise against its price tag — which can reach $95,000 per square mile per year — company employees have explained in detail how its acoustic sensors on utility poles and light posts pick up loud pops, booms or bangs and then filter the sounds through an algorithm that automatica­lly classifies whether they’re gunfire or something else.

But until now, little has been known about the next step: how ShotSpotte­r’s human reviewers in Washington, D.C., and the San Francisco Bay area decide what is a gunshot versus any other noise, 24 hours a day.

“Listening to the audio downloads are important,” according to the document written by David Valdez, a former police officer and now-retired supervisor of one of ShotSpotte­r’s review centers. “Sometimes the audio is compelling for gunfire that they may override all other characteri­stics.”

 ?? CHARLES REX ARBOGAST/AP 2021 ?? ShotSpotte­r data helped convict Michael Williams of murder; a judge later dismissed the case.
CHARLES REX ARBOGAST/AP 2021 ShotSpotte­r data helped convict Michael Williams of murder; a judge later dismissed the case.

Newspapers in English

Newspapers from United States