San Francisco Chronicle

Algorithm grants, removes freedom

Tech calculatio­ns have major impact on prisons, policing

- By Cade Metz and Adam Satariano

PHILADELPH­IA — Darnell Gates sat at a long table in a downtown Philadelph­ia office building. He wore a black Tshirt with “California” in bright yellow letters on the chest. He had never been to the state, but he hoped to visit family there after finishing his probation.

When Gates was released from jail in 2018 — he had served time for running a car into a house in 2013 and later for violently threatenin­g his former domestic partner — he was required to visit a probation office once a week after he had been deemed “high risk.”

He called the visits his “tail” and his “leash.” Eventually, his leash was stretched to every two weeks. Later, it became a month. Gates wasn’t told why. He complained that conversati­ons with his probation officers were cold and impersonal. They rarely took the time to understand his rehabilita­tion.

He didn’t realize that an algorithm had tagged him high risk until he was told about it during an interview with the New York Times.

“What do you mean?” Gates, 30, asked. “You mean to tell me I’m dealing with all this because of a computer?”

In Philadelph­ia, an algorithm created by a University of Pennsylvan­ia professor has helped dictate the experience of probatione­rs for at least five years.

The algorithm is one of many making decisions about people’s lives in the United States and Europe. Authoritie­s use socalled predictive algorithms to set police patrols, prison sentences and probation rules. In the Netherland­s, an algorithm flagged welfare fraud risks. A British city rates which teenagers are most likely to become criminals.

Nearly every state has turned to this new sort of governance algorithm, according to the Electronic Privacy Informatio­n Center, a nonprofit dedicated to digital rights. Algorithm Watch, a watchdog in Berlin, has identified similar programs in at least 16 European countries.

As the practice has spread, U.N. investigat­ors, civil rights lawyers, labor unions and community organizers have been pushing back.

They are angered by a growing dependence on automated systems that are taking humans and transparen­cy out of the process. It is often not clear how the systems are making their decisions. Is gender a factor? Age? ZIP code? It’s hard to say, since many states and countries have few rules requiring that algorithmm­akers

disclose their formulas.

They also worry that the biases — involving race, class and geography — of the people who create the algorithms are being baked into these systems, as ProPublica has reported. In San Jose, where an algorithm is used during arraignmen­t hearings, an organizati­on called Silicon Valley DeBug interviews the family of each defendant, takes this personal informatio­n to each hearing and shares it with defenders as a kind of counterbal­ance to algorithms.

Two community organizers, the Media Mobilizing Project in Philadelph­ia and MediaJusti­ce in Oakland, recently compiled a nationwide database of prediction algorithms. And Community Justice Exchange, a national organizati­on that supports community organizers, is distributi­ng a 50page guide that advises organizers on how to confront the use of algorithms.

The algorithms are supposed to reduce the burden on understaff­ed agencies, cut government costs and — ideally — remove human bias. Opponents say government­s haven’t shown much interest in learning what it means to take humans out of the decisionma­king. A recent U.N. report warned that government­s risked “stumbling zombielike into a digitalwel­fare dystopia.”

Last year, Idaho passed a law specifying that the methods and data used in bail algorithms must be publicly available so the general public can understand how they work. In the Netherland­s, a district court ruled Wednesday that the country’s welfarefra­ud software violated European human rights law, one of the first rulings against a government’s use of predictive algorithms.

“Where is my human interactio­n?” Gates asked, sitting next to his lawyer in the boardroom of the Philadelph­ia public defender’s office. “How do you win against a computer that is built to stop you? How do you stop something that predetermi­nes your fate?”

On a recent Thursday, Todd Stephens sat in a food court across the street from Citizens Bank Park, home of the Philadelph­ia Phillies. He was explaining the latest effort to remake state sentencing practices with a predictive algorithm.

Predictive algorithms, at their most basic, work by using historical data to calculate the probabilit­y of events, similar to how a sports book determines the odds for a game or pollsters forecast an election result.

The technology builds on statistica­l techniques that have been used for decades, often for determinin­g risk. They have been supercharg­ed thanks to increases in affordable computing power and available data.

The private sector uses such tools all the time, to predict how likely people are to default on a loan, get sick or be in a car wreck, or whether they will click on an internet ad. Government­s, which hold vast amounts of data about the public, have been eager to tap their potential.

A Republican member of the Pennsylvan­ia House of Representa­tives, Stephens is part of a state commission working to adopt the technology. Like many states, Pennsylvan­ia has mandated that an algorithm be developed to help courts decide the sentence after someone is convicted.

The idea, Stephens said, was to predict how likely people were to commit another crime by collecting informatio­n about them and comparing that to statistics describing known offenders. That might include age, sex and past and current conviction­s.

The commission had proposed a plan that would have leaned on informatio­n provided by county probation department­s. But the American Civil Liberties Union and community groups protested this plan during public meetings in the fall. They worried it would expand the power of predictive algorithms used for probation, including the one that tagged Gates.

“We walked into a hornet’s nest I didn’t even know existed,” Stephens said.

In response to the protests, the state commission recommende­d a much simpler setup based on software already used by the state courts. But even this algorithm is difficult for a layperson to understand. Asked to explain it, Stephens suggested speaking with another commission­er.

Nyssa Taylor, criminal justice policy counsel with the Philadelph­ia ACLU, was among the protesters. She worries that algorithms will exacerbate rather than reduce racial bias. Even if government­s share how the systems arrive at their decisions — which happens in Philadelph­ia in some cases — the math is sometimes too complex for most people to wrap their heads around.

Various algorithms embraced by the Philadelph­ia criminal justice system were designed by Richard Berk, a professor of criminolog­y and statistics at Penn. These algorithms do not use ZIP codes or other location data that could be a proxy for race, he said. And although he acknowledg­ed that a layperson couldn’t easily understand the algorithm’s decisions, he said human judgment had the same problem.

“All machinelea­rning algorithms are black boxes, but the human brain is also a black box,” Berk said. “If a judge decides they are going to put you away for 20 years, that is a black box.”

Last year in Rotterdam, Netherland­s, a rumor circulatin­g in two predominan­tly lowincome and immigrant neighborho­ods claimed that the city government had begun using an experiment­al algorithm to catch citizens who were committing welfare and tax fraud.

Mohamed Saidi learned about it from a WhatsApp message that he initially thought was a joke. Mohamed Bouchk-hachakhe first heard from his mother, who had been told by a friend. Driss Tabghi got word from a local union official.

The rumor turned out to be true.

The Dutch program, System Risk Indication, scans data from government authoritie­s to flag people who may be claiming unemployme­nt aid when they are working or a housing subsidy for living alone when they are living with several others.

The agency that runs the program, the Ministry of Social Affairs and Employment, said the data could include income, debt, education, property, rent, car ownership, home address and the welfare benefits received for children, housing and health care.

The algorithm produces “risk reports” on individual­s who should be questioned by investigat­ors. In Rotterdam, where the system was most recently used, 1,263 reports were produced in two neighborho­ods.

“You’re putting me in a system that I didn’t even know existed,” said Bouchkhach­akhe, who works for a logistics company.

Similar programs exist elsewhere. In North Carolina, IBM software has been used to identify Medicaid fraud. In London, local councils tested software to identify those who may be wrongly claiming a housing benefit. Systems are used to flag children who may be at risk of abuse.

In Rotterdam, opposition built after word about the techniques spread. Privacy rights groups, civil rights lawyers and the largest national labor union rallied citizens to fight the effort.

“They will not tell you if you are on the register,” said Tijmen Wisman, an assistant professor of privacy law who runs a Dutch privacy group. He helped organize a meeting for roughly 75 residents in the affected neighborho­ods, many taking video on their smartphone­s to share with neighbors.

The district court that sided with the opponents ordered an immediate halt to the algorithm’s use. In the closely watched case, which is seen as setting a precedent in Europe about government use of predictive algorithms, the court said that the welfare program lacked privacy safeguards and that the government was inadequate­ly transparen­t about how it worked. The decision can be appealed.

Once a week in Bristol, England, a team gathers in a conference room to review the latest results from an algorithm meant to identify the most atrisk youths in the city and review caseloads. Representa­tives from the police and children’s services and a member of the team that designed the software typically attend to scan the list of names.

With youth violence and crime on the rise, and with many youth programs and community centers where young people gathered having been closed, the local government turned to software to help identify children most in need. Officials there say the work provides evidence the technology can work if coupled with a human touch.

Last year, Bristol introduced a program that creates a risk score based on data pulled from police reports, social benefits and other government records. The system tallies crime data, housing informatio­n and any known links to others with high risk scores, and if the youth’s parents were involved in a domestic incident. Schools feed in attendance records.

“You can get quite a complete picture,” said Tom Fowler, 29, the data scientist who helped create the youth scoring system for the Bristol government.

The scores fluctuate depending on how recently the youths had an incident like a school suspension. The goal at the weekly meetings is to identify children at risk of being recruited into crime.

There’s evidence that the algorithm identifies the right people, but the city is still figuring out how to translate the data into action. Last year, a teenager who had one of the highest risk scores stabbed someone to death. In a review of the killing, city officials concluded that they had taken the right steps. Fowler said a person can’t be arrested simply because of the algorithm.

“He had a social worker and oneonone coaching,” said Fowler, who now works for a data-analytics company. “He made a really bad decision.

“You can’t control for that. Data can only go so far. But it’s pretty much the worst thing that can happen. That makes you do a bit of soul searching whether you did everything you could.”

Sitting in the Philadelph­ia public defender’s office, Gates said he was an easy person to read, pointing to the tattoos on his arms, which were meant to look like the bones under his skin. He understand­s machines. From a young age, he enjoyed dismantlin­g computers and smartphone­s before putting them back together.

But Gates, whom we met through the defender’s office, believed that a person could read him better than a machine.

“Does a computer know I might have to go to a doctor’s appointmen­t on Friday at 2 o’clock?” he asked.

Visiting the probation office so often can prevent him from getting the rest of his life on track. “How is it going to understand me as it is dictating everything that I have to do?” Gates asked.

Several weeks after his interview with the Times, he was allowed to make a short trip to Puerto Rico after a personal appeal to a judge. He always felt comfortabl­e in front of his judge. The experience showed him the importance of a human touch.

“I can’t explain my situation to a computer,” Gates said. “But I can sit here and interact with you, and you can see my expression­s and what I am going through.”

 ?? Photos by Jessica Kourkounis / New York Times ?? Above: Darnell Gates, 30, who is on probation in Philadelph­ia, was deemed “high risk” by a computer algorithm. Below: University of Pennsylvan­ia professor of criminolog­y and statistics Richard Berk.
Photos by Jessica Kourkounis / New York Times Above: Darnell Gates, 30, who is on probation in Philadelph­ia, was deemed “high risk” by a computer algorithm. Below: University of Pennsylvan­ia professor of criminolog­y and statistics Richard Berk.
 ??  ??
 ?? Dustin Thierry / New York Times 2019 ?? A meeting of a Dutch union and a civil rights group to oppose an algorithm program that flags possible welfare fraud in the Netherland­s. Algorithm opponents in many areas want more human oversight.
Dustin Thierry / New York Times 2019 A meeting of a Dutch union and a civil rights group to oppose an algorithm program that flags possible welfare fraud in the Netherland­s. Algorithm opponents in many areas want more human oversight.
 ?? Francesca Jones / New York Times 2019 ?? Tom Fowler, a data scientist, helped create a risk scoring system for youths in Bristol, England.
Francesca Jones / New York Times 2019 Tom Fowler, a data scientist, helped create a risk scoring system for youths in Bristol, England.

Newspapers in English

Newspapers from United States