Gulf News

How algorithms (secretly) run the world

THE COMPLEX FORMULAS ARE PLAYING A GROWING ROLE IN OUR LIVES — FROM DETECTING SKIN CANCERS TO CHOOSING NEW FACEBOOK FRIENDS

-

hen you browse online for a new pair of shoes, pick a movie to stream on Netflix or apply for a car loan, an algorithm likely has its word to say on the outcome.

The complex mathematic­al formulas are playing a growing role in all walks of life: from detecting skin cancers to suggesting new Facebook friends, deciding who gets a job, how police resources are deployed, who gets insurance at what cost, or who is on a “no fly” list.

Algorithms are being used — experiment­ally — to write news articles from raw data, while Donald Trump’s presidenti­al campaign was helped by behavioura­l marketers who used an algorithm to locate the highest concentrat­ions of “persuadabl­e voters”.

But while such automated tools can inject a measure of objectivit­y into erstwhile subjective decisions, fears are rising over the lack of transparen­cy algorithms can entail, with pressure growing to apply standards of ethics or “accountabi­lity”.

Data scientist Cathy O’Neil cautions about “blindly trusting” formulas to determine a fair outcome.

“Algorithms are not inherently fair, because the person who builds the model defines success,” she said.

O’Neil argues that while some algorithms may be helpful, others can be nefarious. In her 2016 book, ‘Weapons of Math Destructio­n’, she cites some troubling examples in the United States:

— Public schools in Washington DC in 2010 fired more than 200 teachers — including several well-respected instructor­s — based on scores in an algorithmi­c formula which evaluated performanc­e.

— A man diagnosed with bipolar disorder was rejected for employment at seven major retailers after a third-party “personalit­y” test deemed him a high risk based on its algorithmi­c classifica­tion.

— Some courts rely on computer-ranked formulas to determine jail sentences and parole, which may discrimina­te against minorities by taking into account “risk” factors such as their neighbourh­oods and friend or family links to crime. rely on the imperfect logic, probabilit­y, and who design them.”

The report noted that data systems can ideally help weed out human bias but warned against algorithms “systematic­ally disadvanta­ging certain groups.”

Zeynep Tufekci, a University of North Carolina professor who studies technology and society, inputs, people said automated decisions are often based on data collected about people, sometimes without their knowledge.

“These computatio­nal systems can infer all sorts of things about you from your digital crumbs,” Tufekci said in a recent TED lecture. They can infer your sexual orientatio­n, your personalit­y traits, your political leanings. They have predictive power with high levels of accuracy.”

Part of the problem, she said, stems from asking computers to answer questions that have no single right answer.

“They are subjective, openended and value-laden questions, asking who should the company hire, which update from which friend should you be shown, which convict is more likely to reoffend.”

Frank Pasquale, a University of Maryland law professor and author of “The Black Box Society: The Secret Algorithms That Control Money and Informatio­n,” shares the same concerns.

He suggests one way to remedy unfair effects may be to enforce existing laws on consumer protection or deceptive practices.

Pasquale points at the European Union’s data protection law, set from next year to create a “right of explanatio­n” when consumers are impacted by an algorithmi­c decision, as a model that could be expanded.

This would “either force transparen­cy or it will stop algorithms from being used in certain contexts,” Pasquale said.

 ??  ??
 ??  ?? Donald Trump’s presidenti­al campaign was helped by behavioura­l marketers who used an algorithm to locate the highest concentrat­ions of ‘persuadabl­e voters’.
Donald Trump’s presidenti­al campaign was helped by behavioura­l marketers who used an algorithm to locate the highest concentrat­ions of ‘persuadabl­e voters’.

Newspapers in English

Newspapers from United Arab Emirates