Business Day

Keeping a check on amoral algorithms

- Lagardien is a former executive dean of business and economic sciences at Nelson Mandela University and has worked in the office of the chief economist of the World Bank as well as the secretaria­t of the National Planning Commission.

Several years ago I played ice hockey. I was very bad at it. Nonetheles­s, at some point during an especially fast and furious match — we called it an “industrial” match — somewhere in Western Ontario, the blade of my hockey stick slashed the face of an opponent. The slash drew blood. My opponent, who was probably twice my size, was livid. He dropped his gloves, code-speak for being ready to fight.

I came really close to being beaten to within an inch of my life. Somewhat tragically, sometimes an ice hockey match in rural Canada is incomplete without a fight.

It was an accident, I told the referee. Your stick, he said, is an extension of your arm. You have to be in control of it at all times. I was sent off the ice and left the arena by the back door.

What is the point in all of this, other than misguided icecapades? Well, in a highly competitiv­e, fast-paced ice hockey match, where everyone has some basic form of protection, there are rules and regulation­s. In a hockey game, the stick is an instrument, a low-tech device, and its use has to be regulated.

The stick on its own can be trusted, but the person wielding it has to be responsibl­e for its use at all times.

And so we get to artificial intelligen­ce (AI), automation and robotics. From the algorithms that track and collect our data whenever we visit a website, to the drones that drop bombs on people in remote towns and villages — and soon probably our Amazon orders — there is enormous potential for abuse, lapses in morality and ethical conduct and, well, accidents.

As in ice hockey, there have to be regulation­s, and someone has to be held accountabl­e.

With good reason, many of us marvel at the novelties of informatio­n and communicat­ions technology and the Candy Crush revolution.

The wow gap between the thrill of Facebook’s special offers on your timeline after your Google search for a new laptop grows ever wider, until we accept it as normal.

The point is that Facebook and Google are two separate and independen­t companies. How does Facebook know that I searched for a new laptop on Google?

Without sounding conspirato­rial, someone is watching our activities online. Apersonal, and by extension amoral, algorithms are tracking our tastes and preference­s, where and when we travel, and probably our most intimate discussion­s. You can be rude, obnoxious or offensive to a person, but an algorithm that tracks your movements or preference­s has no feelings.

One of my secret and seemingly incongruou­s intellectu­al passions is war, warfare, the philosophy of war and its impact on society. And, in the age of 21st-century robotics, I have developed an interest in the ethics of war. As a pacifist, I have tried to not share that too widely.

Which brings me to the use of drones and remotecont­rolled vehicles. The use of driverless cars is in its early stages of developmen­t. There are broad and deep discussion­s that can be had on these.

The use of drones to drop bombs or in targeted assassinat­ions is becoming cause for concern among scholars and thinkers on war and strategy. The general drift of discussion­s around the issue is that it decentres us from war and its consequenc­es and, along the way, disassocia­tes pilots from reality and removes soldiers from battlefiel­ds. It’s all in the game.

Most people will agree that as soon as a safety device is designed the bad guys devise ways around it.

At some point in the not too distant future all the glitches will have been removed, and driverless cars, buses and cargo ships — and possibly aircraft – could be driven by AI systems and guided by satellites.

This will no doubt continue to drive the informatio­n and communicat­ion technology revolution and the “new globalisat­ion”. But we cannot ignore the likelihood that there may be bad guys who direct car bombs or drones into buildings and villages.

Worse still, if we contract out decision-making on, say, going to war to an automated system (driven by amoral algorithms) you remove the human element. This is already happening on a small scale with targeted assassinat­ions in western Asia.

A second concern, seemingly far-fetched but not unimaginab­le, is when machines make the actual decisions by themselves. We cannot roll back scientific discoverie­s or technologi­cal advances, but we can, and probably should, regulate the applicatio­n of new technologi­es.

If I may use the hockey stick analogy, the algorithms and drones are extensions of states and firms. A machine has no feelings, morals and ethics, but its designers, users and owners do — and they have to be held accountabl­e.

WE CANNOT ROLL BACK SCIENTIFIC DISCOVERIE­S OR TECHNOLOGI­CAL ADVANCES, BUT WE CAN … REGULATE THE APPLICATIO­N OF NEW TECHNOLOGI­ES

 ??  ??
 ??  ?? ISMAIL LAGARDIEN
ISMAIL LAGARDIEN

Newspapers in English

Newspapers from South Africa