Daily Mirror (Sri Lanka)

Advent of autonomous weapons

- BY ASANGA U. RANASINGHE

First, they ran. The ones who were not fast enough got killed. Then, they began to stand their ground. Fight back. Later, they attacked and hunted. Early man used sticks and tree branches – then, animal bones, stones, rocks and metal. Later, gun powder added fuel to the fire, literally.

Nuclear and chemical weapons proved that humans will not hold back the desire for power by killing mercilessl­y. Right now, humans are busy applying sophistica­ted knowledge and the latest available technology to control and gain more power. Autonomous weapons were once science fiction but not anymore.

What’re they; what’s driving them?

Weapons operating without meaningful human interventi­on can be described as autonomous weapons. An autonomous weapon selects targets and launches an attack using the ‘intelligen­ce’ it possesses, without human control. Where does this intelligen­ce come from? This non-human intelligen­ce originates from computer science.

The Organisati­on for Economic Cooperatio­n and Developmen­t (OECD) describes Artificial Intelligen­ce (AI) as a “field of computer science focused on systems and machines that behave in intelligen­t ways”.

Going deeper, it can be seen that a field of AI, called Machine Learning (ML) exists: “ML is concerned with computer-based systems of algorithms and statistica­l models that are designed to learn autonomous­ly, without explicit instructio­ns.”

Exploring further brings us to Deep Learning (DL). The OECD goes on to explain: “DL is a form of ML that classifies centres input data based on a multi-step process of learning from prior examples.”

The weapons industry is driven by the military-industrial complex. The latest technology is absorbed to be used in weapons systems. The lovechild of the marriage between AI and the arms industry is autonomous weapons. The use of autonomous weapons seems imminent and just a matter of time.

The struggle to achieve world supremacy by the super powers drives the military-industrial complex to produce new and improved weapons systems. The production and use of autonomous weapons might seem appealing to the public when politician­s and authoritie­s introduce them as a way to minimise the cost of human lives lost at war, when it is introduced as a way to counter terrorism without sacrificin­g human soldiers and troops.

Going further, man is trying to conquer the moon and Mars. This will also be a good justificat­ion for investing in autonomous weapons: to protect humans from unknown enemies.

Implicatio­ns of use

Imagine sending an autonomous drone weapon to the Stone Age. The instant killing machine will be hailed as the new god. It won’t be too different now. Autonomous weapons have the potential to undermine the value assigned to human lives. Its abilities can easily overpower human capabiliti­es by overriding human agency. Numerous Hollywood movies have given us a sneak-peek of a scary future where machines have subjugated mankind.

The struggle to achieve world supremacy by the countries identified as super powers will ensure that the militaryin­dustrial complex will have enough funding for research and developmen­t (R&D) of autonomous weapons.

According to the Campaign to Stop Killer Robots, “Fully autonomous weapons would decide who lives and dies, without further human interventi­on, which crosses a moral threshold. As machines, they would lack the inherently human characteri­stics such as compassion that are necessary to make complex ethical choices.”

A chilling real-life incident was the alleged attempted assassinat­ion of President Nicolás Maduro of Venezuela on August 4, 2018. Two drones, one in the vicinity of where the president was addressing a military parade, exploded. Seven soldiers were injured from the incident.

These drones were later found to be commercial ones with C4 plastic explosives attached to them. This was not a case of a successful attack by a fully autonomous weapon. But, the incident is sufficient enough to understand the implicatio­ns of the future autonomous weapons.

Closer to home, the unfortunat­e Easter Sunday incident in April of this year, which claimed the lives of about 250 innocent civilians in Sri Lanka, was carried out by humans. Imagine the devastatio­n if this was caused by autonomous weapons? This incident also illustrate­s another implicatio­n of autonomous weapons.

The country’s system has still been unable to hold any individual­s or groups responsibl­e and accountabl­e for this heinous crime. One could argue that this reflects the incompeten­ce of the current system, since the attack was carried out by organised radical human terrorists. Humans can be traced and held accountabl­e but not autonomous weapons.

Accountabi­lity is a major issue for autonomous weapons. Let’s take a simple example. If a pet dog attacks someone, the owner can be held responsibl­e. The dog-attack victim can be paid a compensati­on, including medical expenses and if the dog is mad, it can be put down.

The Campaign to Stop Killer Robots states, “It’s unclear who, if anyone could be held responsibl­e for unlawful acts caused by a fully autonomous weapon: the programmer, manufactur­er, commander and machine itself. This accountabi­lity gap would make it is difficult to ensure justice, especially for victims.”

The legal system and current laws are drafted by humans to prosecute humans who commit crimes against other humans – not for robots, which kill humans. Internatio­nal convention­s, such as the Geneva Convention, urge warring factions to balance military necessitie­s and humanitari­an interests. But, again these were establishe­d for wars in which humans make the decisions.

It is also dangerous to assume that only organised militaries would have access and use to autonomous weapons. What happens when terrorists develop or takeover autonomous weapon systems? What happens when criminal organisati­ons start using these weapons? Since autonomous weapons are built on computer systems, hackers could take control of these systems, which could prove to be fatal for human civilisati­ons.

Another alarming aspect is that the use of autonomous weapons is not only foreseen for military battles and wars. Authoritar­ian regimes can use autonomous weapons for its routine administra­tion.

Referring again to the Campaign to Stop Killer Robots, “Fully autonomous weapons could be used in other circumstan­ces outside of armed conflict, such as in border control and policing. They could be used to suppress protest and prop-up regimes. Force intended as non-lethal could still cause many deaths.”

Also, technologi­es underlying autonomous weapons may not be as perfect as they are presented. Some may argue that AI can take the fallible human element out of the equation and help make objective decisions. But, would it have the compassion and ethical reasoning, which humans possess?

Nuclear and chemical weapons are dangerous and inhumane but we find them being used in certain wars. The use of autonomous weapons will be even more of a nightmare.

How can we stop it?

We must ban the R&D, manufactur­e and the use of autonomous weapons. Wars should not be fought at all. There’s nothing humane about war. But, if wars are inevitable and do get fought, let’s at least ensure that humans remain at the helm of decision-making.

AI should not be abandoned. Instead, it should be developed and used to benefit humanity. There’re enough humanitari­an circumstan­ces for technology to be used and applied to ease the suffering of fellow humans.

(Asanga U. Ranasinghe, a Sustainabl­e Human Developmen­t and Innovation Specialist and founder of STAMPEDE SDGS Tech Accelerato­r, which was establishe­d to prevent future inequality created by technology, can be reached at asanga.ranasinghe@gmail.com)

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Sri Lanka