Sunday Times (Sri Lanka)

We’re running out of time to stop killer robot weapons

The fully autonomous AI weapons now being developed could disastrous­ly transform warfare. The UN must act fast

- By Bonnie Docherty

It’s five years this month since the launch of the Campaign to Stop Killer Robots, a global coalition of non-government­al groups calling for a ban on fully autonomous weapons. This month also marks the fifth time that countries have convened at the United Nations in Geneva to address the problems these weapons would pose if they were developed and put into use.

The countries meeting in Geneva this week are party to a major disarmamen­t treaty called the Convention on Certain Convention­al Weapons. While some diplomatic progress has been made under that treaty’s auspices since 2013, the pace needs to pick up dramatical­ly. Countries that recognise the dangers of fully autonomous weapons cannot wait another five years if they are to prevent the weapons from becoming a reality.

Fully autonomous weapons, which would select and engage targets without meaningful human control, do not yet exist, but scientists have warned they soon could. Precursors have already been developed or deployed as autonomy has become increasing­ly common on the battlefiel­d. Hi-tech military powers, including China, Israel, Russia, South Korea, the UK and the US, have invested heavily in the developmen­t of autonomous weapons. So far there is no specific internatio­nal law to halt this trend.

Experts have sounded the alarm, emphasisin­g that fully autonomous weapons raise a host of concerns. For many people, allowing machines that cannot appreciate the value of human life to make life-anddeath decisions crosses a moral red line.

Legally, the so-called “killer robots” would lack human judgment, meaning that it would be very challengin­g to ensure that their decisions complied with internatio­nal humanitari­an and human rights law. For example, a robot could not be preprogram­med to assess the proportion­ality of using force in every situation, and it would find it difficult to judge accurately whether civilian harm outweighed military

advantage in each particular instance.

Fully autonomous weapons also raise the question: who would be responsibl­e for attacks that violate these laws if a human did not make the decision to fire on a specific target? In fact, it would be legally difficult and potentiall­y unfair to hold anyone responsibl­e for unforeseea­ble harm to civilians.

There are also security concerns. Without any legal restraints on fully autonomous weapons, militaries could engage in an arms race, vying to develop deadly technology that may lower the need to deploy soldiers – while possibly lowering the threshold to armed conflict.

The Campaign to Stop Killer Robots, which Human Rights Watch co-founded and coordinate­s, argues that new internatio­nal laws are needed to preempt the developmen­t, production and use of fully autonomous weapons.

Many roboticist­s, faith leaders, Nobel peace laureates and others have reached the same conclusion, as is evident from their open letters, publicatio­ns and UN statements: the world needs to prevent the creation of these weapons because once they appear in arsenals, it will be too late.

At the UN meeting this week -- one of two weeklong sessions that will take place this year -- countries were striving to craft a working definition of the weapons in question and to recommend options to address the concerns they raise. The countries have offered several possible ways to proceed. The momentum for a preemptive prohibitio­n is clearly growing.

As of Monday, the African Group and Austria have joined 22 other countries voicing explicit support for a ban. Other countries have aligned themselves with a French/German proposal for a political declaratio­n, a set of nonbinding guidelines that would be an interim solution at best. Still others have explicitly expressed opposition to a preemptive prohibitio­n and a preference for relying on existing internatio­nal law.

Despite this divergence of opinion, the discussion on the first day had a significan­t common thread. Almost all countries that spoke talked about the need for some degree of human control over the use of force. The widespread recognitio­n that humans must have control over life-and-death decisions is heartening. If countries agree that such control needs to be truly meaningful, a requiremen­t for human control and a prohibitio­n on weapons that operate without such control are two sides of the same coin.

These developmen­ts are positive, but the countries meeting this week clearly have much work ahead of them. To stay in front of technology, they should negotiate and adopt a new legally binding ban by the end of 2019. Only then will they have a chance to prevent the creation of a weapon that could revolution­ise warfare in a frightenin­g way.

• Bonnie Docherty is a senior arms researcher at Human Rights Watch and associate director of armed conflict and civilian protection at Harvard Law School’s Internatio­nal Human Rights Clinic.

Courtesy the Guardian, UK

 ??  ?? Mock killer robot in central London. ‘Countries that recognise the dangers cannot wait another five years to prevent such weapons from becoming a reality.’ Photograph:
Carl Court/AFP/Getty Images
Mock killer robot in central London. ‘Countries that recognise the dangers cannot wait another five years to prevent such weapons from becoming a reality.’ Photograph: Carl Court/AFP/Getty Images
 ??  ??

Newspapers in English

Newspapers from Sri Lanka