Santa Fe New Mexican

Nations hesitant to set up rules on AI weapons

Smaller nations want guardrails on use as technology improves rapidly

- By Eric Lipton

It seems like something out of science fiction: swarms of killer robots that hunt down targets on their own and are capable of flying in for the kill without any human signing off.

But it is approachin­g reality as the United States, China and a handful of other nations make rapid progress in developing and deploying new technology that has the potential to reshape the nature of warfare by turning life and death decisions over to autonomous drones equipped with artificial intelligen­ce programs.

That prospect is so worrying to many other government­s they are trying to focus attention on it with proposals at the United Nations to impose legally binding rules on the use of what militaries call lethal autonomous weapons.

“This is really one of the most significan­t inflection points for humanity,” Alexander Kmentt, Austria’s chief negotiator on the issue, said in an interview. “What’s the role of human beings in the use of force — it’s an absolutely fundamenta­l security issue, a legal issue and an ethical issue.”

But while the U.N. is providing a platform for government­s to express their concerns, the process seems unlikely to yield substantiv­e new legally binding restrictio­ns. The United States, Russia, Australia, Israel and others have all argued no new internatio­nal law is needed for now, while China wants to define any legal limit so narrowly it would have little practical effect, arms control advocates say.

The result has been to tie the debate up in a procedural knot with little chance of progress on a legally binding mandate anytime soon.

“We do not see that it is really the right time,” Konstantin Vorontsov, the deputy head of the Russian delegation to the U.N., told diplomats who were packed into a basement conference room recently at the U.N. headquarte­rs in New York.

Last week, officials from China and the United States discussed a related issue: potential limits on the use of AI in decisions about deploying nuclear weapons.

Against that backdrop, the question of what limits should be placed on the use of lethal autonomous weapons has taken on new urgency, and for now has come down to whether it is enough for the U.N. simply to adopt nonbinding guidelines, the position supported by the United States.

“The word ‘must’ will be very difficult for our delegation to accept,” Joshua Dorosin, the chief internatio­nal agreements officer at the State Department, told other negotiator­s during a debate in May over the language of proposed restrictio­ns.

Dorosin and members of the U.S. delegation, which includes a representa­tive from the Pentagon, have argued instead of a new internatio­nal law, the U.N. should clarify existing internatio­nal human rights laws already prohibit nations from using weapons that target civilians or cause a disproport­ionate amount of harm to them.

But the position being taken by the major powers has only increased the anxiety among smaller nations, who say they are worried that lethal autonomous weapons might become common on the battlefiel­d before there is any agreement on rules for their use.

“Complacenc­y does not seem to be an option anymore,” Ambassador Khalil Hashmi of Pakistan said during a meeting at U.N. headquarte­rs. “The window of opportunit­y to act is rapidly diminishin­g as we prepare for a technologi­cal breakout.”

Rapid advances in AI and the intense use of drones in conflicts in Ukraine and the Middle East have combined to make the issue that much more urgent. So far, drones generally rely on human operators to carry out lethal missions, but software is being developed that soon will allow them to find and select targets more on their own.

“This isn’t the plot of a dystopian novel, but a looming reality,” Gaston Browne, the prime minister of Antigua and Barbuda, told officials at a recent U.N. meeting.

 ?? EDMUND D. FOUNTAIN/NEW YORK TIMES FILE PHOTO ?? The experiment­al Kratos XQ-58 unmanned combat aerial vehicle this summer at Eglin Air Force Base in Florida. The drone uses artificial intelligen­ce and has the capability to carry weapons, although it has not yet been used in combat. Nations are trying to establish rules for such vehicles.
EDMUND D. FOUNTAIN/NEW YORK TIMES FILE PHOTO The experiment­al Kratos XQ-58 unmanned combat aerial vehicle this summer at Eglin Air Force Base in Florida. The drone uses artificial intelligen­ce and has the capability to carry weapons, although it has not yet been used in combat. Nations are trying to establish rules for such vehicles.

Newspapers in English

Newspapers from United States