AI faces its ‘Oppenheimer moment’
Regulators who want to get a grip on an emerging generation of artificially intelligent killing machines may not have much time left to do so, governments were warned on Monday.
As autonomous weapons systems rapidly proliferate, including across battlefields in Ukraine and Gaza, algorithms and unmanned aerial vehicles are already helping military planners decide whether or not to hit targets. Soon, that decision could be outsourced entirely to the machines.
“This is the Oppenheimer moment of our generation,” said Austrian Foreign Minister Alexander Schallenberg, referencing J Robert Oppenheimer, who helped invent the atomic bomb in 1945 before going on to advocate for controls over the spread of nuclear arms.
Civilian, military and technology officials from more than 100 countries convened in vienna to discuss how their economies can control the merger of ai with military technologies —two sectors that have recently animated investors, helping pushing stock valuations to historic highs.
Spreading global conflict combined with financial incentives for companies to promote AI adds to the challenge of controlling killer robots, Jaan Tallinn, an early investor in Alphabet Inc’s AI platform Deepmind Technologies said.
Governments around the world have taken steps to collaborate with companies integrating AI tools into defense. The Pentagon is pouring millions of dollars into AI startups. The European Union last week paid Thales SA to create an imagery database to help evaluate battlefield targets.
Tel Aviv-based +972 Magazine reported that Israel was using an AI program called “Lavender” to come up with assassination targets. After the story — which Israel has disputed — United Nations Secretary-general Antonio Guterres said he was “deeply troubled” by reports of AI use in the Gaza military campaign and that no part of life-and-death decisions should be delegated to the cold calculations of algorithms.
“The future of slaughter bots is here,” said Anthony Aguirre, a physicist who predicted the trajectory the technology would take in a short 2017 film seen by more than 1.6 million viewers. “We need an armscontrol treaty negotiated by the United Nations General Assembly.” In the longer run, after the technology becomes accessible to nonstate actors and potentially to terrorists, countries will be forced into writing new rules, predicted Arnoldo André Tinoco, Costa Rica’s foreign minister.