Robot wars are unethical, says MOD scientist
Artificial intelligence weapons systems are “absolutely unethical”, the Ministry of Defence’s chief scientific adviser has said. Simon Cholerton insisted that MOD research “has no plans to develop fully automated weapons” even if that risked putting the Armed Forces at a disadvantage.
TERMINATOR-STYLE robot soldiers and weapons systems that kill without human command are “absolutely unethical” and will not be developed by Britain, the Ministry of Defence’s chief scientific adviser has said.
In his first interview since his appointment earlier this year, Simon Cholerton insisted MOD research, which he oversees, was “doing no work and has no plans to develop fully automated weapons” – even if that risks putting the Armed Forces at a disadvantage.
“There will be a certain amount of asymmetry around this, and there always is when you face an enemy who doesn’t share our values,” he said. “But that makes no difference to our ethics.”
His intervention comes amid what has been called a new global arms race for lethal autonomous weapons (Laws).
Governments have repeatedly met, under the auspices of the United Nations, to try to thrash out guidelines for the regulation of such weapons, but doing so is proving difficult because – unlike chemical, biological or nuclear arms which are difficult to manufacture and to stockpile in great numbers without detection – Laws can be created with just the flick of a switch.
“You might have a system that apparently has human control but then just needs a software patch to be fully automatable,” said Paul Scharre, a former special operations officer with the American military who worked in the office of the US secretary of defence.
At least a dozen states, including the United States, Russia, the UK and China, are involved in the development of partly automated systems, with research funding on robotics estimated to hit almost $200billion by 2020.
The Terminator Hollywood film franchise, which starred Arnold Schwarzenegger as a military cyborg, pitted man against machine in a dystopian science-fiction future.
However, while automated defensive weapons already exist, Cholerton said Laws represented “a red line” for the UK. He promised that the British military would instead develop technologies to protect flesh-and-blood soldiers against Laws in future conflicts.
“On the battlefield, the question is how do you protect soldiers from systems that we consider absolutely unethical,” he said. “As we see threats emerge, we need to develop technologies and strategies that work within our ethical principles.”
Among the technologies he said would “protect our people” are lasers, or “directed-energy weapons”. “The wow factor for me is that these are primarily defensive systems,” he said.
The MOD announced yesterday that British troops had begun testing more than 70 different types of futuristic technology, including surveillance drones and unmanned vehicles, in the fields of Salisbury Plain. The experiment, called “Autonomous Warrior”, will last four weeks to allow troops to try out a range of prototype technology.