AUTONOMOUS DEATH
If there seems to be an arms race in using machine learning in the civilian world, then you know there’s one taking place in the military one as well. The US military has been using remote-controlled machines and semi-autonomous hardware for years, but we’re now on the cusp of death by fully autonomous machines, known as LAWS, Lethal Autonomous Weapons Systems. Current systems only require human input for the final order to fire, although certain defensive systems are exempt—missile defense systems, such as the Phalanx, for example.
Many programers and researchers involved in machine learning have boycotted institutions and projects with military connections. Google has pledged not to work on any military projects, as has Elon Musk, but this won’t really dent the research by much. Russia is deploying automated gun and missile systems to defend its bases and oil pipelines, and has developed the heavily armed Uran-9 drone tank. It has also publicly said that it will ignore any future ban on LAWS. Are we ready to be killed by an algorithm? The United Nations is currently considering a ban on any weapons that can kill without human intervention; however, the talks are being delayed by the usual squabbling and objections. It might like to hurry, though; the Russians are not alone—the South Korean arms company Dodamm is already selling gun turrets that
are technically capable of fully autonomous operation, and are already being deployed along the border with North Korea, and in the Middle East. The concern is that if such weapons are not restricted, there will be an arms race, as each of the big powers scrambles to keep ahead. Once such systems are developed, it will be very difficult to go back.