Robotic soldiers march on
An intriguing question is being intensely debated in labs, boardrooms and ministerial gatherings in many countries — should robots be soldiers? Should soldiers be robots?
Rhetorical as it may be, the fact is that robotic war systems are moving beyond concept to actual usage. Remotely managed drone systems with weaponised capability have been used for almost a decade now. The Obama administration would be remembered for triggering the deployment of weaponised drones in conflict zones. Other countries too have brought remotely managed drones for attacks and assassinations.
This wave of killing machines are called lethal autonomous weapons systems (LAWs), or simply killer robots. The military industry complexes in various countries are eagerly working with labs to develop LAWs that span a spectrum of capabilities. Drones are now capable of using artificial intelligence to choose their targets. So, the decision of killing is moving from the human managing the drone to an algorithm.
Drones were the beginning. Similar weapons are being created and tested for ground assault. The US army already has remote-managed reconnaissance robots that can also be weaponised. These can sneak up to the target on ground and shoot them while its operating soldier is safe at a distance. The level of autonomy is in three categories broadly, depending on the configuration. The weapon can be remotely operated, remotely monitored but operated only as an intervention; and finally, a fully autonomous landbased or flying machine.
The United Nations Convention on Certain Weapons has been holding meetings of experts and governments to consider a global treaty on LAWs. In the last meeting in April, many countries called for a ban on LAWs. But some others including the US, UK, Russia and Israel are focused on creating a treaty that establishes a framework for use of LAWs.
For many countries a robotic soldier is the ideal weapon. Attacking the enemy from a remote location or some distance offers advantages of reduced body count. The political system that may push for war also recoils from the prospect of body bags.
Some developed countries are eagerly but quietly investing in LAWs. This is creating a new category within the armaments industry,w hich is getting closer to new tech companies for collaborative efforts. Boston Dynamics has already stunned the world with it dexterous autonomous robots that can do backflips and open doors. It is just a matter of time before someone mounts a gun on it to sneak around battlefields.
However, some tech companies are facing a dilemma now. Should they develop AI-based systems for weapons sector? There are reports that a few thousand Google employees have protested the company’s collaboration with the Department of Defence.
Such protests notwithstanding, the two industries are coming together to create the third big shift in weaponry. After gunpowder and nuclear weapons, AI and autonomous killing machines will soon decide the fate of armed conflict.
The question then is of framing rules that govern ethics, accountability and oversight of independent-minded robot soldiers.
Among emerging markets, China has invested in technologies that will enable it to build LAWs. China’s Anbot robo-cop which looks like Star Wars’ R2D2, is not far from being weaponised. India’s Centre for Artificial Intelligence and Robotics under the Defence Research and Development Organisation is experimenting with its own robotic soldiers. Most countries are keeping their plans under cover but the direction and intention is clear. After the deterrence achieved by nuclear weapons, robotic warfare will be easier to unleash.
Formany countries, a robotic soldier is the ideal weapon. Attacking the enemy from a remote location or some distance offers advantages of reduced body count