REPLACES MUSK AT HELM
this means the delegation of “lethal authority.“
Quite a few such systems already exist, such as the Israeli Harpy Drone, designed to linger over enemy territory and destroy any enemy radar it discovers. The US Navy developed the so-called “fire and forget” missile, which could be fired from a ship to a remote area where it automatically seeks and destroys an enemy vessel.
Currently, the US Department of Defence is developing a machine vision system based on AI technology known as Project Maven, which analyses massive amounts of data and video captured by military drones in order to detect objects, track their motion and alert human analysts of patterns and abnormal or suspicious activity.
The data is also used to improve the targeting of autonomous drone strikes using facial and object recognition.
Ahead of other countries, the US is experimenting with autonomous boats that track submarines for very long distances, while China researches the use of “swarm intelligence” to enable teams of drones to hunt and destroy as a single unit. Russia is working on an underwater drone that could deliver powerful nuclear warheads to destroy entire cities.
Machine learning and drone swarms are all potential technologies that could change the future of war. Some of these AI technologies are already being used on the battlefield. For example, Russia maintained that a drone swarm attacked one of its bases in Syria.
However, according to Manuel DeLanda the advent of intelligent and autonomous bombs, missiles and drones equipped with artificial observation and decision-making capabilities is part of a much larger transfer of cognitive structures from humans to machines.
If we think about it, Microsoft’s Cortana, Google’s Now, Samsung’s Bixby, Apple’s Siri and Netflix’s streaming algorithms all combine user input, access to big data, and bespoke algorithms to provide decision support to human users based on their decision history and the decisions of millions of other users. This is nothing more than a slow and steady shift of authority from humans to algorithms. Similarly, the decision-making in war is being transferred to intelligent machines – unfortunately not without risk.
On March 29, 2018, five members of the Al Manthari family were travelling with a Toyota Landcruiser in Yemen to the city of al-Sawma’ah to pick up a local elder. At about 2pm a rocket from a US Predator drone with object and facial recognition hit the vehicle, killing three of its passengers, while a fourth died later. The US took responsibility for the strike, claiming the victims were terrorists. Yet Yemenis who knew the family claim they were civilians.
Could it be that the Al Manthari family were killed due to incorrect metadata, which are used to select drone targets? Metadata is often harvested from mobile phones – conversations, text messages, email, web browsing behaviour, location, and patterns of behaviour. Attacks are based on specific predetermined criteria, for example a connection to a suspected terrorist phone number or a suspected terrorist location. But sometimes this intelligence can be wrong!
AI weapons may have many benefits, but they carry significant ethical concerns, in particular that a computer algorithm could both select and eliminate human targets without any human involvement.
However, based on history, I do not doubt that algorithms will run future wars and that “killer robots” will be used. Perhaps the suggestion of Ronald Arkin should be followed to fit all autonomous weapons with ”ethical governors.”
One thing is for sure in the war of algorithms – domination will depend on the quality of each side’s AI and algorithms.
Professor Louis Fourie is the deputy vicechancellor: Knowledge and Information Technology – Cape Peninsula University of Technology. TESLA appointed a director who’s been on its board for years to replace Elon Musk as chairperson, stirring suspicion as to whether the mercurial chief executive will be restrained after costly run-ins with securities regulators. Robyn Denholm, 55, will assume the chairperson’s role immediately. A director since 2014, she’ll leave her position as chief financial officer and head of strategy at Australian telecommunications company Telstra Corp after a six-month notice period. The appointment marks the end of an era for Musk, 47, who became chairperson when he led a $7.5 million initial investment in Tesla in April 2004. His problematic August tweets about trying to take the company private caused months of chaos and culminated in a settlement with the US Securities and Exchange Commission. The agency sought to improve the governance of a board long criticised for being too closely aligned with its billionaire leader. “While Denholm is technically an independent member of the board, she has been part of the Musk team for some time now and that suggests she will not be up to the task of checking Musk’s worst instincts,” said Stephen Diamond, a professor of law at Santa Clara University. Tesla rose 0.6 percent to $350.25 (nearly R4 899) in New York before the start of trading yesterday. Ceding the role of chairperson was a condition of the accord Musk reached with the SEC to settle fraud charges related to his tweets on taking the company private.
AN UNMANNED aerial vehicle (UAV) hangs on display above the Textron booth during the Special Operations Forces Industry Conference in Florida. Artificial Intelligence weapons may have many benefits, but they carry significant ethical concerns. I