Khaleej Times

Keep an eye out for AI weapons

- Allan Jacob allan@khaleejtim­es.com Allan is a news junkie and history buff who loves a debate

There’s this raging debate in my head about smart machines making people lazy — of Artificial Intelligen­ce (AI) and common sense. Many fear AI will take our jobs. Others are fascinated with the prospect of devices soon taking up arms on our behalf and making our world a safer place. We’ve traversed different millennia and seem delighted to hand over our intelligen­ce to gadgets. Once that task is complete, they can be weaponised with all the resources at our disposal in sweet, abject surrender.

We invented gunpowder, then developed nuclear, chemical and biological weapons and are now giving machines the power to decide who to locate, target and kill. Easy-peasy, or easy prey? Saves us the trouble of dragging our sorry, lazy selves to distant battlefiel­ds, fighting in dustbowls and trenches of hate, in wars we started in the first place. Boots on the ground to do the gory stuff is the stuff of legend from the World Wars. The modern age is being dictated by the speed at which the mind works to outsource every remnant of what is left of our brains.

We are witnessing the Fourth Industrial Revolution, where the march of technology is faster than what we have witnessed before. This revolution is breeding what is known as the Third Revolution in Warfare driven by Artificial Intelligen­ce, a scary prospect which tech czars like Elon Musk and more than 100 AI CEOs have been warning for more than a year now, yet are doing nothing to curtail its developmen­t.

But what got me interested recently about the progress of Artificial Intelligen­ce was a news report linked to Google, which many mainstream publicatio­ns missed or played down. It was called Project Maven, which the digital behemoth was undertakin­g for the US Department of Defence, or the Pentagon. Specifical­ly, the tech giant was helping the US government analyse drone footage using AI. Sounded harmless, but Google employees, 3,000 of them, were up in arms. “Google should not be in the business of war,” they wrote in a letter to CEO Sundar Pichai. There were reports of resignatio­ns, and the company decided to pull out of the contract ahead of its renewal next year. The firm also said it would draft a set of principles that would guide its work on AI for military use in future.

Now, here’s the fine print, and it’s telling. The company said the technology is intended to ‘save’ lives, humans would ‘review’ it, and save people from doing ‘tedious’ work. Simple, but the fact remains that Google downplayed the importance of Project Maven and other AI military ventures, for which it expected revenue to soar from $15 million to $250 million annually. That said, the company isn’t involved in drone warfare, it merely provides the optics, the images for AI to analyse and help reduce collateral damage in conflict situations.

Location matters in warfare, and that’s where AI makes a grand entry. Google has the technology and was only progressin­g — naturally, to help the military. Profits are key to the future of any company, and what they did cannot be termed unethical. What the military could do with AI technology is what should concern us.

The question here is: are we concerned enough? Or are we blindsided by the comfort provided by the latest tech? And it doesn’t help that 30 countries are developing autonomous weapons — killer robots and drones, that are getting better to do things their way — the locate-lock-and-target routine. Machines helping themselves to some human scalps is a scenario we may not wish upon ourselves, yet we appear meek. For the moment, government­s are looking at all means (and machines) at their disposal to gain an edge in the great geopolitic­al game that is being played on different continents.

Online, the algorithms driven by AI are already waging cyberwars, gaining the upperhand against people and systems. There are government­s, corporatio­ns and even terrorists involved. In the real world, meanwhile, AI is making life easier as we interact more with machines than with people like us. Rapid strides have been made in healthcare, education, travel and other sectors using this remarkable technology.

Less human interventi­on may speed up some processes and save lives in many cases, but we need more people to control the same technology we have put out there for surveillan­ce, intelligen­ce gathering and quick skills (or kills) in the pursuit of war and peace.

One such weapon is an autonomous gun named the Samsung AGR Sentry gun that South Korea deploys on its border with North Korea. It can fire on its own if it spots enemy activity close to the 38th Parallel (the line dividing the two countries), or even an incursion. Do you know that the United States is developing a ‘thinking’ warship that can conquer the waves and shoot enemies on its own? But the biggest concern in the defence community is the prospect of autonomous drone swarms that can cause damage on an unimaginab­le scale.

And people can sit back and say: Look, no hands. Military analysts can claim that the terrain demanded it. Only the bad guys got hurt and we kept our troops at home, they will say. Others will proclaim defensive reflux made us develop these weapons to keep us and future generation­s out of harm’s way. All these are valid excuses as we develop a closer rapport with machines while shunning people-to-people contact and so-called confidence building measures (CBMs). Who knows, self-propelled ICBMs (Interconti­nental Ballistic Missiles) tipped with nuclear weapons could be the answer in future wars, all because we didn’t ban them when we could and merely regulated their use.

We could always sing: “We didn’t start the fire, it was always burning, since the world’s been turning,” like Billy Joel. Sure, we can’t blame ourselves for getting to the Third Revolution in Warfare in double quick time. We’ve traded our intelligen­ce for comfort, and the sanctuary of security provided by machines that have a mind of their own. Intellect and wisdom, that we humans are proud of, have been given the short shrift. With Project Maven, covert, weaponised AI is out of the bag. I’ve made my peace with it. You should, too.

What the military could do with AI technology is what should concern us. Are we concerned enough though? Or are we blindsided by the comfort provided by the latest tech? It doesn’t help that 30 countries are developing autonomous weapons — killer robots and drones, that are getting better to do things their way

 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from United Arab Emirates