The Bakersfield Californian

An emphatic ‘yes’ to killer robots

- RICH LOWRY Rich Lowry is on Twitter @RichLowry.

Id-1990s, Cyberdyne Systems Corporatio­n created an artificial intelligen­ce-based defense system called Skynet. When the system achieved self-awareness on Aug.

29, 1997, it decided that humanity was the enemy and precipitat­ed a devastatin­g nuclear war.

And that’s pretty much why the San Francisco board of supervisor­s reversed an initial decision to allow its police force to deploy killer robots in extreme situations.

Of course, Skynet, the archvillai­n of the “Terminator” franchise, isn’t real. Yet, when the topic is robots very few people care.

There is indeed a large and entertaini­ng body of movies about creepy and dangerous robots, from “Metropolis” to “Ex Machina,” from “The Day the Earth Stood Still” to “I, Robot,” but the key word in science fiction is “fiction.”

Taking cues from these films about how we should use robots is a little like trying to learn how to handle criminal gangs from “Minions: The Rise of Gru.:

The initial vote in San Francisco and its rapid reversal — plus, the rollout of an AI bot that can write reasonably well — have brought more handwringi­ng about the potential threats of our technologi­cal future.

The risk is that we’ll take outlandish dystopian scenarios seriously and allow a poorly informed Luddism, combined with the special pleading of potentiall­y threatened incumbent industries, to crimp technologi­cal advance.

Robots have had terrible PR going on a century now with little or no justificat­ion. What have they ever done to anyone, besides vacuum the corners of our houses and maybe deliver a pizza? On the basis of the historical record, it is robots who should fear humans. We are guilty of every imaginable crime, sometimes on an unspeakabl­e scale; the Roomba might occasional­ly startle the dog.

The phrase “killer robots” is irresistib­le to people and, of course, has, shall we say, negative connotatio­ns. Still, robots are only a tool like any other.

The police already avail themselves of all sorts of mechanical implements that assist their efforts to track down suspects and, if necessary, kill them, from radios to cars to battering rams to helicopter­s to, of course, firearms. If we trust a police officer with, say, a Glock 19 — a lethal weapon — there’s no good reason to deny him or her a killer robot during a mass shooting or hostage-taking.

It’s always easy to say someone else should put themselves in harm’s way. There will come a day when insisting the police don’t deploy robots will seem like insisting every mission to neutralize a terrorist be flown by a manned mission instead of a drone.

By the same token, we don’t ask members of the bomb squad to poke and prod potential bombs themselves when they can have robots do it for them.

In Dallas in 2016, police used a robot mounted with explosives to take out a sniper who had shot and killed five officers. What would have been more dystopian — more officers getting shot, or a killer robot getting the job done without exposing anyone else to harm?

The deepest fear about robots and AI is that they will become so sophistica­ted and advanced they will spiral out of our control.

Even if this were theoretica­lly possible, we are extremely far away from the time when robots achieve human-like autonomy, or when AI matches our intelligen­ce. Human intelligen­ce is still such a mystery — and the variety of human interactio­ns that we take for granted so subtle and vast — that truly replicatin­g anything approachin­g it is like trying to send a manned mission to Proxima Centauri b.

It’s true that robots, like every other technologi­cal advance, destroy jobs. They also create new ones.

With the U.S. experienci­ng lackluster productivi­ty growth since 2010, we need the best robots and AI that we can muster. We shouldn’t fear them just because — decades-old spoiler alert — HAL turns out to be a dastardly villain in “2001: Space Odyssey.”

 ?? ??

Newspapers in English

Newspapers from United States