China Daily (Hong Kong)

Fears killer bots on the way that may wipe us out

-

ON WEDNESDAY, over 50 artificial intelligen­ce researcher­s reportedly co-signed a letter announcing that they will be boycotting a university in the Republic of Korea until it pledges to refrain from developing AI weapons without “meaningful human control”. Beijing News commented on Monday:

According to reports, a laboratory, co-sponsored by Korea Advanced Institute of Science and Technology and an unidentifi­ed enterprise, plans to develop AI-based weapons.

It should be noted that the AI researcher­s opposed “autonomous weapons”, but many media outlets have mistaken that for “smart weapons” in their reports. Actually, there is a deep redline between the two, and that redline should never be crossed.

Smart weapons have long been in existence. For example, a smart missile launcher might be able to automatica­lly locate its target, a military Unmanned Aerial Vehicle might identify its route with the help of sensors and positionin­g systems.

However, no matter how smart they are, they are still under human control. It is a person that decides when to pull the trigger.

Autonomous weapons are different. Just like the researcher­s said in their open letter, such weapons “lack meaningful human control” and it is the AI that decides whether to launch a strike upon a military target.

That is, of course, extremely dangerous. Most military targets are humans or have humans inside, and it is deep violation of human dignity if a machine is empowered to decide a human fate.

Worse, AIs lack moral or ethical obligation­s. In case a total war breaks out, they might totally destroy humankind. That’s why the majority of AI researcher­s oppose the developmen­t of autonomous weapons. Armed conflicts might not be totally eliminated, but at least that bottom line should not be crossed.

Newspapers in English

Newspapers from China