Pre-emp­tive strikes

Re­searchers fear EU may fund killer ro­bot work

THE (Times Higher Education) - - FRONT PAGE - David.matthews@timeshigh­ere­d­u­ca­tion.com

“EuroSWARM” sounds like some­thing out of Nigel Farage’s night­mares.

In fact, it is a Euro­pean Union­funded re­search project that has ex­per­i­mented with drones, re­mote­con­trolled cars and other sen­sors to cre­ate an au­tonomously be­hav­ing “swarm” of bots that can com­mu­ni­cate with each other.

In a demon­stra­tion sce­nario, re­searchers set the swarm to check out a “sus­pi­cious-look­ing” ve­hi­cle, ex­plained Hyo-sang Shin, reader in guid­ance, nav­i­ga­tion and con­trol at Cran­field Uni­ver­sity, one of the project part­ners. The idea is that the swarm could be used for scout­ing an area be­fore troops are de­ployed, he said.

The project, which came to an end in Novem­ber last year, did not equip any of the drones or cars with weapons. The swarm is rather about “max­imis­ing the in­for­ma­tion you can col­lect”, said Dr Shin.

But EuroSWARM’s mil­i­tary uses have crit­ics wor­ried. It is one of the first trial projects in a new era of EU-funded mil­i­tary re­search; the bud­get for sim­i­lar ac­tiv­i­ties is set to ex­plode over the next decade.

This fund­ing splurge, trig­gered by fears of Euro­pean back­ward­ness in mil­i­tary tech­nol­ogy, has seen the global de­bate around re­search into “lethal au­ton­o­mous weapons” (Laws) – col­lo­qui­ally known as “killer ro­bots” – move to Brus­sels.

“Al­though the EU hasn’t given any fund­ing (yet) to ‘killer ro­bots’ in the strict sense,” said Bram Vranken, a re­searcher at Vre­desac­tie, a Bel­gian peace or­gan­i­sa­tion, “it is clearly pri­ori­tis­ing ro­botic sys­tems which are push­ing the bound­aries to­wards in­creas­ingly au­ton­o­mous sys­tems”, such as swarm sys­tems or “in­te­grated and au­ton­o­mous sur­veil­lance tech­nol­ogy”.

Vre­desac­tie is one of sev­eral groups, hail­ing from Ger­many, Italy, the UK and Spain, that have formed Re­searchers for Peace to cam­paign against what they call the “fur­ther mil­i­tari­sa­tion of the Euro­pean re­search bud­get”. The group ac­cuses the EU of de­vel­op­ing au­ton­o­mous weapons “with­out any pub­lic de­bate”. So far, more than 600 re­searchers have signed a pe­ti­tion in sup­port.

Aside from EuroSWARM, Mr Vranken said that he was also wor­ried about Ocean 2020, a

E35 mil­lion (£30.8 mil­lion) project that aims to “in­te­grate drones and un­manned sub­marines into fleet op­er­a­tions”. The project, led by Leonardo, an Ital­ian weapons con­trac­tor, in­volves sev­eral Euro­pean min­istries of de­fence, plus the Fraun­hofer So­ci­ety, a Ger­man ap­plied re­search net­work.

Th­ese projects are po­ten­tially just the be­gin­ning. Ear­lier this month, the EU an­nounced its spend­ing plans for 2021-27, and pledged

€ 13 bil­lion over the pe­riod for the Euro­pean De­fence Fund, even more than was ex­pected. Of this fund­ing,

E4.1 bil­lion will be set aside ex­plic­itly for re­search, a huge leap in re­sources com­pared with now, with the rest spent on de­vel­op­ment.

This will place the EU “among the top four” de­fence re­search and tech­nol­ogy in­vestors in Europe, ac­cord­ing to the Euro­pean Com­mis­sion. How­ever, this will still be peanuts com­pared with the US, where the Depart­ment of De­fense is spend­ing about $16 bil­lion (£11.8 bil­lion) a year on sci­ence and tech­nol­ogy.

The fight in Brus­sels is now over how this money should be used. In 2014, the Euro­pean Par­lia­ment was one of the first bod­ies to take se­ri­ously warn­ings about “killer ro­bots”, call­ing on mem­ber states to “ban the de­vel­op­ment, pro­duc­tion and use of fully au­ton­o­mous weapons which en­able strikes to be

car­ried out with­out hu­man in­ter­ven­tion”.

In Fe­bru­ary this year, MEPs amended pro­pos­als from the com­mis­sion – the EU’s ex­ec­u­tive arm – to pre­vent EU funds be­ing spent on “fully” au­ton­o­mous weapons that “en­able strikes to be car­ried out with­out mean­ing­ful hu­man in­ter­ven­tion and con­trol”. Asked whether it sup­ports this pro­hi­bi­tion, a Euro­pean Com­mis­sion spokes­woman de­clined to com­ment on the record. For now, it is not clear if the MEPs’ pro­hi­bi­tion will stand.

Those push­ing for in­creased EUwide mil­i­tary re­search point out that the Con­ti­nent lags be­hind ri­vals when it comes to de­vel­op­ing new mil­i­tary tech­nolo­gies such as drones.

But this is not an ar­gu­ment that im­presses Laëti­tia Sé­dou, EU pro­gramme of­fi­cer at the Euro­pean Net­work Against Arms Trade. “One of the rea­sons [for the cre­ation of the EU] is to try and pre­vent go­ing back into this arms race,” she said.

De­spite an in­ter­na­tional ef­fort by the Cam­paign to Stop Killer Ro­bots, gov­ern­ments are yet to agree to a ban on weapons where hu­mans no longer have “mean­ing­ful con­trol” over the use of force. What, if any­thing, can uni­ver­si­ties and re­searchers do in the mean­time?

One op­tion is to boy­cott in­sti­tu­tions seen to be tak­ing their re­search too far. In March, dozens of re­searchers threat­ened to boy­cott the Korea Ad­vanced In­sti­tute of Sci­ence and Tech­nol­ogy, a uni­ver­sity in South Korea, after it opened a Re­search Cen­ter for the Con­ver­gence of Na­tional De­fense and Ar­ti­fi­cial In­tel­li­gence with an arms com­pany. This spurred a pledge from KAIST’s pres­i­dent that the uni­ver­sity would avoid de­vel­op­ing “au­ton­o­mous weapon[s] lack­ing mean­ing­ful hu­man con­trol”.

But this poses the ques­tion of how far sci­en­tists should col­lab­o­rate with re­search projects that get close to – but stop short of – cre­at­ing a fully au­ton­o­mous weapon; there are a huge range of pro­cesses that can be au­to­mated be­fore­hand, some more eth­i­cally chal­leng­ing than oth­ers.

The “big­gest eth­i­cal is­sue” is au­tomat­ing the de­ci­sion to fire, said Stu­art Parkin­son, ex­ec­u­tive di­rec­tor of Sci­en­tists for Global Re­spon­si­bil­ity, a UK-based or­gan­i­sa­tion with about 750 mem­bers. But au­to­matic take-off and land­ing for drones is ar­guably “less prob­lem­atic”, he said.

Th­ese com­plex­i­ties mean that “it’s hard to say this project is eth­i­cal; this is not”, Dr Parkin­son added. For this rea­son, uni­ver­si­ties need to make sure that re­searchers are eth­i­cally trained, while ethi­cists should be in­cluded in re­search teams, he said.

Blood on the lab floor

As with any area of fast de­vel­op­ing re­search, a de­cent pro­por­tion of re­search spend­ing should be de­voted

to look­ing into how the tech­nol­ogy might be mis­used, Dr Parkin­son ar­gued. And at the mo­ment “we don’t have that”, he said.

And when in doubt over the ethics of a project, just look at the fun­ders, Dr Parkin­son ad­vised. If your back­ers are mil­i­tary, “what­ever you do will be sucked into that world”, he said.

Once an elec­tri­cal en­gi­neer, Dr Parkin­son left the field after con­clud­ing that it was sim­ply too dom­i­nated by mil­i­tary re­search fun­ders. For some aca­demics, “maybe it’s time to look for a different di­rec­tion”, he said.

But there will be no short­age of young re­searchers will­ing to take the place of the dis­en­chanted, hence the need for the mil­i­tary fun­ders

them­selves to abide by proper re­search ethics guide­lines, Dr Parkin­son pointed out.

For his part, Dr Shin ac­knowl­edged that his EuroSWARM project might one day be a build­ing block of a lethal au­ton­o­mous weapon sys­tem, but ar­gued that “any tech­nol­ogy can be dan­ger­ous”.

He said that he would “prob­a­bly” agree to work on a re­search project that ac­tu­ally in­volved weapons. “But I would re­strict my­self to things that might ben­e­fit or re­duce risk to hu­man troops or [re­duce] civil­ian ca­su­al­ties,” Dr Shin added. He is against drones ever us­ing their own judge­ment to fire.

Proper reg­u­la­tion, rather than aca­demic boy­cotts such as the one pro­posed against KAIST, are likely to be more ef­fec­tive, Dr Shin said.

It will be “years rather than decades” be­fore drones are able to fire on their own ini­tia­tive, said Dr Parkin­son, al­though then their “re­li­a­bil­ity will be in the eye of the be­holder”.

But in a sense, fully au­ton­o­mous weapons are al­ready with us: the Korean bor­der al­ready has ma­chine­gun tur­rets that can in the­ory fire au­to­mat­i­cally on move­ment, Dr Parkin­son said (al­though the South Korean mil­i­tary has re­port­edly made sure that a hu­man has to au­tho­rise any at­tack). He warned: “That’s an ex­am­ple of where some­thing is al­ready hap­pen­ing.”

Self-rule drones will fire on their own ini­tia­tive in ‘years rather than decades’

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.