The Scotsman

The use of AI gives rise to huge potential legal issues

Maybe I can find a programmer to blame if software causes me harm, but how do I deal with the mysterious black box, asks Iain Mitchell

-

In October, 2017, Saudi Arabia conferred citizenshi­p on Sophia. The puzzling thing is that Sophia is a robot, even more human-looking than the archetypal female robot in Fritz Lang’s expression­ist masterpiec­e, Metropolis. So, is Saudi Arabia leading the world in giving legal recognitio­n to cyber life forms or is it a triumph of hype over reality?

Sophia was developed and built by Hong Kong-based company Hanson Robotics, headed by David Hanson, a former Disney “Imagineer”, who created performing manne - q u i n s s u c h a s t h e s i n g i n g s i mu - lacrum of Jack Sparrow in Pirates of the Caribbean. With that pedigree, you might expect her to look, and even sound convincing. According to Hanson, she uses artificial intelligen­ce, visual data processing and facial recogni tion technolog y to answer questions, make conversati­on and imitate human gestures and facial expression­s. So, does she have artificial intelligen­ce?

It’s a very slippery and imprecise expression, “Artificial Intelligen­ce”. According to the Oxford English Dictionary, it isn’t even a thing, but, rather, a field of study.

Yet, ever ywhere we look, people a r e u s i n g “AI ” t o d e s c r i b e , we l l , what? If you are in the business of selling software for running an office, then AI is an on-trend way of making the stuff you are tr ying to sell sound really hi-tech. Ironically, even when applied to rather humdrum products, the descriptio­n is not inaccurate: at its widest, AI can be used to describe the running of any computer program.

At the heart of the running of all computer processes, whether a program on a general-purpose computer or the functionin­g of a narrowly specialise­d device, like an automatic door-opening sensor, is the use of algorithms, sets of rules to be followed in calculatio­ns or other problem-solving operations. Run an algorithm on a computer and that’s AI.

However, when most people use AI, they are thinking of either complex algorithms, capable, for example, of sifting job applicatio­ns, or even what are called “neural networks”, which are closer to what we think of as robots. Essentiall­y, neural networks are systems educated by means of a dataset, as with the Go-playing program, Google Go, or even educate themselves by figuring out all the combinatio­ns from the basic rules of the game, as with Google Go Zero. The problem about such neural networks is that, unlike a convention­al program where the code can be analysed and understood, neural net works are black

boxes: even the people who create them cannot determine how they make their decisions.

The all-per vasive use of AI in general, the particular opacity of neural networks, and the risk of flaws and biases in the datasets used to train such networks give rise to huge potential legal issues. What happens if an AI system goes wrong? What do we mean by “wrong” anyway? What if medical diagnosis software fails to pick up that I am suffering the early stages of cancer? What if I am discrimina­ted against by an AI system when I apply for a job? Can AI be used to sift Big Data and end up infringing my human rights?

Maybe I can find a programmer to blame if convention­al software causes me damage, but how do I deal with the mysterious black box?

These are the sor ts of prob - lems which l awye r s a r e o n l y now beginning to address. The Law Commission­s for England and Wales and for Scotland are in the middle of a joint consultati­on on legal issues arising from the use of self-driving vehicles. The European Commission has a high-level expert group looking at the legal implicatio­ns of AI, and is looking at the reform of product liabilit y law to take account of AI systems. The IT industr y is getting concerned with ethical issues arising from AI.

Closer to home, the Faculty of Advocates, along with the Associatio­n of European Lawyers, the S cottish S o ciet y for Comp u t e r s a n d L aw, E d i n b u rg h University’s SCRIPT Centre and the British Computer Society, is hosting a conference in the Faculty’s Mackenzie Building on 31 May, AI beyond the Hype, looking at legal and ethical implicatio­ns of AI.

It will be interestin­g to see what the experts make of it all, though one thing which they will not need to worr y about is Sophia, for though she app ears to b e the self-aware robot of popular imaginatio­n, it’s only an illusion. We haven’t invented self-aware robots, at least not yet…

Iain Mitchell, QC, is a member of the Faculty of Advocates and Chair of the Scottish Society for Computers and Law

 ??  ??

Newspapers in English

Newspapers from United Kingdom