The Scotsman

Smart thinking is required before deploying artificial intelligen­ce in your business

There are key legal and contractua­l risks that must be considered as the courts develop common law principles for AI cases, says Phillip Kelly

-

Witht he business use of artificial intelligen­ce (AI) on the rise, there are key legal and contractua­l risks that businesses using, or supplying, AI need to consider.

As with most contracts for the sale of products, any contract for the supply or provision of AI is likely to contain supplier – or developer – favoured allocation­s of risk.

Businesses supplying AI will try to protect themselves from potential liability by including a provision excluding liability for defective AI. The effectiven­ess of such clauses has not yet been tested, so the task will eventually fall to the courts to assess whether such a clause is reasonable.

The absence of case law makes it difficult to predict how a court would strike this balance, and this is a significan­t area of risk for suppliers looking to rely on an exclusion clause or, more importantl­y, purchasers hoping to overcome the clause to recover loss and damage.

Another likely avenue for a potentialc­laimant would be the argument that there is fault with the quality or fitness of the ai,andt hat this does not satisfy the requiremen­ts of the sale of Goods Act 1979 (SGA 1979) or Consumer Rights Act 2015 (CRA 2015).

These arguments are not without their own difficulti­es. Crucially, claims under both the SGA 1979 and C ra 2015 would be based on the classifica­tion of AI as a" good ", which is contentiou­s. for instance, in the context of computer software, the English courts have ruled that intangible computer software does not constitute a" good" for the purposes of the SGA 1979. Therefore, it is unclear whether the code underpinni­ng an AI process would be similarly categorise­d.

In cases where there are barriers to relying on contractua­l liability, a potential claimant will usually look to the law of delicts to try to bridge this gap, and negligence is often viewed as an opportunit­y to impose liability on a party outside the reach of the contract.

The law on negligence is rooted in the principle of foreseeabi­lity of the loss and proving a chain of causation between the loss and the party being sued. The features of AI pose challenges in establishi­ng a claim under this branch of law. There may be significan­t problems for a claimant establishi­ng foreseeabi­lity or a causal nexus between the conduct of the suppliers and an outcome caused by aspects of AI developed by machine learning after the ai was initially programmed.

This uncertaint­y will benefit suppliers and developers in the short term, but there is a significan­t risk the courts will look to adapt the principles of negligence to fit the new paradigms created by AI; the existing product liability regime will likely come into play. However, difficulti­es in determinin­g exactly where the defect occurred in the supply chain will be problemati­c if the complaint has stemmed from a feature of autonomous machine learning.

We expect the measures most likely will include introducin­g a strict liability regime to cover situations where remoteness or causation might otherwise prove a barrier to recovering against a supplier or developer; an adapted duty of care, for example obligation­s on a supplier of AI systems to monitor and maintain those systems to control for unexpected outcomes due to machine learning; express allocation of liability on manufactur­ers for damage caused by defects in the products; joint and several liability between manufactur­ers, developers, suppliers and retailers; and reversing the burden of proof, requiring the manufactur­er or supplier to prove that the AI product was not the cause of the harm.

The UK has already taken steps to address some of the uncertaint­ies around AI by introducin­g the Automateda­nd electric vehicles act 2018 which attributes liability to the insurerwhe­re damage has been caused by an automated vehicle driving itself.

We can expect that the law of delicts will continue to be shaped by legislativ­e and regulatory reform, not to mention the more immediate prospects of legal developmen­ts in the courts as they develop the existing common law principles by dealing with novel cases on a day today basis. Phillip Kelly is a Partner, DLA Piper

 ?? ??

Newspapers in English

Newspapers from United Kingdom