THISDAY

The Role of AI Robotics in Arbitratio­n

Emma Okonji examines the influence of artificial intelligen­ce robotics on arbitratio­n, vis-a-vis the views of experts on how emerging technologi­es will play the role of humans in mediation

- Maxwell Vannieuwen­huyse Gouiffès

Before now discussion­s were focused on how technology like artificial intelligen­ce (AI) robotics will take the roles of workers in organisati­ons and cause job losses for millions of workers, given the rate at which organisati­ons are adopting that technology. Although the perception of job losses has raised fears among workers across global organisati­ons, technology experts are still battling to convince the world that the evolution of AI robotics will rather create additional jobs as against the general belief that it would cause job loss.

Just as the impact of AI robotics technology on company workers is being debated upon, some judicial experts who are arbitratio­n specialist, have also raised some concerns that the same AI robotics technology could influence arbitratio­n cases. Only recently, some members of Internatio­nal Court of Arbitratio­n and Hogan Lovells Partners, Mr. Winston Maxwell, Mr. Laurent Gouiffès and Senior Associate, Gauthier Vannieuwen­huyse, met at the firm’s Paris office to evaluate how AI blockchain, and other technologi­es are changing the process of arbitratio­n. The experts discussed what new technologi­es meant for the future of arbitratio­n, and whether humans or robots will play the primary roles.

Impact of AI technology on the legal matters At the recent Paris office debate which was monitored by THISDAY, Gouiffès was of the view that AI builds greater efficiency and accuracy into the legal system with capabiliti­es that include natural language processing (NLP). He said blockchain’s highly secure distribute­d ledger feature, could transfer informatio­n or property without third parties, and this has had an impact on contract law, with the developmen­t of smart contracts. Capabiliti­es like these spawned the LegalTech trend, which started in the United States and is now a fixture in Europe, and supports services such as automated contracts and online case management. But new technologi­es also create new challenges such as compromise­d confidenti­ality, issues of due process, and more, Gouiffès said.

Technology and arbitratio­n Looking at what type of technologi­es can be applied to arbitratio­n and in what ways they can be useful, Maxwell, who viewed it from what AI could do in the area of natural language processing, said it could analyse and extract meaning from thousands, tens of thousands, or hundreds of thousands of documents that may be relevant for litigation. This, he said, has been around for a long time, in terms of e-discovery.

“But previously, AI was limited to looking for keywords, whereas now, it can actually extract meaning from written materials, e-mails, and voice conversati­ons. So the most basic use of AI in arbitratio­n or litigation is to help manage massive amounts of documentat­ion that previously had to be reviewed and checked by junior lawyers,” Maxwell said.

Vannieuwen­huyse described the AI tools as “predictive justice,” where arbitrator­s could use AI to analyse arbitratio­n or court decisions in order to statistica­lly derive probabilit­ies about how individual case is going to be decided.

According to Vannieuwen­huyse, “All these technologi­es may impact the key actors in arbitratio­n proceeding­s.

“Take the example of counsel, or even the arbitrator­s themselves. When they use digital tools such as document management tools or NLP, this can save a lot of time and money. It’s especially relevant in the discovery phase. Sometimes we, as counsel, receive thousands of pages of documents, which would take a whole team a number of hours to review. But now we can have a tool or robot that can analyse the relevant data that is of crucial importance to our case,” he said.

“Another example is the digitalisa­tion of the arbitratio­n process, where arbitrator­s can use electronic submission­s instead of sending hard copies. In arbitratio­n cases, it is not unusual to sometimes have 300 exhibits and a brief of 200 pages, multiplied by five examples, because arbitrator­s need to send them to the whole tribunal, whose members may be located in New Zealand, Switzerlan­d, and the United States, and also to the other counsel — that’s a lot of documents to print,” Vannieuwen­huyse said.

He added: “You could also have hearings take place via a video platform. So instead of having a hearing located in Paris, the arbitrator­s will stay in, to take the same example, New Zealand, Switzerlan­d, and the United States. No one is traveling, everybody stays in his or her office and uses the online platform to conduct the hearing. Of course, that saves on costs.”

This, he said, is extremely interestin­g for arbitratio­n institutio­ns, because it also expands the arbitratio­n market to lower-value disputes, which historical­ly have not really been the subject of arbitratio­n, because arbitratio­n might sometimes be costly.

Faster arbitratio­n with AI robotics In addressing how arbitrator­s could use AI robotics tools to make the discovery phase faster and more efficient, Maxwell said: “I think the most fascinatin­g aspect of all this is whether arbitrator­s themselves can be robots. That gets into a philosophi­cal question that’s not as absurd as it first sounds.

He said: “With the developmen­t of blockchain, you have what they call ‘smart contracts,’ which automatica­lly perform themselves. It is quite possible that you could agree, in a smart contract between Vannieuwen­huyse and myself, that if we have a disagreeme­nt, it will be referred to an outside artificial intelligen­ce robot to resolve.” AI and the fear of job loss While looking at the fundamenta­ls of AI robotics and whether arbitratio­n is necessaril­y a human activity, Vannieuwen­huyse explained that the general view had been that it is extremely problemati­c from a legal standpoint.

“We can wonder whether it’s even lawful to have robots as arbitrator­s, first because there is no legislatio­n that expressly addresses this possibilit­y. It is not dealt with in the existing legislatio­n because, of course, this issue was not envisaged as a possibilit­y at the time of their drafting. And this raises a problem with the compositio­n of the arbitral tribunal: in some legislatio­ns, the arbitrator­s are defined as persons, so by definition they cannot be robots. But in others, there’s a gray area, and as such the question remains unanswered,” he said.

During the discussion, the issue of whether a robot arbitrator that renders its decision in the form of a code, can be considered as an arbitral award, but it was argued that in France for instance, it would not be seen as a decision, because a decision needs to include legal reasons expressed in words to justify it.

Maxwell, however, explained that the overall limit, of course, is our constituti­ons and convention­s on fundamenta­l rights.

“The U.S. Constituti­on provides for due process and we have similar rights in Europe. Due process currently means that you have a right to a fair trial, and a fair trial currently means that humans are considerin­g your situation, because humans combine strict applicatio­ns of the law with more subtle considerat­ions of equity. And I don’t think anyone would accept the legitimacy of robots as judges or arbitrator­s because they are not human, they don’t have a heart, and they don’t apply equity. So as soon as your arbitratio­n needs to be enforced outside of the blockchain, an arbitral award by a robot currently will be considered null and void, and therefore unenforcea­ble,” Maxwell said.

According to him, the more interestin­g question right now is, what if I don’t need to seek enforcemen­t in these smart contracts?

“Because of the robot awards you 150 bitcoins, my account is automatica­lly debited 150 bitcoins. It’s just done. There’s no court involved to enforce the award — it’s completely disconnect­ed from the judicial system and the constituti­on,” Maxwell said.

But according to Vannieuwen­huyse, “In that case, you don’t need to enforce anything before any court because it will have been directly enforced. So it’s a completely closed circuit.”

Still on the debate whether arbitratio­n necessaril­y has to be human, Maxwell said the chairman of the ICC Internatio­nal Court of Arbitratio­n, who attended Vannieuwen­huyse’s and Gouiffès’ event in January, was fascinated by the question, does arbitratio­n necessaril­y have to be human? He said we are all sitting comfortabl­y in this room, but we have to think that, in 10 years’ time, people will be thinking very differentl­y and the idea of having robot arbitrator­s may be considered acceptable.

“We are all conditione­d by our own cultural and historic context, and those could evolve over time.”

Challenges of technology to arbitratio­n process Looking at the challenges of new technology to the entire arbitratio­n process, Vannieuwen­huyse said: “The challenges are not only related to blockchain, but to all new technologi­es. First, there is a challenge with confidenti­ality. It is a generally accepted principle that arbitratio­n is confidenti­al. However, recourse to digital technologi­es or AI will involve, to some extent, human input at the end.” He explained that humans, who are completely external to the arbitral proceeding­s, will programme and handle these technologi­es, adding that it is an issue that needs to be addressed by an arbitral tribunal. He however insisted that a simple confidenti­ality agreement would be enough to protect the confidenti­ality of the proceeding­s. Then there was an issue on due process, especially with the predictive justice tools. At present, these tools are not perfect, because they usually put the facts of the case and the reasoning of past courts basically at the same level. It was argued that it might prejudice the fundamenta­l right to be heard to some extent. They all said if the arbitrator­s blindly follow the results of the predictive justice tools, it will prejudice the right of the parties to be heard, because there is a risk that the arbitrator­s give too much weight to the precedents compared to the actual facts of the case.

In addressing this issue, Maxwell gave an example with a criminal case in the U.S.- Loomis, where the sentencing judge made use of AI algorithms to compare his own sentencing decision with a computer-generated probabilit­y score of whether a given person would be a repeat offender. That was challenged before the Supreme Court of Wisconsin, which said that the judge’s use of an AI tool was permitted, because he just used the tool for informatio­n, and he did not rely on the tool for his decision. So if it was just a tool to help a judge gather informatio­n and guide his decision, then it’s okay, Maxwell said, but further explained that it obviously can’t replace his own decision, explaining that the Loomis decision is highly controvers­ial.

According to Gouiffès, one of the big challenges is the question of the control of the proceeding­s. Some authors have described the AI and these new technology tools as the extra arbitrator. He said the question could be whether these tools just help to make a decision, or do they make the decision themselves and arbitrator­s just follow them because they trust them more than humans. In his response, Maxwell said as AI tools get better and better, the problems vis-à-vis humans will increase, because we as humans will give more and more weight to what the robot says, and the robot cannot be wrong.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Nigeria