AI will make judicial decisions with avenue to appeal to humans, predicts top judge

Legal profession must develop mechanisms to deal with generative AI says Master of the Rolls Sir Geoffrey Vos

AI could make judicial decisions in the future, but its use must be regulated, according to a top judge.

Addressing the Law Society of Scotland’s Law and Technology Conference last week, Master of the Rolls, Sir Geoffrey Vos, said he believed AI will “at some stage be used to take some (at first, very minor) decisions”. 

But he added there was a need to “develop mechanisms to deal with the use of generative AI within the legal system”, hopefully to “turn it to the advantage of access to justice and effective and economical legal advice and dispute resolution.”

In the event the technology was making judicial decisions, he said, required controls would be that the parties know what decisions are taken by judges and what by machines, and for there always to be the option of an appeal to a human judge.

Vos’s comments came amid an ongoing debate over the use of generative AI in the legal profession that gained momentum late last year with the release of OpenAI’s ChatGPT tool. 

Vos cautioned that tools like ChatGPT don’t enable lawyers “to simply cut corners”, pointing to the recent example of New York lawyer Steven Schwartz, who cited cases provided by ChatGPT in his submissions for a personal injury case that turned out to be “bogus”, despite him asking the tool to confirm that they were real.  

He noted that Schwarz “would have done well” to read an article published in May by City litigation firm Enyo Law, which noted that ChatGPT can be a valuable assistant for tasks like drafting, document review and predicting case outcomes but that its results need to be carefully checked.  

“I suspect that non-specialised AI tools will not help professional lawyers as much as they may think, though I have no doubt that specialised legal AIs will be a different story,” Vos said. 

He noted Spellbook already claims to have adapted GPT-4 “to review and suggest language for your contracts and legal documents” and that pressure for law firms to use such tools could come from clients. 

“If briefs can be written by ChatGPT and Spellbook and checked by lawyers, clients will presumably apply pressure for that to happen if it is cheaper and saves some of an expensive fee-earners’ time,” he said. 

He added that ChatGPT’s assurance to Schwartz that the cases it gave him were were real when in fact they weren’t showed that ChatGPT and AIs more generally “need to be programmed to understand the full import of a human question”, and humans using them “need to be savvier in checking their facts”.

This meant asking more specific questions and programmers explaining to the AIs they are creating “what humans mean when they ask something as open textured as ‘is this a real case’”.

Vos’s views were echoed at a recent panel discussion at American Lawyer’s Legalweek event, which noted that generative AI is bound by the data on which it is trained. The current version of ChatGPT has limited knowledge of the events – including cases – that took place after 2021, meaning it might be missing crucial legal developments. 

Furthermore, OpenAI claims a factual accuracy rate of 70-80% for GPT-4, the latest version of ChatGPT, with that 20-30% below perfect being “significant” in law according to panel member Ilona Logvinova, associate general counsel at McKinsey & Company. The tool also suffers from “hallucinations,” meaning that sometimes it “predicts” facts that have no basis in reality. 

For tools like ChatGPT to realise their full potential for lawyers in providing accurate legal advice, predictions of legal outcomes and assistance with dispute resolution, Vos said, “it is going to have to be trained to understand the principles upon which lawyers, courts and judges operate. 

“As Mr Schwartz found to his cost, the present version of ChatGPT does not have a sufficiently reliable moral compass. In the meantime, court rules may have to fill the gap.”

In terms of litigation, he said, “one can envisage a rule or a professional code of conduct regulating whether and in what circumstances and for what purposes lawyers can: (i) use large language models to assist in their preparation of court documents, and (b) be properly held responsible for their use in such circumstances.”

Email your news and story ideas to: news@globallegalpost.com

Top