Hallucinating artificial intelligence can tank a court case by creating fake case citations that leave the lawyers open to sanctions or the proceeding itself vulnerable to being overturned, a former litigator said.
Last month, a judge handed down a $5,000 penalty on a law firm representing Colombian airline Avianca Inc., which used ChatGPT to write its legal brief, but the AI included fabricated judicial decisions.
A similar case happened in South Africa, and the judge and magistrate overseeing the cases ripped the law firms in their decisions.
“There is potential harm to the reputation of judges and courts whose names are falsely invoked as authors of the bogus opinions and to the reputation of a party attributed with fictional conduct,” the judge presiding over the Avianca case wrote. “It promotes cynicism about the legal profession and the American judicial system.”
Jacqueline Schafer, CEO and founder of Clearbrief, an AI-powered platform that essentially fact-checks legal briefs, said this issue will continue to happen because of the time pressure that lawyers face.
“There’s a big temptation to use things that can just write it for you,” Schafer told Fox News Digital during a Zoom interview.
“We’re likely to see these stories continue to pop up. That’s why it’s critical for law firms to thoroughly review all of their pleadings before filing, even if they think they have banned ChatGPT in their firm.”
Schafer, who began her career as a litigator in New York before becoming an assistant attorney general for the states of Alaska and Washington, created Clearbrief in 2020 to catch mistakes or bogus cases in AI-written briefs.
WATCH: SCAFER EXPLAINS HOW CLEARBRIEF WORKS
“The challenge we have with generative AI like ChatGPT that creates instant written work is that it will do things like completely make up fake case citations and invent facts,” she said.
“A user can, for example, ask AI to write them a legal analysis of Arizona law, and ChatGPT will write something that seems elegantly written, and it may even include citations that look totally real.”
It can trick even the most experienced lawyers if they don’t “take the time to painstakingly check over every case and statute and look it up manually,” Schafer said.
In the South African ruling, the presiding magistrate essentially said the same thing in his ruling: “When it comes to legal research, the efficiency of modern technology still needs to be infused with a dose of good old-fashioned independent reading.”
Issues arise when legal professionals secretly use AI-powered programs like ChatGPT, Schafer said.
“Ironically, we need AI to help us detect the AI hallucinations,” according to Schafer, who said that’s the genesis behind Clearbrief.
“I meet with major law firms every day who are dealing with two problems,” she said. “They are terrified of using generative AI that writes the whole document for you if it introduces embarrassing errors that will get the firm sanctioned.
“But they also are facing pressure from their clients to use AI technology to be more efficient and cut down their bills. So the legal industry is doing a lot of work right now to identify tech that can solve both problems.”