Advertisement

B.C. ruling on AI ‘hallucinated’ fake legal cases could set precedent, experts say

Click to play video: 'AI in B.C. court case being closely watched'
AI in B.C. court case being closely watched
A B.C. court case is being closely watched across the country. At issue: should a lawyer be held financially liable for submitting fake case law to the courts, after using artificial intelligence? Rumina Daya reports – Feb 5, 2024

Industry experts say a pending B.C. Supreme Court case could provide clarity and perhaps even set precedent on the use of AI models like ChatGPT in Canada’s legal system.

The high-profile case involves bogus case law produced by ChatGPT and allegedly submitted to the court by a lawyer in a high-net-worth family dispute. It is believed to be the first of its kind in Canada, though similar cases have surfaced in the United States.

“It is serious in the sense that it is going to create a precedent and it’s going to give some guidance, and we’re going to see in a couple of ways,” Jon Festinger, K.C., an adjunct professor with UBC’s Allard School of Law told Global News.

“There’s the court proceeding around costs … The other part of this is the possibility of discipline from the Law Society in terms of this lawyer’s actions, and questions around … law, what is the degree of technological competence that lawyers are expected to have, so some of that may become more clear around this case as well.”

Story continues below advertisement
Click to play video: 'B.C. lawyer under fire for AI-generated fake case law'
B.C. lawyer under fire for AI-generated fake case law

Lawyer Chong Ke, who allegedly submitted the fake cases, is currently facing an investigation by the Law Society of B.C.

Breaking news from Canada and around the world sent to your email, as it happens.

The opposing lawyers in the case she was litigating are also suing her personally for special costs, arguing they should be compensated for the work necessary to uncover the fact that bogus cases were almost entered into the legal record.

Ke’s lawyer has told the court she made an “honest mistake” and that there is no prior case in Canada where special costs were awarded under similar circumstances.

Ke apologized to the court, saying she was not aware the artificial intelligence chatbot was unreliable and she did not check to see if the cases actually existed.

UBC assistant professor of Computer Science Vered Shwartz said the public does not appear to be well enough educated on the potential limits of new AI tools.

Story continues below advertisement

“There is a major problem with ChatGPT and other similar AI models, language models: the hallucination problem,” she said.

“These models generate text that looks very human-like, looks very factually correct, competent, coherent, but it might actually contain errors because these models were not trained on any notion of the truth, they were just trained to generate text that looks human-like, looks like the text they read.”

Click to play video: 'First Canadian court case over AI-generated court filings'
First Canadian court case over AI-generated court filings

ChatGPT’s own terms of use warn users that the content generated may not be accurate in some situations.

But Shwartz believes the companies that produce tools like ChatGPT need to do a better job of communicating their shortfalls, and that they should not be used for sensitive applications.

She said the legal system also needs more rules about how such tools are used, and that until guardrails are in place the best solution is likely to simply ban them.

Story continues below advertisement

“Even if someone uses them just to help with the writing, they need to be responsible for the final output and they need to check it and make sure the system didn’t introduce some factual errors,” she said.

“Unless everyone involved would fact-check every step of the process, these things might go under the radar, it might have happened already.”

Festinger said that education and training for lawyers about what AI tools should and shouldn’t be used for is critical.

But he said he remains hopeful about the technology. He believes more specialized AI tools dealing specifically with law and tested for accuracy could be available within the next decade — something he said would be a net positive for the public when it comes to access to justice.

B.C. Supreme Court Justice David Masuhara is expected to deliver a decision on Ke’s liability for costs within the next two weeks.

— with files from Rumina Daya

Sponsored content

AdChoices