Advertisement

AI ‘hallucinated’ fake legal cases allegedly filed to B.C. court in Canadian first

Click to play video: 'First Canadian court case over AI-generated court filings'
First Canadian court case over AI-generated court filings
WATCH: In the first case of its kind in Canada, a B.C. judge is hearing about a growing concern in the justice system: lawyers using artificial intelligence to create court filings that cite cases that don't actually exist. Rumina Daya reports – Jan 23, 2024

A B.C. courtroom is believed to be the site of Canada’s first case of artificial intelligence inventing fake legal cases.

Lawyers Lorne and Fraser MacLean told Global News they discovered fake case law submitted by the opposing lawyer in a civil case in B.C. Supreme Court.

“The impact of the case is chilling for the legal community,” Lorne MacLean, K.C., said.

“If we don’t fact check AI materials and they are inaccurate it can lead to an existential threat for the legal system: people waste money, courts waste resources and tax dollars, and there is a risk that the judgments will be erroneous, so it’s a huge deal.”

Click to play video: 'Examining AI in the courtroom'
Examining AI in the courtroom

Sources told Global News the case was a high-net-worth family matter, with the best interests of children at stake.

Story continues below advertisement

Lawyer Chong Ke allegedly used ChatGPT to prepare legal briefs in support of the father’s application to take his children to China for a visit — resulting in one or more cases that do not actually exist being submitted to the court.

Global News has learned Ke told the court she was unaware that AI chatbots like ChatGPT can be unreliable, and did not check to see if the cases actually existed — and apologized to the court.

Ke left the courtroom with tears streaming down her face on Tuesday, and declined to comment.

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

AI chatbots like ChatGPT are known to sometimes make up realistic sounding but incorrect information, a process known as “hallucination.

The problem has already crept into the U.S. legal system, where several incidents have surfaced — embarrassing lawyers, and raising concerns about the potential to undermine confidence in the legal system.

In one case, a judge imposed a fine on New York lawyers who submitted a legal brief with imaginary cases hallucinated by ChatGPT — an incident the lawyers maintained was a good-faith error.

In another case, Donald Trump’s former lawyer Michael Cohen said in a court filing he accidentally gave his lawyer fake cases dreamed up by AI.

Click to play video: 'B.C. joins Ottawa’s ChatGPT privacy investigation'
B.C. joins Ottawa’s ChatGPT privacy investigation

“It sent shockwaves in the U.S. when it first came out in the summer of 2023 … shockwaves in the United Kingdom, and now it’s going to send shockwaves across Canada,” MacLean said.

Story continues below advertisement

“It erodes confidence in the merits of a judgment or the accuracy of a judgment if it’s been based on false cases.”

Legal observers say the arrival of the technology — and its risks — in Canada should have lawyers on high alert.

“Lawyers should not be using ChatGPT to do research. If they are to be using chatGPT it should be to help draft certain sentences,” said Vancouver lawyer Robin Hira, who is not connected with the case.

“And even still, after drafting those sentences and paragraphs they should be reviewing them to ensure they accurately state the facts or they accurately address the point the lawyer is trying to make.”

Lawyer Ravi Hira, K.C., who is also not involved in the case, said the consequences for misusing the technology could be severe.

“If the court proceedings have been lengthened by the improper conduct of the lawyer, personal conduct, he or she may face cost consequences and the court may require the lawyer to pay the costs of the other side,” he said.

“And importantly, if this has been done deliberately, the lawyer may be in contempt of court and may face sanctions.”

Click to play video: 'U.S. Congress holds hearing on risks, regulation of AI: ‘Humanity has taken a back seat’'
U.S. Congress holds hearing on risks, regulation of AI: ‘Humanity has taken a back seat’

Hira said lawyers who misuse tools like ChatGPT could also face discipline from the law society in their jurisdiction.

Story continues below advertisement

“The warning is very simple,” he added. “Do you work properly. You are responsible for your work. And check it. Don’t have a third party do your work.”

The Law Society of BC warned lawyers about the use of AI and provided guidance three months ago. Global News is seeking comment from the society to ask if it is aware of the current case, or what discipline Ke could face.

The Chief Justice of the B.C. Supreme Court also issued a directive last March telling judges not to use AI, and Canada’s federal court followed suit last month.

In the case at hand, the MacLeans said they intend to ask the court to award special costs over the AI issue.

However, Lorne MacLean said he’s worried this case could be just the tip of the iceberg.

“One of the scary things is, have any false cases already slipped through the Canadian justice system and we don’t even know?”

— with files from Rumina Daya

Sponsored content

AdChoices