Advertisement

B.C. lawyer who used fake, AI-generated cases faces law society probe, possible costs

Click to play video: 'B.C. lawyer under fire for AI-generated fake case law'
B.C. lawyer under fire for AI-generated fake case law
The Law Society of BC has launched an investigation into a lawyer who submitted fake case law to the courts, after using artificial intelligence. As Rumina Daya reports, it's believed to be the first case of its kind in Canada, and the lawyers who discovered the bogus information are suing – Jan 31, 2024

A British Columbia lawyer alleged to have submitted bogus case law “hallucinated” by an AI chatbot is now facing both an investigation from the Law Society of B.C. and potential financial consequences.

Earlier this month, it was revealed that lawyer Chong Ke allegedly used ChatGPT to prepare legal briefs in a family law case at B.C. Supreme Court.

In reviewing the submissions, lawyers for the opposing side discovered that some of the cases cited in the briefs did not, in fact, exist.

Click to play video: 'First Canadian court case over AI-generated court filings'
First Canadian court case over AI-generated court filings

Those lawyers are now suing Ke for special costs in the case.

Story continues below advertisement

Ke was called to the bar five years ago, and the court heard she is not a sophisticated computer user and has little experience with artificial intelligence.

According to an affidavit she filed, Ke had tried using AI for fun, but never in a professional capacity until November 2023.

“Imagine yourself as a young lawyer … she was mortified,”Ke’s lawyer John Forstrom, told the court, situation as a living nightmare.

“She has since educated herself about the issue and become aware of the dangers of relying on certifications provided by AI.”

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

The court has heard that in November, Ke queried ChatGPT for a case about travelling overseas to visit parents.

The tool returned three cases, two of which Ke submitted to the court in support of her application in a high-net-worth separation dispute, on behalf of the father who was seeking to take his children on a trip to China.

But when the lawyers for the children’s mother, who opposed the trip, tried to look up the cases Ke had cited, they weren’t able to find any record of them.

The lawyers, Lorne and Fraser MacLean, asked repeatedly for copies, but they were never provided. Ms. Ke later stated she wasn’t relying on the cases anymore.

Story continues below advertisement

The MacLeans ultimately discovered the briefs were not real.

Click to play video: 'Examining AI in the courtroom'
Examining AI in the courtroom

Ke’s lawyer says his client made an error, and that there is no evidence she intended to mislead the court by relying on fake cases.

But the MacLeans are now suing Ms. Ke for special costs over the incident.

Neither side would comment on Wednesday, but Lorne MacLean previously described the threat AI poses to the legal system as grave.

“The impact of the case is chilling for the legal community,” he told Global News on Jan. 23.

“If we don’t fact-check artificially-generated intelligence materials and they’re inaccurate, it can lead to an existential threat to the legal system.”

AI chatbots like ChatGPT are known to sometimes make up realistic sounding but incorrect information, a process known as “hallucination.

Story continues below advertisement
Click to play video: 'B.C. joins Ottawa’s ChatGPT privacy investigation'
B.C. joins Ottawa’s ChatGPT privacy investigation

The problem has already cropped up several times in U.S. courts, including in filings from Donald Trump’s former lawyer Michael Cohen.

Meanwhile, the Law Society of BC has confirmed it is probing the incident.

“The Law Society is investigating the conduct of Chong Ke … who is alleged to have relied, in submissions to the court, on non-existent case law identified by ChatGPT,” the organization said in a statement.

“The Law Society has also issued guidance to lawyers on the appropriate use of AI in providing legal services and expects lawyers to comply with the standards of conduct expected of a competent lawyer if they do rely on AI in serving their clients.”

The Chief Justice of the B.C. Supreme Court also issued a directive last March telling judges not to use AI, and Canada’s federal court followed suit last month.

Story continues below advertisement

While the artificial cases were not ultimately relied on in court proceedings, Justice David Masuhara said that the court is always concerned, at every moment, with the integrity of the process.

The MacLeans are slated to argue why Ke should be held financially accountable in the incident on Friday.

— with files from Rumina Daya

Sponsored content

AdChoices