A British Columbia lawyer alleged to have submitted bogus case law “hallucinated” by an AI chatbot is now facing both an investigation from the Law Society of B.C. and potential financial consequences.
Earlier this month, it was revealed that lawyer Chong Ke allegedly used ChatGPT to prepare legal briefs in a family law case at B.C. Supreme Court.
In reviewing the submissions, lawyers for the opposing side discovered that some of the cases cited in the briefs did not, in fact, exist.
Those lawyers are now suing Ke for special costs in the case.
Ke was called to the bar five years ago, and the court heard she is not a sophisticated computer user and has little experience with artificial intelligence.
According to an affidavit she filed, Ke had tried using AI for fun, but never in a professional capacity until November 2023.
“Imagine yourself as a young lawyer … she was mortified,”Ke’s lawyer John Forstrom, told the court, situation as a living nightmare.
“She has since educated herself about the issue and become aware of the dangers of relying on certifications provided by AI.”
Get daily National news
The court has heard that in November, Ke queried ChatGPT for a case about travelling overseas to visit parents.
The tool returned three cases, two of which Ke submitted to the court in support of her application in a high-net-worth separation dispute, on behalf of the father who was seeking to take his children on a trip to China.
But when the lawyers for the children’s mother, who opposed the trip, tried to look up the cases Ke had cited, they weren’t able to find any record of them.
The lawyers, Lorne and Fraser MacLean, asked repeatedly for copies, but they were never provided. Ms. Ke later stated she wasn’t relying on the cases anymore.
The MacLeans ultimately discovered the briefs were not real.
Ke’s lawyer says his client made an error, and that there is no evidence she intended to mislead the court by relying on fake cases.
But the MacLeans are now suing Ms. Ke for special costs over the incident.
Neither side would comment on Wednesday, but Lorne MacLean previously described the threat AI poses to the legal system as grave.
“The impact of the case is chilling for the legal community,” he told Global News on Jan. 23.
“If we don’t fact-check artificially-generated intelligence materials and they’re inaccurate, it can lead to an existential threat to the legal system.”
AI chatbots like ChatGPT are known to sometimes make up realistic sounding but incorrect information, a process known as “hallucination.”
The problem has already cropped up several times in U.S. courts, including in filings from Donald Trump’s former lawyer Michael Cohen.
Meanwhile, the Law Society of BC has confirmed it is probing the incident.
“The Law Society is investigating the conduct of Chong Ke … who is alleged to have relied, in submissions to the court, on non-existent case law identified by ChatGPT,” the organization said in a statement.
“The Law Society has also issued guidance to lawyers on the appropriate use of AI in providing legal services and expects lawyers to comply with the standards of conduct expected of a competent lawyer if they do rely on AI in serving their clients.”
The Chief Justice of the B.C. Supreme Court also issued a directive last March telling judges not to use AI, and Canada’s federal court followed suit last month.
While the artificial cases were not ultimately relied on in court proceedings, Justice David Masuhara said that the court is always concerned, at every moment, with the integrity of the process.
The MacLeans are slated to argue why Ke should be held financially accountable in the incident on Friday.
— with files from Rumina Daya
Comments