Australian lawyer caught using ChatGPT filed court documents referencing ‘non-existent’ cases


An Australian lawyer has been referred to a state legal complaints commission, after it was discovered he had used ChatGPT to write court filings in an immigration case and the artificial intelligence platform generated case citations that did not exist.

In a ruling by the federal circuit and family court on Friday, Justice Rania Skaros referred the lawyer, who had his name redacted from the ruling, to the Office of the NSW Legal Services Commissioner (OLSC) for consideration.

The court heard in an appeal of an administrative appeals tribunal ruling the lawyer filed an amended application to the federal circuit and family court in October 2024, as well as an outline of submissions. Skaros said “both documents contained citations to cases and alleged quotes from the tribunal’s decision which were nonexistent”.

On 19 November, the lawyer wrote to the court stating the errors were unintentional, and that he deeply regretted them. At a hearing on 25 November, the lawyer admitted to using ChatGPT to write the documents.

“The [lawyer] stated that he had used AI to identify Australian cases, but it provided him with nonexistent case law,” Skaros said. “The court expressed its concern about the [lawyer]’s conduct and his failure to check the accuracy of what had been filed with the court, noting that a considerable amount time had been spent by the court and my associates checking the citations and attempting to find the purported authorities.”

In an affidavit provided to the court, the lawyer said that due to time constraints and health issues, he decided to use AI.

“He accessed the site known as ChatGPT, inserted some words and the site prepared a summary of cases for him,” the judgment said. “He said the summary read well, so he incorporated the authorities and references into his submissions without checking the details.”

The lawyer was said to be deeply embarrassed about the incident and has taken steps to improve his knowledge of AI.

Counsel for the immigration minister argued the lawyer had failed to exercise adequate care, and given the public interest in the misuse of AI in legal proceedings, it was in the public interest for misuses of AI cases to be referred to the OLSC.

“It was submitted [by the minister] that such conduct would continue to occur and must be ‘nipped in the bud’.”

skip past newsletter promotion

Skaros said the use of generative AI in legal proceedings is a live and evolving issue and it was in the public interest of the OLSC to be made aware of such conduct.

It is the second legal case in Australia where a lawyer has been referred to a regulatory body over using AI, after a Melbourne lawyer was referred to the Victorian legal complaints body last year after admitting to using AI in a family court case that generated false case citations.

In a practice note issued by the NSW supreme court late last year, which will come into effect on Monday, the court has put limits on the use of generative AI by NSW lawyers, including the stipulation that it must not be used to generate affidavits, witness statements, character references or other material tendered in evidence or used in cross-examination.



Source link

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe

Latest Articles