Lawyer Disciplined for Using ChatGPT to Cite Nonexistent Cases
AI/News/Software

Lawyer Disciplined for Using ChatGPT to Cite Nonexistent Cases

A judge penalized a lawyer for attempting to use ChatGPT for legal research, stating that ignorance of AI risks is unacceptable in today's legal landscape.

In a recent legal debacle, attorney Thomas Nield from Semrad Law Firm was penalized by Judge Michael Slade for using ChatGPT to cite cases that do not exist. The incident occurred during a bankruptcy case dating back to 2024, where Nield submitted a chapter 13 repayment plan that included fictional legal precedents produced by the AI.

Judge Slade expressed disbelief at the attorney’s reliance on AI, stating, “Any lawyer unaware that using generative AI platforms to do legal research is playing with fire is living in a cloud.”

Background

In a bankruptcy proceeding, Nield attempted to defend his client’s repayment strategy against objections from the creditor, Corona Investments LLC. Complications arose when it was revealed that key legal citations he provided did not match any known cases. Upon investigation, Judge Slade found that the referenced rulings either didn’t exist or were incorrectly cited.

In response to his misconduct, Nield claimed ignorance about AI’s reliability for legal citations. He stated he would refrain from using AI for legal research in the future without rigorous verification of the output.

Conclusion

As a consequence of his actions, Nield was sanctioned to pay $5,500 and mandated to attend a session on AI risks at an upcoming legal conference. This situation serves as a warning to legal professionals about the inherent dangers of relying on AI for accurate legal research.

Next article

Exciting Hands-On Opportunities for Silksong at Upcoming Events

Newsletter

Get the most talked about stories directly in your inbox

Every week we share the most relevant news in tech, culture, and entertainment. Join our community.

Your privacy is important to us. We promise not to send you spam!