
In a recent legal debacle, attorney Thomas Nield from Semrad Law Firm was penalized by Judge Michael Slade for using ChatGPT to cite cases that do not exist. The incident occurred during a bankruptcy case dating back to 2024, where Nield submitted a chapter 13 repayment plan that included fictional legal precedents produced by the AI.
Judge Slade expressed disbelief at the attorney’s reliance on AI, stating, “Any lawyer unaware that using generative AI platforms to do legal research is playing with fire is living in a cloud.”
Background
In a bankruptcy proceeding, Nield attempted to defend his client’s repayment strategy against objections from the creditor, Corona Investments LLC. Complications arose when it was revealed that key legal citations he provided did not match any known cases. Upon investigation, Judge Slade found that the referenced rulings either didn’t exist or were incorrectly cited.
In response to his misconduct, Nield claimed ignorance about AI’s reliability for legal citations. He stated he would refrain from using AI for legal research in the future without rigorous verification of the output.
Conclusion
As a consequence of his actions, Nield was sanctioned to pay $5,500 and mandated to attend a session on AI risks at an upcoming legal conference. This situation serves as a warning to legal professionals about the inherent dangers of relying on AI for accurate legal research.