A Pennsylvania lawyer has been sanctioned for submitting fabricated legal cases generated by artificial intelligence in a product liability lawsuit involving Colgate-Palmolive Co. The lawyer cited nonexistent case law created by AI, specifically ChatGPT, which led to judicial concern over AI-generated 'hallucinations' in legal filings. The issue escalated when the plaintiff repeatedly referenced fake AI-generated cases in court documents, prompting Amazon to highlight these inaccuracies in a motion to dismiss. The court issued an order to show cause regarding potential sanctions after the lawyer continued to submit fabricated cases. At a hearing, the lawyer admitted that the motion was drafted using AI, and the statement read in court was also AI-generated. This incident has drawn attention to the risks of relying on artificial intelligence tools for legal research and case preparation.
Guy cites fake AI case in complaint. Amazon points that out in MTD. In opp brief, guy cites 6 more fake cases. Court issues OSC re sanctions. Guy responds with more fake cases. At hearing, reads statement saying that he used AI. Statement he read was also drafted by AI. https://t.co/KhB2DyUG3u
Plaintiff tells court that a motion he filed "was done by A.I." Court says "that is a bad idea" https://t.co/7TjJZEP4dY
Un abogado cita sentencias "inexistentes" inventadas por ChatGPT y la justicia italiana alerta de las "alucinaciones" de la IA https://t.co/Ro84trBQy8