Anthropic’s lawyer was forced to apologize after Claude hallucinated a legal citation

techcrunch.com/2025/05/15/anthropics-lawyer-was-forced-to-apologize-after-claude-hallucinated-a-legal-citation

A lawyer representing Anthropic admitted to using an erroneous citation created by the company’s Claude AI chatbot in its ongoing legal battle with music publishers, according to a filing made in a Northern California court on Thursday.
Claude hallucinated the citation with “an…

This story appeared on techcrunch.com, 2025-05-15 19:37:53.
The Entire Business World on a Single Page. Free to Use →