Anthropic has agreed to pay $1.5 billion to settle a class-action lawsuit brought by authors who said the artificial intelligence company used pirated copies of their books to train its models, in what plaintiffs describe as the largest publicly reported copyright recovery on record.
Court filings in San Francisco show the accord will compensate authors at about $3,000 per work for an estimated 500,000 books and requires Anthropic to destroy the pirated datasets it downloaded, including material from Library Genesis and Pirate Library Mirror. A federal judge must approve the agreement.
The settlement follows a June ruling by US district judge William Alsup that drew a line between legally obtained and pirated texts. The court found that training models on lawfully acquired books constituted fair use, while storing millions of books taken from shadow libraries was “inherently, irredeemably infringing.”
“This settlement sends a powerful message to AI companies and creators alike that taking copyrighted works from these pirate websites is wrong,” said Justin Nelson, a lawyer for the authors.
Mary Rasenberger, the chief executive of the Authors Guild, called the deal “a vital step in acknowledging that AI companies cannot simply steal authors’ creative work to build their AI.”
Anthropic said it remains focused on safety and will comply with the agreement. “We remain committed to developing safe AI systems that help people and organizations extend their capabilities, advance scientific discovery, and solve complex problems,” said Aparna Sridhar, Anthropic’s deputy general counsel.
Legal experts said the resolution could steer how other artificial intelligence firms approach data acquisition and licensing. “This is massive,” said Chad Hummel, a trial lawyer at McKool Smith who is not involved in the case. “This will cause generative A.I. companies to sit up and take notice.”
The case is among dozens testing how copyright law applies to model training. While some courts have indicated that training on legally obtained books can be fair use, the Anthropic litigation underscores potential liability tied to pirated sources. The company, backed by Amazon and Alphabet, did not admit liability under the settlement and may still face claims over outputs produced by its systems.
Recent Stories