Tech and AIScrew the money -- Anthropic's $1.5B copyright settlement sucks...

Screw the money — Anthropic’s $1.5B copyright settlement sucks for writers

-


Around half a million writers will be eligible for a payday of at least $3,000, thanks to a historic $1.5 billion settlement in a class action lawsuit that a group of authors brought against Anthropic.

This landmark settlement marks the largest payout in the history of U.S. copyright law, but this isn’t a victory for authors — it’s yet another win for tech companies.

Tech giants are racing to amass as much written material as possible to train their LLMs, which power groundbreaking AI chat products like ChatGPT and Claude — the same products that are endangering the creative industries, even if their outputs are milquetoast. These AIs can become more sophisticated when they ingest more data, but after scraping basically the entire internet, these companies are literally running out of new information.

That’s why Anthropic, the company behind Claude, pirated millions of books from “shadow libraries” and fed them into its AI. This particular lawsuit, Bartz v. Anthropic, is one of dozens filed against companies like Meta, Google, OpenAI, and Midjourney over the legality of training AI on copyrighted works.

But writers aren’t getting this settlement because their work was fed to an AI — this is just a costly slap on the wrist for Anthropic, a company that just raised another $13 billion, because it illegally downloaded books instead of buying them.

In June, federal judge William Alsup sided with Anthropic and ruled that it is, indeed, legal to train AI on copyrighted material. The judge argues that this use case is “transformative” enough to be protected by the fair use doctrine, a carve-out of copyright law that hasn’t been updated since 1976.

“Like any reader aspiring to be a writer, Anthropic’s LLMs trained upon works not to race ahead and replicate or supplant them — but to turn a hard corner and create something different,” the judge said.

It was the piracy — not the AI training — that moved Judge Alsup to bring the case to trial, but with Anthropic’s settlement, a trial is no longer necessary.

“Today’s settlement, if approved, will resolve the plaintiffs’ remaining legacy claims,” said Aparna Sridhar, deputy general counsel at Anthropic, in a statement. “We remain committed to developing safe AI systems that help people and organizations extend their capabilities, advance scientific discovery, and solve complex problems.”

As dozens more cases over the relationship between AI and copyrighted works go to court, judges now have Bartz v. Anthropic to reference as a precedent. But given the ramifications of these decisions, maybe another judge will arrive at a different conclusion.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest news

Is retail back or is MicroStrategy only pumping the price of bitcoin?

In 2023, MicroStrategy’s average bitcoin purchase price was near $30,000. Today, that average has jumped to near $50,000. Source...

Tesla Proposes a Trillion-Dollar Bet That It’s More Than Just Cars

Tesla launched a limited robotaxi service in Austin, Texas, earlier this summer, but it's unclear whether the vehicles...

Gold Price All-Time High: Is It Better to Buy Tokenized or Physical Gold?

Why is gold reaching all-time highs? The gold price has risen because investors are nervous. Political shocks, wars,...

Advertisement

Donald Trump-endorsed World Liberty Financial still 32% off revised goal

World Liberty Financial is still approximately 32% off its revised fundraising goal, despite the endorsement of President-elect Trump. Source...

New Asia-Centric Fund From Sora Ventures Aims to Amass $1B in Bitcoin

On Friday, Sora Ventures unveiled a bitcoin treasury fund targeting $1 billion in purchases within six months. Sora...

Must read

Is retail back or is MicroStrategy only pumping the price of bitcoin?

In 2023, MicroStrategy’s average bitcoin purchase price was...

You might also likeRELATED
Recommended to you