831
top 50 comments
sorted by: hot top controversial new old
[-] nulluser@lemmy.world 221 points 1 month ago

threatens to "financially ruin" the entire AI industry

No. Just the LLM industry and AI slop image and video generation industries. All of the legitimate uses of AI (drug discovery, finding solar panel improvements, self driving vehicles, etc) are all completely immune from this lawsuit, because they're not dependent on stealing other people's work.

But it would also mean that the Internet Archive is illegal, even tho they don't profit, but if scraping the internet is a copyright violation, then they are as guilty as Anthropic.

[-] magikmw@piefed.social 26 points 1 month ago

IA doesn't make any money off the content. Not that LLM companies do, but that's what they'd want.

[-] axmo@lemmy.ca 16 points 1 month ago

Profit (or even revenue) is not required for it to be considered an infringement, in the current legal framework.

load more comments (8 replies)
load more comments (3 replies)
[-] halcyoncmdr@lemmy.world 135 points 1 month ago

As Anthropic argued, it now "faces hundreds of billions of dollars in potential damages liability at trial in four months" based on a class certification rushed at "warp speed" that involves "up to seven million potential claimants, whose works span a century of publishing history," each possibly triggering a $150,000 fine.

So you knew what stealing the copyrighted works could result in, and your defense is that you stole too much? That's not how that works.

[-] zlatko@programming.dev 39 points 1 month ago

Actually that usually is how it works. Unfortunately.

*Too big to fail" was probably made up by the big ones.

If scraping is illegal, so is the Internet Archive, and that would be an immense loss for the world.

[-] Signtist@bookwyr.me 8 points 1 month ago

This is the real concern. Copyright abuse has been rampant for a long time, and the only reason things like the Internet Archive are allowed to exist is because the copyright holders don't want to pick a fight they could potentially lose and lessen their hold on the IPs they're hoarding. The AI case is the perfect thing for them, because it's a very clear violation with a good amount of public support on their side, and winning will allow them to crack down even harder on all the things like the Internet Archive that should be fair use. AI is bad, but this fight won't benefit the public either way.

load more comments (3 replies)
[-] chaosCruiser@futurology.today 64 points 1 month ago

Oh no! Building a product with stolen data was a rotten idea after all. Well, at least the AI companies can use their fabulously genius PhD level LLMs to weasel their way out of all these lawsuits. Right?

[-] Rooskie91@discuss.online 45 points 1 month ago

I propose that anyone defending themselves in court over AI stealing data must be represented exclusively by AI.

[-] thesohoriots@lemmy.world 9 points 1 month ago

PhD level LLM = paying MAs $21/hr to write summaries of paragraphs for them to improve off of. Google Gemini outsourced their work like this, so I assume everyone else did too.

[-] PushButton@lemmy.world 48 points 1 month ago

Let's go baby! The law is the law, and it applies to everybody

If the "genie doesn't go back in the bottle", make him pay for what he's stealing.

[-] Zetta@mander.xyz 32 points 1 month ago

The law absolutely does not apply to everybody, and you are well aware of that.

[-] AstralPath@lemmy.ca 5 points 1 month ago

Shouldn't it?

load more comments (1 replies)
[-] SugarCatDestroyer@lemmy.world 5 points 1 month ago

I just remembered the movie where the genie was released from the bottle of a real genie, he turned the world into chaos by freeing his own kind, and if it weren't for the power of the plot, I'm afraid people there would have become slaves or died out.

Although here it is already necessary to file a lawsuit for theft of the soul in the literal sense of the word.

load more comments (2 replies)

This would mean the copyright holders like Disney are now the AI companies, because they have the content to train them. That's even worse, man.

[-] BussyCat@lemmy.world 7 points 1 month ago

It’s not because they would only train on things they own which is an absolute tiny fraction of everything that everyone owns. It’s like complaining that a rich person gets to enjoy their lavish estate when the alternative is they get to use everybody’s home in the world.

[-] a_wild_mimic_appears@lemmy.dbzer0.com 5 points 1 month ago* (last edited 1 month ago)

do you know how much content disney has? go scrolling: https://en.wikipedia.org/wiki/List_of_assets_owned_by_the_Walt_Disney_Company e: that's the tip of the iceberg, because if they band together with others from the MPAA & RIAA, they can suffocate the entire Movie, Book and Music world with it.

[-] BussyCat@lemmy.world 5 points 1 month ago

They have 0.2T in assets the world has around 660T in assets which as I said before is a tiny fraction. Obviously both hold a lot of assets that aren’t worthwhile to AI training such as theme parks but when you consider a single movie that might be worth millions or billions has the same benefit for AI training as another movie worth thousands. the amount of assets Disney owned is not nearly as relevant as you are making it out to be

load more comments (1 replies)
[-] panda_abyss@lemmy.ca 36 points 1 month ago

Well maybe they shouldn't have done of the largest violations of copyright and intellectual property ever.

Probably the largest single instance ever.

[-] ivanafterall@lemmy.world 9 points 1 month ago

I feel like it can't even be close. What would even compete? I know I've gone a little overboard with my external hard drive, but I don't think even I'm to that level.

[-] westingham@sh.itjust.works 34 points 1 month ago

I was reading the article and thinking "suck a dick, AI companies" but then it mentions the EFF and ALA filed against the class action. I have found those organizations to be generally reputable and on the right side of history, so now I'm wondering what the problem is.

[-] kibiz0r@midwest.social 39 points 1 month ago

They don’t want copyright power to expand further. And I agree with them, despite hating AI vendors with a passion.

For an understanding of the collateral damage, check out How To Think About Scraping by Cory Doctorow.

[-] Jason2357@lemmy.ca 13 points 1 month ago

Take scraping. Companies like Clearview will tell you that scraping is legal under copyright law. They’ll tell you that training a model with scraped data is also not a copyright infringement. They’re right.

I love Cory's writing, but while he does a masterful job of defending scraping, and makes a good argument that in most cases, it's laws other than Copyright that should be the battleground, he does, kinda, trip over the main point.

That is that training models on creative works and then selling access to the derivative "creative" works that those models output very much falls within the domain of copyright - on either side of a grey line we usually call "fair use" that hasn't been really tested in courts.

Lets take two absurd extremes to make the point. Say I train an LLM directly on Marvel movies, and then sell movies (or maybe movie scripts) that are almost identical to existing Marvel movies (maybe with a few key names and features altered). I don't think anyone would argue that is not a derivative work, or that falls under "fair use." However, if I used literature to train my LLM to be able to read, and used that to read street signs for my self-driving car, well, yeah, that might be something you could argue is "fair use" to sell. It's not producing copy-cat literature.

I agree with Cory that scraping, per se, is absolutely fine, and even re-distributing the results in some ways that are in the public interest or fall under "fair use", but it's hard to justify the slop machines as not a copyright problem.

In the end, yeah, fuck both sides anyway. Copyright was extended too far and used for far too much, and the AI companies are absolute thieves. I have no illusions this type of court case will do anything more than shift wealth from one robber-barron to another, and won't help artists and authors.

load more comments (1 replies)
load more comments (2 replies)
[-] peoplebeproblems@midwest.social 8 points 1 month ago

I disagree with the EFF and ALA on this one.

These were entire sets of writing consumed and reworked into poor data without respecting the license to them.

Honestly, I wouldn't be surprised if copyright wasn't the only thing to be the problem here, but intellectual property as well. In that case, EFF probably has an interest in that instead. Regardless, I really think it need to be brought through court.

LLMs are harmful, full stop. Most other Machine Learning mechanisms use licensed data to train. In the case of software as a medical device, such as image analysis AI, that data is protected by HIPPA and special attention is already placed in order to utilize it.

load more comments (1 replies)
[-] pelya@lemmy.world 6 points 1 month ago

AI coding tools are using the exact same backends as AI fiction writing tools, so it would hurt the fledgling vibe coder profession (which according to proper software developers should not be allowed to exist at all).

load more comments (1 replies)
[-] keyhoh@piefed.social 29 points 1 month ago* (last edited 1 month ago)

I thought it was hilarious how there was a quote in the article that said

immense harm not only to a single AI company, but to the entire fledgling AI industry and to America’s global technological competitiveness

It will only do this because all these idiotic American companies fired all their employees to replace them with AI. Hire then back and the edge won't dull. But we all know that they won't do this and just cry and point fingers wondering how they ever lost a technology race.

Edited because it's my first time using quotes and I don't know how to use them properly haha

[-] Deflated0ne@lemmy.world 28 points 1 month ago

Good. Burn it down. Bankrupt them.

If it's so "critical to national security" then nationalize it.

the "burn it down" variant would only lead to the scenario where the copyright holders become the AI companies, since they have the content to train it. AI will not go away, it might change ownership to someone worse tho.

nationalizing sounds better; even better were to put in under UNESCO-stewardship.

[-] Deflated0ne@lemmy.world 5 points 1 month ago

Hard to imagine worse than the insane techno-feudalists who currently own it.

believe me, Disney is fucking ruthless in comparison to Anthropic.

[-] 9point6@lemmy.world 27 points 1 month ago

Probably would have been cheaper to license everything you stole, eh, Anthropic?

[-] SugarCatDestroyer@lemmy.world 25 points 1 month ago* (last edited 1 month ago)

Unfortunately, this will probably lead to nothing: in our world, only the poor seem to be punished for stealing. Well, corporations always get away with everything, so we sit on the couch and shout "YES!!!" for the fact that they are trying to console us with this.

[-] Modern_medicine_isnt@lemmy.world 5 points 1 month ago

This issue is not so cut and dry. The AI companies are stealing from other companies more than ftom individual people. Publishing companies are owned by some very rich people. And they want thier cut.

This case may have started out with authors, but it is mentioned that it could turn into publishing companies vs AI companies.

[-] Treczoks@lemmy.world 24 points 1 month ago

Well, theft has never been the best foundation for a business, has it?

While I completely agree that copyright terms are completely overblown, they are valid law that other people suffer under, so it is 100% fair to make them suffer the same. Or worse, as they all broke the law for commercial gain.

load more comments (1 replies)
[-] Lexam@lemmy.world 18 points 1 month ago

No it won't. Just their companies. Which are the ones making slop. If your AI does something actually useful it will survive.

load more comments (1 replies)
[-] arararagi@ani.social 15 points 1 month ago

Meanwhile some Italian YouTuber was raided because some portable consoles already came with roms in their memory, they only go after individuals.

[-] Plurrbear@lemmy.world 14 points 1 month ago

Fucking good!! Let the AI industry BURN!

[-] WereCat@lemmy.world 13 points 1 month ago

We just need to show that ChatGPT and alike can generate Nintendo based content and let it fight out between them

[-] crystalmerchant@lemmy.world 12 points 1 month ago

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

And yet, despite 20 years of experience, the only side Ashley presents is the technologists' side.

[-] a_wild_mimic_appears@lemmy.dbzer0.com 12 points 1 month ago* (last edited 1 month ago)

So, the US now has a choice: rescue AI and fix their fucked up copyright system, or rescue the fucked up copyright system and fuck up AI companies. I'm interested in the decision.

I'd personally say that the copyright system needs to be fixed anyway, because it's currently just a club for the RIAA&MPAA to wield against everyone (remember the lawsuits against single persons with alleged damages in the millions for downloading a few songs? or the current tries to fuck over the internet archive?). Should the copyright side win, then we can say goodbye to things like the internet archive or open source-AI; copyright holders will then be the AI-companies, since they have the content.

[-] a_person@piefed.social 11 points 1 month ago

Good fuck those fuckers

[-] RagingRobot@lemmy.world 8 points 1 month ago

Is this how Disney becomes the owner of all of the AI companies too? Lol

[-] LucidLyes@lemmy.world 8 points 1 month ago

I hope LLMs and generative AI crash and burn.

load more comments (1 replies)
[-] jsomae@lemmy.ml 6 points 1 month ago

Would really love to see IP law get taken down a notch out of this.

[-] timuchan@lemmy.wtf 6 points 1 month ago

The bill comes due 🤷🏻

[-] phonics@lemmy.world 5 points 1 month ago

With the amount of money pouring in you'd think they'd just pay for it

load more comments (1 replies)
[-] ZILtoid1991@lemmy.world 5 points 1 month ago

Now they're in the "finding out" phase of the "fucking around and finding out".

load more comments
view more: next ›
this post was submitted on 09 Aug 2025
831 points (100.0% liked)

Technology

75792 readers
1734 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS