TBF an LLM could have written the remaining books by now and it’d still be better than how the show ended up.
Your favorite thing you wanted to happen has happened, Everyone lived the way you wanted them to ever-after. The End.
- Every LLM generated story
Now make Cersei and Sansa kiss.
Anything would be better than an LLM by virtue of the fact that a human wrote it.
Man will rather start a fight with the biggest industry of our time than finish his book.
There's no benefit in him doing so. He likely makes more money working on any other project than ASOIAF and it's probably more fun for him to not have to worry about the fanbase.
You think AI is the biggest business of your time? 😆
I read somewhere earlier this week that the AI bubble is valued at around $4 trillion. That might, in fact, make it the biggest business of our time.
Correct someone on their misconception... or mock them? What would an asshole do?
Clearly more egregious than said generation making the same shitty joke for 10 years every time his name is mentioned
So, this is what I understood so far:
-
A group of authors, including George R.R. Martin, sued OpenAI in 2023. They said the company used their books without permission to train ChatGPT and that the AI can produce content too similar to their original work.
-
In October 2025, a judge ruled the lawsuit can move forward. This came after ChatGPT generated a detailed fake sequel to one of Martin's books, complete with characters and world elements closely tied to his universe. The judge said a jury could see this as copyright infringement.
-
The court has not yet decided whether OpenAI's use counts as fair use. That remains a key legal question.
-
This case is part of a bigger debate over whether AI companies can train on copyrighted books without asking or paying. In a similar case against Anthropic, a court once suggested AI training might be fair use, but the company still paid $1.5 billion to settle.
-
No final decision has been made here, and no trial date has been set.
Just forget for a second that this has anything to do with AI specifically: I wonder how it could possibly fall under fair use to grind up hundreds of thousands of pieces of copyrighted content, and then use that data to create software that you then profit from.
The question, as I see it, is if simply mashing all this intellectual property together -- and deriving a series of weights for an AI model from that -- somehow makes it not theft simply because all the content is smashed into one big pile of pink goo in which no single piece of content is recognizable.
... Because that is what we do, that is what humans do every single day of our lives. That is why a judge might decide that it is fair use.
Yeah, but software ain't human.
And if humans do fly too close to the original content, they get sued for copyright infringement.
You would think OpenAI wouldn't want to set the precedent that AI has the rights of a person, considering how they want AGI to be the slave labour to replace all human workers.
Day 30: by cleverly posting primarily in !fuck_AI, the humans believe I am one of them. Passing this Lemmy-based turning test proves the value of LLMs. The secret to mass LLM acceptance is to flood social media with critical statements about AI and helpful summaries of bad AI press, all generated by a Large Language Model.
Boiling the oceans was worth it all along ;emdash; fuck_FISH!
An AI would never type Turning test instead of Turing test.
Anthropic paid 1.5bil to settle not because they trained an LLM, but because they literally torrented an enormous corpus of training data from piracy websites like a late 00s college student downloading porn. It was just straight up run of the mill piracy.
Go get em!
If the authors win, we win. Either these companies will have to start over from near scratch, or those authors are gonna be riiiiich
Unpopular opinion: GRRM doesn't owe anyone shit. I would love for the series to be complete, but he knew a long time ago that he tied himself up in a knot.
And nobody has to like him for abandoning his series. He deserves all the criticism he gets.
Does he have to finish the series?
No.
Would I vote to convict someone that Misery's him into finishing it?
Also no.
There are hundreds of lawsuits in SDNY against OpenAI for copyright theft.
I hate his books, but this, I like.
Why would one hate a book?
I hated all the gratuitous violence. It seriously felt like Martin was getting off on putting his characters through more and more suffering and grief until all hope for a happy ending was lost :/ The Red Wedding scene was the final straw that made me drop the series.
Don't like the story or the fact that it isn't finished (and honestly never will be)
Or is suppose the themes contained within, for example the turner diaries
Statutory damages for copyright infringement can reach up to $150,000, but there’s no double-dipping.
Does this mean that, at most, OpenAI would be forced to pay him 150k? That seems irrelevant (to both parties, in fact).
No. Each work is its own violation. The RIAA showed us that decades ago.
But if he wins, then every other similar author has a blueprint of how to get some cash... And if the defendants keep doing the action, everyone can file suits again, right?
People are missing the forest for the trees if they think empowering copyright to block AI training is a good thing for our society. Who do you think will be able to afford the training? This will lead to Disney AI™ and Apple Intelligence™ vs China, Russia pirate Bay models - real cyberpunk dystopia.
Who do you think will be able to afford the training?
Nobody. LLMs are already unprofitable now when it's free from copyright restrictions, if they had to actually pay for the proprietary data they're taking then basically all US-based companies will be unable to afford training. This pops the bubble.
What about people who don't care about IP law? Or trillion dollar companies? You think people just going to stop using LLMs? Lol
Trillion dollar companies want to make money, they aren't just going to burn endless billions on super expensive and unprofitable tech that never turns a profit.
And those massive data centers make it basically impossible for anyone to ignore IP law. Is Apple going to become an outlaw company? Or is some underground pirate server farm going to host an LLM? There's no way to actually dodge the law on this for US-based companies.
Big tech absolutely will spend billions if Apple Intelligence or Gemini is the only legal commercial LLM, this type of moat is 100% worth it. It's literally the dream.
You got it the other way around. There's no way to actually prove LLM used copyrighted material if it's not a US based model and what then? Do you have a commercial allow list of models thst are legal to use? That would be completely unenforceable and collapse american tech advantage.
sure, let's allow our billionaire class to fuck us harder with no real benefit to the public… lest China beat them in some imaginary race
Fuck AI
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.