796
submitted 1 month ago by tonytins@pawb.social to c/games@lemmy.world

A user asked on the official Lutris GitHub two weeks ago "is lutris slop now" and noted an increasing amount of "LLM generated commits". To which the Lutris creator replied:

It's only slop if you don't know what you're doing and/or are using low quality tools. But I have over 30 years of programming experience and use the best tool currently available. It was tremendously helpful in helping me catch up with everything I wasn't able to do last year because of health issues / depression.

There are massive issues with AI tech, but those are caused by our current capitalist culture, not the tools themselves. In many ways, it couldn't have been implemented in a worse way but it was AI that bought all the RAM, it was OpenAI. It was not AI that stole copyrighted content, it was Facebook. It wasn't AI that laid off thousands of employees, it's deluded executives who don't understand that this tool is an augmentation, not a replacement for humans.

I'm not a big fan of having to pay a monthly sub to Anthropic, I don't like depending on cloud services. But a few months ago (and I was pretty much at my lowest back then, barely able to do anything), I realized that this stuff was starting to do a competent job and was very valuable. And at least I'm not paying Google, Facebook, OpenAI or some company that cooperates with the US army.

Anyway, I was suspecting that this "issue" might come up so I've removed the Claude co-authorship from the commits a few days ago. So good luck figuring out what's generated and what is not. Whether or not I use Claude is not going to change society, this requires changes at a deeper level, and we all know that nothing is going to improve with the current US administration.

you are viewing a single comment's thread
view the rest of the comments
[-] adeoxymus@lemmy.world 134 points 1 month ago

Tbh I agree, if the code is appropriate why care if it’s generated by an LLM

[-] deadcade@lemmy.deadca.de 89 points 1 month ago

It's still made by the slop machine, the same one that could only be created by stealing every human made artwork that's ever been published. (And this is not "just one company", every LLM has this issue.)

Not only that, the companies building massive datacenters are taking valuable resources from people just trying to live.

If the developer isn't able to keep up, they should look for (co-)maintainers. Not turn to the greedy megacorps.

[-] Ganbat@lemmy.dbzer0.com 18 points 1 month ago* (last edited 1 month ago)

If the developer isn't able to keep up, they should look for (co-)maintainers.

Same energy as "Just go on Twitter and ask for free voice actors," a la Vivziepop. A lot of people think this kind of shit is super easy, but realistically, it's nearly impossible to get people to dedicate that kind of effort to something that can never be more than a money/time sink.

[-] prole 6 points 1 month ago

I was under the impression that FOSS developers do it for the love of the game and not for monetary compensation. They're literally putting the software out for free even though they don't need to. They are going to be making this shit regardless.

[-] Ganbat@lemmy.dbzer0.com 6 points 1 month ago

My point was "Help me with my passion project for nothing" is a much harder sell. "Just find some help," is advice along the lines of "Just get in a plane and fly it."

[-] tempest@lemmy.ca 4 points 1 month ago

That is what they are technically doing but they often don't always consider the consequences and often react poorly when they realize that an Amazon (it whatever) comes along and contributes nothing and monetizes their work while dumping the support and maintenance on them.

That is the name of the game though if you use an MIT license.

[-] p03locke@lemmy.dbzer0.com 2 points 1 month ago

At this point, teachers do it "for the love of the game", but they still want to get paid more than minimum wage.

[-] deadcade@lemmy.deadca.de 3 points 1 month ago

Absolutely true, but there's one clear and obvious way; drop support for the project yourself.

If a FOSS project is archived/unmaintained, for a large enough project, someone else will pick up where the original left off.

FOSS maintainers don't owe anyone anything. What some developers do is amazing and I want them to keep developing and maintaining their projects, but I don't fault them for quitting if they do.

[-] p03locke@lemmy.dbzer0.com 6 points 1 month ago* (last edited 1 month ago)

XKCD, of course

If a FOSS project is archived/unmaintained, for a large enough project, someone else will pick up where the original left off.

No, they won't. This line of thinking is how we got the above.

Their line of work is thankless, and nobody wants to do a fucking thankless job, especially when the last maintainer was given a bunch of shit for it.

[-] Vlyn@lemmy.zip 2 points 1 month ago

Hey, if your project is important enough you might get your own Jia Tan (:

[-] silver_wings_of_morning@feddit.dk 10 points 1 month ago

Speaking only on the programming part of the slop machine, programmers typically copy code anyways. It's not an ethical issue for a programmer using a tool that has been trained on other people's "stolen" code.

[-] lostbit@feddit.nl 2 points 1 month ago

rofl don’t quit your day job

[-] Goretantath@lemmy.world 3 points 1 month ago

Just like how every other human artist learned how to draw by looking at examples their art teacher gave them, aka "stealing it" in your words.

[-] prole 18 points 1 month ago

LLMs are not sentient and they're not learning.

[-] Dettweiler42@lemmy.dbzer0.com 40 points 1 month ago

It's all about curation and review. If they use AI to make the whole project, it's going to be bloated slop. If they use it to write sections that they then review, edit, and validate; then it's all good.

I'm fairly anti-AI for most current applications, but I'm not against purpose-built tools for improving workflow. I use some of Photoshop's generative tools for editing parts of images I'm using for training material. Sometimes it does fine, sometimes I have to clean it up, and sometimes it's so bad it's not worth it. I'm being very selective, and if the details are wrong it's no good. In the end, it's still a photo I took, and it has some necessary touchups.

[-] drolex@sopuli.xyz 21 points 1 month ago
  • Ethical issue: products of the mind are what makes us humans. If we delegate art, intellectual works, creative labour, what's left of us?
  • Socio-economic issue: if we lose labour to AI, surely the value produced automatically will be redistributed to the ones who need it most? (Yeah we know the answer to this one)
  • Cultural issue: AIs are appropriating intellectual works and virtually transferring their usufruct to bloody billionaires
[-] criss_cross@lemmy.world 15 points 1 month ago

If a human is reviewing the code they submit and owning the changes I don’t care if they use an LLM or not. It’s when you just throw shit at the wall and hope it sticks that’s the problem.

I’m more concerned with the admitted OpenClaw usage. That’s a hydrogen bomb heading straight for a fireworks factory.

[-] pivot_root@lemmy.world 11 points 1 month ago

It's the same for me.

I don't care if somebody uses Claude or Copilot if they take ownership and responsibility over the code it generates. If they ask AI to add a feature and it creates code that doesn't fit within the project guidelines, that's fine as long as they actually clean it up.

I’m more concerned with the admitted OpenClaw usage. That’s a hydrogen bomb heading straight for a fireworks factory.

This is the problem I have with it too. Using something that vulnerable to prompt injection to not only write code but commit it as well shows a complete lack of care for bare minimum security practices.

[-] obelisk_complex@piefed.ca 2 points 1 month ago

Yikes. Hadn't heard about the openclaw use. That stack scares the bejeezus out of me.

[-] Iheartcheese@lemmy.world 14 points 1 month ago

Yeah. Call me if he starts using AI artwork.

[-] wholookshere@piefed.blahaj.zone 43 points 1 month ago

so you draw the line at stealing artists work, but not programmers work?

[-] Dremor@lemmy.world 37 points 1 month ago* (last edited 1 month ago)

Being a developer, I don't care if someone else uses my code. Code is like a brick. By itself it has little value, the real value lies on how it is used.
If I find an optimal way to do something, my only wish is to make it available to as much people as possible. For those who comes after.

[-] wholookshere@piefed.blahaj.zone 5 points 1 month ago

Sure, but that's just your view.

And also not how LLMs work.

They gobble up everything and cause unreadable code. Not learning.

[-] Dremor@lemmy.world 2 points 1 month ago

That's not how LLMs work either.

An LLM had no knowledge, but has the statically probability of a token to follow another token, and given an overall context it create the statically most likely text.
To calculate such probability as accurently as possible you need as much examples as possible, to determine how often word A follow word B. Thus the immense datasets required.
Luckily for us programmers, computer programs are inherently statically similar, which makes LLMs quite good at it.
Now, the programs it create aren't perfect, but it allows to write long, boring code fast, and even explain it if you require it to. This way I've learned a lot of new things that I wouldn't have unless I had the time and energy to screw around with my programs (which I wished I had, but don't), or looked around Open Source programs source code, which would take years to an average human.

Now there is the problem of the ethic use of AI, which is a whole other aspect. I use only local models, which I run on my own hardware (usually using Ollama, but I'm looking into NPU enabled alternatives).

[-] Miaou@jlai.lu 1 points 1 month ago

Elon, Jeff, and Mark thank you for your service

[-] Dremor@lemmy.world 1 points 1 month ago

I can live with helping some assholes if my contributions help others. At least I don't make them richer since I only use local IAs.

[-] adeoxymus@lemmy.world 25 points 1 month ago

Tbh all programmers have been copy pasting from each other forever. The middle step of searching stack overflow or GitHub for the code you want is simply removed

[-] galaxy_nova@lemmy.world 7 points 1 month ago

Exactly. If someone has already come up with an optimal solution why the hell would I reimplement it. My real problems are not with LLMs themselves but rather the sourcing of the training data and the power usage. If I could use an “ethically sourced” llm locally I’d be mostly happy. Ultimately LLMs are also only good for code specifically. Architecture or things that require a lot of thought like data pipelines I’ve found AI to be pretty garbage at when experimenting

[-] wholookshere@piefed.blahaj.zone 4 points 1 month ago

That's not what an LLM is doing is it.

[-] smeg@feddit.uk 8 points 1 month ago
[-] wholookshere@piefed.blahaj.zone 26 points 1 month ago

LLMs have stolen works from more than just artists.

ALL of public repositories at a minimum have been used as training, regardless of licence. including licneses that require all dirivitive work be under the same license.

so there's more than just lutris stollen.

[-] lung@lemmy.world 4 points 1 month ago

So he's a badass Robinhood pirate that steals code from corporations and gives it to the people?

[-] wholookshere@piefed.blahaj.zone 9 points 1 month ago

The fuck you talking about.

Using a tool with billions of dollars behind it robinhood?

How is stealing open source prihcets code regardless of license stealing fr corporation's?

[-] lung@lemmy.world 1 points 1 month ago* (last edited 1 month ago)
  • he's not anthropic, and doesn't have billions of dollars
  • stealing from open source is not stealing, that's the point of open source
  • the argument above is that these models are allegedly trained "regardless of license" i.e. implying they are trained on non-oss code
[-] wholookshere@piefed.blahaj.zone 2 points 1 month ago
  1. he's using a tool that took billions in funding.

  2. that's not how open source licensing work.

  3. no, I'm saying some licneses restrict LLM usage in the form of derivative work must also be licensed under the same license. Using that work as a starting point requires you to also open that portion of code.

side note:

https://www.securityweek.com/github-copilot-chat-flaw-leaked-data-from-private-repositories/

why does AI have access to private code?

[-] prole 4 points 1 month ago

No, the LLM was trained on other code (possibly including Lutris, but also probably like billions of lines from other things)

[-] RightHandOfIkaros@lemmy.world 14 points 1 month ago

Personally, I have never seen LLM generated code that works without needing to be edited, but I imagine for routine blocks of code and very common things it probably does fine. I dont see why a programmer needs to rewrite the same code blocks over and over again for different projects when an LLM can do that part leaving more time for the programmer to write the more specialized parts. The programmer will still have to edit and verify the generated code, but programming is more mechanical than something like art.

However, for more specialized code, I would be concerned. It would likely not function at all without editing, and if it did function it probably wouldn't be optimized or secure. However, this programmer claims to have 30 years of experience, and if thats the case then he likely knows this and probably edits the LLM output code himself.

As I have said before, Generative AI is a tool, like PhotoShop. I dont see why people should reject a tool if it can make their job easier. It won't be able to completely replace people effectively. Businesses will try, but quality will drop off because its not being used by people that understand what the end result needs to be, and businesses will inevitably lose money.

[-] p03locke@lemmy.dbzer0.com 2 points 1 month ago

However, for more specialized code, I would be concerned. It would likely not function at all without editing, and if it did function it probably wouldn’t be optimized or secure.

That's not completely true. Claude and some of the Chinese coding models have gotten a lot better at creating a good first pass.

That's also why I like tests. Just force the model to prove that it works.

Oh, you built the thing and think it's finished? Prove it. Go run it. Did it work? No? Then go fix the bugs. Does it compile now? Cool, run the unit test platform. Got more bugs? Fix them. Now, go write more unit tests to match the bugs you found. You keep running into the same coding issue? Go write some rules for me that tell yourself not to do that shit.

I mean, I've been doing this programming shit for many decades, and even I've been caught by my overconfidence of trying to write some big project and thinking it's just going to work the first time. No reason to think even a high-powered Claude thinking model is going to magically just write the whole thing bug-free.

[-] XLE@piefed.social 6 points 1 month ago* (last edited 1 month ago)

"If" doing all the lifting here.

If we ignore the mountain of evidence saying the opposite...

[-] Kowowow@lemmy.ca 4 points 1 month ago

I want to one day make a game and there is no way I'm not prototyping it with llm code, though I would want to get things finalized by a real coder if I ever got the game finished but I've never made real progress on learning code even in school

[-] The_Blinding_Eyes@lemmy.world 1 points 1 month ago

While I know there is more nuance than this, but why should I spend any of my time on something, when you spent no time creating it? I know that applies more to the slop, but that's where I am with most LLM generated stuff.

this post was submitted on 12 Mar 2026
796 points (100.0% liked)

Games

47855 readers
683 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Rules

1. Submissions have to be related to games

Video games, tabletop, or otherwise. Posts not related to games will be deleted.

This community is focused on games, of all kinds. Any news item or discussion should be related to gaming in some way.

2. No bigotry or harassment, be civil

No bigotry, hardline stance. Try not to get too heated when entering into a discussion or debate.

We are here to talk and discuss about one of our passions, not fight or be exposed to hate. Posts or responses that are hateful will be deleted to keep the atmosphere good. If repeatedly violated, not only will the comment be deleted but a ban will be handed out as well. We judge each case individually.

3. No excessive self-promotion

Try to keep it to 10% self-promotion / 90% other stuff in your post history.

This is to prevent people from posting for the sole purpose of promoting their own website or social media account.

4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

This community is mostly for discussion and news. Remember to search for the thing you're submitting before posting to see if it's already been posted.

We want to keep the quality of posts high. Therefore, memes, funny videos, low-effort posts and reposts are not allowed. We prohibit giveaways because we cannot be sure that the person holding the giveaway will actually do what they promise.

5. Mark Spoilers and NSFW

Make sure to mark your stuff or it may be removed.

No one wants to be spoiled. Therefore, always mark spoilers. Similarly mark NSFW, in case anyone is browsing in a public space or at work.

6. No linking to piracy

Don't share it here, there are other places to find it. Discussion of piracy is fine.

We don't want us moderators or the admins of lemmy.world to get in trouble for linking to piracy. Therefore, any link to piracy will be removed. Discussion of it is of course allowed.

Authorized Regular Threads

Related communities

PM a mod to add your own

Video games

Generic

Help and suggestions

By platform

By type

By games

Language specific

founded 2 years ago
MODERATORS