409
submitted 2 days ago* (last edited 2 days ago) by Cevilia to c/fuck_ai@lemmy.world
top 50 comments
sorted by: hot top controversial new old
[-] brucethemoose@lemmy.world 8 points 1 day ago

No sane IDE implementation lets LLMs run commands without a sandbox.

WTF is this?

[-] phed@lemmy.ml 3 points 1 day ago

my thoughts. the heck does a brain dead LLM have access to your file system or to do anything except interact with you lol

[-] brucethemoose@lemmy.world 2 points 1 day ago

I mean, it makes sense inside like a docker container or VM.

...Not like this.

[-] Buddahriffic@lemmy.world 4 points 1 day ago

Which is also why I don't want tight AI integration on an OS. It's fine as a chatbot, not an administrator that I suspect MS will one day hand more control over their PC than they allow users.

[-] Cevilia 6 points 1 day ago

An insane IDE implementation that lets LLMs run commands without a sandbox.

https://antigravity.google/

[-] brucethemoose@lemmy.world 4 points 1 day ago* (last edited 1 day ago)

Unreal.

It’s like they’re trying to get the public to despise LLMs. It’s certainly working.

[-] Cevilia 154 points 2 days ago

The copium in the reddit thread is hilarious.

"The issue is that you had a space in your path name"

No, the issue is that the AI wiped an entire drive! 🤣

[-] phed@lemmy.ml 2 points 1 day ago

I can't stand those types of people.

[-] Voroxpete@sh.itjust.works 75 points 2 days ago

I mean, a lot of the people pointing that out are actually doing so to indicate the dangers of relying on AI in the first place.

If you read some OPs replies it becomes clear that what happened here is they asked the bot how to fix something, didn't understand the instructions it replied with, and then just went and said "Hey, I don't get it, so you do it for me."

Anyone who knew what they were doing would have noticed the bad delete command the bot presented (improperly formatted, and with no safety checks), but because OP figured "Hey, knowing stuff is for suckers", they ended up losing all their stuff.

[-] naevaTheRat@lemmy.dbzer0.com 19 points 2 days ago

How would you feel if you bought pharmaceuticals that promised to heal you and they made you sick? What about a ride at a theme park with no warning signs that failed and hurt you?

The whole marketing thing of these garbage devices is based around abusing trust. There are no warnings that you need to be an expert, in fact they claim the opposite.

The person is a rube, but only evil people abuse the trust of others and only evil people blame people for having their trust abused. Being able to trust people is good actually, and we should viciously beat to death everyone that violates social trust.

[-] Ledivin@lemmy.world 4 points 2 days ago* (last edited 2 days ago)

How would you feel if you bought a hammer and then it broke your hand?

You wouldn't feel anything at all, because it's an inane scenario that can't actually happen without you misusing the tool.

[-] naevaTheRat@lemmy.dbzer0.com 7 points 2 days ago

If someone told me the hammer was safe and I hit something with it, the temper was bad, it shattered and cut me, and it was established to be deliberate deception beyond even negligence I'd want my pound of flesh yeah

load more comments (1 replies)
[-] Gaja0@lemmy.zip 25 points 2 days ago

It's funny until you realize Google dumped $93M into convincing the general public that AI is the future before thrusting a half baked technology into our daily lives.

[-] Burninator05@lemmy.world 5 points 2 days ago

Well, there's the problem. In AI development $93m is nothing. Its like they threw a pocket change at a child and demanded industry leading AI.

[-] Gaja0@lemmy.zip 2 points 1 day ago

I think this is just the cost of it's adverts

[-] Cevilia 3 points 1 day ago

If they threw $93m at me, I'd start saying their large lying machine was the solution to all my problems.

[-] usernameusername@sh.itjust.works 94 points 2 days ago
[-] snooggums@piefed.world 20 points 2 days ago
[-] frunch@lemmy.world 15 points 2 days ago

I don't usually laugh aloud at comments, but this one got me. Thank you, and Happy Monday 🫠

[-] sup@lemmy.ca 3 points 2 days ago

Outstanding

load more comments (1 replies)
[-] Kyrgizion@lemmy.world 94 points 2 days ago

The irony of having run out of tokens on that last message... "Your problem now, peace out. Or pony up".

Great, we invented data protection rackets.

[-] falseWhite@lemmy.world 81 points 2 days ago* (last edited 2 days ago)

Why would you give AI access to the whole drive? Why would you allow AI run destructive commands on its own without reviewing them?

The guy was asking for it. I really enjoy seeing these vibe coders imagine they are software engineers and fail miserably with their drives and databases wiped.

[-] tburkhol@lemmy.world 67 points 2 days ago

If he knew what he was doing, would he need to be vibe coding? The target audience are exactly the people most susceptible to collateral damage.

[-] moody@lemmings.world 15 points 2 days ago

I have a couple dev friends who were told by management that they need to be using AI, and they hate it.

load more comments (2 replies)
[-] INeedMana@piefed.zip 6 points 2 days ago

I'll probably get eaten here but here goes: I do use LLMs when coding. But those should NEVER be used when on unknown waters. To quickly get the 50 lines boilerplate and fill out the important 12 - sure. See how a nested something can be written in a syntax I've forgotten - yes. Get some example to know where to start searching the documentation from - ok. But "I asked it to do X, don't understand what it spewed out, let's roll"? Hell no, it's a ticking bomb with a very short fuse. Unfortunately the marketing has pushed LLMs as things one can trust. I feel I'm already being treated like a zealot dev, afraid for his job, when I'm warning people around me to not trust the LLMs' output below search engine query

[-] ladicius@lemmy.world 12 points 2 days ago

That here is the core of the problem.

load more comments (2 replies)
[-] SaharaMaleikuhm@feddit.org 57 points 2 days ago

Live by the slop, die by the slop. Also: no backup, no pity

[-] khepri@lemmy.world 16 points 2 days ago* (last edited 2 days ago)

IF (and this is a big if) you are going to allow an AI direct access to your files and your command line, for the love of Gabe sandbox that shit and run a backup for the folders you give it access to. We know AI makes mistakes like this. Just act as if you were giving your little brother access to your drives and your command line and it's his first day. I get we're all still learning about this stuff, but allowing an AI agent command-line access and full drive access, to a local drive you have no backup of, is just leaving Little Timmy at home alone with a loaded shotgun and a open bottle of pills level of irresponsibility.

[-] BassTurd@lemmy.world 37 points 2 days ago

Watching vibe coders get blown up by their own ignorance and stupidity is such a great past time. Fuck AI.

load more comments (1 replies)
[-] samus12345@sh.itjust.works 17 points 2 days ago
[-] phed@lemmy.ml 2 points 1 day ago

His drive certainly did get the D:

[-] Ilixtze@lemmy.ml 18 points 2 days ago

Hilarious! I knew this was going to happen! Just not that fast!

[-] ashenone@lemmy.ml 20 points 2 days ago
load more comments (4 replies)
[-] TriangleSpecialist@lemmy.world 37 points 2 days ago* (last edited 2 days ago)

Damn, Microsoft really just implemented their own version of the rm -rf / Russian roulette on their sad excuse for an OS.

It only took boiling an ocean for training the damn thing.

EDIT: Google did. We'll blame a Pavlovian reflex and lack of sleep (or anything other than my stupidity)...

[-] CompactFlax@discuss.tchncs.de 19 points 2 days ago

Yes, Microsoft is moving this direction.

No, Microsoft is not in this post. Microsoft and Google have not yet merged.

load more comments (2 replies)
[-] horn_e4_beaver@discuss.tchncs.de 19 points 2 days ago* (last edited 2 days ago)

It's nice that Google gives LLMs an excuse to get out of conversations that they're done with.

load more comments (1 replies)
[-] MedicPigBabySaver@lemmy.world 4 points 1 day ago

Fuck Reddit and Fuck Spez.

[-] psx_crab@lemmy.zip 22 points 2 days ago* (last edited 2 days ago)

2017: we lock your data behind paywall without your authorisation, pay us 2 bitcoin to unlock it.

2025: whoops i deleted your D: drive, do check it for the extend of the damage.

Would people learn

Edit: omfg quota limit hit right after the drive is empty. Seriously why would people even allow AI to hold their egg basket. This is all on OOP.

[-] wewbull@feddit.uk 22 points 2 days ago

Kinda a pity it wasn't the C drive. It would have uninstalled itself.

[-] november@piefed.blahaj.zone 15 points 2 days ago
load more comments (1 replies)
[-] watson@lemmy.world 15 points 2 days ago

While also eliminating 12 jobs

[-] IcyToes@sh.itjust.works 16 points 2 days ago

Looks to be creating rather than reducing the work needed.

I don't feel threatened.

[-] watson@lemmy.world 21 points 2 days ago* (last edited 2 days ago)

If they fire you and then make your coworker work twice as hard for the same pay, then you definitely should.

If they fire your coworker and make you work twice as hard for the same pay, then you definitely should.

If they fire both of you, and then give everyone in your town cancer from the toxic water runoff from a massive AI data center they just built, then you definitely should.

load more comments (6 replies)
load more comments
view more: next ›
this post was submitted on 01 Dec 2025
409 points (100.0% liked)

Fuck AI

4701 readers
994 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS