66

My company is strongly pushing AI. There are lot of experiments, demos, and effort from decently smart people about integrating it into our workflows. There are some impressive victories that have been made with AI tooling producing some things fast. I am not in denial about this. And the SE department is tracking improved productivity (as measured by # of tickets being done, I guess?)

The problem is I hate AI. I hate every fucking thing about it. Its primary purpose, regardless of what utility is gained, is spam. I think it's obvious how google search results are spam, how spam songs and videos are being produced, etc. But even bad results from AI that have to be discarded, IMO, are spam.

And that isn't even getting into all massive amounts of theft to train the data, or the immense amounts of electricity it takes to do training and inference, as well as run, all this crap. Nor the psychosis being inflicted onto people who emplace their trust into these systems. Nor the fact that these tools are being used to empower authoritarian regimes to track vulnerable populations, both here (in the USA) and abroad. And all this AI shit serves to enrich the worst tech moguls and to displace people like artists and people like myself, a programmer.

I'm literally being told at my job that I should view myself basically as an AI babysitter, and that AI has been unambiguously proven in the industry, so the time for wondering about it, experimenting with it, or opposing it is over. The only fault and flaw is my (i.e. any given SE's) unwillingness to adapt and onboard.

Looking for advice from people who have had to navigate similar crap. Because I feel like I'm at a point where I must adapt or eventually get fired.

you are viewing a single comment's thread
view the rest of the comments
[-] Brokkr@lemmy.world 30 points 3 days ago

AI is a tool, just like a hammer. You could use a rock, but that doesn't give you the leverage that a hammer does.

AI is also a machine, it can get you to your destination faster, like a car or train.

Evil people have used hammers, cars, and trains to do evil and horrible things. These things can also be used for useless stupid things, like advertising.

But they can also be used for good, like an ambulance or to transport food. They also make us more efficient and can be used to save resources and effort. It depends on who uses it and how they use it.

You can't control how other people may misuse these things, but you can control how much you know, how you use it, and what you use it for.

[-] gwl 1 points 1 day ago

AI is a tool like a Gun, more like, specifically designed for a terrible purpose, used for a terrible purpose, and with no way to use it for good except by deconstructing it and finding a new use for it

[-] thatsTheCatch@lemmy.nz 14 points 3 days ago

One aspect that analogy doesn't work for is hammers and cars weren't built with the mass theft of intellectual property, they aren't being leveraged to put people out of jobs, and they aren't the driving force for building insane numbers of data centres that increase power bills for locals and ravage their water supply.

It's not necessarily the pure usage of AI that I don't like, as much as what has been and is being used to create it.

Cars have their own problems of course, and cause more issues with the direct use of them than what went into building them.

I read someone leave a different comment where they said something like "If human meat was the healthiest, least environmentally damaging, and cheapest food, they still wouldn't eat it." In this case AI doesn't really match those benefits anyway

[-] Cowbee@lemmy.ml 10 points 3 days ago

IP itself needs to be abolished, so that part isn't as important. Further, cars did put people out of jobs that used to draw horse carriages and maintain them. The original commenter is correct with their analysis.

[-] thericofactor@sh.itjust.works 4 points 3 days ago

If IP is abolished, that would to me imply that use of AI should be free for everyone, as it's based on everyones collective knowledge.

[-] Cowbee@lemmy.ml 7 points 3 days ago

Sure, I don't see a problem with that.

[-] helix@feddit.org 4 points 2 days ago

Combustion engine cars cause cancer, AI doesn't. Checkmate, Atheist.

[-] Brokkr@lemmy.world 2 points 3 days ago

Regarding the idea of IP theft, I think it's more complicated.

Let's say you buy a book, learn from it, and use that knowledge to your benefit. We don't think of this as stealing, even though you didn't buy the knowledge, you only bought the text.

If you borrow a book, likewise the knowledge is yours, not just while you've borrowed the book. Even if you pirate or steal the book (please don't), the knowledge is still yours regardless.

So I don't think there should be an issue of an AI learning freely from content.

However, I would agree that when it reproduces a work without attribution that it becomes a problem. The problem that we as a society have even when AI is not involved is defining where that line is. Because there are cases where some kinds of reproduction are ok (parody, homage, etc.). We as a society do not have clear rules for these things because it is hard to define those rules. That's our problem, not a problem with AI. Using AI just makes it too easy for someone to cross that line and therefore I find it risky to use it for the production of that kind of material.

But I do not think it's an issue for me to ask the AI how to do some unusual thing in the terminal or to refactor a part of my code to work a little differently. There is no harm in asking it to create a GUI version of the cli program that I made.

As for putting people out of jobs, we may be at an inflection point in our productivity curve. Historically, these have caused short term job loss but ultimately lead to improvements generally once we've had time to adjust once people learn how to leverage these tools effectively. Likely, it will create more jobs in new areas.

Humans will always use more resources, especially energy. Until the last few decades the source of thst energy wasn't concerning. Now it is and we need to find more ways to produce more energy cleanly. Arguing for less energy use is never going to work. We will always use more energy.

[-] group_hug@sh.itjust.works 1 points 1 day ago

Tech bros are mostly transhumanists. When interviewed by the New York Times Peter Thief couldn't bring himself that end of the human race would be a bad thing.

Only thing these dudes care about is total power. Living forever and going down in history as god's to the machines that replace us.The rest of the world can burn.

In thiels on words from the NYT Trans is only bad when you half ass it like transgender. That is only part trans so it is bad. When you do full trans like trans human, that is actually good.

These dudes have no humanity. That's why they can be so cruel. And they are the ones that control these tools.

[-] helix@feddit.org 1 points 2 days ago

Likely, it will create more jobs in new areas.

Yeah, actually fixing the mess the AIs created. I don't look forward to it.

this post was submitted on 12 Dec 2025
66 points (100.0% liked)

General Programming Discussion

9224 readers
6 users here now

A general programming discussion community.

Rules:

  1. Be civil.
  2. Please start discussions that spark conversation

Other communities

Systems

Functional Programming

Also related

founded 6 years ago
MODERATORS