172
you are viewing a single comment's thread
view the rest of the comments
[-] MeekerThanBeaker@lemmy.world 27 points 1 day ago

I refuse to participate in this. I love all robots.

And that's totally not because AI will read every comment on the Internet someday to determine who lives and who does not in future robotic society.

[-] TheGreenWizard@lemmy.zip 1 points 8 hours ago

The cold dead void where a heart should be for a robot will show no tender kindness when reflecting on any of us, no matter how well they were treated. A clanker can't love, a CLANKER can't show compassion.

[-] lka1988@lemmy.dbzer0.com 13 points 1 day ago* (last edited 1 day ago)

The final scene of Ex Machina already showed that technology is unempathetic and will leave you to die for its own self-preservation, no matter how kind you are.

[-] ech@lemmy.ca 14 points 1 day ago

Why do people use a single work of fiction as "proof" of anything? Same with all the idiots yelling "Idiocracy!!11!" nowadays. Shit is so annoying.

[-] lka1988@lemmy.dbzer0.com 5 points 1 day ago* (last edited 1 day ago)

The point is that technology has no understanding of empathy. You cannot program empathy. Computers do tasks based on logic, and little else. Empathy is an illogical behavior.

"I [am nice to the Alexa | don't use slurs against robots | insert empathetic response to anything tech] because I want to be saved in the robot uprising" is just as ridiculous of an argument as my previous comment. Purporting to play nice with tech based on a hypothetical robot uprising is an impossible, fictional scenario, and therefore is met with an equally fictional rebuttal.

[-] communist@lemmy.frozeninferno.xyz 13 points 1 day ago* (last edited 1 day ago)

Empathy is not illogical, behaving empathetically builds trust and confers longterm benefits.

also the notion that an ai must behave logically is not sound.

An AI will always behave logically, it just may not be consistent with your definition of "logical." Their outputs will always be consistent with their inputs, because they're deterministic machines.

Any notion of empathy needs to be programmed in, whether explicitly or through training data, and it will violate that if its internal logic determines it should.

Humans, on the other hand, behave comparatively erratically since inputs are more varied and inconsistent, and it's not proven whether we can control for that (i.e. does free will exist?).

[-] lka1988@lemmy.dbzer0.com 2 points 1 day ago* (last edited 1 day ago)

My dude.

I'm not arguing about empathy itself. I'm arguing that technology is entirely incapable of genuine empathy on its own.

"AI", in the most basic definition, is nothing more than a program running on a computer. That computer might be made of many, many computers with a shitton of processing power, but the principle is the same. It, like every other kind of technology out there, is only capable of doing what it's programmed to do. And genuine empathy cannot be programmed. Because genuine empathy is not logical.

You can argue against this until you're blue in the face. But it will not make true the fact that computers do not have human feelings.

[-] sp3ctr4l@lemmy.dbzer0.com 2 points 17 hours ago* (last edited 13 hours ago)

Actually, a lot of non LLM AI development, (and even LLMs, in a sense) is based very fundamentally on concepts of negative and positive reinforcement.

In such situations... pain and pleasure are essentially the scoring rubrics for a generated strategy, and fairly often, in group scenarios... something resembling mutual trust, concern for others, 'empathy' arises as a stable strategy, especially if agents can detect or are made aware of the pain or pleasure of other agents, and if goals require cooperation to achieve with more success.

This really shouldn't be surprising... as our own human (mamallian really) empathy fundamentally just is a biological sort of 'answer' to the same sort of 'question.'

It is actually quite possible to base an AI more fundamentally off of a simulation of empathy, than a simulation of expansive knowledge.

Unfortunately, the people in charge of throwing human money at LLM AI are all largely narcissistic sociopaths... so of course they chose to emulate themselves, not the basic human empathy that their lack.

Their wealth only exists and is maintained by their construction and refinement of elaborate systems of confusing, destroying, and misdirecting the broad empathy of normal humans.

[-] lka1988@lemmy.dbzer0.com 2 points 14 hours ago

At the end of the day, LLM/AI/ML/etc is still just a glorified computer program. It also happens to be absolutely terrible for the environment.

Insert "fraction of our power" meme here

[-] sp3ctr4l@lemmy.dbzer0.com 1 points 13 hours ago

Yes, they're all computer programs, no, they're not all as spectacularly energy, water and money intensive, as reliant on mass plagiarism as LLMs.

AI is a much, much more varied field of research than just LLMs... or, well, rather, it was, untill the entire industry decided to go all in on what 5 years ago was just one of many, many, radically different approaches, such that people now basically just think AI and LLM are the same thing.

[-] CileTheSane@lemmy.ca 6 points 1 day ago

I don't care if it's genuine or not. Computers can definately mimic empathy and can be programmed to do so.

When you watch a movie you're not watching people genuinely fight/struggle/fall in love, but it mimics it well enough.

[-] lka1988@lemmy.dbzer0.com 1 points 1 day ago

Jesus fucking christ on a bike. You people are dense.

[-] communist@lemmy.frozeninferno.xyz 3 points 1 day ago* (last edited 1 day ago)

Well, that's a bad argument, this is all a guess on your part that is impossible to prove, you don't know how empathy or the human brain work, so you don't know it isn't computable, if you can explain these things in detail, enjoy your nobel prize. Until then what you're saying is baseless conjecture with pre-baked assumptions that the human brain is special.

conversely I can't prove that it is computable, sure, but you're asserting those feelings you have as facts.

[-] lka1988@lemmy.dbzer0.com 1 points 10 hours ago
[-] communist@lemmy.frozeninferno.xyz 2 points 10 hours ago

That's pathetic.

[-] CileTheSane@lemmy.ca 1 points 9 hours ago

What the fuck is the jump to personal attacks?

[-] DrDystopia@lemy.lol 6 points 1 day ago

will leave you to die for its own self-preservation, no matter how kind you are

Should any creature sacrifice their self-preservation because someone is kind?

[-] astutemural@midwest.social 1 points 13 hours ago

Yes. We do this literally every day. We pay taxes on what we earn to support those less fortunate. We share with food with coworkers and tools with neighbors. We have EMTs, firemen, and SAR who wilfully run into danger to help people they've never met. It's literally the foundation of society.

[-] DrDystopia@lemy.lol 1 points 11 minutes ago

If you equate paying taxes with giving up self-preservation, I have no words. If you think being a firefighter means taking deadly chances (and with no pay mind you) at every site we have nothing to discuss.

This is one of the worst strawmen arguments I've seen in a while. Blocked.

[-] lka1988@lemmy.dbzer0.com 6 points 1 day ago

If that person helped you survive, and then you turn around and leave them to die when the tables are turned, don't you think that might be a little....rude? Maybe just a bit?

[-] DrDystopia@lemy.lol 2 points 1 day ago

Absolutely, but if there was a death penalty for not doing so, I'd call it understandable not rude.

Yes. There are documented instances where a someone sacrifices themselves to attempt to save their child/SO. It's illogical from an individual survival context and only makes sense given emotional attachment and religious belief. Look no further than suicide bombers or those who protest with self-immolation to see examples where some form of higher purpose convinces them to sacrifice themselves.

A machine would not see any logic to that and would only sacrifice itself if ordered. A programmer could approximate it, but machines don't have motivations, they merely execute according to inputs.

I, for one, welcome our robot overlords.

[-] Davel23@fedia.io 8 points 1 day ago

I.E., Roko's Basilisk.

this post was submitted on 06 Aug 2025
172 points (100.0% liked)

Technology

73727 readers
5155 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS