23
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 06 Oct 2025
23 points (100.0% liked)
Technology
40450 readers
318 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
I think the problem with anthropomorphizing LLMs this way is that they don't have intent, so they can't have responsiblity. If this piece of software had been given the tools to actually kill someone, I think we all understand that it wouldn't be appropriate to put the LLM on trial. Instead, we need to be looking at the people who are trying to give more power to these systems and dodge responsibility for their failures. If this LLM had caused someone to be killed, then the person who tied critical systems into a black box piece of software that is poorly understood and not fit for the purpose is the one who should be on trial. That's my problem with anthropomorphizing LLMs, it shifts the blame and responsibility away from the people who are responsible for attempting to use them for their own gain, at the expense of others.