21
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 06 Apr 2025
21 points (100.0% liked)
TechTakes
1791 readers
61 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
Utterly rancid linkedin post:
text inside image:
Why can planes "fly" but AI cannot "think"?An airplane does not flap its wings. And an autopilot is not the same as a pilot. Still, everybody is ok with saying that a plane "flies" and an autopilot "pilots" a plane.
This is the difference between the same system and a system that performs the same function.
When it comes to flight, we focus on function, not mechanism. A plane achieves the same outcome as birds (staying airborne) through entirely different means, yet we comfortably use the word "fly" for both.
With Generative AI, something strange happens. We insist that only biological brains can "think" or "understand" language. In contrast to planes, we focus on the system, not the function. When AI strings together words (which it does, among other things), we try to create new terms to avoid admitting similarity of function.
When we use a verb to describe an AI function that resembles human cognition, we are immediately accused of "anthropomorphizing." In some way, popular opinion dictates that no system other than the human brain can think.
I wonder: why?
Yes the 2 rs in strawberry machine thinks. In the same way that an airplane flies. /s
E: it gets even worse as half the AI field says the airplanes fly like how birds do. That is why the anthropomorphization is bad. Because it both doesn't think as in the function, nor think as in the system. And by anthropomorphizing people make it look like it can do both.