640
you are viewing a single comment's thread
view the rest of the comments
[-] mawhrin@awful.systems 19 points 1 year ago

i guess it comes down to a philosophical question

no, it doesn't, and it's not a philosophical question (and neither is this a question of philosophy).

the software simply has no cognitive capabilities.

[-] EatATaco@lemm.ee 4 points 1 year ago

I'm not sure I agree, but then it goes to my second question:

What's the effective difference?

[-] mawhrin@awful.systems 16 points 1 year ago

(…) perception, attention, thought, imagination, intelligence, comprehension, the formation of knowledge, memory and working memory, judgment and evaluation, reasoning and computation, problem-solving and decision-making (…)

[-] braxy29@lemmy.world 2 points 1 year ago

don't know why you got downvoted, an LLM is essentially a chinese room, and whether such a room "knows" is still the question.

[-] mawhrin@awful.systems 17 points 1 year ago
[-] petrol_sniff_king 10 points 1 year ago

Thanks for that read.

[-] self@awful.systems 11 points 1 year ago

don’t know why you got banned

[-] froztbyte@awful.systems 11 points 1 year ago

Good god it’s a hydra

[-] techMayhem@lemmy.world 9 points 1 year ago

Someone in the chinese room would not know anything about their in- or output. Sure you memorized that a certain set of symbols means your output should contain another set of symbols, but what do you actually "know" about these symbols.

But you have no idea what it's about. Is it a greeting? A recipe for some pasta? Instructions to build a bomb? Could be anything.

this post was submitted on 15 May 2024
640 points (100.0% liked)

TechTakes

1869 readers
95 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS