[-] Poteryashka@lemmy.ml 1 points 1 year ago

Wow. I apologize for the confusion. I absolutely do not believe these two situations are comparable. My bad.

[-] Poteryashka@lemmy.ml 1 points 1 year ago

I'm really not exactly sure what qualifies, but the existence of an emergent system so has to be there. Does fungus communication give rise to a system that can build some kind of memory and refer to it to develop more complex behavior? If not, then it's lacking the level of complexity to be considered consciousness. (But that's just where I personally draw the line)

Eusociality has its own context. It's possible for a hive to show complex organized behavior, but so would an infinite paperclip machine if it was to consist of a swarm of collector drones. A myriad of units with a set of pre-determined instructions can have complex organizations, which still wouldn't qualify as consciousness.

Now, the brain scenario would definitely count since it consists of the necessary "hardware" to start generating its own abstract contextual model of its experiences.

[-] Poteryashka@lemmy.ml 0 points 1 year ago

The point of emerging systems is that they tend to be more than just a sum of their parts:

https://www.merriam-webster.com/dictionary/gestalt

[-] Poteryashka@lemmy.ml 1 points 1 year ago

Another aspect of this conversation was what was posited by the Sapir Whorf Hypothesis. The experiential differences in perception of color can also be attributed to differences in culture / upbringing which influenced one's processing of the stimuli itself. I tend to oversimplify it to the firmware analogy. Sometimes you get raw input and the languages provide different libraries for comtextualizing this input.

view more: ‹ prev next ›

Poteryashka

joined 1 year ago