107
submitted 4 months ago* (last edited 4 months ago) by FlyingSquid@lemmy.world to c/technology@lemmy.world

Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

you are viewing a single comment's thread
view the rest of the comments
[-] Australis13@fedia.io 13 points 4 months ago

Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via%3Dihub

It doesn't look like these "bits" are binary, but "pieces of information" (which I find a bit misleading):

“Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.

The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:

To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.

[-] FlyingSquid@lemmy.world 5 points 4 months ago

But our brains are not digital, so they cannot be measured in binary bits.

[-] conciselyverbose@sh.itjust.works 17 points 4 months ago

There is no other definition of bit that is valid in a scientific context. Bit literally means "binary digit".

Information theory, using bits, is applied to the workings of the brain all the time.

[-] FlyingSquid@lemmy.world 3 points 4 months ago

How do you know there is no other definition of bit that is valid in a scientific context? Are you saying a word can't have a different meaning in a different field of science?

[-] conciselyverbose@sh.itjust.works 8 points 4 months ago

Because actual neuroscientists understand and use information theory.

[-] FlyingSquid@lemmy.world 4 points 4 months ago

Actual neuroscientists define their terms in their papers. Like the one you refuse to read because you've already decided it's wrong.

[-] conciselyverbose@sh.itjust.works 8 points 4 months ago

Actual neuroscientists do not create false definitions for well defined terms. And they absolutely do not need to define basic, unambiguous terminology to be able to use it.

[-] FlyingSquid@lemmy.world 6 points 4 months ago

Please define 'bit' in neuroscientific terms.

[-] conciselyverbose@sh.itjust.works 12 points 4 months ago* (last edited 4 months ago)

Binary digit, or the minimum additional information needed to distinguish between two different equally likely states/messages/etc.

It's same usage as information theory, because information theory applies to, and is directly used by, virtually every relevant field of science that touches information in any way.

[-] FlyingSquid@lemmy.world 4 points 4 months ago

Binary digit

Brains are not binary. I asked you to define it in neuroscientific terms.

[-] conciselyverbose@sh.itjust.works 11 points 4 months ago

Information is information. Everything can be described in binary terms.

Binary digit is how actual brain scientists understand bit, because that's what it means.

But "brains aren't binary" is also flawed. At any given point, a neuron is either firing or not firing. That's based on a buildup of potentials based on the input of other neurons, but it ultimately either fires or it doesn't, and that "fire/don't fire" dichotomy is critical to a bunch of processes. Information may be encoded other ways, eg fire rate, but if you dive down to the core levels, the threshold of whether a neuron hits the action potential is what defines the activity of the brain.

[-] FlyingSquid@lemmy.world 4 points 4 months ago

And yet you were already shown by someone else that the paper that you refuse to read is using its terms correctly.

[-] jerkface@lemmy.ca 3 points 4 months ago* (last edited 4 months ago)

I think what you really mean is brains are not numeric. It's the "digit" part that is objectionable, not the "binary" part, which as an adjective for "digit" just means a way of encoding a portion of a number.

But in the end it's a semantic argument that really doesn't have a lot to do with the thesis.

[-] Australis13@fedia.io 5 points 4 months ago

Indeed not. So using language specific to binary systems - e.g. bits per second - is not appropriate in this context.

[-] Tramort@programming.dev 5 points 4 months ago

All information can be stored in a digital form, and all information can be measured in base 2 units (of bits).

[-] FlyingSquid@lemmy.world 3 points 4 months ago

But it isn't stored that way and it isn't processed that way. The preprint appears to give an equation (beyond my ability to understand) which explains how they came up with it.

[-] Tramort@programming.dev 11 points 4 months ago

Your initial claim was that they couldn't be measured that way. You're right that they aren't stored as bits, but it's irrelevant to whether you can measure them using bits as the unit of information size.

Think of it like this: in the 1980s there were breathless articles about CD ROM technology, and how, in the future, "the entire encyclopedia Britannica could be stored on one disc". How was that possible to know? Encyclopedias were not digitally stored! You can't measure them in bits!

It's possible because you could define a hypothetical analog to digital encoder, and then quantify how many bits coming off that encoder would be needed to store the entire corpus.

This is the same thing. You can ADC anything, and the spec on your ADC defines the bitrate you need to store the stream coming off... in bits (per second)

[-] FlyingSquid@lemmy.world 3 points 4 months ago

As has been shown elsewhere in this thread by Aatube a couple of times, they are not defining 'bit' the way you are defining it, but still in a valid way.

[-] scarabic@lemmy.world 2 points 4 months ago* (last edited 4 months ago)

So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.

this post was submitted on 26 Dec 2024
107 points (100.0% liked)

Technology

70249 readers
3099 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS