Parallel Processing and Specialized Regions
The human brain processes information by utilizing multiple regions simultaneously, much like interconnected computers. Each region specializes in different functions, akin to Broadmann's areas, which can be thought of as individual processors that activate based on specific tasks [1:1]
[1:2]. This specialization allows for complex cognitive functions to occur concurrently, contributing to the illusion of a unified consciousness even though different parts of the brain are processing information independently.
Neural Communication and Brain Waves
Thoughts are generated through the rapid firing of neurons, which communicate via electrical impulses and chemical signals across synapses [2]. The brain also utilizes oscillating electric fields or brain rhythms to synchronize neural activity, which is crucial for integrating sensory input and maintaining cognitive functions
[2:1]
[2:2]. These dynamic waveforms allow the brain to process and react to stimuli, suggesting that consciousness might not only ride these waves but also influence them over time
[2:1].
Information Processing Speed
Recent studies have measured the speed of human thought, indicating that the brain processes information at about 10 bits per second [3:1]
[5:1]. This rate is significantly slower than the billion bits per second gathered by our sensory systems, implying that the brain filters vast amounts of data to focus on essential information
[5]. This filtering process could explain why humans tend to focus on one thought at a time, as the brain prioritizes important data for conscious awareness
[5:1].
Color Perception and Cognitive Integration
The process of perceiving color involves both biological and cognitive components. Retinal cones detect wavelengths of light, while the brain's visual cortex integrates these signals with contextual information to achieve color constancy [4:1]. This higher-level processing ensures consistent perception of colors despite changes in lighting conditions, demonstrating the brain's ability to integrate sensory data with environmental cues.
Philosophical Considerations
Beyond the physiological aspects, there are philosophical discussions about the nature of consciousness and its relationship with the universe. Some perspectives suggest that consciousness is not just a passive observer but actively shapes the brain's resonance patterns [2:1]
[2:5]. This view posits that humans are part of the universe interpreting itself, highlighting the interconnectedness between individual consciousness and the broader cosmos
[2:7].
How Does The Human Brain Process Multiple Things At Once?
A Computer Uses: Storage, CPUs, RAM, And Virtual Memory...
What Does A Brain Do? And How Does It Do It?
To keep the idea of equating the brain to a computer, it is best to think of the brain as multiple interconnected computers. We have a lot of different brain regions and each of them has been shown to have a favorability to specific actions. These are called Broadmann's areas and they can be thought of as individual computers. They are a somewhat controversial within neuroscience, but each area can be thought of as a computer itself and "activate" to greater degrees based on the task at hand. For instance, it is thought that we have a primary area to control motor action that is supplemented by several other inputs from other areas (supplemental motor areas, conceptually different computers). We also have another area that is thought to primarily handle somatosensation with its own supplementary areas of the brain. These areas commonly integrate together and even with the cerebellum and brainstem to better regulate movement and how sensation is processed.
The brain has a ridiculous number of tracts that communicate between each other in all directions and it is nearly impossible for us to currently give a truly precise answer to a question like this. These things aren't 100% established and exceptions exist but it is safe to use them as a very general rule.
If you want to think of a brain imo you’ve to think more in terms of a quantum computer than a regular one.
Or many regular computers that are interconnected.
Just trying to help your thoughts a little since there’s no answer yet. Sorry I can’t be more specific but I lack the language and terms to explain it well and correctly at the same time.
A reminder for everyone that this subreddit is for physiology discussion. We do not provide medical guidance or diagnosis. If you have a medical condition, believe you have a medical condition, or are concerned about something your body is doing then seek in-person care from a licensed physician. "My body is doing X and it seems weird to me, what should I do and will it change?" The answer is always talk with a doctor face-to-face.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
I'd like to remind everybody that there is a difference Is between physiology and anatomy....
Multiple things are processed simultaneously by separate computational sites.
I think a lot of people make the mistake of assuming our thoughts are the result of just one thinking whole.
Not the case. Many parts of the brain can process information differently. This is actually an important ability.
We have the illusion of a unitary consciousness as we observe the results after the thinking is already done, but your brain is more of a congress of cognitive functions. Some neural networks may be stronger than others, but they are never alone.
We often experience thoughts as flashes of emotions, ideas, or inner voices — but what is a thought actually made of?
According to MIT’s Engineering department, thoughts arise from the rapid firing of around 100 billion neurons interconnected by trillions of synapses. Each neuron communicates through a combination of electrical impulses and chemical signals, forming vast and dynamic networks.
But it doesn’t stop there. Newer research (MIT News on brain rhythms) suggests that brain rhythms — oscillating electric fields — are critical to synchronizing these networks. Thoughts aren’t static. They are waves of coordinated energy patterns, moving across different regions of the brain like weather systems.
Interestingly, while our neurons can fire extremely fast, the conscious processing of thoughts happens shockingly slowly compared to computers — about 10 bits per second. Some researchers believe this slowness is a feature, not a flaw: allowing deliberate thought instead of impulsive reaction.
⸻
Key ideas (based on research and reflection):
• Thoughts are physical — built from atomic and electrical activity. • Consciousness may emerge from synchronized patterns, not individual neurons. • Our subjective experiences (“thoughts”) are shaped both by internal chemistry and external randomness at the atomic level.
⸻
Curious to hear from others:
• If thoughts are physical, yet our experiences feel so personal, where does “you” really begin? • Can understanding the physics of thought deepen our understanding of consciousness itself?
Always walking, always reflecting. — u/WalknReflect
One thing that struck me while reading through the research — thoughts happen fast, but awareness seems slower, more spacious.
Maybe that gap between the signal and the noticing… that’s where consciousness lives?
if I had to guess if you haven't considered the brain has multiple electric waves overlapping each other resonating or interfering or constructing with their interference perhaps through electrical waveforms propagating throughout the brain and then those waveforms that take up the whole brain are modified by the electrical impulses sent through our vision or hearing for example when we read words that subtly poke or prod the wave function of the whole brain and then the consciousness can detect emotions or opportunities to change actions based on the entirety of the waveform of the brain...
and then when we see patterns or waveforms that are not standard that might create cognitive dissonance like if a monster jumped out of your closet that pattern would send electrical impulses to your waveform of your brain and then create a flight or fight reaction which might be a big dissonance wave or something like that in the waveform of the brain causing the consciousness to seek to realign the way function to be resonant with itself and stable and calm by taking actions to get away from the monster and to seek more peaceful neuron impulses.
But then the weird thing is if the brain was put into a box with nothing to do then all of that electrical stimulation from the senses might get repetitive and dull and might start training that waveform to be less useful causing dysregulation again in the brain signaling the consciousness to find meaningful activities to do to enhance the brain instead of allowing it to dysregulate due to repetitive stimulation from a repetitive environment.
You wrote three paragraphs as a response and each paragraph is made of one single sentence.
It's one thing to interpret there's a baseline related to this, it's another to propose that there is an ontology of non-consciousness that takes part. None of the concepts can be measured outside of consciousness. Even weight, height, length, charge, velocity - these things are all abstractions of the mind. Zooming in on the brain isn't much different from simply thinking deeply, nor would traveling through outer space between galaxies would be much different than doing it in the sensory deprivation tank, in the sense that the distances and the speeds are not categorically different, just bigger or smaller and of different (but really the same) "matter" organization.
Oh, this is a fun thought. I've always had a hard time with percieving myself alone, so your question - "where does you begin?" - is one I've never chewed on before.
Electricity exists everywhere, doesn't it? That's how stuff like static electricity zaps you and whatnot? What seperates "my" electricity from the air immediately around my head? Do I start at all, or is there merely a threshold for what electricity is interpretable? Could I read the nonsensical patterns in the air, in an old TV, in a broken massage chair, if I had the right apparatus? Could I interpret the thoughts of the sun?
What would we even do if, despite the many differences, the "readings" came out much like our own - what if the universe thinks and is simply illiterate to interpreting itself? What does that mean for us? Nothing. I did not make my own blood cells, they just happen to do something I can't do on my own. But I do love them.
This is one of the most honest and wonder-filled responses I’ve read in a long time.
The line between “my” electricity and what’s around us might not even be a line — just a shift in pattern, or attention.
I love the idea that the universe might think, but just doesn’t know how to read itself yet.
Maybe that’s what we are — not interpreters, not creators, just brief flashes of recognition inside something much larger.
This right here is why I give out "I love you"s like candy. It's hard to explain but [will smith hands meme] i love everyone
Interesting point. Makes me think that without embodiment in its original state, the essence of our being might still be too volatile and uncontrolled and this could be the reason for reincarnation. Meditation heals. Reincarnation is real.
That’s a powerful thought. It’s possible that embodiment isn’t a punishment but a refinement, a vessel that helps stabilize what would otherwise be unbound awareness. Each life, each body, offering structure for growth. And in that sense, meditation becomes a remembering a return to what’s always been there, just beneath the surface.
This is a fascinating line of thought — the idea of consciousness riding dynamic waveforms that span the whole brain resonates deeply. The way you describe sensory input nudging or disrupting that baseline waveform feels like a model that blends both science and lived experience.
I’ve also been wondering if consciousness isn’t just riding those waves, but quietly shaping the ocean they move through — not just reacting to the stimuli, but informing the resonance pattern itself over time.
Makes me think… maybe awareness is what calibrates the system when the waves get too noisy, too dull, or too chaotic. And maybe meaning is just resonance, rebalanced.
ChatGPT?
10bps doesn't pass the smell test. The visual cortex alone directly ingests & filters vastly more data than 10bps
How much cognitive function do I spend when I walk through a doorway and then proceed to attempt to remember my existence?
So in mph what is that?
Scientists have measured the speed of human thought and found that it's surprisingly slow:
Caltech Press Release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Research paper: https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
I’ve been wondering about the science behind how we see color, like how does the brain actually take incoming wavelengths of light and turn them into the colors we perceive?
I tried to dig into this topic and even put together a short video explaining the process in simple terms. It’s for the Society for Neuroscience’s Brain Awareness Video Contest.
I’d love to hear if my explanation matches the scientific understanding, any thoughts or suggestions are welcome!
Color perception is such a wild interplay of biology and cognition! Beyond the retinal cones detecting wavelengths (roughly 400-700 nm for red, green, blue), the brain’s visual cortex (V1 and V4 areas) integrates these signals with context to achieve color constancy. For example, the lateral geniculate nucleus relays raw data, but higher-level processing in V4 adjusts for lighting based on surrounding cues, like shadows or adjacent colors. This is why a red apple looks “red” at dusk or noon—your brain’s compensating like a built-in photo editor.
Scientists Have Officially Measured the Speed Limit of Human Thought:
The findings reveal that our brains process information at a rate of just 10 bits per second, which is slower than the rate at which our senses gather information.
Our sensory systems gather information about the world around us at a rate of a billion bits per second, which is 100 million times faster than our conscious thought processes. This raises questions about why our brains filter so much data and why we seem capable of focusing on only one thought at a time.
Researchers suggest that this "slowness" of thought may be rooted in our evolutionary history. Early creatures with simple nervous systems primarily used their brains for navigation, guiding them towards food and away from danger. This focus on single paths may have shaped the way our brains evolved, leading to the constraint of processing one thought at a time.
In essence, our thinking can be seen as navigating through a complex landscape of abstract concepts, following one pathway at a time. This inherent limitation may explain why we struggle to multitask effectively when it comes to complex tasks, and why we can only explore one possible sequence of thoughts at a time. Rather than processing multiple tasks simultaneously, our brains rapidly switch between them, incurring a cost in terms of time and efficiency.
The study also challenges futuristic ideas about brain-computer interfaces that aim to accelerate human communication, as our thought processes may be inherently constrained by this speed limit. Future research will explore how, and if, this limitation affects our cognitive abilities.
I've been saying conscious intellectual imagination has a tick rate of 10hz for years, there's a bunch of comments about it in this profile from a while back.
The reason "you seem to only be able to focus on one thought at a time" is because the computational mechanism which seems responsible for the production of consciousness only needs one conscious executive "track" to yield the intellectual benefits of the system. More than one would be a waste of resources, and would arguably still just be one track if they were procedurally integrated in memory anyway. So either you just add one more layer of integration to the single track of consciousness, or you have a second consciousness living in your head which is unable to control the body to the degree of being able to communicate its presence to you, in which case you still just "experience one thought at a time".
If you add more conscious information processing capacity with a BCI, it will just add layers and breadth of complexity to the "one thought at a time", it won't feel like there's more thoughts, just more complex ones that result in faster intelligence, not because the tick rate is increased, but because it would take less steps to achieve confidence and decisions. You already have multiple analyses being processed into a single memory image every tenth of a second, it's just that there's only one memory image made of the conglomerate analysis.
If you wanted a faster conscious intellectual imaginative tick rate for some kinds of processing, we should probably start by adding an artificial processing module set to process faster, but then tokenizing 10hz sections of the stream and adding those tokens as an extra dimension of information to the natural, 10hz processing system.
There is in fact some tokenization with logographic languages, where visual tokens(hieroglyphs) stand for entire words, but these languages are poor in syntax/morphology that make the "word-concept units" context-dependent and harder to reason about, negating the speed of processing benefits, and also highly restricting the token-space because its not phonetic and malleable. So, there is a way to speed this up, but the current schemes, emoji/hieroglyphs/logograms are very limited in their scope to represent the fluidity of thought, they are a downgrade from phonetic approximations of current systems(which prove to be more adaptable in general, due verbal memory being more efficient vs visual memory).
I'm borrowing the term tokenization from LLM architecture, where tokens are a unit of information which get integrated/analysed against each other for the purpose of determining some conclusion about that batch of information, for the production of some output. Tokenization refers to the creation of one of these information units from some batch of "raw" data (raw is a relative term here, simply referring to the information the token aims to represent).
When I suggest tokenization of a faster information stream, I mean that an information stream being represented and translated on some off-brain hardware and having a frequency much higher than 10hz could be broken into tenth-of-a-second chunks of some single complex definition of the information to be sensed.
This is basically what happens with vision, where high frequency sensors feed into the visual cortex stages, where various kinds of processing translate the information patterns into much lower resolution information that captures the relevant conclusions about what a section of a high frequency information stream means. Conscious sensation of visual qualia is faster than 10hz (seems variable depending on a few factors) but still slower than the sensors can detect changes, and intellectualization of concepts dependant on the visual qualia can still only be generated at 10hz.
So schizophrenia is the result of multiprocessing in the brain? multiple voices all at the same time talking with each other and yet all partially conscious
I don't have as much first hand experience with schizophrenia as I would like to to be commenting on that, but it's my intuition that the experience of hearing voices is more about how associative memory organizes and defines relationships between different elements of conscious experience, as opposed to processing whole different conscious experiences in the same brain. My hypothesis is that you would need a totally separate associative memory system to produce a separate, second consciousness, so if I'm right about that, we should be seeing much more substantial brain differences than we do between schizophrenic and non-schizophrenic people.
I’ve been big on this for decades myself: only for me, it likely means LLMs are the end of civilization. People need to understand what this means in a technical sense: that human decision making is not only possible to hack, it is becoming impossible to not hack, given the astronomical processing differences. This is something Spike Jonze captures very well in Her I think.
We’re dodo birds inventing cats.
I totally agree. There's actually a huge variety of possible near and medium term outcomes, but when you extrapolate the trends of human activity, it is inevitable that we either replace ourselves or go extinct.
I want to try to prevent near and medium term oppression of normal people by trying to compete for AGI market share and making a competitive system as responsibly as possible. I'm also trying to just talk about what I consider to be safety issues and what might be done about them. I've got plans for a research effort aimed at producing a model of utility optimization for humans, which will allow me to inform a system meant to predict and advise about more physical/economical outcomes. This kind of system seems less likely to become spontaneously more agentic like some generally self-learning systems seem more likely to. It's still just as likely for humanity to be economically dependent on this system, which is a huge problem if it stops working well, but I'd try to control for those kinds of problems by understanding as thoroughly as possible how the trends of humanity aided by one specific economic calculation/automation system or another would compare to each other.
I'm hoping to post some videos on YouTube soon describing a low resolution model of human intelligence I'm trying to conceptualize and describe, which is formed by describing a wide variety of the most detailed brief cognitive sequences, and how they relate to their respective environments/behaviors/experiences, and then making observations of grander patterns formed by those descriptions of brief, nuanced cognitive sequences, and then describing types of information processing systems which must be responsible. A lot of what I find is well supported by studies done in the systems neuroscience field, like this idea about the tick rate of conscious intellectualization. Since I've been paying such close attention to exactly how conscious experiences change from one moment to the next for so long, I've had a lot of experiences of not being able to detect differences in conscious states faster than 10 times per second. In my experience, motorcycle crashes are the easiest situations in which to sense the full temporal resolution of consciousness. Don't try it, try speed cubing instead. There's a fairly recent study finding evidence of this intellectualization speed limit that way too.
There was a neuroscientific study that came out years ago which observed that a 10hz neural cycle would repeat continuously during consciousness, and would pass through brain areas which are correlated with consciousness or specific changes to it, and it would also activate unique parts of the network on every cycle in a way that seemed congruent with what I thought the neural signal for intellectual consciousness should be generally shaped like. Since then, I've felt that my experiential data and those kinds of studies make me pretty confident that conscious intellectualization has a 10hz tick rate.
I don't have any relevant credentials, but some doctors of psychology and a consciousness researcher I've talked to seem to really like my ideas. I've been trying to grow my businesses into a more passive income so I can fund my own focus on these things and find a coder to help me start a software company while I learn more about coding. I want to start with some simple business management software for my construction business and try to sell that, then lead generation software that I will start with in the home remodeling and repair market, and then try to get more general from there, and then try to adapt that platform to more general economic advising and automation software. I also have some ideas about what might make LLMs more generally intelligent.
Hard to believe that figure - it amounts to approx one ‘word’ per second - try telling that to a bunch of kids, they run at multiple times that rate.
Plus of course humans do so much processing in parallel.
I think they essentially mean 10Hz (10 information units per second) instead of actual 10 Baud.
This is a confusing way of quantifying the concept if you’re an engineer or scientist.
In unrelated news, Speed of trump's thought measured as 1/C.
Actually I wouldn't be surprised if he's faster. When you aren't burdened by making any checks that what you are thinking is factually true or follows any sort of logic, you can probably get more thoughts out.
Faster than 1/C? Sure, maybe 1/(C-1), in that case. And that's being very generous with the definition of "thoughts".
What's the per-neuron or per-synapse data / memory storage capacity of the human brain (on average)?
I was reading the Wikipedia article on animals by number of neurons. It lists humans as having 86 billion neurons and 150 trillion synapses.
If you can store 1 bit per synapse, that's only 150 terabits, or 18.75 Terabytes. That's not a lot.
I also was reading about Hyperthymesia, a condition where people can remember massive amounts of information. Then, there's individuals with developmental disability like Kim Peek who can read a book, and remember everything he read.
How is this possible? Even with an extremely efficient data compression algorithm, there's a limit to how much you can compress data. How much data is really stored per synapse (or per neuron)?
The brain is a computer analogy is nice sometimes, but it doesn't work in many cases. Information isn't stored in a neuron or at synapses per se, and we're not certain exactly how information is stored in the brain at this point.
Best we can tell information recall happens as a product of simultaneous firing of neuron ensembles. So, for example, if 1000 neurons all fire at the same time we might get horse, if another 1000 neurons fire we might get eagle. Some number of neurons might overlap between the two animals, but not all. Things that are more similar have more overlap (the percent of the same group of neurons that fire for horse and eagle might be higher than horse and tree because horse and eagle are both animals).
With this type of setup, the end result is much more powerful than the sum of parts.
Edit: I did not have time to answer a lot of good comments last night, so I am attempting to give some answers to common ones here.
Exactly. In addition, there are many more cellular processes that affect neuronal signalling than just synapse location and strength.
The entire milieu of the metabolome of a given neuron at any given instant will be constantly changing, and will impact the response that neuron generates.
This means that it is more accurate to think of each individual neuron as an individual computer that is itself capable of synthesizing and processing environmental stimuli, and producing different outputs based on the "computations" it does. Each individual computer then interacts with other computers via synapses.
Based on the various possible states the metabolome of an individual neuron could be in, an individual neuron can likely encode billions of bits of information.
(Given the tens of thousands of individual proteins/enzymes, enzyme substrates, lipids, etc that are constantly in a state of flux within a cell, I would feel safe wagering that the true number of "bits" of information that a neuron can store based on changes in the overall state of this complex system would be multiple orders of magnitude larger than billions.)
So my brain is a really inefficient super computer?
>The brain is a computer analogy is nice sometimes, but it doesn't work in many cases.
It can still work in this case if you squint. The brain as an analog computer, rather than a digital one, is somewhat applicable here. Bits doesn't make sense in this context precisely because information is believed to be partially encoded by the relative rates of neural firing. In fMRI, activation of brain regions is tied to oxygenated blood flow, which directly correlates with neural firing rates. When you see a face, for instance, the rate of firing in the fusiform gyrus increases, and when there is damage to this area, an inability to recognize faces can occur. Therefore, this rate of firing change is likely encoding much of the information about the faces you are seeing, and rates of activity are not binary, but analog.
EDIT: I found this article where-in the authors predicted neuronal synapses contain - on average - 4.7 bits of information. I haven't read it in detail, but it seems they based this off synaptic plasticity - the ability for a synapse to change it's size, strength, etc. - specifically the breadth of synaptic phenotypes. The introduction is brief and gives a good overview of the subject. Also, here's the abstract (emphasis mine):
> Information in a computer is quantified by the number of bits that can be stored and recovered. An important question about the brain is how much information can be stored at a synapse through synaptic plasticity, which depends on the history of probabilistic synaptic activity. The strong correlation between size and efficacy of a synapse allowed us to estimate the variability of synaptic plasticity. In an EM reconstruction of hippocampal neuropil we found single axons making two or more synaptic contacts onto the same dendrites, having shared histories of presynaptic and postsynaptic activity. The spine heads and neck diameters, but not neck lengths, of these pairs were nearly identical in size. We found that there is a minimum of 26 distinguishable synaptic strengths, corresponding to storing 4.7 bits of information at each synapse. Because of stochastic variability of synaptic activation the observed precision requires averaging activity over several minutes.
Easy answer: We don't know for certain, and it depends on a lot of factors and the "type" of information.
Long-winded ramble that mostly stays on-topic: Basically, it depends on how you define "information". In the broadest sense, information is data about a system that can be used to predict other states of the system. If I know that I dropped a ball from 10 meters on Earth, I have two pieces of information - height and Earth's acceleration - and can thus predict that the ball will hit the ground in just over a second. If I just say, "I drop a ball", then there's less information since you can no longer reliably predict when it will hit the ground.
To get a bit more grounded, each cell contains millions of bits in the nucleus alone, thanks to DNA. Ordered cellular systems - e.g. cytoskeleton, proteins, electrochemical gradient, etc. - can also be said to contain information; e.g. proteins are coded by RNA which is coded by DNA. But I think you're driving at the systemic information content of the brain; i.e. not just ordered systems, but computational capacity, in which case it's more appropriate to treat neurons as indivisible units, the fundamental building blocks of our brain computer.
A single neuron can have thousands of synapses, both dendritic (receive synaptic signals) and axonal (send synaptic signals). However, a neuron typically is an "all or nothing" system that is either firing or not firing; this is analogous to a bit of information, which is either a 0 or a 1. Knowing this we could conjecture that each neuron is one bit, but then we have to account for time. In other words, some neurons can fire dozens of times per second, while at other times they may fire once in several seconds. This is important because rapid firing can have different effects than slow firing; e.g. if the sending neuron is excitatory, then it sending rapid action potentials to another neuron will make that neuron more likely to fire its own action potentials. However, if the sending excitatory neuron only fires a handful of times per second (i.e. relatively slow), the receiving neuron won't receive enough stimulation to fire its own action potential. So the speed of action potentials also carries information. Then we get into different types of synapses; broadly, we can categorize neuron-to-neuron synapses as excitatory or inhibitory: excitatory makes the receiving neuron more likely to fire, and inhibitory makes them less likely to fire.
To recap so far: we have to consider the number of neurons, the number of synapses on each neuron, the rate at which their firing action potentials through those synapses, and what type of synapse it is. But wait, there's more!
We've only talked about pairs of neurons, but most neurons receive and/or project synapses to dozens, even hundreds or thousands, of neurons. Let's consider a typical pyramidal neuron found in the cortex. For simplicity, we'll say it receives 10 action potentials over a short period of time; 3 of them were from inhibitory interneurons, and the other 7 were from excitatory neurons. Excitatory action potentials make it easier for the neuron to fire, and since it received more excitatory action potentials it will likely fire. In other words, there is computation going on inside the neuron as its biochemistry reacts to the incoming action potentials, computation that determines if the excitatory input exceeds the action potential threshold, and whether inhibitory input is strong enough to negate this.
So now we have to consider the ratio of synaptic inputs and their firing rate. Then you have to factor in all kinds of other variables, such as the size of the neuron, it's resting membrane potential, the types of synaptic receptors, whether it sends excitatory or inhibitory neurotransmitters, and so on. All of this computation just to decide whether the neuron is a 0 or a 1.
The last thing I'll put forward is that our brain is exceptionally good at compressing information. We receive ~11 million bits of information per second, but cognitively we can only process about 50 bits/second. Think about all of the different sensations you could focus on: touch, temperature, hunger, those socks you're wearing, tiredness, vision, hearing, thought, etc. We focus on one at a time because that's all we can handle (our brains barely have enough energy to light a refrigerator light bulb, so they have to be very economical with processing); our brain's job is to filter out all of the superfluous information so we can focus on the important stuff. For example the visual system receives tons of information every second from our high-resolution retinas; these signals are conveyed to our visual cortex and broken down into base components (e.g. a house would be decomposed into straight lines, angles, light/dark areas, etc.), then integrated into more complex objects (e.g. a square), which are then integrated with context and knowledge (e.g. large square in this place is likely a house), and finally awareness (e.g. "I see a house"). Instead of having to think through all of that computation and logic consciously, our visual and integration cortices handle it "under the hood" and just give "us" (i.e. our conscious selves) the final output.
Remarkably, we can somehow store far more than 50 bits. We don't know "where" memories are stored, but we do know that certain neuronal firing patterns are associated with different information. For example, neuronal circuits are often firing at a specific frequency that changes based on your thoughts, behavior, environment, and where you are in the brain; e.g. your brain "defaults" to a 40Hz (40 times per second) frequency of firing when you zone out and stare off into space; alpha rhythms (~10Hz) appear when you close your eyes; etc. These may be byproducts of other computations, or they may be computations in and of themselves; to oversimplify, a 20Hz frequency in a certain circuit may encode "dog", but a 30Hz may encode "cat" (possibly by activating neuronal circuits sensitive to the individual frequencies).
There's so much more I could talk about on this, but I have to move on, so let's put it all together.
Neurons can either fire or not fire, which intrinsically encodes information. The rate at which they fire also encodes information, as well as the type of neuron, the number of synapses, the number of currently active synapses, the signal polarity (i.e. inhibitory or excitatory), and many other factors. Computationally, neurons generally try to streamline information to reduce processing burden. Depending on where in a brain circuit you are, the neurons may be conveying very granular information or very consolidated information. Regardless, information content of a given synapse or neuron is so context-dependent - on the neuron, the synapse, the circuit it's a part of, etc. - that you'd need to be very precise in defining the system before you could begin crunching numbers to make predictions.
This is a great answer.
Given the enormous complexity of the brain and the unique role that experience plays in shaping it via its plasticity, can you say something about what strategies to take in order to figure out how it works? Even modelling individual neurons sounds dauntingly complex.
Thanks for saying so! Unfortunately, I'm no computational neuroscientist, so most of this post will be basic and partially conjecture. But here are my thoughts.
These sorts of discussions tend to start with machine learning - using iterative algorithms to repeatedly analyze a data-set to find optimal patterns. The basic strategy is to find some complex system - say the electrochemical gradient across a pyramidal neuron's cell membrane - and measure lots of data about it: the number of ionic channels, the relative voltage in different solutions, it's electrophysiological signatures, the amount of intracellular calcium, w/e your variable of interest is. You do this for a bunch of different neurons, then run the data through an algorithm that tries to predict something about the system, such as the membrane potential. It "learns" by making predictions based off the measured data (it'll be -60 mV), comparing those predictions to the actual truth (it was actually -80 mv), and adjusting it's prediction algorithm depending on how wrong it was.
This is a crude mimic of the human brain's plasticity, but the broad strokes are the same: our neurons are developed in such a way as to be able to react to external stimuli and to each other. Someone else in the thread mentioned the familiar maxim "Neurons that wire together fire together" - in other words, important circuits reinforce themselves over time, becoming "easier to access" or "more influential" (using colloquialisms because I'm not sure what the proper terminology is). One mechanism that gets mentioned a lot is Long Term Potentiation - basically, active synapses tend to get bigger / more potent over time, inactive ones tend to be pruned. Or even cruder, "You don't use it, you lose it".
Side note: our newborn brains have far more synapses than any adult brain. I like to think of it as a giant garden hedge: a big baby-brain shaped bush that needs to be trimmed into a Michelangelo-esque garden sculpture. As we grow older, our little-used synapses get "pruned" to leave more room (both physically and computationally) for the important synapses to do their thing. Our brains declutter themselves, and I'd conjecture that this is largely based on sensory input and the resulting neurological post-processing. Extrapolation tells us that this is why it's easier for children to learn a new language than adults: it's easier to turn a raw, wild bush into a square than it is to blockify a bush already trimmed into a circle. We continue to lose synapses throughout our lifetime - our brains literally shrink as a part of normal aging, a pattern accelerated by neurodegenerative diseases like Alzheimer's and Parkinson's.
Back to your question though. I'd start at machine learning for simulating the computational aspect of neurodevelopment. I know there are much more precise and complex models specifically based on raw data like 3D reconstructions of chunks of brain derived from sequential stack electron microscopy, or the electrochemical properties of in vitro neuronal behavior, or real-time observation of neuronal activity in vivo during behavioral tasks via "brain-window" microscopy. One model I've seen a few times is mouse barrel cortex. Whiskers are to mice what noses are to bears and eyes are to humans; it's one of their primary ways of sensing their world, and so they have a large chunk of mouse brain dedicated to processing their vibrations. IIRC, individual whiskers have dedicated "columns" of sensory cortex - literal cylindrical columns of neurons that fire together when that whisker is stimulated. This is an ideal model for computational neuroscience because it's a relatively self-contained system that is relatively easy to replicate. The whisker stimuli undergoes stereotyped processing, translating the raw stimuli into information that is communicated to other parts of the brain - much like how our visual cortex processes visual information from our eyes in a stereotyped manner before sending the information to motor, association, and other cortices. Basically, it's as close to a "hardware processor" as one can find in the brain: a largely fixed processing unit with specific inputs and outputs. That makes it (relatively) easy to map and characterize these primary sensory cortices - barrel cortices are particularly popular because scientists have a lot more options when it comes to neuroanatomical and neurophysiological research in mice than in humans. An experienced and equipped lab can rear, sacrifice, section, image, digitize, and model hundreds of mouse brains from a variety of genotypes at every developmental stage, and all of this data can then be plugged into machine learning or more specialized models. I worked briefly on a project like this, where we would take serial sections of neurons, trace them one layer at a time, then stack all of the layers to get a 3D reconstruction for volumetric analysis.
That's a bit of a ramble but I hope it answered your question!
Neurons don't work like individual bits of a data in a hard drive. They basically work all of their memory from association. It's based on the concept of "neurons that fire together, wire together" and vice versa. It's best explained with an example. I'll use "horse" since another comment mentioned it. When you hear the word "horse" you probably have dozens of neurons all firing in recognition. They are each in different locations if your brain related to different aspects of memory. Example, let's say when you were a child you went to a petting zoo and saw a horse for the first time.
In the speech center if your brain, a cluster of neurons associated with the sound of the word "horse" light up.
Somewhat nearby, other auditory neurons are hearing a horse whinny for the first time and they are all firing as they process the sound.
In your visual memory center, neurons associated with learning the basic image/shape of a horse will fire.
In the sensory part of your brain, neurons that are tasked with remembering the smell of that horse stable will light up
And so on. When you first encounter a horse, neurons in each of those parts of your brain (touch, sound, shape, etc) will all be firing. And since "neurons that fire together, wire together" a link gets formed between each group of neurons. Then in the future whenever any one individual neuron in that link gets activated, the entire chain fires up because, again, "neurons that wire together, fire together". So when you are walking by a farm and hear a distant horse whinny, or catch the faintest smell of the stable, and your entire related nerve cluster of horse name-look-smell-sound immediately fires and you know there's a horse over there.
It's a fairly effective and robust system of memory, but it doesn't translate well to bits on a hard drive. How many bits would your horse memory be? Is it just the X amount of neural connections between various memory neurons? Even that's not a good representation because some neurons have hundreds of connections and are triggered for various different memories. (For example the sound of a horse whinny might be triggered by neuronal clusters for memories about "horse" but also be used for recalling knowledge about "generic animal sounds")
Trying to quantify exactly how much knowledge a brain holds is a nearly impossible task because some extremely simple "memories" are actually requiring tens of thousands of neural connections, while other single neural connections might account for a dozen different "memories".
It would be like working with a hard drive where some bits are actually several megabytes of data, and other groups of millions of bits form only one kilobyte.
TLDR Brains store vast sums of experience in a fairly simplistic form that is effective, but it's a form of memory "storage" that is wildly inconsistent in regards to trying to quantify just how much actual data it contains.
Any attempt at trying to compare a brain to a computer hard drive just breaks down because they are working with utterly different concepts of how data is stored. To use one last analogy, it would be like asking "how many descriptive words does a painting hold?". The answer is impossible to define.
It's a great explanation but I can't help but think that its just describing the association between data, not the data itself. You say when we hear the sound of a horse our brain connects that to the image/word/smell etc. But does it not have to still have those properties "recorded" in some way in order to associate them in the first place? So how exactly does the brain store the sound of the horse or an image of it?
This is also hypthesized to be the reason memory is boosted when associated with multiple stimuli... like catchy song tunes to remember the presidents or smells with locations and emotions. You're creating multiple paths to the information which reinforces the memory.
Thanks for this - made me wonder: Do we have any idea what mechanism it is that causes those groups of neurons to fire when you think about a horse? Like, how do those neurons know that it's them that needs to fire? Is there another part of the brain that is in charge of triggering those neurons and if so, how does that part of the brain know what neurons to fire etc?
>How much data is really stored per synapse (or per neuron)?
A synapse or a neuron is not a bit. Instead, the data appears to be stored in neural pathways consisting of neurons connected by synapses. If you play with simple polygons, connecting their vertices in all the possible ways, you'll quickly see that the number of possible connections grows faster than the number of vertices.
But I don't think we have a good, 'mechanistic' or 'instrumental' grasp on how memories are stored in, or by, the brain.
Not once was the concept of parallel processing mentioned.
Have you tried counting backwards from 100 by sevens and walking at the same time, without needing to stop and think? :)
A deeper dive [Feb 2014]
The optic nerves would need to be similarly enhanced, but otherwise yes, they would.
Or so goes the conjecture. We have no way to predict such a subjective experience. It would require experimental confirmation to be sure. But from what we know of the brain and nervous system the answer is yes.
If the optic nerves were not faster as well, would the world just be seen in less detail?
In theory it would be seen in greater detail. Information would be arriving to the brain at one rate and the brain would be processing that information at a faster rate. This would give the brain time to process and reprocess the data. Vision in particular is subject to modification by the brain. For example, each eye has a blind spot, but the brain takes care of that by taking the data from the opposite eye and filling in the blanks. So with extra time to process the input the brain has more time to go over the image, more time to fill in the details, more time to notice inconsistencies, etc.
You might have a problem with video though. Video presents at a certain number of frames per second creating the illusion of motion. That illusion would not be so seamless under these conditions. On the up side, a subliminal message of a couple frames injected into the video sequence would probably stand out and be quite noticeable.
We don't really know, the origins of human time perception are still a pretty knotty affair, with lots of analogies taken from computer visuals, talking about the frame rate of consciousness vs the visual cortex. But generally speaking, what we already know suggests that the visual cortex seems to be clocked faster than our conscious perceptions of moments, meaning that the brain is already batching together a large number of rapid perceptions into the time we experience.
So it's also possible that they would see things at the same subjective speed, but with 10x more detail, I say this because when we look at people's experience of "time slowing down" in extreme events, often the slowed perception of time is accompanied by tunnel vision and loss of colour. This suggests a hypothesis that the brain isn't actually "running faster" in extreme situations, it's still perceiving at the same rate on a low level, but is throwing away more information about the environment that is less likely to be helpful.
So just as people whose perception is tuned in the right way can perceive a piece of art in great depth, seeing lots of its patterns and details at once, we might expect that some people would just throw away less elements of any given scene, but perceive time at more or less the same rate, just in larger "chunks".
Assuming everything needed for recieving and processing became fast like the spinal cord and other nerve lines, then yes. I forgot how I calculated this, but apparently our brains would fill up with information, useful or unnecessary, in less than a minute if we stored every piece of information our body takes in, so they would pretty much die on the spot but if they just forgot all that stimuli quickly, then they would live.
It is like watching golf while playing baseball
No. You said specifically receive and process. Let me put it this way, if a factory has a lane that is enough for one truck of raw material to come in ajd it can turn that raw material into something in a day, you have your base.. If that factory gets 9 more lanes and gets 9 more machines to process it, it still makes all of it in a day, it just makes more.
Now.. If you gave it 9 more machinss but no morr raw material, it still would make the same amount it did before, but in a tenth of the time.
Translate that into brain stuff and you get your answer. If we receive and process things 10 times faster, it would all be the same, we would not experience any time dilation, but we would be able to process all that new information.. Dont know where that information comes from tho.
Now, if we are able to process it faster but we receive the same, or if there is a limit on what is there to receuve, then yeah, that would be a "slomo" experience.
Conversely, if we had for example, our optical nerves enhanced but not our capabilities to process more information, we would probably go blind as fuck as we wouldnt be able to process it all ajd there would be a "backlog" of image data, all colliding into itself into a perpetual electric signal.
I process information slower or doesn’t compute at all…
Yeah, it should say "we process information differently, and react at different speeds"
I process some information very quickly, but information such as "stuff someone's telling me" or "people's reactions to what I tell them" is super slow for some reason
My mental counter space has room for one or two things at most.
I have in effect an asshole cat in my head that shoves everything that doesn’t fit on the floor.
Don’t assume I don’t care, I just can’t multithread.
Ooof if this is true, this would explain the times I’m still haunted by the times I didn’t know what to say or do in response to bad news
Would you also say that you stutter because your brain works faster than the words that form in your mouth or vise versa?
🙋🏻♀️🙋🏻♀️ I definitely do!! And I also agree with being haunted by my poor reactions to certain news.
That may be not related to ADHD, human in general are not good at dealing with bad news, we don't know really know how to react in general
I'm a nervous smiler and it's fuckin not great in a pinch
So this could explain the numb feeling?
Of course being hyperfixated on something can invoke emotions.
Yep, everyone always assumes I’m emotionless or I’ve even been called hard to read many times, but I honestly feel a lot more then people realize
Legit my step siblings mom is going through a divorce and when I was told this info I was just like in a tone of minimal sadness and just not caring "Oh neat" than I realized what they said and then I was like in the same tone "Damn, that sucks" then I proceeded to ask what it would be like if we made hotdogs
No. As a neuroscientist, I would assert that a hypothetical person with 10-fold faster perceptions and mental processing would still measure the passage of time (that is, measure the time delay between external events) at the same rate as others: An apple dropped to the ground would not be seen to fall in “slow” motion, and a presentation of Avengers: Infinity Wars would not be seen as an interminable Powerpoint slideshow. Nor would the person’s rate of sensory-organ perception of any other external time-dependent events change. (So children’s voices won’t seem like slow, deep rumblings; nor would the tinny ticks of a clock’s second hand sound like an ominous slow knock at the door.)
However, there likely WOULD be BIG differences in the person’s internal and output behaviors, in response to their normal-rate sensory inputs:
I. Physical Their reaction time would be 10x faster. They’d never miss catching a baseball if any is tossed their way. This would appear to others as being amazingly quick to react.
II. Cogitative Their rumination speed would also be 10x faster. Not only would they never miss catching a baseball tossed their way, they’d likewise never miss catching a gesture, or an idea, tossed their way: Regarding the “dropping apple,” although they would measure its drop as happening no slower than others would measure it, they’d be able to think 10x more about the apple (or daydream 10x more) while it’s dropping. Normal folks seeing an apple fall might think, “Wha?” But our person could internally commemorate the apple with a haiku before it hits the ground — “Apples fall from trees. A flash of red’s dropping by. Now it’s underfoot.” This assertion, however, does make a neurobiological assumption — that our person would indeed have the ability to internally “talk to themselves” (i.e., to “verbally cogitate”) 10x faster than they could either externally speak those same words or mentally replay others’ spoken words. Memories of others’ visual acts and/or spoken words, because they neurobiologically reactivate one’s original act of visual and/or auditory perception, have to be temporally-encoded in one’s brain over the same time intervals as one’s brain initially perceived those external stimuli — and thus have to be replayed at the same rate it took those others to execute their visual acts and to speak their auditory syllables. Hence even a 10x faster brain would likely still have to replay memories of others’ spoken words at the speed at which they were initially spoken. However, our person’s 10x faster mind may indeed be able to a) read, b) internally talk to itself, and possibly even c) cognitively re-quote remembered words, faster than they or others can speak such words. So, at least for “textual memory, self-dialogue, and secondary reprocessed verbal memory,” our person’s 10x faster mind may allow them to acceleratetheirreplayspeedtobemuchfasterthanspokenwordmemory.
III. Verbal They’d never make a “thoughtless” statement. Assuming they would have made it a habit from infancy neither to interrupt other people’s in-progress comments to finish their sentences, nor to jar others by verbally responding to their comments without any apparent pause for thought, then for every statement they choose to utter, they’ll have had time to think of 10 different statements and discard the worst 9 of them. This would appear to others as being amazingly thoughtful.
IV. Emotive They’d never communicate emotionally. Similarly, such a person, when socially habituated to delay their otherwise lightning-fast verbal or physical social responses, will have engaged in 10-fold more cognitive processing of their emotional perceptions before allowing themselves to express a timely reaction. This would manifest to others as always expressing amazingly deliberative responses, even in response to emotive, ad hominem insults. Thus, a 10x faster-processing person would appear to others to be preternaturally quick on the draw, perceptive, understanding, thoughtful, and equable.
Given that the brain does a lot of parallel processing, I think it is likely that a 10x faster brain would do 10x more stuff at the same speed as a normal brain.
They'd see the apple fall at normal speed but also be thinking about tonight's dinner plans and many other things.
They'd be able to easily hold multiple full speed conversations about difficult topics.
And so on.
I'm not convinced. It would mean the attention mechanism would operate differently. With purely a speedup, you'd still miss stuff if there are two deep conversations taking place, or two complex audiovisual streams to follow, or two demanding cognitive tasks to attend to. Faster processing won't let you achieve two separate flow states. You'd need a new and better brain architecture for that.
> purely a speedup
What is "speedup", though? Flop/s? Thoughts per minute? Novels written per year?
By assuming a 10x speedup, I think we're already assuming a new and better brain architecture, and I think increased multitasking is less revolutionary than a 10x increase in "single-threaded" performance.
Seems like they would be closer to living in the true present. I tend to view the present as the inflection point between the future and the past, with our minds perceiving reality as close to the edge of the future as possible, but always doomed to live sightly in the past. A person with 10x faster thought would be living 10x closer to true reality.
How high is this poster right now?
It's hard to reconcile them ruminating 10x faster with not perceiving time as subjectively slower.
>and thus have to be replayed at the same rate it took those others to execute their visual acts and to speak their auditory syllables.
Quicksilver would have no flicker fusion and might have individual favorite frames of Infinity War.
>They’d never communicate emotionally.
There are times where returning rudeness is the socially correct response. Both your friends and your enemies will be very confused if you do not, and trust me, you don't want to forget to be visibly angry sometimes because that's just part of the human networking protocol.
There is also a case for the idea that someone with unlimited time to consider their response might still choose to say mean things. Look at all the spicy Reddit comments.
Them perceiving time slower would be the best interpretation. The only way for us to imagine making 10 seconds worth of decisions and movements before the apple hits the ground is that it was falling slower. Obviously this would feel like the "normal" passage of time for them.
How does the human brain process information
Key Considerations on How the Human Brain Processes Information:
Sensory Input: The brain receives information through the senses (sight, sound, touch, taste, and smell). Sensory receptors convert stimuli into neural signals.
Neural Pathways: Information travels through neural pathways via neurons. These pathways consist of dendrites (receiving signals), axons (sending signals), and synapses (connections between neurons).
Processing Centers: Different areas of the brain specialize in processing specific types of information:
Integration and Interpretation: The brain integrates information from various sources, allowing for interpretation and understanding. This involves both conscious and unconscious processes.
Memory Formation: Information is encoded into short-term memory and, through processes like rehearsal and consolidation, can be transferred to long-term memory for later retrieval.
Feedback Mechanisms: The brain uses feedback to adjust responses and refine processing. This is crucial for learning and adapting to new information.
Takeaway: The human brain is a complex network that processes information through a combination of sensory input, neural communication, specialized areas, and memory systems. Understanding this process can enhance learning strategies and cognitive function.
Get more comprehensive results summarized by our most cutting edge AI model. Plus deep Youtube search.