September 17, 2024

Human Brain May Store 10 Times More Information Than Previously Thought

Human Brain May Store 10 Times More Information Than Previously Thought
Illustration of a synapse. Credit: NIH.

Our memories and thoughts stem from complex patterns of electrical and chemical activity in the brain. Central to this activity are synapses, the junctions where branches of neurons connect, similar to electrical wires. Here, an axon from one neuron links to a dendrite of another. At these synapses, neurotransmitters carry signals across the gap, instructing the receiving neuron to pass on an electrical signal to others. Remarkably, each neuron can form thousands of these connections.

Similar to computers, we measure the brain’s memory storage in “bits,” and the number of bits it can hold rests on the number and strength of its synapses. It’s worth noting that brains are analog — just so you don’t get the wrong idea. Brains process everything in parallel, in continuous time rather than in discrete intervals, so the “bit” (the smallest unit of digital data) in this case is not a perfect analogy for how information is stored in a biological system. Nevertheless, it serves the purpose of estimating information processing using a familiar yardstick.

Previously, it was believed that human synapses came in very limited sizes and strengths. However, recent research suggests otherwise. It reveals that synapses, which were once thought to have a narrow range of possible configurations, are far more complex.

This means that the human brain’s capacity to store information may be almost ten times greater than previously estimated.

New Insights into Synaptic Strength and Memory Storage

A team of researchers from the University of California, San Diego and the Salk Institute developed a highly precise method to assess the strength of synapses in the brain of a rat. These synapses, where brain cells communicate and store information, play a critical role in learning and memory. By understanding how synapses strengthen and weaken, the scientists precisely quantified the information storage capacity of these connections.

In the human brain, over 100 trillion synapses exist between neurons. These synapses facilitate information transfer across the brain by launching chemical messengers. As we learn, specific synapses strengthen, enabling us to retain new information. This process is known as synaptic plasticity.

When neurons talk, they do so at different volumes — some neurons whisper to each other while others shout. The ‘volume’ setting of the synapse, or the synaptic strength, is not static; rather, it changes in both the short term and long term. Synaptic plasticity refers to these changes in synaptic strength.

However, aging and neurological diseases like Alzheimer’s can weaken synapses, reducing our cognitive ability. This is why studies like these are important — they can help scientists unlock new therapies that may stave off or even cure neurodegenerative diseases that currently plague millions of patients across the world. However, measuring the strength of synapses has traditionally been challenging.

<!– Tag ID: zmescience_300x250_InContent_3

[jeg_zmescience_ad_auto size=”__300x250″ id=”zmescience_300x250_InContent_3″]

–>

The team analyzed pairs of synapses from a rat hippocampus, a brain region associated with learning and memory. The new study’s method leverages information theory, which allows scientists to quantify how much information synapses can store and transmit. What this means is that a framework typically reserved for computers was applied to the brain in order to estimate the number of bits that synapses can hold. They found that these pairs of synapses responded to the same type and amount of brain signals, performing synaptic strength adjustment.

Current and Future Potential

Ultimately, this analysis indicated that hippocampal synapses could store between 4.1 and 4.6 bits of information. Previously, scientists thought each synapse could hold one bit. Overall, this means the human brain could store ten times more information than previously thought, or at least a petabyte — that’s as much as 500 billion DVDs or all the movies ever made in high-definition.

While the study focuses on a small part of the rat brain, future research could explore how information storage capacity varies across different brain areas and species. This method could also compare healthy and diseased brain states, offering insights into conditions that affect cognitive functions.

In 2016, the Salk researchers reached the same conclusions. The new findings serve to confirm this initial assessment. Back then, the researchers also made another important revelation: there are at least 26 size categories of synapses, rather than just a few.

These findings also provide insight into the brain’s remarkable efficiency. Despite its complexity, the waking adult brain uses only about 20 watts of continuous power, comparable to a very dim light bulb. These discoveries could inspire computer scientists to design highly precise yet energy-efficient computers. This could particularly benefit “deep learning” and artificial neural networks, enhancing their capabilities in areas like speech recognition, object identification, and language translation.

The findings appeared in the journal Neural Computation.

Thanks for your feedback!