Trending...
- "The Mystery of Emma Thorn" Earns Nomination & Official Selection of the New Media Film Festival
- California: Governor Newsom highlights mental health resources for veterans as Middle East conflict continues
- New Book Warring From the Standpoint of the Throne Room Calls Believers to Pray From Victory
A novel neural model sheds light on how the brain stores and manages information.
MOUNTAIN VIEW, Calif. - Californer -- Within neural networks, diversity is key to handling complex tasks. A 2017 study (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC56...) by Dr. Gabriele Scheler revealed that neurons develop significant variability through the learning process.[1] As networks learn, neuronal properties change—they fire at different rates, form stronger or weaker connections, and vary how easily they can be activated. Dr. Scheler showed that this heterogeneity follows a predictable pattern across brain regions and neuronal subtypes: while most neurons function at average levels, a select few are highly active. Does neuronal variability enable networks to process information more efficiently? A new study offers some answers.
In a June 30th preprint on bioRxiv (https://www.biorxiv.org/content/10.1101/658153v...), Dr. Scheler and Dr. Johann Schumann introduced a neuronal network model that mimics the brain's memory storage and recall functions.[2] Central to this model are high "mutual-information" (MI) neurons, the high-functioning neurons identified in the 2017 study. They found that high MI neurons carry the most crucial information within a memory or pattern representation. Remarkably, stimulating only high MI neurons can trigger the recall of entire patterns, although in a compressed form.
More on The Californer
The finding supports a "hub-and-spoke" model of neural networks, where a few "hub" neurons represent broad concepts, and "spoke" neurons represent specific details connected to those concepts. By activating just the central hub neurons, the connected spoke neurons are triggered downstream, to recreate the original pattern. These mini teams of neurons, or neural ensembles, could be key in recording and recalling complex memories in the brain. "We believe that such structures imply greater advantages for recall," the authors concluded.
This model not only provides insights into human cognition but could have major implications for building better AI systems. Unlike typical AI models that rely on vast amounts of data to learn, a neural ensemble-based model could potentially adapt and learn from fewer examples. For instance, a traditional AI model would require many images to train it to identify shapes. However, this model could be trained on the basic identifying properties of each shape (e.g., a square has four equal sides) from a handful of examples, teaching the model to recognize these shapes in various contexts without extensive training data.
More on The Californer
Next, Dr. Scheler, Dr. Schumann, and collaborator Prof. Röhrbein, are focused on deploying models such as this one to help other researchers build better AI systems. They are in the initial stages of launching a startup that offers a platform for developers to build network models rooted in biology. "By providing these ready-made components, users can create models that are more cognitively oriented and computationally efficient than traditional statistical machine learning models", says Dr. Scheler.
In a June 30th preprint on bioRxiv (https://www.biorxiv.org/content/10.1101/658153v...), Dr. Scheler and Dr. Johann Schumann introduced a neuronal network model that mimics the brain's memory storage and recall functions.[2] Central to this model are high "mutual-information" (MI) neurons, the high-functioning neurons identified in the 2017 study. They found that high MI neurons carry the most crucial information within a memory or pattern representation. Remarkably, stimulating only high MI neurons can trigger the recall of entire patterns, although in a compressed form.
More on The Californer
- High-Growth Power Infrastructure Play Targets AI Boom: 1606 Corp. Executes Aggressive Texas Expansion Strategy: 1606 Corp. (Stock Symbol: CBDW) $CBDW
- Accelerating the Transformation into a U.S. Nuclear Fuel Cycle Leader: Frontier Nuclear and Minerals Inc. (N A S D A Q: FNUC)
- Ozz Metals Ltd Secures 1-Tonne Gold Offtake Agreement
- California: Governor Gavin Newsom announces appointments 3.17.2026
- Owl Publishers Expands Support for Authors with Complete Book Publishing Services
The finding supports a "hub-and-spoke" model of neural networks, where a few "hub" neurons represent broad concepts, and "spoke" neurons represent specific details connected to those concepts. By activating just the central hub neurons, the connected spoke neurons are triggered downstream, to recreate the original pattern. These mini teams of neurons, or neural ensembles, could be key in recording and recalling complex memories in the brain. "We believe that such structures imply greater advantages for recall," the authors concluded.
This model not only provides insights into human cognition but could have major implications for building better AI systems. Unlike typical AI models that rely on vast amounts of data to learn, a neural ensemble-based model could potentially adapt and learn from fewer examples. For instance, a traditional AI model would require many images to train it to identify shapes. However, this model could be trained on the basic identifying properties of each shape (e.g., a square has four equal sides) from a handful of examples, teaching the model to recognize these shapes in various contexts without extensive training data.
More on The Californer
- Jet Set: The Ultimate Coachella Afterparty
- Robyn Chu Launches Sensory Wellness: The Art and Science of Thriving
- JiT Home Buyers Highlights Challenges of Selling Homes That Require Major Repairs
- Heritage at Manalapan Introduces New Single-Family Home Community in One of Monmouth County's Most Desirable Locations
- Compliant Workspace announces partnership with Blackpoint Cyber
Next, Dr. Scheler, Dr. Schumann, and collaborator Prof. Röhrbein, are focused on deploying models such as this one to help other researchers build better AI systems. They are in the initial stages of launching a startup that offers a platform for developers to build network models rooted in biology. "By providing these ready-made components, users can create models that are more cognitively oriented and computationally efficient than traditional statistical machine learning models", says Dr. Scheler.
Source: Carl Correns Foundation for Mathematical Biology
Filed Under: Science
0 Comments
Latest on The Californer
- ISSE Services Named Awardee on $151B SHIELD Contract
- Connect Announces 2026 Class of Cool Companies Highlighting San Diego's Venture-Ready Startups
- Pervaziv AI Launches Cortex 3.0 - World's First Cross-Browser & Cross-IDE AI Coding & Security Agent
- Impact Filtration Appoints Alejandro Sturniolo as Head of Sustainability to Engineer High-Performance, Water-Positive Infrastructure
- Machine Vision Products adds Team A.T.E to its North American sales channel
- Recording Academy Member Joanna Pearl Honors Kim Cousins at 77th Annual Lake Elsinore Gala
- Twitch Streamer Caiuwus Emerges From "Streamer University" Spotlight With New Ambitions for 2026
- The Best Cartier Bracelet Dupes to Buy Right Now
- Xycota Biosciences Announces Nature Portfolio Publication Supporting Brain Repair Platform Targeting FTD and ALS
- Long Beach Parks, Recreation and Marine to Update Operating Hours and Programming at City Gyms
- Fabian Starr Announces "Back to the Arcade" Remix Album
- Attn: Book Critics, Poetry Lovers & Rock Stars: Help Rescue All these Great Poems from Obscurity
- Brainfyre Public Relations Introduces Program to Help Founders Build Media Readiness
- 106 Years Strong: The Liberty Group Celebrates a Century-Plus of Service and Unveils a Unified Family of Companies
- What they are saying: Overwhelming support for Governor Newsom's historic expansion of California's behavioral health capacity and treatment
- Airport Transportation Reaches All Five Continents Through Global Transportation Partner Network
- THINKWARE Announces Limited-Time St. Patrick's Day Discounts on ARC Series and F70 PRO Dash Cams
- ShutterChefs Launches AI Food Photography Platform — Pro Menu Images Under $1 Per Photo
- Here's how California Paralympians represented Team USA at Milano Cortina 2026
- Lowest Van Rental Prices of the Year Announced with Code LAX2026