April 18, 2024

Bioinspired Neural Network Model Can Store Significantly More Memories

In the customized network made by Mr. Burns and Professor Fukai, sets of three or more neurons can link at the same time. Hopfield networks save memories as patterns of weighted connections between various nerve cells in the system. The network is “trained” to encode these patterns, then scientists can check its memory of them by providing a series of incomplete or blurred patterns and seeing if the network can acknowledge them as one it already understands. In classical Hopfield networks, nevertheless, neurons in the model reciprocally link to other neurons in the network to form a series of what are called “pairwise” connections.
In the brain the strengths of connections between nerve cells are not normally the very same in both instructions, so Mr. Burns wonders if this feature of asymmetry might likewise enhance the networks performance.

The researchers found that a network that incorporated both pairwise and set-wise connections carried out best and kept the biggest variety of memories.
Scientists have actually developed a new model motivated by current biological discoveries that reveals improved memory efficiency. This was accomplished by modifying a classical neural network.
Computer models play a crucial function in investigating the brains process of making and maintaining memories and other intricate info. However, constructing such designs is a delicate task. The complex interplay of electrical and biochemical signals, in addition to the web of connections between neurons and other cell types, creates the facilities for memories to be formed. In spite of this, encoding the complex biology of the brain into a computer system design for further research study has actually shown to be a difficult job due to the limited understanding of the hidden biology of the brain.
Scientists at the Okinawa Institute of Science and Technology (OIST) have actually made improvements to a commonly used computer system design of memory, called a Hopfield network, by including insights from biology. The modification has actually led to a network that not just better mirrors the method nerve cells and other cells are connected in the brain, but also has the capability to shop substantially more memories.

The complexity contributed to the network is what makes it more realistic, says Thomas Burns, a Ph.D. trainee in the group of Professor Tomoki Fukai, who heads OISTs Neural Coding and Brain Computing Unit.
” Why would biology have all this complexity? Memory capacity might be a factor,” Mr. Burns says.
In the classical Hopfield network (left), each nerve cell (I, j, k, l) is connected to the others in a pairwise manner. In the customized network made by Mr. Burns and Professor Fukai, sets of 3 or more nerve cells can connect at the same time. Credit: Thomas Burns (OIST).
Hopfield networks save memories as patterns of weighted connections in between various neurons in the system. The network is “trained” to encode these patterns, then researchers can check its memory of them by presenting a series of blurry or incomplete patterns and seeing if the network can acknowledge them as one it currently knows. In classical Hopfield networks, however, neurons in the design reciprocally connect to other neurons in the network to form a series of what are called “pairwise” connections.
Pairwise connections represent how 2 nerve cells link at a synapse, a connection point between 2 nerve cells in the brain. But in truth, nerve cells have complex branched structures called dendrites that offer multiple points for connection, so the brain counts on a lot more intricate plan of synapses to get its cognitive tasks done. In addition, connections between neurons are modulated by other cell types called astrocytes.
” Its simply not realistic that just pairwise connections in between neurons exist in the brain,” explains Mr. Burns. He developed a modified Hopfield network in which not just pairs of neurons but trines, 4, or more neurons might connect up too, such as might take place in the brain through astrocytes and dendritic trees.
The brand-new network allowed these so-called “set-wise” connections, overall it consisted of the same total number of connections as previously. The researchers discovered that a network consisting of a mix of both pairwise and set-wise connections carried out best and maintained the greatest variety of memories. They approximate it works more than doubly along with a traditional Hopfield network. “It ends up you really require a combination of features in some balance,” says Mr. Burns. “You need to have private synapses, but you must also have some dendritic trees and some astrocytes.”.
Hopfield networks are crucial for modeling brain procedures, however they have powerful other usages too. Very comparable types of networks called Transformers underlie AI-based language tools such as ChatGPT, so the enhancements Mr. Burns and Professor Fukai have identified might likewise make such tools more robust.
In the brain the strengths of connections in between nerve cells are not usually the very same in both instructions, so Mr. Burns questions if this function of asymmetry might likewise enhance the networks performance. In addition, he would like to check out ways of making the networks memories communicate with each other, the way they do in the human brain.
Referral: “Simplicial Hopfield networks” by Thomas F Burns and Tomoki Fukai, 1 February 2023, International Conference on Learning Representations.

By Okinawa Institute of Science and Innovation (OIST) Graduate University
March 9, 2023