In the brain, timekeeping is done with neurons that relax at different rates after receiving a signal; now memristors—hardware analogs of neurons—can do that too.
Artificial neural networks may soon be able to process time-dependent information, such as audio and video data, more efficiently. The first memristor with a ‘relaxation time’ that can be tuned is reported today in Nature Electronics, in a study led by the University of Michigan.
Energy Efficiency and AI
Memristors, electrical components that store information in their electrical resistance, could reduce AI’s energy needs by about a factor of 90 compared to today’s graphical processing units. Already, AI is projected to account for about half a percent of the world’s total electricity consumption in 2027, and that has the potential to balloon as more companies sell and use AI tools.
“Right now, there’s a lot of interest in AI, but to process bigger and more interesting data, the approach is to increase the network size. That’s not very efficient,” said Wei Lu, the James R. Mellor Professor of Engineering at U-M and co-corresponding author of the study with John Heron, U-M associate professor of materials science and engineering.
The Problem With GPUs
The problem is that GPUs operate very differently from the artificial neural networks that run the AI algorithms—the whole network and all its interactions must be sequentially loaded from the external memory, which consumes both time and energy. In contrast, memristors offer energy savings because they mimic key aspects of the way that both artificial and biological neural networks function without external memory. To an extent, the memristor network can embody the artificial neural network.
Innovations in Memristor Materials
“We anticipate that our brand-new material system could improve the energy efficiency of AI chips six times over the state-of-the-art material without varying time constants,” said Sieun Chae, a recent U-M Ph.D. graduate in materials science and engineering and co-first-author of the study with Sangmin Yoo, a recent U-M PhD graduate in electrical and computer engineering.
In a biological neural network, timekeeping is achieved through relaxation. Each neuron receives electrical signals and sends them on, but it isn’t a guarantee that a signal will move forward. Some threshold of incoming signals must be reached before the neuron will send its own, and it has to be met in a certain amount of time. If too much time passes, the neuron is said to relax as the electrical energy seeps out of it. Having neurons with different relaxation times in our neural networks helps us understand sequences of events.
How Memristors Work
Memristors operate a little differently. Rather than the total presence or absence of a signal, what changes is how much of the electrical signal gets through. Exposure to a signal reduces the resistance of the memristor, allowing more of the next signal to pass. In memristors, relaxation means that the resistance rises again over time.
While Lu’s group had explored building relaxation time into memristors in the past, it was not something that could be systematically controlled. But now, Lu and Heron’s team have shown that variations on a base material can provide different relaxation times, enabling memristor networks to mimic this timekeeping mechanism.
Material Composition and Testing
The team built the materials on the superconductor YBCO, made of yttrium, barium, carbon and oxygen. It has no electrical resistance at temperatures below -292 Fahrenheit, but they wanted it for its crystal structure. It guided the organization of the magnesium, cobalt, nickel, copper and zinc oxides in the memristor material.
Heron calls this type of oxide, an entropy-stabilized oxide, the “kitchen sink of the atomic world”—the more elements they add, the more stable it becomes. By changing the ratios of these oxides, the team achieved time constants ranging from 159 to 278 nanoseconds, or trillionths of a second. The simple memristor network they built learned to recognize the sounds of the numbers zero to nine. Once trained, it could identify each number before the audio input was complete.
Future Prospects
These memristors were made through an energy-intensive process because the team needed perfect crystals to precisely measure their properties, but they anticipate that a simpler process would work for mass manufacturing.
“So far, it’s a vision, but I think there are pathways to making these materials scalable and affordable,” Heron said. “These materials are earth-abundant, nontoxic, cheap and you can almost spray them on.”
Reference: “Efficient data processing using tunable entropy-stabilized oxide memristors” by Sangmin Yoo, Sieun Chae, Tony Chiang, Matthew Webb, Tao Ma, Hanjong Paik, Yongmo Park, Logan Williams, Kazuki Nomoto, Huili G. Xing, Susan Trolier-McKinstry, Emmanouil Kioupakis, John T. Heron and Wei D. Lu, 20 May 2024, Nature Electronics.
DOI: 10.1038/s41928-024-01169-1
The research was funded by the National Science Foundation. It was done in partnership with researchers at the University of Oklahoma, Cornell University and Pennsylvania State University.
The device was built in the Lurie Nanofabrication Facility and studied at the Michigan Center for Materials Characterization.
Lu is also a professor of electrical and computer engineering and materials science and engineering. Chae is now an assistant professor of electrical engineering and computer science at Oregon State University.