Press "Enter" to skip to content

Neural Networks Go Nano: Brain-Inspired Learning Takes Flight

Researchers from the University of Sydney and UCLA have developed a physical neural network that can learn and remember in real-time, much like the brain’s neurons. This breakthrough utilizes nanowire networks that mirror neural networks in the brain. The study has significant implications for the future of efficient, low-energy machine intelligence, particularly in online learning settings.

Critical step passed for developing agile, low-energy machine intelligence.

For the first time, a physical neural network has successfully been shown to learn and remember ‘on the fly’, in a way inspired by and similar to how the brain’s neurons work.

The result opens a pathway for developing efficient and low-energy machine intelligence for more complex, real-world learning and memory tasks.

Published today (November 1) in Nature Communications<em>Nature Communications</em> is a peer-reviewed, open-access, multidisciplinary, scientific journal published by Nature Portfolio. It covers the natural sciences, including physics, biology, chemistry, medicine, and earth sciences. It began publishing in 2010 and has editorial offices in London, Berlin, New York City, and Shanghai. ” data-gt-translate-attributes=”[{“attribute”:”data-cmtooltip”, “format”:”html”}]”>Nature Communications, the research is a collaboration between scientists at the University of SydneyThe University of Sydney is a public research university located in Sydney, New South Wales, Australia. Founded in 1850, it is the oldest university in Australia and is consistently ranked among the top universities in the world. The University of Sydney has a strong focus on research and offers a wide range of undergraduate and postgraduate programs across a variety of disciplines, including arts, business, engineering, law, medicine, and science.” data-gt-translate-attributes=”[{“attribute”:”data-cmtooltip”, “format”:”html”}]”>University of Sydney and the University of California at Los Angeles (UCLAThe University of California, Los Angeles (UCLA) is a public land-grant research university in Los Angeles, California. It is organized into the College of Letters and Science and 12 professional schools. It is considered one of the country's Public Ivies, and is frequently ranked among the best universities in the world by major college and university rankings.” data-gt-translate-attributes=”[{“attribute”:”data-cmtooltip”, “format”:”html”}]”>UCLA).

Nanowire Neural Network

Electron microscope image of the nanowire neural network that arranges itself like ‘Pick Up Sticks’. The junctions where the nanowires overlap act in a way similar to how our brain’s synapses operate, responding to electric current. Credit: The University of Sydney

Lead author Ruomin Zhu, a PhD student from the University of Sydney Nano Institute and School of Physics, said: “The findings demonstrate how brain-inspired learning and memory functions using nanowire networks can be harnessed to process dynamic, streaming data.”

Nanowire Networks

Nanowire networks are made up of tiny wires that are just billionths of a meter in diameter. The wires arrange themselves into patterns reminiscent of the children’s game ‘Pick Up Sticks’, mimicking neural networks, like those in our brains. These networks can be used to perform specific information processing tasks.

Nanowire Neural Network Close Up

Detail of larger image above: nanowire neural network. Credit: The University of Sydney

Memory and learning tasks are achieved using simple algorithms that respond to changes in electronic resistance at junctions where the nanowires overlap. Known as ‘resistive memory switching’, this function is created when electrical inputs encounter changes in conductivity, similar to what happens with synapses in our brain.

Research Findings and Implications

In this study, researchers used the network to recognize and remember sequences of electrical pulses corresponding to images, inspired by the way the human brain processes information.

Electrode Interaction With Nanowire Network

Electron microscope image of electrode interaction with the nanowire neural network. Credit: The University of Sydney

Supervising researcher Professor Zdenka Kuncic said the memory task was similar to remembering a phone number. The network was also used to perform a benchmark image recognition task, accessing images in the MNIST database of handwritten digits, a collection of 70,000 small greyscale images used in machine learningMachine learning is a subset of artificial intelligence (AI) that deals with the development of algorithms and statistical models that enable computers to learn from data and make predictions or decisions without being explicitly programmed to do so. Machine learning is used to identify patterns in data, classify data into different categories, or make predictions about future events. It can be categorized into three main types of learning: supervised, unsupervised and reinforcement learning.” data-gt-translate-attributes=”[{“attribute”:”data-cmtooltip”, “format”:”html”}]”>machine learning.

“Our previous research established the ability of nanowire networks to remember simple tasks. This work has extended these findings by showing tasks can be performed using dynamic data accessed online,” she said.

Ruomin Zhu

Lead author Ruomin Zhu from the University of Sydney holding the chip designed to manage the nanowire neural network. Credit: The University of Sydney

“This is a significant step forward as achieving an online learning capability is challenging when dealing with large amounts of data that can be continuously changing. A standard approach would be to store data in memory and then train a machine learning model using that stored information. But this would chew up too much energy for widespread application.

Professor Zdenka Kuncic

Supervising researcher and co-author Professor Zdenka Kuncic from the University of Sydney Nano Institute and School of Physics. Credit: The University of Sydney

“Our novel approach allows the nanowire neural network to learn and remember ‘on the fly’, sample by sample, extracting data online, thus avoiding heavy memory and energy usage.”

Mr. Zhu said there were other advantages when processing information online.

“If the data is being streamed continuously, such as it would be from a sensor for instance, machine learning that relied on artificial neural networks would need to have the ability to adapt in real-time, which they are currently not optimized for,” he said.

In this study, the nanowire neural network displayed a benchmark machine learning capability, scoring 93.4 percent in correctly identifying test images. The memory task involved recalling sequences of up to eight digits. For both tasks, data was streamed into the network to demonstrate its capacity for online learning and to show how memory enhances that learning.

Reference: “Online dynamical learning and sequence memory with neuromorphic nanowire networks” by Zhu, Lilak, Loeffler, et al, 1 November 2023, Nature Communications.
DOI: 10.1038/s41467-023-42470-5

Source: SciTechDaily