Home Artificial Intelligence New Brain-like Transistor Can Process and Store Information Just Like the Human...

New Brain-like Transistor Can Process and Store Information Just Like the Human Brain

New Brain-like Transistor Can Process and Store Information Just Like the Human Brain
Discover the groundbreaking AI development as Northwestern, MIT, and Boston College collaborate on a synaptic transistor that mirrors human intelligence. This room temperature transistor promises energy-efficient, cognitive learning in real-world applications.

Breaking barriers in AI, Northwestern’s latest synaptic transistor heralds a new era of intelligent computing. Discover how this brain-like device performs associative learning at room temperature, reshaping the future of machine intelligence.

Human-like intelligence in transistors? Researchers have created a new synaptic transistor capable of higher-level reasoning by getting ideas from the human brain.

This device, which was created by researchers at Northwestern University, Boston College, and the Massachusetts Institute of Technology (MIT), analyzes and retains information in the same way as the human brain does.

In subsequent trials, the researchers revealed that the transistor can conduct associative learning as well as basic machine-learning tasks to classify data.

Although similar technologies have been used in past experiments to construct brain-like computer systems, such transistors cannot work outside of cryogenic temperatures. The new device, on the other hand, is stable at room temperature.

It also functions at high speeds, uses extremely little energy, and preserves recorded data even when power is turned off, making it suitable for real-world applications.

The findings were published today in the journal Nature.

“The brain has a fundamentally different architecture than a digital computer,” explains co-author Mark C. Hersam.

“In a digital computer, data move back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck when attempting to perform multiple tasks at the same time.

“On the other hand, in the brain, memory and information processing are co-located and fully integrated, resulting in orders of magnitude higher energy efficiency. Our synaptic transistor similarly achieves concurrent memory and information processing functionality to more faithfully mimic the brain.”

At Northwestern’s McCormick School of Engineering, Hersam is the Walter P. Murphy Professor of Materials Science and Engineering. In addition, he is the head of the materials science and engineering department, the director of the Materials Research Science and Engineering Center, and a member of the International Institute for Nanotechnology. Hersam collaborated on the study alongside Boston College’s Qiong Ma and MIT’s Pablo Jarillo-Herrero.

Recent breakthroughs in artificial intelligence (AI) have inspired researchers to create computers that function more like the human brain. Since traditional, digital computing systems include distinct processing and storage units, data-intensive operations use a significant amount of energy.

Currently, the memory resistor, sometimes known as a “memristor,” is the most advanced device capable of performing both processing and memory functions. However, memristors continue to suffer from energy-intensive switching.

“For several decades, the paradigm in electronics has been to build everything out of transistors and use the same silicon architecture,” Hersam adds. “Significant progress has been made by simply packing more and more transistors into integrated circuits. You cannot deny the success of that strategy, but it comes at the cost of high power consumption, especially in the current era of big data where digital computing is on track to overwhelm the grid. We have to rethink computing hardware, especially for AI and machine-learning tasks.”

Hersam and his colleagues rethought this paradigm by investigating recent breakthroughs in the physics of moiré patterns, a sort of geometrical design that occurs when two patterns are piled on top of one another. When two-dimensional materials are layered, new characteristics develop that are not present in a single layer. When those layers are twisted to generate a moiré pattern, electrical characteristics may be tuned to unimaginable levels.

The researchers created the novel gadget by combining two kinds of atomically thin materials: bilayer graphene and hexagonal boron nitride. When the materials were layered and intentionally twisted, they created a moiré pattern.

The researchers were able to produce distinct electrical characteristics in each graphene layer even though they were separated by just atomic-scale dimensions by rotating one layer relative to the other. Researchers used moiré physics to achieve neuromorphic functioning at room temperature with the appropriate twist.

“With twist as a new design parameter, the number of permutations is vast,” points out Hersam. “Graphene and hexagonal boron nitride are very similar structurally but just different enough that you get exceptionally strong moiré effects.”

To put the transistor to the test, Hersam and his colleagues taught it to distinguish similar — but not identical — patterns. Hersam recently revealed a novel nanoelectronic gadget capable of interpreting and classifying data in an energy-efficient way, but his new synaptic transistor pushes machine learning and artificial intelligence one step further.

“If AI is meant to mimic human thought, one of the lowest-level tasks would be to classify data, which is simply sorting into bins,” Hersam adds. “Our goal is to advance AI technology in the direction of higher-level thinking. Real-world conditions are often more complicated than current AI algorithms can handle, so we tested our new devices under more complicated conditions to verify their advanced capabilities.”

The researchers first displayed the device with a single pattern: 000 (three zeros in a row). The AI was then instructed to recognize comparable patterns, such as 111 or 101.

If we trained it to detect 000 and then gave it 111 and 101, it knows 111 is more similar to 000 than 101,” Hersam notes. “000 and 111 are not exactly the same, but both are three digits in a row. Recognizing that similarity is a higher-level form of cognition known as associative learning.”

In tests, the novel synaptic transistor identified similar patterns, demonstrating its associative memory. Even when the researchers threw curveballs at it, such as giving it incomplete patterns, it still displayed associative learning.

“Current AI can be easy to confuse, which can cause major problems in certain contexts,” Hersam adds. “Imagine if you are using a self-driving vehicle, and the weather conditions deteriorate. The vehicle might not be able to interpret the more complicated sensor data as well as a human driver could. But even when we gave our transistor imperfect input, it could still identify the correct response.”

Source: 10.1038/s41586-023-06791-1

Image Credit: iStock

Exit mobile version