The only physical similarity between AI and the human brain is its electricity-based operation. For example, artificial intelligence gets its outputs from silicon and metal circuits, while human cognition arises from a mass of living tissue. The way these systems work is also fundamentally different. Traditional computers store and compute information in various hardware components, with data transferred between memory and the microprocessor. On the other hand, the human brain links memory with processing, making it more efficient.
The inefficiency of computers makes AI models particularly energy-intensive. Data centers, where computers and devices are stored, account for 1 to 1.5 percent of global electricity consumption. In fact, by 2027, new AI servers could consume at least 85.4 terawatt-hours per year, exceeding the annual consumption of many small countries.
The human brain is significantly more efficient, so researchers have been trying for years to develop devices and materials that can better mimic its functions. They become those systems Neural computer systems The name of the thing.
A US research team has now taken a crucial first step by making a transistor, one of the building blocks of electronic circuits, act more like a neuron. Transistors are small switch-like devices that transmit and generate electrical signals, similar to neurons in a computer chip. They are used in almost all modern electronic devices. The new type of transistor, a synaptic transistor, combines memory with processing to consume less power.
Charming patterns
The new transistor enables neural circuits, increasing the energy efficiency of AI systems. This also allows the systems to go beyond simple pattern recognition and thus more closely resemble the brain's decision-making process.
To integrate memory directly into the operation of transistors, the research team focused on 2D materials with remarkably thin atomic arrangements. When these materials are layered in different directions, they form charming, changing patterns called corrugated superstructures. These structures can precisely regulate current, allowing data to be stored without a continuous power supply.
There are other moiré transistors, but they only operate at very low temperatures. So the new type is revolutionary because it operates at room temperature and uses twenty times less energy than other types of synaptic devices. Systems built on it are also said to be faster than regular computers.
Moreover, it can make innovative AI systems smarter. Thanks to embedded hardware, this circuit could also make AI models more brain-like at a higher level. According to the researchers, the transistor can “learn” from the data it processes. This is done through associative learning, where the system makes connections between different inputs, recognizes patterns and then creates connections. This method of learning is similar to the way the human brain forms memories and associations between concepts.
However, production methods for new transistors are not currently scalable, necessitating further research to realize the full potential of the circuit.