Intel unveiled its Loihi 2 chip on Thursday, the second generation of a processor family that marries conventional electronics with the architecture of human brains to try to inject some new progress into the computing industry. On top of that, the chip also helps Intel advance its own manufacturing technology.
Loihi 2, an example of a technology called neuromorphic computing, is about 10 times faster than its predecessor, according to Intel. The speed improvement is the result of an eightfold increase in the number of digital neurons, a chip equivalent to human brain cells that mimic the way the brains handle information. The chip also can be programmed better to help researchers tackle more computing tasks.
The chip is built with a preproduction version of the Intel 4 manufacturing process, too, an advanced method Intel plans to use to build mainstream Intel chips arriving in 2023. The Intel 4 process can etch electronics more densely on a chip, a crucial advantage for Intel's need to pack a million digital neurons on a chip measuring 30 square millimeters.
Loihi chips are particularly good at rapidly spotting sensory input like gestures, sounds and even smells, says Mike Davies, leader of the Intel Labs group that developed Loihi. Some experiments have focused on artificial skin that could give robots a better sense of touch. "We can detect slippage if a robot hand is picking up a cup," Davies said.
Neuromorphic computing differs from artificial intelligence, a revolutionary computer technology based more loosely on how brains learn and respond, because it focuses more on the physical characteristics of human gray matter.
It differs from conventional chips in profound ways. For example, Loihi 2 stores data in tiny amounts spread across its mesh of neurons, not in a big bank of traditional computer memory, and it doesn't have a central clock ticking to synchronize computing steps on the chip.
You won't see Loihi 2 in your phone or laptop. Instead, it's geared for researchers at automakers, national labs and universities. Germany's Deutsche Bahn railway network is testing how well it can optimize train schedules. The processor is geared for tasks such as processing sound or detecting hand gestures, but with vastly lower power consumption, Davies said.
Low power use is a characteristic of biological gray matter, too. Human brains are made of about 80 billion cells called neurons, connected into elaborate electrical signaling networks. When enough input signals reach an individual neuron, it fires its own signal to other neurons. The topology of the connections and flow of signals lets us do everything from recognizing Abraham Lincoln to riding a bicycle. Learning is the process of establishing and reinforcing those connections.
Intel isn't the only one pursuing the idea. The Human Brain Project in Europe includes neuromorphic computing in its work. The way blood courses through the brain inspired IBM to power and cool chips with liquids in a flow battery. Samsung used IBM's neuromorphic TrueNorth chip to re-create vision.
Intel's chip is made of a million digital neurons that can be connected in any number of ways, a digital tabula rasa. Getting it to work requires configuring the proper connections between neurons. Actual processing occurs when input data reaches the chip, triggering a spike of activity that flows through the interconnected neurons and eventually produces an output. Each neuron is connected to 100 others on average, though some may reach as many as 10,000.
This flowlike design means the chip requires very little power when idle and can process data very quickly on demand, Davies said.
Programming neuromorphic chips is a big challenge, Davies said. To try to make the process easier for researchers, Intel also released an open-source software framework called Lava.
Fewer but smarter neurons
A million neurons in one chip is far from the billions in a human brain, but Intel is effectively trying to make each neuron smarter than a biological brain cell. For example, in biological brains, electrical signals are either fully on or fully off. In Loihi chips, Intel can assign a different strength to each signal, increasing processing sophistication, Davies said.
The chip can be connected to others, too, for greater scale. One improvement over the first Loihi is better networking that shortens the communication pathways that link neurons.
"The brain achieves accuracy and reliability through tremendous redundancy," Davies said. "The hope is indeed we can solve some of the same problems in a more economical way."
Intel 4 manufacturing
The Intel 4 process is a major step for Intel, its first move to a chipmaking technology called extreme ultraviolet, or EUV. Chip circuitry is etched onto silicon wafers using patterns of light, and EUV enables finer patterns for smaller structures, crucial to miniaturization and other improvements. Problems in recent years meant Intel lost its manufacturing leadership to Samsung and Taiwan Semiconductor Manufacturing Co. (TSMC), and Intel 4 is a key part of Intel's effort to reclaim that leadership by 2025.
Mainstream manufacturing with Intel 4 won't begin until the second half of 2022, with the resulting chips arriving in 2023, Intel has said. But Loihi 2 gives the company a chance to debut the technology while it's still in a preproduction phase.
"We have small quantities in the lab now," Davies said.
Intel made thousands of first generation Loihi chips, Intel said, and Loihi 2 likely will involve a similarly small production run. Manufacturing at Intel's full scale of operations, with millions of processors, brings other challenges.