Fujitsu supercomputer simulates 1 second of brain activity
Harnessing more than 82,000 processors on the world's fourth-ranked supercomputer, scientists run an experiment that represents 1 percent of human brain activity.
Is it really possible to simulate the human brain on a computer? AI researchers have been investigating that question for decades, but Japanese and German scientists have run what they say is the largest-ever simulation of brain activity using a machine.
The simulation involved 1.73 billion virtual nerve cells connected by 10.4 trillion synapses and was run on Japan's, which was ranked the fastest in the world in 2011.
It took the Fujitsu-built K about 40 minutes to complete a simulation of one second of neuronal network activity in real time, according to Japanese research institute RIKEN, which runs the machine.
The simulation harnessed the power of 82,944 processors on the K computer, which is now ranked fourth on the biannual international Top500 supercomputer standings (China's is the fastest now).
Each synapse between excitatory neurons had 24 bytes of memory for greater accuracy. The simulation ran on open-source NEST software and had about 1 petabyte of main memory, which is roughly equal to the memory of 250,000 PCs.
The synapses were randomly connected and the process was meant only to "test the limits of the simulation technology developed in the project and the capabilities of K," RIKEN said in a release.
The K computer is housed at RIKEN's Advanced Institute for Computational Science in Kobe, Japan, and has a rated performance of 10.51 petaflops per second using 705,024 SPARC64 processing cores.
"If petascale computers like the K computer are capable of representing 1 percent of the network of a human brain today, then we know that simulating the whole brain at the level of the individual nerve cell and its synapses will be possible with exascale computers hopefully available within the next decade," Markus Diesmann of the Institute of Neuroscience and Medicine at Germany's Forschungszentrum Julich said in the release.
An exascale computer is a machine capable of calculating a quintillion floating-point operations per second, a thousandfold increase over basic petascale speeds.
Some researchers have speculated that exascale computing may be achieved by 2020, but others disagree.
An expert group within Japan's science ministry is already planning an exascale machine that would have 100 times the processing capacity of the K computer. It's apparently going to be used for quake simulation and prediction and is slated for completion by around 2020.