Researchers have ‘turbocharged’ commercially available PCs to run the same elaborate brain simulations as supercomputers that cost millions of pounds thanks to a pioneering new computing method.
PCs with the latest Graphical Processing Units (GPUs), a processor in charge of rendering the graphics on a machine’s screen, successfully replicated a model of a macaque monkey’s visual cortex, a task previously limited to high-performance computer systems.
The team from the University of Sussex hope the method will make research significantly more accessible, slashing the cost of studying how mammalian brains work and investigating neurological disorders.
Brain simulations are key tools in helping researchers understand brain function and aiding research into Alzheimer’s and Parkinson’s.
However, replicating the neurons and synapses in the brains of even small mammals such as mice requires several terabytes of data, far outside the capabilities of the majority of non-supercomputers, according to the study published in journal Nature Computational Science.
Pioneering new technique
The researchers developed an alternative simulation method called ‘procedural connectivity’, allowing the machine to generate the brain model’s components on the fly instead of storing connectivity data and having to retrieve it from memory.
While a larger-than-average power supply would be required to drive the computer GPU, any relatively recent computer could be upgraded to a GPU with a minimum of 16 gigabytes (GB) of memory to run similar simulations.
The team used a PC with an NVIDIA Titan RTX graphic card with 24GB of memory, which costs £2,452, to run the brain circuit, demonstrating that although the computers are still powerful, they cost a fraction of the traditional multi-million-pound machines.
Breaking down research barriers
“This research is a game-changer for computational neuroscience and AI researchers who can now simulate brain circuits on their local workstations, but it also allows people outside academia to turn their gaming PC into a supercomputer and run large neural networks,” said Professor Thomas Nowotny, professor of informatics at the University of Sussex.
The study builds upon the work of US researcher Eugene Izhikevich, who pioneered a similar method for large-scale brain simulation in 2006.
Dr James Knight, research fellow in Computer Science, said academics typically had to apply for access to supercomputers designed to fulfil a specific scientific purpose, calling it a “quite a high barrier for entry which is potentially holding back a lot of significant research”.
“Our hope for our own research now is to apply these techniques to brain-inspired machine learning so that we can help solve problems that biological brains excel at but which are currently beyond simulations.”