• A pair of researchers from the U.S. Department of Energy's Argonne Labs want to map all the neurons and connections in the human brain. To make this happen, they're enlisting the help of an advanced supercomputer.
A Daunting Task
Humanity has so much to gain from a map of the human brain’s neural connections, but while producing that map sounds simple enough in theory, it’s extremely difficult in practice. Our brains have some 100 billion neurons making 100 trillion connections, and mapping this massive amount of information requires that we pinpoint the exact location of every single neuron before we can even hope to understand how they communicate with one another.
Click to View Full Infographic
Taking a team approach to this task can make it slightly less daunting, and that’s exactly what two researchers from the U.S. Department of Energy’s (DOE) Argonne National Laboratory are doing with their new brain connectome project.
Researcher Doga Gursoy is an assistant computational scientist at Argonne’s Advanced Photon Source (APS), while the other researcher, Bobby Kasthuri, is an Argonne neuroscientist. The goal of their joint project is to fully map the human brain, but first, they’re starting relatively small by mapping the 75 million neurons in the brain of an adult shrewmouse via a combination of data mining and machine learning.
Enabling this process is Argonne’s new 9.65 petaflops Intel-Cray supercomputer, Theta. Housed in the Argonne Leadership Computing Facility (ALCF), Theta is equipped with an advanced and flexible software platform designed to work with big data problems.
Image Credit: ALCF
“The basic goal is simple — we would like to be able to image all of the neurons in the brain — but the datasets from X-rays and electron microscopes are extremely large. They are at the tera- and petabyte scales,” Gursoy said in an ALCF press release. “So we would like to use Theta to build the software and codebase infrastructure in order to analyze that data.”
To map the deepest levels of neural connectivity and activity in the tiny mouse brain, Gursoy and Kasthuri first need to produce the data that Theta will later analyze. This will be accomplished using Argonne’s electron microscopes and a high-resolution 3D imaging technique called X-ray microtomography.
Once the high-resolution images are taken, they can be fed to Theta. The analysis produced by Theta’s supercomputing powers will provide an image of the mouse brain in a depth never before explored.
“Machine learning will go through these datasets and help come up with predictive models. For this project, it can help with segmentation or reconstruction of the brain and help classify or identify features of interest,” explained Venkat Vishwanath, ALCF Data Sciences Group lead.
The Big Picture
After they complete their analysis of the adult shrewmouse’s brain, Gursoy and Kasthuri plan to move on to the brain of larger mouse, and then maybe a non-human primate. While the technique won’t be ready for the complexity of a human brain for some time now, once it is, the implications could be tremendous.
A map of every connection in the brain could help us understand identify the small changes that can signal the development of neurological disorders, such as Alzheimer’s or autism. Such knowledge of the brain’s wiring could also lead to the development of better treatments for such disorders.
Click to View Full Infographic
Furthermore, this level of brain mapping could assist us in more futuristic endeavors, such as the engineering of neurons themselves through synthetic biology or the creation of advanced brain-computer interfaces. It could also lead to the development of better artificial neural networks, which are designed to mimic how the human brain works.
Ultimately, as Vishwanath points out, the ALCF team’s research could inspire other projects that take advantage of supercomputing for seemingly impossible endeavors:
“By developing and demonstrating rapid analysis techniques, such as data mining, graph analytics, and machine learning, together with workflows that will facilitate productive usage on our systems for applications, we will pave the way for more and more science communities to use supercomputers for their big data challenges in the future.”