Features

doi:10.1038/nindia.2012.52 Published online 16 April 2012

Virtual Cell: computing human brain

Building an all-inclusive computational model of the human brain is one of the final frontiers of science. Pawan Dhar explains how it might help scientists unravel mysteries of brain functions, molecule-by-molecule.

To create a virtual brain will be one of the final frontiers of science.
© Getty

Going by the rapid progress in science and technology, it won't be a big surprise to see a virtual three dimensional replica of the human brain living inside the most powerful supercomputers in a few years time.

Consider the Blue Brain Project, a collaboration between the Brain Mind Institute and IBM. It aims to build a complete functional model of human brain that integrates all the aspects of brain anatomy and function, from the molecule to the tissue level. Project leader Henry Markaram says they already have a primitive prototype in place.

The human brain weighs approximately 1.5 kilos and uses only 20 watts of power. By mashing a semi-solid mass of the human brain into a soupy solution, the human brain inventory is estimated to have 85 billion cells.

Interestingly, the body allocates 20-25% of its total energy budget in maintaining brain activities. During development, the brain adds hundreds of thousands of cells every minute touching its peak weight around age 20. The brain is made of sensory neurons (respond to light, sound, mechanical and chemical stimuli), motor neurons (control the activity of muscles), interneurons (mediate simple reflexes in the brain) and glial cells (support neurons and contribute towards the development of brain).

Crunching data to make a human brain

To recreate a 1400 gram human brain in silico, one must manage an embarrassingly enormous textual, audio, visual, temporal, spatial data. Is it possible to build a molecular and cellular model of the brain that performs all brain functions with reliable accuracy? Nobody has an answer yet.

In 1936, Alan Turing, the father of computer science, imagined computability in terms of the states of human mind. He was convinced that a sufficiently large computer could demonstrate all mental functions of the human brain. In fact, he once told his assistant that he was building a brain. In 1946 he compiled a report that contained detailed circuit diagrams to build an Automatic Computing Engine (ACE) based on his view of how the human brain functions.

Turing also suggested how computers could modify their own programmes. The Turing Machine, a logical computational engine made on simple rules, can run forever and serve as a model for mathematical calculation in the computer.

New age brains

Fast forward to the 21st century and we have petaflops computers that offer at least a thousand trillion operations per second with a processing speed of 1015 Hz. Given that the brain contains approximately 90 billion cells with an average of 10,000 synapses per neuron operating at a presumed maximal rate of 100 Hz, we still do not have a situation where even the fastest supercomputer can't simulate brain activity. What's missing?

We do not know how the brain stores, processes and retrieves data that comes in the form of text, sound, images, touch, taste, smell, with the tags of space and emotions. Probably there are more data formats and tags, unknown to science at this time.

We do not know if brain compresses the incoming data and, if yes, how the data is uncompressed on demand? We do not know if the brain 'defrags' the data periodically and makes the processing more efficient. However, things are beginning to change.

In the summer of 2005, The Brain Mind Institute in France and IBM in the US, launchedtThe Blue Brain project, a bold new initiative to compute brain memory, processing and intelligence using the chassis of the Blue Gene supercomputer. Proponents of this initiative praise it as the most ambitious project of the century. Opponents haven't accepted the view that brain can be represented in the form of conditional statements involving '0's and '1's.

Early theories

The classic cable theory that originated in the 1850s has found applications in neuroscience. It helps us understand how electrical signals from synapses integrate and propagate in branching tubes of different diameters of dendrites. Unfortunately, due to extreme difficulty in collecting data from distal parts of the neuron, the mechanism behind integrating signals is still unclear.

In 1959, Wilfred Rall, one of the pioneers in computational neuroscience, proposed a framework that helps describe transmission of signals through synapses, spatio-temporal integration and so on1.The famous Hodgkin Huxley equation further helps describe feedback outcomes. A rapid increase in the computational power during the last two decades has enabled construction of a model for hippocampal circuits2, 3.

To scale up modelling to the whole brain level, one would need to consider tens of billions of neurons, each receiving inputs from thousands of other neurons branching at the dendritic end into several thin appendages. This calls for the right modelling approach customized for physiologically unique processes in the brain, a high performance computing platform and an ability to perform experiments to regularly validate observations and improve the model.

The Blue Brain project

This is where the Blue Brain project comes in. Henry Merkel, Principal Investigator of the Blue Brain project has pitched in for the one billion euro funding to create an in silico model of the human brain with support from IBM. The approach is to mine all the published data and databases, build a statistical model, look for interesting patterns, identify knowledge gaps, make predictions and follow up with specific experiments.

Once the first version is complete the whole brain model will have to be trained with diverse inputs to make the right decisions. The scale of computing required to accomplish the goal will be frighteningly enormous. Merkel says that even on an exascale supercomputer (1000 times more powerful than petascale computers, expected to be introduced by 2020) the virtual brain may appear, at best, in a slow motion.

The Blue Brain model will enable people to look into brain at all the levels – from molecules to neuronal microcircuits to brain regions and the whole brain. The entire brain inventory will participate in the model and be used for understanding approximately 560 brain diseases that affect humans. 'It will be a Hubble telescope for the brain", Merkel says.

Parallel to this project, the US, Europe and China have been investigating the possibility of embedding computer chips into the human brain for many years now. An expected joint outcome of both the projects is that a specific cell/tissue level brain model will be fabricated into computer chips. Each chip will serve as a brain for physical robots leading to a new generation of highly intelligent electronics.

On a critical note — how does one find the right modelling approach to build a functional model? Can consciousness be bottled in a computer? Will the human mind be uploaded into a computer for better decision making? Will a custom-designed mind be available on subscription? Will the ultra-intelligent computer be a boon or a threat?

Nobody has an answer yet. It is a bold new science with far reaching consequences, both exciting and frightening.

This article is the seventh and final in a series entitled 'Virtual Cell'.


References

  1. Rail, W. Branching dendritic trees and motoneuron membrane resistivity. Exp. Neurol. 1, 491-527 (1959) | Article | PubMed | ISI |
  2. Traub, R. D. et al. Cellular mechanisms of neuronal synchronization in epilepsy. Science 216, 745-747 (1982) | Article | PubMed | ISI | ADS |
  3. Traub, R. D. et al. Computer simulation of carbachol-driven rhythmic population oscillations in the CA3 region of the in vitro rat hippocampus. J. Physiol. 451, 653-672 (1992) | PubMed |