Mimicking the Brain: Using Computers to Investigate Neurological Disorders
Seachrist, Lisa, Science News
Deep within the brain a single neuron fires. That electrical signal
triggers a biochemical chain reaction that courses from neuron to neuron, ultimately forming a set of connections that brings alive a scenic vista, a child's touch, or the memory of a long-ago event. Arresting any part of that signal devastates the cognitive activities that appear to make us human.
While the speed and precision of the human brain lead some people to refer to it as the ultimate computer, the brain maintains a distinct advantage over the computer--resilience. When crucial interactions between neurons falter, the brain reroutes signals in an attempt to maintain the ability to think, remember, and perceive. "When you damage just one small part of the computer, the whole thing will collapse," says neurologist and computer scientist James Reggia of the University of Maryland in College Park. "The brain is very different. It is able to adjust its own circuitry."
Despite this resilience, the brain has its limitations. Neurological diseases such as Alzheimer's and Parkinson's cause progressive losses of vital cognitive functions that no degree of brain-initiated rewiring can repair.
Scientists do not know why some conditions spur the brain to large-scale reorganization of the synapses, or junctions between neurons, whereas others result in permanent damage. The problem lies in a basic dichotomy in neuroscience: Remarkable gains in elucidating the way neurons communicate with each other on the molecular level simply haven't explained the biology of how we think, sense, and feel.
For the past decade, researchers have employed a controversial tool to decipher this puzzle: computer systems known as neural networks. These networks simulate elementary, but poorly understood, brain functions such as reading and language (SN: 11/26/88, p. 344). Scientists exploring artificial intelligence have also made extensive use of neural networks. Now, some researchers are using them to model disorders of the brain, with an eye to discovering better therapeutic strategies.
Reggia organized a workshop at the University of Maryland in June to explore ways in which computational models of brain disorders ranging from phantom limb pain to stroke to Alzheimer's will enable scientists to test theories of how and why the brain responds to disease and trauma.
"The complexity of the brain makes it necessary that we use computational models to understand how disease affects the brain," says Reggia. Otherwise, "it's almost like trying to understand the climate without using computer models."
Psychiatrist-turned-computer-modeler Manfred Spitzer of Heidelberg University in Germany used neural modeling to tackle the enigma of phantom limb pain.
For over a hundred years, physicians have reported that amputees not only continue to feel their amputated limbs, they often suffer cramping, burning, and shooting pains in specific regions of those limbs. Researchers have traced the origin of such pains to reorganization of the brain area that formerly processed sensations in the absent limb. As neurons in that area adapt to some new purpose, their activity manifests itself as phantom pain (SN: 6/10/95, p.357).
Spitzer, however, questioned just how such a reorganization would occur. Paraplegics, like amputees, suffer loss of stimulation from large sections of their bodies, and presumably their brains contain areas that cease activity for want of stimulation and become ripe for reorganization. Yet the paralyzed don't suffer phantom limb pain.
The cortex of the brain creates specific areas that both receive neural impulses from various parts of the body and issue instructions to them. Spitzer and his colleagues developed a neural network that mimics this mapping electronically. When the team deprived the network of a specific input, as might happen after amputation of a limb or loss of stimulation as a result of paralysis, the areas of the network responsible for that input didn't undertake any new functions, says Spitzer. …