Contact:
Michael Patrick Rutter
617.496.3815
Harvard Scientists Investigate How the Brain May Encode Visual Information of Natural Scenes
Cambridge, Mass. - December 13, 2004 - By mining direct recordings of neuronal activity in live animals as they viewed natural scenes, researchers in Harvard University's Division of Engineering and Applied Sciences have developed a more realistic model of how the brain encodes real-world visual information.
The work, published in the November 24th issue of The Journal of Neuroscience, could help move scientists beyond artificial visual stimuli typically used in experiments-such as spots, bars or sine waves-to a better understanding of how the brain processes dynamic objects such as trees swaying, cars speeding by, or joggers stretching.
"By using data that represents what animals see through their own eyes, we have a better sense of how part of the brain's visual system might encode visual information," explains Garrett Stanley, Associate Professor of Biomedical Engineering. "What's intriguing is that the same circuits may be responsible for two very different types of tasks: detection, knowing that an object is there, and transmission, getting information about the object"
The scientists used snippets from movies of common scenes to pinpoint the pattern and sequence of neuronal firings in the lateral geniculate nucleus (LGN), a layered structure in the brain's thalamus with cells that respond to form and motion. Much like a football quarterback deciding on when and to whom to throw the ball, the LGN acts as a gateway between the visual world and the higher cortical structures by directing the flow of information.
Stanley and co-author Nicholas A. Lesica were able to precisely correlate LGN activity-which cells were active or silent and when-with specific sequences in the movies, like a person walking or an object moving into and out of view. Thalamus bursts, a rapid succession of spikes (patterns of neurons firing after a long period of silence), were originally thought to only happen during sleep or in periods of low arousal. Recent studies, however, have shown that burst events can also be triggered by external events like visual stimuli during more active times. By using natural scene data, the researchers were able to take these studies one-step further and zero-in on firing patterns that could not be detected when using traditional artificial visual stimuli.
"These bursts may serve as a wake-up call, alerting the visual cortex to the presence of something in the receptive field and signal the start of subsequent neuronal activity," says Stanley. "Such patterns may be an important part of the complete neural code of the LGN, providing critical details, such as how an object is moving, changing contrast, and so forth, to the higher centers in the brain responsible for perception."
Ultimately, the researchers hope their work inspires other neuroscientists to develop new ways to obtain and analyze visual and other data that represents the brain during real-world conditions. In the future, with a better understanding of how the brain encodes everyday scenes, engineers might be able to artificially trigger a visual response or experience by sending such data from a computer through a device that directly interfaces with the brain.
Stanley's coauthor was graduate student Nicholas A. Lesica, also in the Harvard Division of Engineering and Applied Sciences. Their paper was based on prior work first done by Stanley and Yang Dan, Associate Professor of Neurobiology at UC Berkeley and Fei-Fei Li, a graduate student at the California Institute of Technology.
###


