From rxpgnews.com
Mapping attention, memory and language links in human brain
By University of Arizona,
May 20, 2007 - 3:59:37 AM
A University of Arizona scientist who has specialized in studying how fireflies and other creatures communicate has won a million-dollar grant to conduct a pioneering 5-year study on the roles that attention and memory play when the human brain hears and processes spoken language.
This is the chance to study the ultimate form of animal communication -- language, said Thomas A. Christensen of UA's department of speech, language and hearing sciences (SLHS). Humans have evolved a very sophisticated symbolic form of communication. Language affects how we think, what we believe, how we interact with each other. I'd even go so far as to say that our future as a species depends on understanding how we communicate. But very little is known about what's going on in the brain when we're having a simple conversation.
Until recently, Christensen was a research scientist with the Arizona Research Laboratories' Division of Neurobiology, studying olfactory communication (the sense of smell) in insects. His research is grounded in the areas of learning and memory, systems physiology and animal communication. Encouraged by Elena Plante, head of the SLHS department, he applied for a $1 million career development award from the National Institute of Deafness and Other Communication Disorders. The grant was awarded in April.
The grant will take his career -- and biomedical science -- in new directions. Christensen will use UA's state-of-the-art magnetic resonance imaging (MRI) facilities to map the areas and networks within the brain linked to language, attention and memory. The UA's advanced MRI is a non-invasive imaging tool that is sensitive enough to show exactly what parts of the brain are involved when a person listens to another human voice.
What you read in the text books is that if you're right handed, then language is localized to the left hemisphere of your brain, Christensen said. I found out right away -- that's just not true. Analyzing a human voice also involves the right hemisphere and even parts of the cerebellum. The cerebellum is a large part of the brain that serves to coordinate voluntary movements, posture, and balance in humans.
These MRI images destroy the myth that you're only using about 10 percent of your brain for any particular task, Christensen said. The crux of this grant is to learn more about the language, attention and memory centers in the brain, and also about the complex interactions between them.
Inside the scanner, volunteer subjects don headphones and perform simple language discrimination tasks in Christensen's experiments. They're asked to respond by pressing a button when they hear words that fall into a certain semantic category -- the name of an animal, for example. Then, to make the task a bit harder, subjects are asked to respond only when they hear a woman's voice speak a word in the chosen category. The task taxes attention even more when subjects are asked to respond to a woman's voice speaking a 'target' word in one ear at the same time a man's voice is speaking words in the other.
The MRI scanner records activity throughout the 45-minute sessions, revealing multiple regions and networks, some deep within the brain, that scientists didn't suspect were involved when the brain listens.
We're getting a snapshot of what that activity is across the population. What's so striking is how clearly we see that certain areas of the brain are strongly engaged in attentional control while other areas are not. As we scan more volunteers, we're definitely beginning to see a pattern here.
Christensen's research on the brain-governing system we called attention -- how the brain selects only some information from its environment and is able to focus awareness on objects and events relevant to immediate goals -- is profoundly relevant to such disorders as schizophrenia, ADHD and many other impairments that affect language abilities.
ADHD (Attention Deficit Hyperactivity Disorder) is probably one of the most over-diagnosed disorders of our time, Christensen said. The reason for that, I think, is that we really don't know very much about the biological basis of this syndrome. There's a lot of research on it, but there's still a lot of disagreement about what the root cause is, and about whether drugs like Ritalin that are being prescribed to children as young as 2 years old are doing any good, and if we have any business exposing our children to drugs at such a very early age, he added.
As Christensen collects more MRI data that show the connections among areas of the brain that are strongly engaged in language tasks, he plans to collaborate with computer modeling experts. We could develop a mathematical model that would allow us to generate hypotheses about what we expect if we deliver a certain type of stimulus. We'd see what effect it would produce in our model.
Simulating brain activity in the mathematical model would take the whole question of language processing beyond 'blobology' -- where you're just looking at blobs of activation in the brain. That's what I hope to do, Christensen said.
All rights reserved by RxPG Medical Solutions Private Limited ( www.rxpgnews.com )
|
|