PHOTO: COURTESY THE McGILL NEWS

New light on language

DANIEL McCABE | Attention all editors of psychology and neuroanatomy textbooks: You may have some serious revising to do.

A new study by a McGill team led by psychology professor Laura Ann Petitto and Robert Zatorre, a neuropsychologist from the Montreal Neurological Institute, upsets the applecart in a big way in terms of what we know about the organization of language in the brain.

According to the study's findings, presented last week at a meeting of the Society for Neuroscience in New Orleans, the same parts of the brain which interpret words and parts of words in speech, also process information about words conveyed through signed language.

These findings challenge current notions about which parts of the brain do what. "This raises many questions about the function of brain tissue we thought we understood very well," says Petitto.

They also bolster Petitto's hypothesis, generated from previous studies, that the same neural mechanism is responsible for overseeing the development of both spoken and signed language.

The new study involved using positron emission tomography (PET) to probe the brain activity of two distinct groups of research subjects. The first consisted of 11 profoundly deaf individuals. The other group was made up of 10 hearing people. Petitto's team carefully selected adults who were firmly entrenched in a world of either spoken or signed communication.

"Our deaf subjects were born deaf and grew up using signed language. Our hearing subjects had no knowledge of signed languages."

The study, which also involved research assistants Kristine Gauna, Deanna Dostie and student Jim Nikelski, aimed to see inside the brains of both signers and speakers to compare the neural activity connected to each kind of language.

The prevailing view regarding the organization of language in the brain is that speech and sound are critical to language processing and language acquisition. The fact that the brain tissue which is known to process certain aspects of language (information at the phonetic level, for instance) is located in the left temporal lobe near the ear, is seen by many scientists as proof positive that this prevailing view is true.

Petitto, who is also a member of the MNI's McDonnell-Pew Centre for Cognitive Neuroscience, has been studying language acquisition for 20 years. She suspected the story behind language development might not be so cut and dried.

In 1991, she published an eye-opening paper in Science, describing how profoundly deaf babies exposed to signed languages from birth babble with their hands in the same manner that hearing babies babble verbally. Both hearing and deaf babies experimented with and learned to use language in very similar ways, noted Petitto.

She discovered that profoundly deaf babies reached the same milestones at the same time as hearing children did in terms of their evolving mastery of language.

Both hearing and deaf babies entered into a "syllabic babbling stage" at about seven to ten months of age. At 11 to 14 months, signing and speaking children both reached the "first word stage."

Says Petitto, "The question this raised was: How is it possible? If language acquisition is so dependent on speech, how could the profoundly deaf children be matching the hearing children milestone for milestone? The only way it could be explained was if there was some common mechanism at work for both spoken and signed language."

This hypothesis led Petitto to suspect that in fully formed adult brains, "the functional neuroanatomy of sign and speech should be the same."

To take her studies a step further, Petitto joined forces with Zatorre, a brain imaging expert and one of the world's top authorities on the portions of the brain linked to processing music and speech.

In their use of PET, they focused on the parts of the brain that have been firmly established as being connected to speaking and hearing.

"Other researchers had noted that the left hemisphere was responsible for language. We wanted to understand why," says Petitto.

"We used the existence of signed languages as an 'experiment in nature.' They are natural languages, but they have evolved in the absence of sound. Where would these languages be lateralized? If they were organized in the same places as spoken languages, this would challenge the notion that this particular brain tissue is exclusively dedicated to sound processing, and it would suggest that these sites may be doing something else."

During the course of their study, Petitto and Zatorre did in fact notice that there was an increased blood flow in these parts of the brain for signed language. A surge in brain activity during signed language was even detected in an area of the brain called the superior temporal gyrus  an area connected to the auditory nerves.

This part of the brain plays a critical role in making sense of the information we hear it processes speech. The discovery that it also has something to do with processing signed language came out of left field.

"This was really stunning evidence. A part of the brain that is responsible for sound was being activated in people who never heard sound," says Petitto. "According to all the textbooks, that part of the brain processes speech and sound. Period."

Petitto says there are two possible explanations. Either this area of the brain is far more complex than scientists originally believed and plays a role in deciphering patterns in all forms of language, containing both visual and sound cells. Alternatively, it may be able to adjust in cases where people can't communicate verbally  recruiting visually oriented cells to deal with signed language.

In general the discoveries support Petitto's hypothesis that people aren't born to speak so much as they're born to acquire and use language.

She says she owes the deaf community a debt of thanks for their willingness to take part in her studies over the years. "If they didn't support our work, we wouldn't have a finding." One of the research assistants for the most recent study, Kristine Gauna, is profoundly deaf herself.

The funding support of the Natural Sciences and Engineering Research Council, the Medical Research Council and the McDonnell-Pew Centre for Cognitive Neuroscience was also instrumental.

Petitto is excited about taking her most recent findings to the next stage and probing the function of the superior temporal gyrus further. "On the plane ride back from New Orleans, Robert and I tried to think about other things, but we just couldn't stop ourselves. We kept imagining new experiments we could design."

The way Petitto sees it, there is no time to lose. "This is the decade of the brain. There is a ferocious attempt to chart its territory. We're all trying to crack the code for the brain."