Main

The auditory areas consist of the primary auditory cortex and the auditory association area (the supratemporal gyrus). The neural network that projects from the inner ear to the primary auditory cerebral cortex is formed without any auditory input, whereas post-processing neurons develop by learning with proper neural input. The learning period for the mother tongue is thought to be below five to six years of age1. Reducing the auditory signals during the critical language-learning period can severely limit a child's potential for developing an effective communication system2. ‘Pre-lingual deaf’ patients, who were deafened before acquiring language, communicate using sign language.

In an attempt to understand how these auditory areas function in the congenitally deaf, we used positron emission tomography (PET) to measure cortical activation during a sign-language task. In the main experiment we sought to localize the ‘sign language’ areas, but a secondary experiment was set up to localize both the auditory areas that had been dormant and the visual areas.

In the main experiment, the subject viewed a video of sign-language words being signed by a native signer; a still frame of the video was viewed in the control task. PET images were seen using statistical parametric mapping software3, and maps were superimposed onto magnetic resonance images of the subject's brain for spatial localization. We found that sign language activated the supratemporal gyri bilaterally (left, z=4.52; P=0.005, corrected; Fig. 1).

Figure 1: Activation of areas of the brain.
figure 1

The activated areas were superimposed onto the three horizontal sections (10 mm below, and 4 mm and 8 mm above, the intercommissural plane) of the subject's magnetic resonance image. Yellow areas were activated by sign language in the main experiment; green areas were activated by audition, and blue areas by vision, in the secondary experiment.

The subject was scheduled to have a cochlear implant in his left ear. The implant is an artificial prosthesis, inserted into the inner ear, that electrically stimulates the cochlear nerve and enables the profoundly deaf to hear sounds. To distinguish the supratemporal gyri from the visual and dormant auditory areas, a secondary experiment was performed after the operation, consisting of an auditory task, a visual task and rest. In the visual task, the subject watched a video showing someone moving both hands up and down in a meaningless manner. In the auditory task, the words of the tape were delivered through the cochlear implant. The visual stimulation was found to activate the visual cortex in the occipital lobe (P=0.001, corrected), and the auditory stimulation activated the right primary auditory cortex, contralateral to the auditory input (P=0.002, uncorrected) (Fig. 1).

Pre-lingual deaf people can hear when a cochlear implant is switched on, but this does not allow them to understand words. Language stimulation through the implant activates only the primary auditory cortex in the pre-lingual deaf, whereas in the post-lingual deaf it activates both the primary and the secondary auditory areas4. The result of our secondary experiment was compatible with these findings. Our study of native signers and those who learnt sign language later showed that the nature and timing of sensory and linguistic experience significantly affect the development of the language systems of the brain5.

In bilingual subjects (those with both signed and spoken language), sign language activates the visual areas6, whereas our study showed activation of the auditory area in the sign-language task. Because our subject had never received auditory input while the neural network was being formed, it seems that the supratemporal lobe was engaged in processing sign language. Using sign language elicits considerable activation of the left hemisphere in Broca's area and Wernicke's area, as well as of the right hemisphere7, whereas our results indicated limited activation of Wernicke's area by sign-language words.

This cross-modal plasticity is also seen in visual areas. Braille-reading blind subjects have activation of the primary and secondary visual cortical areas when they perform tactile tasks8, although congenitally blind Braille readers have activation of visual reading areas but not primary visual cortex9. Our results indicate that the primary auditory cortex of deaf people is reserved for hearing sounds, whereas the secondary areas are used for processing sign language. This cross-modal non-plasticity of the primary auditory cortex is supported by functional magnetic resonance imaging of a congenitally deaf subject10, which suggests that the primary projection areas might be rigidly organized.

We observed that sign language activates the ‘language’ areas but not primary auditory cortex. The finding that, after a cochlear implant is in place, spoken words activate primary auditory cortex but not adjacent language areas indicates that primary auditory cortex still functions as an auditory area in this patient. We also identified the ‘sign-language’ area as the supratemporal gyri, which is usually the auditory area.