Abstract
The upper regions of the brain's temporal lobe are important both for hearing and for comprehending spoken language. We have discovered that these regions can be activated by sign language in congenitally deaf subjects, even though the temporal lobe normally functions as an auditory area. This finding indicates that, in deaf people, the brain region usually reserved for hearing may be activated by other sensory modalities, providing striking evidence of neural plasticity.
Main
The auditory areas consist of the primary auditory cortex and the auditory association area (the supratemporal gyrus). The neural network that projects from the inner ear to the primary auditory cerebral cortex is formed without any auditory input, whereas post-processing neurons develop by learning with proper neural input. The learning period for the mother tongue is thought to be below five to six years of age1. Reducing the auditory signals during the critical language-learning period can severely limit a child's potential for developing an effective communication system2. ‘Pre-lingual deaf’ patients, who were deafened before acquiring language, communicate using sign language.
In an attempt to understand how these auditory areas function in the congenitally deaf, we used positron emission tomography (PET) to measure cortical activation during a sign-language task. In the main experiment we sought to localize the ‘sign language’ areas, but a secondary experiment was set up to localize both the auditory areas that had been dormant and the visual areas.
In the main experiment, the subject viewed a video of sign-language words being signed by a native signer; a still frame of the video was viewed in the control task. PET images were seen using statistical parametric mapping software3, and maps were superimposed onto magnetic resonance images of the subject's brain for spatial localization. We found that sign language activated the supratemporal gyri bilaterally (left, z=4.52; P=0.005, corrected; Fig. 1).
The subject was scheduled to have a cochlear implant in his left ear. The implant is an artificial prosthesis, inserted into the inner ear, that electrically stimulates the cochlear nerve and enables the profoundly deaf to hear sounds. To distinguish the supratemporal gyri from the visual and dormant auditory areas, a secondary experiment was performed after the operation, consisting of an auditory task, a visual task and rest. In the visual task, the subject watched a video showing someone moving both hands up and down in a meaningless manner. In the auditory task, the words of the tape were delivered through the cochlear implant. The visual stimulation was found to activate the visual cortex in the occipital lobe (P=0.001, corrected), and the auditory stimulation activated the right primary auditory cortex, contralateral to the auditory input (P=0.002, uncorrected) (Fig. 1).
Pre-lingual deaf people can hear when a cochlear implant is switched on, but this does not allow them to understand words. Language stimulation through the implant activates only the primary auditory cortex in the pre-lingual deaf, whereas in the post-lingual deaf it activates both the primary and the secondary auditory areas4. The result of our secondary experiment was compatible with these findings. Our study of native signers and those who learnt sign language later showed that the nature and timing of sensory and linguistic experience significantly affect the development of the language systems of the brain5.
In bilingual subjects (those with both signed and spoken language), sign language activates the visual areas6, whereas our study showed activation of the auditory area in the sign-language task. Because our subject had never received auditory input while the neural network was being formed, it seems that the supratemporal lobe was engaged in processing sign language. Using sign language elicits considerable activation of the left hemisphere in Broca's area and Wernicke's area, as well as of the right hemisphere7, whereas our results indicated limited activation of Wernicke's area by sign-language words.
This cross-modal plasticity is also seen in visual areas. Braille-reading blind subjects have activation of the primary and secondary visual cortical areas when they perform tactile tasks8, although congenitally blind Braille readers have activation of visual reading areas but not primary visual cortex9. Our results indicate that the primary auditory cortex of deaf people is reserved for hearing sounds, whereas the secondary areas are used for processing sign language. This cross-modal non-plasticity of the primary auditory cortex is supported by functional magnetic resonance imaging of a congenitally deaf subject10, which suggests that the primary projection areas might be rigidly organized.
We observed that sign language activates the ‘language’ areas but not primary auditory cortex. The finding that, after a cochlear implant is in place, spoken words activate primary auditory cortex but not adjacent language areas indicates that primary auditory cortex still functions as an auditory area in this patient. We also identified the ‘sign-language’ area as the supratemporal gyri, which is usually the auditory area.
References
Osberger, M. J.et al. Ann. Otol. Rhinol. Laryngol. 100, 883–888 (1991).
Fitch, J. L., Williams, T. F. & Etienne, J. E. J. Speech Hear. Disord. 47, 373–375 (1982).
SPM96 (Wellcome Department of Cognitive Neurology, London, 1996).
Naito, Y.et al. Acta Otolaryngol. 117, 490–496 (1997).
Neville, H. J.et al. Brain Lang. 57, 285–308 (1997).
Soderfeldt, B.et al. Neurology 49, 82–87 (1997).
Neville, H. J.et al. Proc. Natl Acad. Sci. USA 95, 922–929 (1998).
Sadato, N.et al. Nature 380, 526–528 (1996).
Büchel, C., Price, C., Frachowiak, R. S. J. & Friston, K. Brain 121, 409–419 (1998).
Hickok, G.et al. Hum. Brain Mapping 5, 437–444 (1997).
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Nishimura, H., Hashikawa, K., Doi, K. et al. Sign language ‘heard’ in the auditory cortex. Nature 397, 116 (1999). https://doi.org/10.1038/16376
Issue Date:
DOI: https://doi.org/10.1038/16376
This article is cited by
-
Decoding auditory deprivation: resting-state fMRI insights into deafness and brain plasticity
Brain Structure and Function (2024)
-
A modality-independent proto-organization of human multisensory areas
Nature Human Behaviour (2023)
-
Temporal visual representation elicits early auditory-like responses in hearing but not in deaf individuals
Scientific Reports (2022)
-
Early deafness leads to re-shaping of functional connectivity beyond the auditory cortex
Brain Imaging and Behavior (2021)
-
Enhanced tactile identification of musical emotion in the deaf
Experimental Brain Research (2020)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.