Cross modal plasticity

Cross modal plasticity can reorganize connections between the four main lobes as a response to sensory loss.

Cross modal plasticity is the adaptive reorganization of neurons to integrate the function of two or more sensory systems. Cross modal plasticity is a type of neuroplasticity and often occurs after sensory deprivation due to disease or brain damage. The reorganization of the neural network is greatest following long-term sensory deprivation, such as congenital blindness or pre-lingual deafness. In these instances, cross modal plasticity can strengthen other sensory systems to compensate for the lack of vision or hearing. This strengthening is due to new connections that are formed to brain cortices that no longer receive sensory input.

Plasticity in the blind

Even though the blind are no longer able to see, the visual cortex is still in active use, although it deals with information different from visual input. Studies found that the volume of white matter (myelinated nerve connections) was reduced in the optic tract, but not in the primary visual cortex itself. However, grey matter volume was reduced by up to 25% in the primary visual cortex. The atrophy of grey matter, the neuron bodies, is likely due to its association with the optic tract.[1] Because the eyes no longer receive visual information, the disuse of the connected optic tract causes a loss of grey matter volume in the primary visual cortex. White matter is thought to atrophy in the same way, although the primary visual cortex is less affected.

For example, blind individuals show enhanced perceptual and attentional sensitivity for identification of different auditory stimuli, including speech sounds. The spatial detection of sound can be interrupted in the early blind by inducing a virtual lesion in the visual cortex using transcranial magnetic stimulation.[2]

The somatosensory cortex is also able to recruit the visual cortex to assist with tactile sensation. Cross modal plasticity reworks the network structure of the brain, leading to increased connections between the somatosensory and visual cortices. Furthermore, the somatosensory cortex acts as a hub region of nerve connections in the brain for the early blind but not for the sighted.[3] With this cross-modal networking the early blind are able to react to tactile stimuli with greater speed and accuracy, as they have more neural pathways to work with. One element of the visual system that the somatosensory cortex is able to recruit is the dorsal-visual stream. The dorsal stream is used by the sighted to identify spatial information visually, but the early blind use it during tactile sensation of 3D objects.[4] However, both sighted and blind participants used the dorsal stream to process spatial information, suggesting that cross modal plasticity in the blind re-routed the dorsal visual stream to work with the sense of touch rather than changing the overall function of the stream.

Experience dependence

There is evidence that the degree of cross modal plasticity between the somatosensory and visual cortices is experience-dependent. In a study using tactile tongue devices to transmit spatial information, early blind individuals were able to show visual cortex activations after 1 week of training with the device.[5] Although there were no cross modal connections at the start, the early blind were able to develop connections between the somatosensory and visual cortices while sighted controls were unable to. Early or congenitally blind individuals have stronger cross modal connections the earlier they began learning Braille.[6] An earlier start allows for stronger connections to form as early blind children have to grow up using their sense of touch to read instead of using their sight. Perhaps due to these cross modal connections, sensory testing studies have shown that tactile spatial acuity is enhanced in blindness[7][8] and that this enhancement is experience-dependent.[9]

Plasticity in the deaf

Cross modal plasticity can also occur in pre-lingual deaf individuals. A functional magnetic resonance imaging (fMRI) study found that deaf participants use the primary auditory cortex as well as the visual cortex when they observe sign language.[10] Although the auditory cortex no longer receives input from the ears, the deaf can still use specific regions of the cortex to process visual stimuli.[11] Primary sensory abilities like brightness discrimination, visual contrast sensitivity, temporal discrimination thresholds, temporal resolution, and discrimination thresholds for motion directions do not appear to change in the loss of a modality like hearing. However, higher-level processing tasks may undergo compensating changes. In the case of auditory deprivation, some of these compensations appear to affect visual periphery processing and movement detection in peripheral vision.[12]

Deaf individuals lack auditory input, so the auditory cortex is instead used to assist with visual and language processing. Auditory activations also appear to be attention-dependent in the deaf. However, the process of visual attention in the deaf is not significantly different from that of hearing subjects.[13] Stronger activations of the auditory cortex during visual observation occur when deaf individuals pay attention to a visual cue, and the activations are weaker if the cue is not in the direct line of sight.[14] One study found that deaf participants process peripheral visual stimuli more quickly than hearing subjects.[15] Deafness appears to heighten spatial attention to the peripheral visual field, but not the central one.[16] The brain thus seems to compensate for the auditory loss within its visual system by enhancing peripheral field attention resources; however, central visual resources may suffer.[17]

Improvements tend to be limited to areas in the brain dedicated to both auditory and visual stimuli, not simply rewriting audio-dedicated areas into visual areas. The visual enhancements seem to be especially focused in areas of the brain that normally process convergence with auditory input. This is specifically seen in studies showing changes in the posterior parietal cortex of deaf individuals, which is both one of the main centers for visual attention but also an area known for integrating information from various senses.[18]

Recent research indicates that in attention-based tasks such as object tracking and enumeration, deaf subjects perform no better than hearing subjects.[19] Improvement in visual processing is still observed, even when a deaf subject is not paying attention to the direct stimulus.[20] A study published in 2011 found that congenitally deaf subjects had significantly larger neuroretinal rim areas than hearing subjects, which suggests that deaf subjects may have a greater concentration of retinal ganglion cells.[21]

Sign language

Deaf individuals often use sign language as their mode of communication. However, sign language alone does not appear to significantly change brain organization. In fact, neuroimaging and electrophysiology data studying functional changes in visual pathways, as well as animal studies of sensory deprivation, have shown that the enhancement in attention of peripheral visual processing found in deaf individuals is not found in hearing signers.[22]

The peripheral visual changes are seen in all forms of deaf individuals – signers, oral communicators, etc.[23] Comparative fMRIs of hearing speakers and hearing early signers, on the other hand, show comparable peripheral activation. The enhancement in attention of peripheral visual processing found in deaf individuals has not been found in hearing signers. It is therefore unlikely that signing causes the neurological differences in visual attention.[24]

Cochlear implants

Another way to see cross modal plasticity in the deaf is when looking at the effects of installing cochlear implants. For those who became deaf pre-lingually, cross modal plasticity interfered with their ability to process language using a cochlear implant. For the pre-lingual deaf, the auditory cortex has been reshaped to deal with visual information, so it cannot deal as well with the new sensory input that the implant provides. However, for post-lingual deaf their experience with visual cues like lip reading can help them understand speech better along with the assistance of a cochlear implant. The post-lingual deaf do not have as much recruitment of the auditory cortex as the early deaf, so they perform better with cochlear implants.[25] It was also found that the visual cortex was activated only when the sounds that were received had potential meaning. For instance, the visual cortex activated for words but not for vowels.[26] This activation is further evidence that cross modal plasticity is attention dependent.

Plasticity after olfactory deficit or whisker trimming

Cross-modal plasticity can be mutually induced between two sensory modalities. For instance, the deprivation of olfactory function upregulate whisker tactile sensation, and on the other hand, the trimming of whiskers upregulates olfactory function. In terms of cellular mechanisms, the coordinated plasticity between cortical excitatory and inhibitory neurons is associated with these upregulations of sensory behaviors.[27][28][29]

References

  1. Ptito, M; Schneider, FCG; Paulson, OB; Kupers, R. (2008). "Alterations of the visual pathways in congenital blindness". Exp. Brain Res. 187 (1): 41–49. doi:10.1007/s00221-008-1273-4. PMID 18224306.
  2. Collignon, O; Davare, M; Olivier, E; De Volder, AG. (2009). "Reorganization of the right occipito-parietal stream for auditory spatial processing in early blind humans. A transcranial magnetic stimulation study". Brain Topogr. 21 (3–4): 232–240. doi:10.1007/s10548-009-0075-8. PMID 19199020.
  3. Shu, N; Liu, Y; Li, J; Yu, C; Jiang, T.; Jiang, Tianzi (2009). "Altered anatomical network in early blindness revealed by diffusion tensor tractography". PLoS ONE. 4 (9): e7228. doi:10.1371/journal.pone.0007228.
  4. Bonino, D; Ricciardi, E; Sani, L; Gentili, C; Vanello, N; Guazzelli, M; Vecchi, T; Pietrini, P. (2008). "Tactile spatial working memory activates the dorsal extrastriate cortical pathway in congenitally blind individuals". Arch. Ital. Biol. 146 (3–4): 133–146. PMID 19378878.
  5. Ptito, M; Matteau, I; Gjedde, A; Kupers, R. (2009). "Recruitment of the middle temporal area by tactile motion in congenital blindness". NeuroReport. 20 (6): 543–47. doi:10.1097/wnr.0b013e3283279909.
  6. Liu, Y; Yu, C; Liang, M; Tian, L; Zhou, Y; Qin, W; Li, K; Jiang, T.; Jiang, T. (2007). "Whole brain functional connectivity in the early blind". Brain. 130 (8): 2085–96. doi:10.1093/brain/awm121.
  7. Goldreich, D; Kanics, IM (2003). "Tactile acuity is enhanced in blindness". Journal of Neuroscience. 23 (8): 3439–45. PMID 12716952.
  8. Goldreich, D; Kanics, IM (2006). "Performance of blind and sighted humans on a tactile grating detection task". Perception & Psychophysics. 68 (8): 1363–71. doi:10.3758/bf03193735. PMID 17378422.
  9. Wong, M; Gnanakumaran, V; Goldreich, D (2011). "Tactile spatial acuity enhancement in blindness: evidence for experience-dependent mechanisms". Journal of Neuroscience. 31 (19): 7028–37. doi:10.1523/jneurosci.6461-10.2011. PMID 21562264.
  10. Lambertz, N; Gizewski, ER; de Greiff, A; Forsting, M. (2005). "Cross-modal plasticity in deaf subjects dependent on extent of hearing loss". Cognit Brain Res. 25 (3): 884–90. doi:10.1016/j.cogbrainres.2005.09.010.
  11. Lomber, SG; Meredith, M. A.; Kral, A. (2010). "Cross-modal plasticity in specific auditory cortices underlies visual compensations in the deaf". Nature Neuroscience. 13 (11): 1421–1427. doi:10.1038/nn.2653. PMID 20935644.
  12. Corina, D.; Singleton, J. (2009). "Developmental Social Cognitive Neuroscience: Insights From Deafness". Child Development. 80 (4): 952–967. doi:10.1111/j.1467-8624.2009.01310.x. PMID 19630887.
  13. Dye, MWG; Baril, D. E.; Bavelier, D (2007). "Which aspects of visul attention are changed by deafness? The case of the Attention Network Test". Neuropsychologia. 45 (8): 1801–1811. doi:10.1016/j.neuropsychologia.2006.12.019. PMID 17291549.
  14. Fine, I; Finney, EM; Boynton, GM; Dobkins, K. (2005). "Comparing effects of auditory deprivation and sign language within the auditory and visual cortex". J Cogn Neurosci. 17 (10): 1621–37. doi:10.1162/089892905774597173. PMID 16269101.
  15. Bottari, D; Caclin, A., Giard, M.-H., & Pavani, F. (2011). "Changes in Early Cortical Visual Processing Predict Enhanced Reactivity in Deaf Individuals". PLoS ONE. 6 (9): e25607. doi:10.1371/journal.pone.0025607.
  16. Bavelier, D.; Dye, M. W.; Hauser, P. C. (2006). "Do deaf individuals see better?". Trends in Cognitive Science. 10 (11): 512–18. doi:10.1016/j.tics.2006.09.006.
  17. Proksch, J.; Bavelier, D. (2002). "Changes in the spatial distribution of visual attention after early deafness". Journal of Cognitive Neuroscience. 14 (5): 687–701. doi:10.1162/08989290260138591. PMID 12167254.
  18. Bavelier, D., Dye, M. W., & Hauser, P. C. (2006). Do deaf individuals see better? Trends in Cognitive Science 10-11, 512-18.
  19. Hauser, PC; Dye, M. W. G.; Boutla, M.; Green, C. S.; Bavelier, D (2007). "Deafness and visual enumeration: not all aspects of attention are modified by deafness". Brain Research. 1153: 178–187. doi:10.1016/j.brainres.2007.03.065. PMC 1934506Freely accessible. PMID 17467671.
  20. Armstrong, BA; Neville, H. J.; Hillyard, S. A.; Mitchell, T. V. (2002). "Auditory deprivation affects processing of motion, but not color". Cognitive Brain Research. 14 (3): 422–434. doi:10.1016/s0926-6410(02)00211-2. PMID 12421665.
  21. Codina, C; Pascallis, O.; Mody, C.; Rose, J.; Gummer, L.; Buckley, P.; Toomey, P. (2011). "Visual Advantage in Deaf Adults Linked to Retinal Changes". PLoS ONE. 6 (6): e20417. doi:10.1371/journal.pone.0020417.
  22. Corina, D.; Singleton, J. (2009). "Developmental Social Cognitive Neuroscience: Insights From Deafness". Child Development. 80 (4): 952–967. doi:10.1111/j.1467-8624.2009.01310.x. PMID 19630887.
  23. Proksch, J.; Bavelier, D. (2002). "Changes in the spatial distribution of visual attention after early deafness". Journal of Cognitive Neuroscience. 14 (5): 687–701. doi:10.1162/08989290260138591. PMID 12167254.
  24. Bavelier, D.; Dye, M. W.; Hauser, P. C. (2006). "Do deaf individuals see better?". Trends in Cognitive Science. 10 (11): 512–18. doi:10.1016/j.tics.2006.09.006.
  25. Doucet, ME; Bergeron, F; Lassonde, M; Ferron, P; Lepore, F. (2006). "Cross-modal reorganization and speech perception in cochlear implant users". Brain. 129 (12): 3376–83. doi:10.1093/brain/awl264.
  26. Giraud, A; Price CJm, Graham JM; Truy, E; Frackowiak, RSJ; Frackowiak, Richard S.J (2001). "Cross-modal plasticity underpins language recovery after cochlear implantation". Neuron. 30 (3): 657–63. doi:10.1016/s0896-6273(01)00318-x. PMID 11430800.
  27. Zhang, Guanjun; Gao, Zhilong; Guan, Sudong; Zhu, Yan; Jin-Hui, Wang (2013). Molecular Brain. 6 (2): 1–11. Missing or empty |title= (help)
  28. Ye, Bing; Huang, Li; Gao, Zilong; Chen, Ping; Ni, Hong; Guan, Sudong; Zhu, Yan; Jin-Hui, Jin-Hui (2012). "http://www.plosone.org/article/metrics/info%3Adoi%2F10.1371%2Fjournal.pone.0041986". PLoS ONE. 7 (8): e41986. doi:10.1371/journal.pone.0041986. External link in |title= (help)
  29. Ni, Hong; Huang, Li; Chen, Na; Liu, Dongbo; Zhang, Fengyu; Ge, Ming; Guan, Sudong; Zhu, Yan; Jin-Hui, Jin-Hui (2010). "http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0013736". PLoS ONE. 5 (10): e13736. doi:10.1371/journal.pone.0013736. External link in |title= (help)
This article is issued from Wikipedia - version of the 7/21/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.