The Neural Processing of Vocal Emotion After Hearing Reconstruction in Prelingual Deaf Children: A Functional Near-Infrared Spectroscopy Brain Imaging Study

Yuyang Wang, Lili Liu, Ying Zhang, Chaogang Wei, Tianyu Xin, Qiang He, Xinlin Hou, Yuhe Liu, Yuyang Wang, Lili Liu, Ying Zhang, Chaogang Wei, Tianyu Xin, Qiang He, Xinlin Hou, Yuhe Liu

Abstract

As elucidated by prior research, children with hearing loss have impaired vocal emotion recognition compared with their normal-hearing peers. Cochlear implants (CIs) have achieved significant success in facilitating hearing and speech abilities for people with severe-to-profound sensorineural hearing loss. However, due to the current limitations in neuroimaging tools, existing research has been unable to detail the neural processing for perception and the recognition of vocal emotions during early stage CI use in infant and toddler CI users (ITCI). In the present study, functional near-infrared spectroscopy (fNIRS) imaging was employed during preoperative and postoperative tests to describe the early neural processing of perception in prelingual deaf ITCIs and their recognition of four vocal emotions (fear, anger, happiness, and neutral). The results revealed that the cortical response elicited by vocal emotional stimulation on the left pre-motor and supplementary motor area (pre-SMA), right middle temporal gyrus (MTG), and right superior temporal gyrus (STG) were significantly different between preoperative and postoperative tests. These findings indicate differences between the preoperative and postoperative neural processing associated with vocal emotional stimulation. Further results revealed that the recognition of vocal emotional stimuli appeared in the right supramarginal gyrus (SMG) after CI implantation, and the response elicited by fear was significantly greater than the response elicited by anger, indicating a negative bias. These findings indicate that the development of emotional bias and the development of emotional perception and recognition capabilities in ITCIs occur on a different timeline and involve different neural processing from those in normal-hearing peers. To assess the speech perception and production abilities, the Infant-Toddler Meaningful Auditory Integration Scale (IT-MAIS) and Speech Intelligibility Rating (SIR) were used. The results revealed no significant differences between preoperative and postoperative tests. Finally, the correlates of the neurobehavioral results were investigated, and the results demonstrated that the preoperative response of the right SMG to anger stimuli was significantly and positively correlated with the evaluation of postoperative behavioral outcomes. And the postoperative response of the right SMG to anger stimuli was significantly and negatively correlated with the evaluation of postoperative behavioral outcomes.

Keywords: cochlear implant; functional near-infrared spectroscopy; infant and toddler; prelingual deaf; vocal emotion.

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Copyright © 2021 Wang, Liu, Zhang, Wei, Xin, He, Hou and Liu.

Figures

FIGURE 1
FIGURE 1
fNIRS test procedures and environment. (A) A schematic representation of the block design used for the vocal emotional stimulation experimental procedure. (B) An example of the positioning of the participants during the fNIRS test. The test was conducted in a dark environment, and the lights in the room were switched off.
FIGURE 2
FIGURE 2
The arrangement of the light sources and detectors and the positions of the detection channels. (A) The position of the light sources and detectors, arranged according to the 10–20 system, with red indicating a light source and blue indicating a detector. (B) A schematic representation of the positions of the detection channels (indicated by the green line) formed by the light sources and the detectors. (C,D) Schematic representations of the channel positions in the left (C) and right (D) hemispheres during fNIRS. Each gray dot represents the center of the detection channel.
FIGURE 3
FIGURE 3
Behavioral outcomes. The vertical axis is the score, and the horizontal axis is the test type. Red represents the preoperative test score, and blue represents the postoperative test score. The scores of the two tests did not differ significantly between the two periods.
FIGURE 4
FIGURE 4
Changes in the responses of cortical oxyhemoglobin to four vocal emotional stimuli (the white grid pattern represents fearful emotion, the gray grid pattern represents angry emotion, the white diagonal pattern represents happy emotion, and the gray diagonal pattern represents neutral emotion) in the two-period test (red represents the preoperative test, blue represents the postoperative test). The six subplots show channels 21, 22 (located in the left pre-motor and supplementary motor cortex), 30 (located in the right supramarginal gyrus of Wernicke’s area), 40, 42 (located in the middle temporal gyrus), and 47 (located in the superior temporal gyrus). (A,B) In channels 21 and 22 (located in the left pre-motor and supplementary motor cortex), there is a significant difference in cortical activation elicited by vocal emotional stimulation in the preoperative and postoperative tests. (C,D) In channels 40 and 42 (located in the right middle temporal gyrus), there is a significant difference in cortical activation elicited by vocal emotional stimulation in the preoperative and postoperative tests. (E) In channel 47 (located in the right superior temporal gyrus), there is a significant difference in cortical activation elicited by vocal emotional stimulation in the preoperative and postoperative tests. (F) In channel 30 (located in the right supramarginal gyrus of Wernicke’ s area), cortical activation elicited by fear is significantly different than that elicited by anger. ∗p < 0.05.
FIGURE 5
FIGURE 5
Correlation between the changes in the cortical oxyhemoglobin response to angry vocal emotions in channel 30 (located in the right supramarginal gyrus of Wernicke’s area) and the changes in the IT-MAIS/MAIS score. (A) A positive correlation was observed between preoperative neural responses to postoperative behavioral outcomes (Pearson’s correlation coefficient, r = 0.455, p = 0.033, n = 22). (B) A negative correlation was observed between the postoperative neural responses to postoperative behavioral outcomes (Pearson’s correlation coefficient, r = −0.576, p = 0.005, n = 22).

References

    1. Ainsworth M. D., Bell S. M. (1970). Attachment, exploration, and separation: illustrated by the behavior of one-year-olds in a strange situation. Child Dev. 41 49–67.
    1. Allen M. C., Nikolopoulos T. P., O’Donoghue G. M. (1998). Speech intelligibility in children after cochlear implanation. Otol. Neurotol. 19 742–746.
    1. Anderson C. A., Wiggins I. M., Kitterick P. T., Hartley D. E. (2019). Pre-operative brain imaging using functional near-infrared spectroscopy helps predict cochlear implant outcome in deaf adults. J. Assoc. Res. Otolaryngol. 20 511–528.
    1. Arimitsu T., Uchida-Ota M., Yagihashi T., Kojima S., Watanabe S., Hokuto I., et al. (2011). Functional hemispheric specialization in processing phonemic and prosodic auditory changes in neonates. Front. Psychol. 2:202. 10.3389/fpsyg.2011.00202
    1. Benavides-Varela S., Gomez D. M., Mehler J. (2011). Studying neonates’ language and memory capacities with functional near-infrared spectroscopy. Front. Psychol. 2:64. 10.3389/fpsyg.2011.00064
    1. Bruck C., Kreifelts B., Wildgruber D. (2011). Emotional voices in context: a neurobiological model of multimodal affective information processing. Phys. Life Rev. 8 383–403. 10.1016/j.plrev.2011.10.002
    1. Bryant G., Barrett H. C. (2008). Vocal emotion recognition across disparate cultures. J. Cogn. Cult. 8 135–148. 10.1163/156770908x289242
    1. Bu L., Wang D., Huo C., Xu G., Li Z., Li J. (2018). Effects of poor sleep quality on brain functional connectivity revealed by wavelet-based coherence analysis using NIRS methods in elderly subjects. Neurosci. Lett. 668 108–114. 10.1016/j.neulet.2018.01.026
    1. Cannon S. A., Chatterjee M. (2019). Voice emotion recognition by children with mild-to-moderate hearing loss. Ear Hear. 40 477–492. 10.1097/AUD.0000000000000637
    1. Cavicchiolo S., Mozzanica F., Guerzoni L., Murri A., Dall’Ora I., Ambrogi F., et al. (2018). Early prelingual auditory development in Italian infants and toddlers analysed through the Italian version of the infant-toddler meaningful auditory integration scale (IT-MAIS). Eur. Arch. Otorhinolaryngol. 275 615–622. 10.1007/s00405-017-4847-6
    1. Chatterjee M., Zion D. J., Deroche M. L., Burianek B. A., Limb C. J., Goren A. P., et al. (2015). Voice emotion recognition by cochlear-implanted children and their normally-hearing peers. Hear. Res. 322 151–162. 10.1016/j.heares.2014.10.003
    1. Cheng Y., Lee S. Y., Chen H. Y., Wang P. Y., Decety J. (2012). Voice and emotion processing in the human neonatal brain. J. Cogn. Neurosci. 24 1411–1419. 10.1162/jocn_a_00214
    1. Cooper R. P., Aslin R. N. (1989). The language environment of the young infant: implications for early perceptual development. Can. J. Psychol. 43 247–265. 10.1037/h0084216
    1. Damasio A. R. (1989). Time-locked multiregional retroactivation: a systems-level proposal for the neural substrates of recall and recognition. Cognition 33 25–62. 10.1016/0010-0277(89)90005-x
    1. Denham S. A., McKinley M., Couchoud E. A., Holt R. (1990). Emotional and behavioral predictors of preschool peer ratings. Child Dev. 61 1145–1152.
    1. Drotar D., Sturm L. (1988). Prediction of intellectual development in young children with early histories of nonorganic failure-to-thrive. J. Pediatr. Psychol. 13 281–296. 10.1093/jpepsy/13.2.281
    1. Editorial Board of Chinese Journal of Otorhinolaryngology Head and Neck Surgery, Chinese Medical Association Otorhinolaryngology Head and Neck Surgery Branch, and Professional Committee of Hearing and Speech Rehabilitation of China Disabled Rehabilitation Association (2014). Guidelines for cochlear implant work (2013). Chin. J. Otorhinolaryngol. Head Neck Surg. 49 89–95. 10.3760/cma.j.issn.1673-0860.2014.02.001
    1. Eisenberg N., Spinrad T. L., Eggum N. D. (2010). Emotion-related self-regulation and its relation to children’s maladjustment. Annu. Rev. Clin. Psychol. 6 495–525. 10.1146/annurev.clinpsy.121208.131208
    1. Farroni T., Menon E., Rigato S., Johnson M. H. (2007). The perception of facial expressions in newborns. Eur. J. Dev. Psychol. 4 2–13. 10.1080/17405620601046832
    1. Fruhholz S., Grandjean D. (2013). Multiple subregions in superior temporal cortex are differentially sensitive to vocal expressions: a quantitative meta-analysis. Neurosci. Biobehav. Rev. 37 24–35. 10.1016/j.neubiorev.2012.11.002
    1. Fruhholz S., Trost W., Kotz S. A. (2016). The sound of emotions-towards a unifying neural network perspective of affective sound processing. Neurosci. Biobehav. Rev. 68 96–110. 10.1016/j.neubiorev.2016.05.002
    1. Geers A. E., Davidson L. S., Uchanski R. M., Nicholas J. G. (2013). Interdependence of linguistic and indexical speech perception skills in school-age children with early cochlear implantation. Ear Hear. 34 562–574. 10.1097/AUD.0b013e31828d2bd6
    1. Glenn S. M., Cunningham C. C. (1983). What do babies listen to most? A developmental study of auditory preferences in nonhandicapped infants and infants with Down’s syndrome. Dev. Psychol. 19 332–337. 10.1037/0012-1649.19.3.332
    1. Gomez D. M., Berent I., Benavides-Varela S., Bion R. A., Cattarossi L., Nespor M., et al. (2014). Language universals at birth. Proc. Natl. Acad. Sci. U.S.A. 111 5837–5841. 10.1073/pnas.1318261111
    1. Gottlieb G. (1971). Ontogenesis of Sensory Function in Birds and Mammals. New York, NY: Academic Press.
    1. Grossmann T., Oberecker R., Koch S. P., Friederici A. D. (2010). The developmental origins of voice processing in the human brain. Neuron 65 852–858. 10.1016/j.neuron.2010.03.001
    1. Güntürkün O., Ströckens F., Ocklenburg S. (2020). Brain lateralization: a comparative perspective. Physiol. Rev. 100 1019–1063. 10.1152/physrev.00006.2019
    1. Hoehl S., Striano T. (2010). The development of emotional face and eye gaze processing. Dev. Sci. 13 813–825. 10.1111/j.1467-7687.2009.00944.x
    1. Jiam N. T., Caldwell M., Deroche M. L., Chatterjee M., Limb C. J. (2017). Voice emotion perception and production in cochlear implant users. Hear. Res. 352 30–39. 10.1016/j.heares.2017.01.006
    1. Jiam N. T., Limb C. (2020). Music perception and training for pediatric cochlear implant users. Expert Rev. Med. Devices 17 1193–1206. 10.1080/17434440.2020.1841628
    1. Kang S. Y., Colesa D. J., Swiderski D. L., Su G. L., Raphael Y., Pfingst B. E. (2010). Effects of hearing preservation on psychophysical responses to cochlear implant stimulation. J. Assoc. Res. Otolaryngol. 11 245–265.
    1. Kasatkin N. I. (1972). First conditioned reflexes and the beginning of the learning process in the human infant. Adv. Psychobiol. 1 213–257.
    1. Kochel A., Schongassner F., Feierl-Gsodam S., Schienle A. (2015). Processing of affective prosody in boys suffering from attention deficit hyperactivity disorder: a near-infrared spectroscopy study. Soc. Neurosci. 10 583–591. 10.1080/17470919.2015.1017111
    1. Kohn N., Eickhoff S. B., Scheller M., Laird A. R., Fox P. T., Habel U. (2014). Neural network of cognitive emotion regulation–an ALE meta-analysis and MACM analysis. Neuroimage 87 345–355. 10.1016/j.neuroimage.2013.11.001
    1. Kong Y.-Y., Mullangi A., Marozeau J., Epstein M. (2011). Temporal and spectral cues for musical timbre perception in electric hearing. J. Speech Lang. Hear. Res. 54 981–994.
    1. Li Q., Feng J., Guo J., Wang Z., Li P., Liu H., et al. (2020). Effects of the multisensory rehabilitation product for home-based hand training after stroke on cortical activation by using NIRS methods. Neurosci. Lett. 717:134682. 10.1016/j.neulet.2019.134682
    1. Linnankoski I., Leinonen L., Vihla M., Laakso M.-L., Carlson S. (2005). Conveyance of emotional connotations by a single word in English. Speech Commun. 45 27–39. 10.1016/j.specom.2004.09.007
    1. Liu P., Pell M. D. (2012). Recognizing vocal emotions in Mandarin Chinese: a validated database of Chinese vocal emotional stimuli. Behav. Res. Methods 44 1042–1051. 10.3758/s13428-012-0203-3
    1. Lu X., Qin Z. (2018). Auditory and language development in Mandarin-speaking children after cochlear implantation. Int. J. Pediatr. Otorhinolaryngol. 107 183–189. 10.1016/j.ijporl.2018.02.006
    1. Ludlow A., Heaton P., Rosset D., Hills P., Deruelle C. (2010). Emotion recognition in children with profound and severe deafness: do they have a deficit in perceptual processing? J. Clin. Exp. Neuropsychol. 32 923–928. 10.1080/13803391003596447
    1. Minagawa-Kawai Y., van der Lely H., Ramus F., Sato Y., Mazuka R., Dupoux E. (2011). Optical brain imaging reveals general auditory and language-specific processing in early infant development. Cereb. Cortex 21 254–261. 10.1093/cercor/bhq082
    1. Most T., Michaelis H. (2012). Auditory, visual, and auditory–visual perceptions of emotions by young children with hearing loss versus children with normal hearing. J. Speech Lang. Hear. Res. 55 1148–1162. 10.1044/1092-4388(2011/11-0060)
    1. Most T., Weisel A., Zaychik A. (1993). Auditory, visual and auditory-visual identification of emotions by hearing and hearing-impaired adolescents. Br. J. Audiol. 27 247–253. 10.3109/03005369309076701
    1. Niedenthal P. M. (2007). Embodying emotion. Science 316 1002–1005. 10.1126/science.1136930
    1. Okamoto M., Dan H., Sakamoto K., Takeo K., Shimizu K., Kohno S., et al. (2004). Three-dimensional probabilistic anatomical cranio-cerebral correlation via the international 10-20 system oriented for transcranial functional brain mapping. Neuroimage 21 99–111. 10.1016/j.neuroimage.2003.08.026
    1. Pegg J. E., Werker J. F., McLeod P. J. (1992). Preference for infant-directed over adult-directed speech: evidence from 7-week-old infants. Infant Behav. Dev. 15 325–345. 10.1016/0163-6383(92)80003-d
    1. Pell M. D., Monetta L., Paulmann S., Kotz S. A. (2009). Recognizing emotions in a foreign language. J. Nonverbal Behav. 33 107–120.
    1. Peltola M. J., Leppanen J. M., Maki S., Hietanen J. K. (2009). Emergence of enhanced attention to fearful faces between 5 and 7 months of age. Soc. Cogn. Affect. Neurosci. 4 134–142. 10.1093/scan/nsn046
    1. Piper S. K., Krueger A., Koch S. P., Mehnert J., Habermehl C., Steinbrink J., et al. (2014). A wearable multi-channel fNIRS system for brain imaging in freely moving subjects. Neuroimage 85(Pt 1) 64–71. 10.1016/j.neuroimage.2013.06.062
    1. Planalp S. (1996). Varieties of cues to emotion in naturally occurring situations. Cogn. Emot. 10 137–154.
    1. Rigato S., Farroni T., Johnson M. H. (2010). The shared signal hypothesis and neural responses to expressions and gaze in infants and adults. Soc. Cogn. Affect. Neurosci. 5 88–97. 10.1093/scan/nsp037
    1. Saliba J., Bortfeld H., Levitin D. J., Oghalai J. S. (2016). Functional near-infrared spectroscopy for neuroimaging in cochlear implant recipients. Hear. Res. 338 64–75. 10.1016/j.heares.2016.02.005
    1. Sato H., Hirabayashi Y., Tsubokura H., Kanai M., Ashida T., Konishi I., et al. (2012). Cerebral hemodynamics in newborn infants exposed to speech sounds: a whole-head optical topography study. Hum. Brain Mapp. 33 2092–2103. 10.1002/hbm.21350
    1. Sauter D. A., Eisner F., Calder A. J., Scott S. K. (2010). Perceptual cues in nonverbal vocal expressions of emotion. Q. J. Exp. Psychol. (Hove) 63 2251–2272. 10.1080/17470211003721642
    1. Schorr E. A., Roth F. P., Fox N. A. (2009). Quality of life for children with cochlear implants: perceived benefits and problems and the perception of single words and emotional sounds. J. Speech Lang. Hear. Res. 52 141–152. 10.1044/1092-4388(2008/07-0213)
    1. Strangman G., Culver J. P., Thompson J. H., Boas D. A. (2002). A quantitative comparison of simultaneous BOLD fMRI and NIRS recordings during functional brain activation. Neuroimage 17 719–731. 10.1006/nimg.2002.1227
    1. Thompson W. F., Balkwill L.-L. (2006). Decoding speech prosody in five languages. Semiotica 2006 407–424.
    1. Trainor L. J., Austin C. M., Desjardins R. N. (2000). Is infant-directed speech prosody a result of the vocal expression of emotion? Psychol. Sci. 11 188–195. 10.1111/1467-9280.00240
    1. Vaish A., Grossmann T., Woodward A. (2008). Not all emotions are created equal: the negativity bias in social-emotional development. Psychol. Bull. 134 383–403. 10.1037/0033-2909.134.3.383
    1. Wauters L. N., Knoors H. (2008). Social integration of deaf children in inclusive settings. J. Deaf Stud. Deaf Educ. 13 21–36. 10.1093/deafed/enm028
    1. Wiefferink C. H., Rieffe C., Ketelaar L., De Raeve L., Frijns J. H. (2013). Emotion understanding in deaf children with a cochlear implant. J. Deaf Stud. Deaf Educ. 18 175–186. 10.1093/deafed/ens042
    1. Wilson M. (2002). Six views of embodied cognition. Psychon. Bull. Rev. 9 625–636. 10.3758/bf03196322
    1. Xu L., Zhou N., Chen X., Li Y., Schultz H. M., Zhao X., et al. (2009). Vocal singing by prelingually-deafened children with cochlear implants. Hear. Res. 255 129–134.
    1. Yang F., Zhao F., Zheng Y., Li G. (2020). Modification and verification of the infant-toddler meaningful auditory integration scale: a psychometric analysis combining item response theory with classical test theory. Health Qual. Life Outcomes 18:367. 10.1186/s12955-020-01620-9
    1. Zeng F. G., Rebscher S., Harrison W., Sun X., Feng H. (2008). Cochlear implants: system design, integration, and evaluation. IEEE Rev. Biomed. Eng. 1 115–142. 10.1109/RBME.2008.2008250
    1. Zhang D., Chen Y., Hou X., Wu Y. J. (2019). Near-infrared spectroscopy reveals neural perception of vocal emotions in human neonates. Hum. Brain Mapp. 40 2434–2448. 10.1002/hbm.24534
    1. Zhang D., Zhou Y., Hou X., Cui Y., Zhou C. (2017). Discrimination of emotional prosodies in human neonates: a pilot fNIRS study. Neurosci. Lett. 658 62–66. 10.1016/j.neulet.2017.08.047
    1. Zhao C., Chronaki G., Schiessl I., Wan M. W., Abel K. M. (2019). Is infant neural sensitivity to vocal emotion associated with mother-infant relational experience? PLoS One 14:e0212205. 10.1371/journal.pone.0212205
    1. Zhao E. E., Dornhoffer J. R., Loftus C., Nguyen S. A., Meyer T. A., Dubno J. R., et al. (2020). Association of patient-related factors with adult cochlear implant speech recognition outcomes: a meta-analysis. JAMA Otolaryngol. Head Neck Surg. 146 613–620.
    1. Zimmerman-Phillips S., Osberger M., Robbins A. (1997). Infant-Toddler: Meaningful Auditory Integration Scale (IT-MAIS). Sylmar, LA: Advanced Bionics Corporation.

Source: PubMed

3
Abonnieren