Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies

Frédéric Bevilacqua, Eric O Boyer, Jules Françoise, Olivier Houix, Patrick Susini, Agnès Roby-Brami, Sylvain Hanneton, Frédéric Bevilacqua, Eric O Boyer, Jules Françoise, Olivier Houix, Patrick Susini, Agnès Roby-Brami, Sylvain Hanneton

Abstract

This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with auditory-feedback. In particular, we propose to distinguish between sound-oriented tasks and movement-oriented tasks in experiments involving interactive sound feedback. We describe several research questions and recently published results on movement control, learning and perception. In particular, we studied the effect of the auditory feedback on movements considering several cases: from experiments on pointing and visuo-motor tracking to more complex tasks where interactive sound feedback can guide movements, or cases of sensory substitution where the auditory feedback can inform on object shapes. We also developed specific methodologies and technologies for designing the sonic feedback and movement sonification. We conclude with a discussion on key future research challenges in sensori-motor learning with movement sonification. We also point out toward promising applications such as rehabilitation, sport training or product design.

Keywords: interactive systems; learning; movement; sensori-motor; sonification; sound design.

Figures

Figure 1
Figure 1
This figure summarizes the interdisciplinary research we conducted in the Legos project, from fundamental research, methods and tools, to applications.

References

    1. Bevilacqua F., Schnell N., Rasamimanana N., Bloit J., Fléty E., Caramiaux B., et al. (2013). De-MO: designing action-sound relationships with the MO interfaces, in CHI '13 Extended Abstracts on Human Factors in Computing Systems (Paris: ). 10.1145/2468356.2479571
    1. Bevilacqua F., Schnell N., Rasamimanana N., Zamborlin B., Guédy F. (2011). Online gesture analysis and control of audio processing, in Musical Robots and Interactive Multimodal Systems, Vol. 74, eds Solis J., Ng K. (Springer Berlin Heidelberg; ), 127–142. 10.1007/978-3-642-22291-7_8
    1. Boyer E. (2015). Continuous Auditory Feedback for Sensorimotor Learning. Ph.D. thesis, Université Pierre et Marie Curie - Paris VI.
    1. Boyer E., Vandervoorde L., Bevilacqua F., Hanneton S. (2015). Touching sounds: perception of the curvature of auditory virtual surfaces, in Virtual Reality (VR), 2015 IEEE (Arles: ), 153–154. 10.1109/VR.2015.7223341
    1. Boyer E. O., Babayan B. M., Bevilacqua F., Noisternig M., Warusfel O., Roby-Brami A., et al. . (2013a). From ear to hand: the role of the auditory-motor loop in pointing to an auditory source. Front. Comput. Neurosci. 7:26. 10.3389/fncom.2013.00026
    1. Boyer E. O., Bevilacqua F., Phal F., Hanneton S. (2013b). Low-cost motion sensing of table tennis players for real time feedback. Int. J. Table Tennis Sci. 8, 1–4.
    1. Boyer E. O., Pyanet Q., Hanneton S., Bevilacqua F. (2014). Learning movement kinematics with a targeted sound, in Sound, Music & Motion, Lecture Notes in Computer Science, Vol. 8905, eds Aramaki M., Derrien O., Kronland-Martinet R., Ystad S. (Springer International Publishing; ), 218–233. 10.1007/978-3-319-12976-1_14 Available online at:
    1. Caramiaux B., Bevilacqua F., Bianco T., Schnell N., Houix O., Susini P. (2014a). The role of sound source perception in gestural sound description. ACM Trans. Appl. Percept. 11, 1–19. 10.1145/2536811
    1. Caramiaux B., Schnell N., Françoise J., Bevilacqua F. (2014b). Mapping through listening. Comput. Music J. 38, 34–48. 10.1162/COMJ_a_00255
    1. Castiello U., Giordano B. L., Begliomini C., Ansuini C., Grassi M. (2010). When ears drive hands: the influence of contact sound on reaching to grasp. PLoS ONE 5:e12240. 10.1371/journal.pone.0012240
    1. Cirstea C. M., Ptito A., Levin M. F. (2006). Feedback and cognition in arm motor skill reacquisition after stroke. Stroke 37, 1237–1242. 10.1161/01.STR.0000217417.89347.63
    1. Dubus G., Bresin R. (2013). A systematic review of mapping strategies for the sonification of physical quantities. PLoS ONE 8:e82491. 10.1371/journal.pone.0082491
    1. Effenberg A. (2004). Using sonification to enhance perception and reproduction accuracy of human movement patterns, in Internation Workshop on Interactive Sonification 2004 (Bielefeld: ), 1–5.
    1. Effenberg A. O. (2005). Movement sonification: effects on perception and action. IEEE Multimedia 12, 53–59. 10.1109/MMUL.2005.31
    1. Eriksson M., Bresin R. (2010). Improving running mechanics by use of interactive sonification, in Proceedings of the Interaction Sonification Workshop (ISon) 2010 (Stockholm: KTH Royal Institute of Technology; ).
    1. Farrugia N., Benoit C.-E., Harding E., Kotz S. A., Dalla Bella S. (2012). BAASTA : battery for the assessment of auditory sensorimotor and timing abilities, in Proceedings of the 12th International Conference on Music Perception and Cognition and the 8th Triennial Conference of the European Society for the Cognitive Sciences of Music (Thessaloniki: ).
    1. Fiebrink R., Cook P. R. (2010). The wekinator: a system for real-time, interactive machine learning in music, in Proceedings of The Eleventh International Society for Music Information Retrieval Conference (ISMIR 2010) (Utrecht: ).
    1. Françoise J. (2015). Motion-Sound Mapping By Demonstration. Ph. D. thesis, Université Pierre et Marie Curie - Paris VI.
    1. Françoise J., Chapuis O., Hanneton S., Bevilacqua F. (2016). Soundguides: adapting continuous auditory feedback to users, in Proceedings of the 34th International Conference on Human Factors in Computing Systems, Extended Abstracts, CHI EA '16 (San Jose, CA: ACM; ).
    1. Franinović K., Serafin S. (2013). Sonic Interaction Design. Cambridge, MA; London, UK: MIT Press.
    1. Fritz T. H., Hardikar S., Demoucron M., Niessen M., Demey M., Giot O., et al. . (2013). Musical agency reduces perceived exertion during strenuous physical performance. Proc. Natl. Acad. Sci. U.S.A. 110, 17784–17789. 10.1073/pnas.1217252110
    1. Hanneton S., Herquel P., Auvray M. (2015). Intermodal recoding of a video game: learning to process signals for motion perception in a pure auditory environment. Int. J. Adapt. Control Signal Process. 29, 1475–1483. 10.1002/acs.2549
    1. Houix O., Bevilacqua F., Misdariis N., Susini P., Flety E., Françoise J., et al. (2015). Objects with multiple sonic affordances to explore gestural interactions, in xCoAx 2015. Computation, Communication, Aesthetics & X, xCoAx 2015: Proceedings of the Third Conference on Computation, Communication, Aesthetics and X (Glasgow: Alison Clifford, Miguel Carvalhais and Mario Verdicchio; ), 296–303.
    1. Houix O., Misdariis N., Susini P., Bevilacqua F., Gutierrez F. (2014). Sonically augmented artifacts: design methodology through participatory workshops, in Lecture Notes in Computer Science, Vol. 8905, eds Aramaki M., Derrien O., Kronland-Martinet R., Ystad S. (Springer International Publishing; ), 20–40. 10.1007/978-3-319-12976-1_2 Available online at:
    1. Hsu H.-Y., Lin C.-F., Su F.-C., Kuo H.-T., Chiu H.-Y., Kuo L.-C. (2012). Clinical application of computerized evaluation and re-education biofeedback prototype for sensorimotor control of the hand in stroke patients. J. Neuroeng. Rehabil. 9, 1–9. 10.1186/1743-0003-9-26
    1. Hunt A., Wanderley M. M., Paradis M. (2003). The importance of parameter mapping in electronic instrument design. J. New Music Res. 32, 429–440. 10.1076/jnmr.32.4.429.18853
    1. Katan S., Grierson M., Fiebrink R. (2015). Using interactive machine learning to support interface development through workshops with disabled people, in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI '15 (New York, NY: ACM; ), 251–254.
    1. Kramer G., Walker B. N., Bonebright T., Cook P. R., Flowers J., Miner N. (1999). Sonification Report: Status of the Field and Research Agenda. Technical Report, University of Nebraska; - Lincoln.
    1. Lemaitre G., Houix O., Susini P., Visell Y., Franinovic K. (2012). Feelings elicited by auditory feedback from a computationally augmented artifact: the flops. IEEE Trans. Affect. Comput. 3, 335–348. 10.1109/T-AFFC.2012.1
    1. Leman M. (2008). Embodied Music Cognition and Mediation Technology. Cambridge, MA; London, UK: MIT Press.
    1. Maes P.-J., Leman M., Palmer C., Wanderley M. M. (2014). Action-based effects on music perception. Front. Psychol. 4:1008. 10.3389/fpsyg.2013.01008
    1. Maulucci R. A., Eckhouse R. H. (2001). Retraining reaching in chronic stroke with real-time auditory feedback. NeuroRehabilitation 16, 171–182.
    1. Oh U., Kane S. K., Findlater L. (2013). Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures, in Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS '13 (New York, NY: ACM; ), 13:1–13:8. 10.1145/2513383.2513455
    1. Rath M., Rocchesso D. (2005). Continuous sonic feedback from a rollling ball. IEEE Multimedia 12, 60–69. 10.1109/MMUL.2005.24
    1. Rath M., Schleicher R. (2008). On the relevance of auditory feedback for quality of control in a balancing task. Acta Acust. United Acust. 94, 12–20. 10.3813/AAA.918003
    1. Ripollés P., Rojo N., Grau-Sánchez J., Amengual J., Càmara E., Marco-Pallarés J., et al. . (2015). Music supported therapy promotes motor plasticity in individuals with chronic stroke. Brain Imaging Behav. Available online at: . [Epub ahead of print]. 10.1007/s11682-015-9498-x.
    1. Robertson J. V. G., Hoellinger T., Lindberg P., Bensmail D., Hanneton S., Roby-Brami A. (2009). Effect of auditory feedback differs according to side of hemiparesis: a comparative pilot study. J. Neuroeng. Rehabil. 6:45. 10.1186/1743-0003-6-45
    1. Roby-Brami A., Van Zandt-Escobar A., Jarrassé N., Robertson J., Schnell N., Boyer E., et al. (2014). Toward the use of augmented auditory feedback for the rehabilitation of arm movements in stroke patients. Ann. Phys. Rehabil. Med. 57, e4–e5. 10.1016/j.rehab.2014.03.015
    1. Rocchesso D., Polotti P., Delle Monache S. (2009). Designing continuous sonic interaction. Int. J. Design 3, 13–25.
    1. Rosati G., Oscari F., Spagnol S., Avanzini F., Masiero S. (2012). Effect of task-related continuous auditory feedback during learning of tracking motion exercises. J. Neuroeng. Rehabil. 9:79. 10.1186/1743-0003-9-79
    1. Schiavio A., Altenmüller E. (2015). Exploring music-based rehabilitation for parkinsonism through embodied cognitive science. Front. Neurol. 6:217. 10.3389/fneur.2015.00217
    1. Schlaug G. (2015). Musicians and music making as a model for the study of brain plasticity, in Music, Neurology, and Neuroscience: Evolution, the Musical Brain, Medical Conditions, and Therapies, Vol. 217 of Progress in Brain Research, Chapter 3, eds Eckart Altenmüller S. F., Boller F. (Amsterdam: Elsevier; ), 37–55. Available online at:
    1. Schmidt R. A. (1988). Motor Control and Learning: A Behavioral Emphasis, 2nd Edn. Champaign, IL: Human Kinetics.
    1. Schneider S., Schönle P. W., Altenmüller E., Münte T. F. (2007). Using musical instruments to improve motor skill recovery following a stroke. J. Neurol. 254, 1339–1346. 10.1007/s00415-006-0523-2
    1. Schnell N., Röbel A., Schwarz D., Peeters G., Borghesi R. (2009). MuBu and friends – assembling tools for content based real-time interactive audio processing in Max/MSP, in Proceedings of the International Computer Music Conference (ICMC 2009) (Montreal, QC: ), 423–426.
    1. Scholz D. S., Wu L., Schneider J., Pirzer J., Grossbach M., Rollnik J. D., et al. . (2014). Sonification as a possible stroke rehabilitation strategy. Front. Neurosci. 8:332. 10.3389/fnins.2014.00332
    1. Sigrist R., Rauter G., Riener R., Wolf P. (2013). Augmented visual, auditory, haptic, and multimodal feedback in motor learning: a review. Psychon. Bull. Rev. 20, 21–53. 10.3758/s13423-012-0333-8
    1. Singh A., Piana S., Pollarolo D., Volpe G., Varni G., Tajadura-Jimenez A., et al. (2016). Go-with-the-flow: tracking, analysis and sonification of movement and breathing to build confidence in activity despite chronic pain. Hum. Comput. Interact. 31, 1–49. 10.1080/07370024.2015.1085310
    1. Susini P., Misdariis N., Lemaitre G., Houix O. (2012). Naturalness influences the perceived usability and pleasantness of an interface's sonic feedback. J. Multimodal User Interfaces 5, 175–186. 10.1007/s12193-011-0086-0
    1. Tajadura-Jimenez A., Bianchi-Berthouze N., Furfaro E., Bevilacqua F. (2015). Sonification of surface tapping changes behavior, surface perception, and emotion. IEEE MultiMedia 22, 48–57. 10.1109/MMUL.2015.14
    1. Tajadura-Jiménez A., Liu B., Bianchi-Berthouze N., Bevilacqua F. (2014). Using sound in multi-touch interfaces to change materiality and touch behavior, in Proceedings of the Nordic Conference on Human-Computer Interaction (NordiCHI'14) (Helsinki: ), 199–202.
    1. Thaut M. H. (2015). The discovery of human auditory–motor entrainment and its role in the development of neurologic music therapy. Prog. Brain Res. 217, 253–266. 10.1016/bs.pbr.2014.11.030
    1. Wanderley M. M., Depalle P. (2004). Gestural control of sound synthesis. Proc. IEEE 92, 632–644. 10.1109/JPROC.2004.825882
    1. Zampini M., Spence C. (2004). The role of auditory cues in modulating the perceived crispness and staleness of potato chips. J. Sens. Stud. 19, 347–363. 10.1111/j.1745-459x.2004.080403.x

Source: PubMed

3
Iratkozz fel