Influences of rhythm- and timbre-related musical features on characteristics of music-induced movement

Birgitta Burger, Marc R Thompson, Geoff Luck, Suvi Saarikallio, Petri Toiviainen, Birgitta Burger, Marc R Thompson, Geoff Luck, Suvi Saarikallio, Petri Toiviainen

Abstract

Music makes us move. Several factors can affect the characteristics of such movements, including individual factors or musical features. For this study, we investigated the effect of rhythm- and timbre-related musical features as well as tempo on movement characteristics. Sixty participants were presented with 30 musical stimuli representing different styles of popular music, and instructed to move along with the music. Optical motion capture was used to record participants' movements. Subsequently, eight movement features and four rhythm- and timbre-related musical features were computationally extracted from the data, while the tempo was assessed in a perceptual experiment. A subsequent correlational analysis revealed that, for instance, clear pulses seemed to be embodied with the whole body, i.e., by using various movement types of different body parts, whereas spectral flux and percussiveness were found to be more distinctly related to certain body parts, such as head and hand movement. A series of ANOVAs with the stimuli being divided into three groups of five stimuli each based on the tempo revealed no significant differences between the groups, suggesting that the tempo of our stimuli set failed to have an effect on the movement features. In general, the results can be linked to the framework of embodied music cognition, as they show that body movements are used to reflect, imitate, and predict musical characteristics.

Keywords: dance; motion capture; music-induced movement; musical feature extraction; pulse clarity; spectral flux.

Figures

Figure 1
Figure 1
Marker and joint locations. (A) Anterior and posterior view of the marker placement on the participants’ bodies; (B) Anterior view of the marker locations as stick figure illustration; (C) Anterior view of the locations of the secondary markers/joints used in the analysis.
Figure 2
Figure 2
Fluctuation spectra of two stimuli used in the study. (A) Peaks at a regular distance of 0.28 Hz, with the highest peak at 4.56 Hz and other clear peaks at 2.29, 6.85, and 9.13 Hz, suggesting clear pulses and periodicity (stimulus 1, see Appendix). (B) Markedly lower magnitude values, a less periodic pattern of peaks, and more noise, suggesting low pulse clarity (stimulus 21, see Appendix).
Figure 3
Figure 3
Spectrograms (sec. 10–20) of sub-band no. 2 (50–100 Hz) of two stimuli used in the study. (A) High amount of temporal change (red represents high energy at the respective time and frequency, whereas blue represents low energy; see color bar) resulting in high value for Sub-Band Flux (stimulus 26, see Appendix). (B) Low amount of temporal change resulting in low Sub-Band Flux (stimulus 5, see Appendix).

References

    1. Alluri V., Toiviainen P. (2010). Exploring perceptual and acoustical correlates of polyphonic timbre. Music Percept. 27, 223–24210.1525/mp.2010.27.3.223
    1. Arom S. (1991). African Polyphony and Polyrhythm: Musical Structure and Methodology. Cambridge: Cambridge University Press
    1. Bengtsson S. L., Ullén F., Ehrsson H. H., Hashimoto T., Kito T., Naito E., et al. (2009). Listening to rhythms activates motor and premotor cortices. Cortex 45, 62–7110.1016/j.cortex.2008.07.002
    1. Brown S., Merker B., Wallin N. L. (2000). “An introduction to evolutionary musicology,” in The Origins of Music, eds Wallin N. L., Merker B., Brown S. (Cambridge, MA: MIT Press; ), 3–24
    1. Camurri A., Lagerlöf I., Volpe G. (2003). Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int. J. Hum. Comput. Stud. 59, 213–22510.1016/S1071-5819(03)00050-8
    1. Camurri A., Mazzarino B., Ricchetti M., Timmers R., Volpe G. (2004). “Multimodal analysis of expressive gesture in music and dance performances,” in Gesture-Based Communication in Human-Computer Interaction. Lecture Notes in Computer Science, 2915, eds Camurri A., Volpe G. (Berlin: Springer; ), 20–39
    1. Chen J. L., Penhune V. B., Zatorre R. J. (2009). The role of auditory and premotor cortex in sensorimotor transformations. Ann. N. Y. Acad. Sci. 1169, 15–3410.1111/j.1749-6632.2009.04556.x
    1. Collyer C. E., Broadbent H. A., Church R. M. (1992). Categorical time production: evidence for discrete timing in motor control. Percept. Psychophys. 51, 134–14410.3758/BF03212238
    1. Cross I. (2001). Music, cognition, culture, and evolution. Ann. N. Y. Acad. Sci. 930, 28–4210.1111/j.1749-6632.2001.tb05723.x
    1. Desmond J. C. (1993). Embodying difference: issues in dance and cultural studies. Cult. Crit. 26, 33–6310.2307/1354455
    1. Eerola T., Luck G., Toiviainen P. (2006). “An investigation of pre-schoolers’ corporeal synchronization with music,” in Proceedings of the 9th International Conference on Music Perception and Cognition, eds Baroni M., Addessi A. R., Caterina R., Costa M. (Bologna: University of Bologna), 472–476
    1. Fraisse P. (1982). “Rhythm and tempo,” in The Psychology of Music, ed. Deutsch D. (New York, NY: Academic Press; ), 149–180
    1. Godøy R. I., Haga E., Jensenius A. R. (2006). “Playing ‘Air instruments’: mimicry of sound-producing gestures by novices and experts,” in Gesture in Human-Computer Interaction and Simulation, Lecture Notes in Computer Science, 3881, eds Gibet S., Courty N., Kamp J.-F. (Berlin: Springer; ), 256–267
    1. Grahn J. A., Brett M. (2007). Rhythm and beat perception in motor areas of the brain. J. Cogn. Neurosci. 19, 893–90610.1162/jocn.2007.19.5.893
    1. Grahn J. A., Rowe J. B. (2009). Feeling the beat: premotor and striatal interactions in musicians and nonmusicians during beat perception. J. Neurosci. 29, 7540–754810.1523/JNEUROSCI.2018-08.2009
    1. Hadar U. (1989). Two types of gesture and their role in speech production. J. Lang. Soc. Psychol. 8, 221–22810.1177/0261927X8983004
    1. Janata P., Tomic S. T., Haberman J. M. (2011). Sensorimotor coupling in music and the psychology of the groove. J. Exp. Psychol. Gen. 141, 54–7510.1037/a0024208
    1. Jensenius A. R. (2006). “Using motiongrams in the study of musical gestures,” in Proceedings of the International Computer Music Conference (New Orleans, LA: Tulane University), 499–502
    1. Keller P., Rieger M. (2009). Special issue – musical movement and synchronization. Music Percept. 26, 397–40010.1525/mp.2009.26.3.289
    1. Lakoff G., Johnson M. (1980). Metaphors We Live By. Chicago: University of Chicago Press
    1. Lakoff G., Johnson M. (1999). Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. New York, NY: Basic Books
    1. Lartillot O., Eerola T., Toiviainen P., Fornari J. (2008). “Multi-feature modeling of pulse clarity: design, validation, and optimization,” in Proceedings of the 9th International Conference on Music Information Retrieval, eds Bello J. P., Chew E., Turnbull D. (Philadelphia, PA: Drexel University), 521–526
    1. Lartillot O., Toiviainen P. (2007). “A matlab toolbox for musical feature extraction from audio,” in Proceedings of the 10th International Conference on Digital Audio Effects (Bordeaux: University of Bordeaux), 1–8
    1. Leman M. (2007). Embodied Music Cognition and Mediation Technology. Cambridge, MA: MIT Press
    1. Leman M., Godøy R. I. (2010). “Why Study Musical Gesture?” in Musical Gestures. Sound, Movement, and Meaning, eds Godøy R. I., Leman M. (New York, NY: Routledge; ), 3–11
    1. Leman M., Naveda L. (2010). Basic gestures as spatiotemporal reference frames for repetitive dance/music patterns in Samba and Charleston. Music Percept. 28, 71–9210.1525/mp.2010.28.1.71
    1. Lesaffre M., De Voogdt L., Leman M., De Baets B., De Meyer H., Martens J.-P. (2008). How potential users of music search and retrieval systems describe the semantic quality of music. J. Am. Soc. Inf. Sci. Technol. 59, 695–70710.1002/asi.20731
    1. Luck G., Saarikallio S., Burger B., Thompson M. R., Toiviainen P. (2010). Effects of the Big Five and musical genre on music-induced movement. J. Res. Pers. 44, 714–72010.1016/j.jrp.2010.10.001
    1. MacDougall H. G., Moore S. T. (2005). Marching to the beat of the same drummer: the spontaneous tempo of human locomotion. J. Appl. Physiol. 99, 1164–117310.1152/japplphysiol.00138.2005
    1. Madison G., Gouyon F., Ullén F., Hörnström K. (2011). Modeling the tendency for music to induce movement in humans: first correlations with low-level audio descriptors across music genres. J. Exp. Psychol. Hum. Percept. Perform. 37, 1578–159410.1037/a0024323
    1. Moelants D. (2002). “Preferred tempo reconsidered,” in Proceedings of the 7th International Conference on Music Perception and Cognition, eds Stevens C., Burnham D., McPherson G., Schubert E., Renwick J. (Adelaide: Causal Productions), 580–583
    1. Murray M. P., Drought A. B., Kory R. C. (1964). Walking patterns of normal men. J. Bone Joint Surg. 46, 335–360
    1. Naveda L., Leman M. (2010). The spatiotemporal representation of dance and music gestures using Topological Gesture Analysis (TGA). Music Percept. 28, 93–11210.1525/mp.2010.28.1.93
    1. Nettl B. (2000). “An ethnomusicologist contemplates universals in musical sound and musical culture,” in The Origins of Music, eds Wallin N. L., Merker B., Brown S. (Cambridge, MA: MIT Press; ), 463–472
    1. Pampalk E., Rauber A., Merkl D. (2002). “Content-based organization and visualization of music archives,” in Proceedings of the 10th ACM International Conference on Multimedia, Juan-les-Pins (New York, NY: ACM Press), 570–579
    1. Parncutt R. (1994). A perceptual model of pulse salience and metrical accent in musical rhythms. Music Percept. 11, 409–46410.2307/40285633
    1. Phillips-Silver J., Trainor L. J. (2008). Vestibular influence on auditory metrical interpretation. Brain Cogn. 67, 94–10210.1016/j.bandc.2007.11.007
    1. Repp B. H. (2005). Sensorimotor synchronization: a review of the tapping literature. Psychon. Bull. Rev. 12, 969–99210.3758/BF03206433
    1. Savitzky A., Golay M. J. E. (1964). Smoothing and differentiation of data by simplified least squares procedures. Anal. Chem. 36, 1627–163910.1021/ac60214a047
    1. Shrout P. E., Fleiss J. L. (1979). Intraclass correlations: uses in assessing rater reliability. Psychol. Bull. 86, 420–42810.1037/0033-2909.86.2.420
    1. Stevens C. J., Schubert E., Wang S., Kroos C., Halovic S. (2009). Moving with and without music: scaling and lapsing in time in the performance of contemporary dance. Music Percept. 26, 451–46410.1525/mp.2009.26.5.451
    1. Styns F., van Noorden L., Moelants D., Leman M. (2007). Walking on music. Hum. Mov. Sci. 26, 769–78510.1016/j.humov.2007.07.007
    1. Toiviainen P., Burger B. (2011). MoCap Toolbox Manual. Jyväskylä: University of Jyväskylä
    1. Toiviainen P., Luck G., Thompson M. (2010). Embodied meter: hierarchical eigenmodes in music-induced movement. Music Percept. 28, 59–7010.1525/mp.2010.28.1.1
    1. Trainor L. J., Gao X., Lei J.-J., Lehtovaara K., Harris L. R. (2009). The primal role of the vestibular system in determining musical rhythm. Cortex 45, 35–4310.1016/j.cortex.2007.10.014
    1. Van Dyck E., Moelants D., Demey M., Coussement P., Deweppe A., Leman M. (2010). “The impact of the bass drum on body movement in spontaneous dance,” in Proceedings of the 11th International Conference in Music Perception and Cognition, eds Demorest S. M., Morrison S. J., Campbell P. S. (Seattle, WA: University of Washington), 429–434
    1. Varela F. J., Thompson E., Rosch E. (1991). The Embodied Mind: Cognitive Science and Human Experience. Cambridge, MA: MIT Press
    1. Zentner M., Eerola T. (2010). Rhythmic engagement with music in infancy. Proc. Natl. Acad. Sci. U.S.A. 107, 5768–577310.1073/pnas.1000121107

Source: PubMed

3
Suscribir