Rhythmic engagement with music in infancy

Marcel Zentner, Tuomas Eerola, Marcel Zentner, Tuomas Eerola

Abstract

Humans have a unique ability to coordinate their motor movements to an external auditory stimulus, as in music-induced foot tapping or dancing. This behavior currently engages the attention of scholars across a number of disciplines. However, very little is known about its earliest manifestations. The aim of the current research was to examine whether preverbal infants engage in rhythmic behavior to music. To this end, we carried out two experiments in which we tested 120 infants (aged 5-24 months). Infants were exposed to various excerpts of musical and rhythmic stimuli, including isochronous drumbeats. Control stimuli consisted of adult- and infant-directed speech. Infants' rhythmic movements were assessed by multiple methods involving manual coding from video excerpts and innovative 3D motion-capture technology. The results show that (i) infants engage in significantly more rhythmic movement to music and other rhythmically regular sounds than to speech; (ii) infants exhibit tempo flexibility to some extent (e.g., faster auditory tempo is associated with faster movement tempo); and (iii) the degree of rhythmic coordination with music is positively related to displays of positive affect. The findings are suggestive of a predisposition for rhythmic movement in response to music and other metrically regular sounds.

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Fig. 1.
Fig. 1.
Duration of rhythmic movement during all musical stimuli vs. control stimuli across both experiments. (A) Music was much more effective in generating rhythmic movements than was speech (Audio S1, S3, S7, and S10). (B) Rhythm was just as effective as music, and again more effective than speech, in eliciting rhythmic movement (Audio S2, S4, S8, S9, and S10). The MUSIC condition refers to the mean of Saint-Saëns, Mozart, and children’s music versions. Letters after the stimulus name refer to rhythm (R) and fluctuating (F) variants of the musical stimuli. The Beat label denotes the average of beat 1 and 2 conditions. Error bars indicate SEs.
Fig. 2.
Fig. 2.
Data extraction and periodicity calculation from a 30-s motion-capture excerpt of an infant. (A) Vertical movement trajectory of the right foot, exhibiting short bouts of periodic, oscillating movements (see Movie S4 for an animation). (B) Autocorrelation results in up to a 1-s lag during the excerpt, highlighting the periodic segments and their prevalent period (darker color). (CE) Mean autocorrelation curves within each 10-s segment, demonstrating how the period is constant throughout the excerpt (between 440 and 480 ms), but the amplitude differs widely across segments.
Fig. 3.
Fig. 3.
Movement time follows musical time. Stimulus period means (x axis) are plotted against movement period means (y axis), yielding a correlation of r = 0.61 (computed on experimental stimuli only, colored blue). Movement period extraction followed the procedures illustrated in Fig. 2. Stimulus period extraction was obtained by a tempo-finding algorithm that uses autocorrelation of the spectral energy flux and that applies a resonance function emphasizing perceptually salient beat regions.

Source: PubMed

3
Prenumerera