Changes in music tempo entrain movement related brain

CHANGES IN MUSIC TEMPO ENTRAIN MOVEMENT RELATED BRAIN ACTIVITY
Ian Daly1, James Hallowell1, Faustina Hwang1, Alexis Kirke2, Asad Malik1, Etienne Roesch1, James Weaver1, Duncan Williams2, Eduardo Miranda2, and Slawomir J. Nasuto1
1Brain
Embodiment Lab, School of Systems Engineering, University of Reading, UK
2Interdisciplinary
centre for Computer Music Research, University of Plymouth, Plymouth, UK
Results
“We listen to music with our muscles”
(Friedrich Nietzsche, 1844-1900)
Artefacts were detected in 31.03% (360) trials, this left 804 artefact free trials.
Significant ERD/S was observed during music listening.
C3
Introduction
During both movement and the imagination of movement an event-related
(de)synchronisation (ERD/S) may be observed in the electroencephalogram (EEG) [2].
We explore if ERD/S is observable while listening to music with particular tempos.
Event related (de)synchronisation (ERD/S) is a reduction or increase in alpha / beta band
power in EEG during, and after, movement.
Music tempo describes the number of beats per minute in a piece of music.
Music is known to entrain movement. Music may make you want to tap your feet or
otherwise move rhythmically [1].
30
Frequency (Hz)
There is a documented relationship between music tempo and our desire to move [1].
C4
Cz
25
20
15
10
3
2
1
0
1
2
3
4
5
6
7
8
9
10
11
3
2
1
0
1
2
3
4
5
6
7
8
9
10
11
3
2
1
0
1
2
3
4
5
6
7
8
9
10
11
Time (s)
Figure 2. Event-related (de)synchronisations during music listening.
Is there a relationship between music tempo and ERD/S?
Correlation between ERD strength and tempo
variance is significant and peaks approximately
4s after the start of the music (fig 4).
Methods
The left hemisphere motor cortex shows the
greatest correlation between ERD strength and
music tempo (fig 3).
Participants
„ Twenty-nine right handed individuals
„ Age (median = 35, SD = 14.6)
„ 13 males, 16 female
Measurements
Music
„
„
„
„
19 channels, international 10/20 system
1,000 Hz sample rate
All impedances < 15KΩ
BrainAmp EEG amplifier (BrainProducts, Germany)
Figure 3. Scalp map of maximum correlation between ERD
strength and tempo standard deviation..
110 excerpts from film scores (length 10s) [3].
Music spans a range of styles and contains a range of tempos
Tempo estimated from each clip via a dynamic programming approach [4].
Mean and standard deviation of tempos extracted for each clip.
0.16
0.14
0.12
0.1
Correlation
„
„
„
„
Paradigm
Four runs listening to music, ten trials per run
Each trial involved listening to a random piece of music while having EEG recorded.
Participants were instructed to sit still throughout.
This was followed by a questionnaire to evaluate the music induced emotion (evaluated
elsewhere [5]).
0.08
0.06
0.04
0.02
0
0 02
0 04
0
1
-3 -2 -1 0 1
Figure 1. Paradigm timing.
2 3 4 5 6 7 8 9 10 11 12 13
Time (s)
Analysis
Artefacts removed via visual inspection.
ERD/S identified via time-frequency spectrograms.
10x10 cross-fold validation used to identify maximum time-frequency and channel locations
of correlation between ERD and tempo features (mean and standard deviation).
References
3
4
5
6
7
8
9
10
11
Figure 4. Correlation between ERD strength and tempo standard variation over
time.
Discussion
?
2
The observed relationship is not an audio-steady-state response
(ASSR), these occur centrally, at the stimulation frequency [6],
whereas the ERD is lateral and spans the alpha and beta bands
[2]. Additionally, ASSR is triggered by a steady stimulation
frequency, whereas our observations relate to tempo variance.
Interestingly, no significant relationship with mean tempo is
found. Thus, ERD is modulated by the variation (dynamics) of
music tempo.
This may be related to findings elsewhere that music tonality can
produce significant effects in the motor cortex [7].
The greater tempo variation in the music, the greater the
“surprise” in the music. ERD has been reported to relate to
[1] Schachner, A., Brady, T. F., Pepperberg, I. M., and Hauser, M. D., “Spontaneous motor entrainment to music in multiple vocal mimicking species.,” Current biol., 19, pp. 831‟
6, 2009..
[2] G. Pfurtscheller and F. Lopes da Silva, “Event-related EEG/MEG synchronization and desynchronization: basic principles,” Clin. Neurophys., 110, pp. 1842‟1857, 1999.
[3] T. Eerola and J. K. Vuoskoski, “A comparison of the discrete and dimensional models of emotion in music,” Psychology of Music, 39, pp. 18‟49, 2010.
[4] D. P. W. Ellis, “Beat Tracking by Dynamic Programming,” J. of New Music Research, 36, pp. 51‟60, 2007.
[5] I. Daly, A. Malik, F. Hwang, E. Roesch, J. Weaver, A. Kirke, D. Williams, E. Miranda, and S. J. Nasuto, “Neural correlates of emotional responses to music: an EEG study,”
Neuroscience Letters, 573, pp. 52‟57, 2014.
[6] S. Nozaradan, I. Peretz, and A. Mouraux, “Selective neuronal entrainment to the beat and meter embedded in a musical rhythm.,” The J. of Neurosci., 32, pp. 17572‟81, 2012.
[7] S. Durrant, E. R. Miranda, D. R. Hardoon, J. Shawe-taylor, A. Brechmann, and H. Scheich, “Neural Correlates of Tonality in Music,” in Proc. of Music, Brain & Cognition
Workshop - NIPS Conference, Whistler (Canada), 2007.
[8] R. Khosrowabadi, A. Wahab, K. K. Ang, and M. H. Baniasad, “Affective computation on EEG correlates of emotion from musical and vocal stimuli,” in 2009 Int. Joint Conf. on
Neural Networks, pp. 1590‟1594, IEEE, 2009.
[9] M. C. Bastiaansen, K. B. B¨ocker, P. J. Cluitmans, and C. H. Brunia, “Event-related desynchronization related to the anticipation of a stimulus providing knowledge of results.,”
Clin. Neurophys., 110, pp. 250‟60, 1999.
[10] J. H. Sheeba, V. K. Chandrasekar, and M. Lakshmanan, “General coupled-nonlinear-oscillator model for event-related (de)synchronization,” Phys. Rev. E, 84, p. 036210, 2011.
“surprise” in music [8]. Additionally, ERD has also been reported
to relate to anticipation of an auditory stimulus [9].
The results suggest that music is entraining activity in the motor
cortex. Although entrainment is a non-linear phenomena it may
account for linear changes in ERD, which have been
demonstrated to arise from both linear and non-linear coupled
oscillators [10].
Our future work will seek to further explore neural processes
involved in music listening.
Additionally, our results are informative for construction of braincomputer music interfaces (BCMIs). For example, ERD/S is a
common mechanism used in BCI control.
Contact information
Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading RG6 6AY, UK
Email: [email protected]
Website: http://bel.reading.ac.uk
Acknowledgments
This work was supported by the EPSRC grants (EP/J003077/1 and EP/J002135/1). The authors thank Sajeel Ahmed, Isil Poyraz Bilgin, Katherine
Heseltine Flynn, and Maryam Karimidomal for help with EEG recording.