Neuroimaging and psychological evidence suggest that language and music are closely coupled such that capacity in one domain can transfer to the other (Lagrois, Palmer, & Peretz, 2019; Schön et al., 2010). Unlike music, rhythm in language has been controversially debated (Cummins, 2012), and there is no consensus to date as to what unit constitutes the rhythmic centre in speech and language (Villing, Repp, Ward, & Timoney, 2011). The present study investigates this question by adopting a synchronised movement paradigm (Repp, 2005), and compares several rhythmically relevant events in their ability to attract synchronised movement. These events include manually identified vowel onsets and acoustically derived landmarks that describe the amplitude envelope properties in speech. Thirty-two non-musicians listened to repetitions of 20 natural English sentences and were instructed to tap in synchrony with what they perceived to be the sentence beat. The time course of the sentences was tagged for the five rhythmically relevant events: vowel onsets, fastest energy increase (maxD), a combination of high local pitch and periodic energy (PPP), and the largest amplitude of intersyllabic and interstress timescales (IMF1 and IMF2). The results showed that both vowel onsets and maxD consistently describe synchronisation targets in speech, with other landmarks performing significantly worse. Moreover, participants with a higher level of music experience show higher levels of synchrony with these anchors. The present findings suggest that local energy changes shape sensorimotor synchronisation with speech, and such synchronisation is influenced by individual music experience, which supports tight links between language and music.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.