Advertisement
Unsolved Mystery

Unsolved Mystery Unsolved Mysteries discuss a topic of biological importance that is poorly understood and in need of research attention.

See all article types »

The Evolutionary Biology of Musical Rhythm: Was Darwin Wrong?

  • Aniruddh D. Patel mail

    a.patel@tufts.edu

    Affiliation: Department of Psychology, Tufts University, Medford, Massachusetts, United States of America

    X
  • Published: March 25, 2014
  • DOI: 10.1371/journal.pbio.1001821
Corrections

7 May 2014: The PLOS Biology Staff (2014) Correction: The Evolutionary Biology of Musical Rhythm: Was Darwin Wrong? PLoS Biol 12(5): e1001873. doi: 10.1371/journal.pbio.1001873 | View correction

Abstract

In The Descent of Man, Darwin speculated that our capacity for musical rhythm reflects basic aspects of brain function broadly shared among animals. Although this remains an appealing idea, it is being challenged by modern cross-species research. This research hints that our capacity to synchronize to a beat, i.e., to move in time with a perceived pulse in a manner that is predictive and flexible across a broad range of tempi, may be shared by only a few other species. Is this really the case? If so, it would have important implications for our understanding of the evolution of human musicality.

Background

Music is a human universal with an ancient history: delicately carved bone flutes made by ice-age hunter-gatherers predate the oldest known cave paintings by several thousand years [1],[2]. While musical forms and meanings vary widely across cultures [3], certain features of human music are widespread [4]. For example, every culture has some form of music with a beat, a perceived periodic pulse that dancers use to guide their movements and performers use to coordinate their actions [5]. Darwin, intrigued by the ubiquity and power of music in human life, felt that our sense of melody and rhythm tapped into ancient and fundamental aspects of brain function, arguing that “The perception, if not the enjoyment, of musical cadences [i.e., melodies] and of rhythm is probably common to all animals, and no doubt depends on the common physiological nature of their nervous systems” [6]. Darwin's intuition seems plausible. Focusing on rhythm, the prevalence of periodic (or near-periodic) rhythms in animal biology (e.g., in heartbeat, gait, and brain activity [7]) makes it reasonable to suspect that beat-based rhythmic processing has ancient evolutionary roots.

Darwin's view suggests that key features of musical beat processing should be similar in humans and other species. For humans, one of the most salient features of musical beat processing is that it links perception and action in an intimate way. We often express our perception of the beat by moving rhythmically (tapping a foot, nodding our head) in time with the beat [8]. That is, humans entrain rhythmic movements to the beat of music, and in social settings (e.g., dancing or marching), this can lead to synchronized rhythmic actions within groups of people [9]. In support of Darwin's view, the ability to entrain actions to a periodic pulse is not uniquely human: several species of frogs and insects are known to call or flash periodically and in synchrony with conspecifics [10]. Indeed, it has been suggested that rhythmic entrainment emerges quite easily in biological systems [11]. A view of rhythmic synchronization as very basic to biological systems informs some current models of musical beat-based processing. For example, in “neural resonance” theory [12],[13], beat perception arises when nonlinear oscillations in the nervous system entrain to (oscillate in synchrony with) external rhythmic stimuli. This theory is in line with Darwin's views because it holds that nonlinear oscillations are ubiquitous in brain dynamics and that the neural entrainment of such oscillations by auditory rhythms is “intrinsic to the physics of the neural systems involved in perceiving, attending, and responding to auditory stimuli” [12].

Such a view is appealing for its generality; yet it faces what biologist Tecumseh Fitch has called “the paradox of rhythm.” As Fitch notes, “Periodicity and entrainment seem to be among the most basic features of living things, yet the human ability (and proclivity) to entrain our motor output to auditory stimuli appears to be very rare.” [14, p. 78]. Stating the paradox more colloquially, Fitch asks “Why don't dogs dance?” Dogs have lived with humans (and our music) for thousands of years, and their brain structure is much more akin to ours than to frogs and insects. Yet they show no spontaneous tendency to synchronize their movements with a musical beat. Indeed, even when humans try to train dogs to dance to music (as in the sport “canine freestyling”), dogs show no evidence of sensing a beat or moving in synchrony with it, unlike their human partners who dance directly beside them [15].

Challenges to Darwin's View

Informal observations of dogs aside, more serious challenges to the view that beat-based processing is widespread come from laboratory studies of nonhuman primates [16],[17]. To understand the significance of these studies, it is important to review some key characteristics of how humans synchronize movements to a beat. While humans typically synchronize to the beat of complex auditory stimuli (i.e., real music), basic features of human synchronization to a beat can be studied by having people tap along with a metronome. This is a trivially easy task for most adults, even those with no musical training. Synchronization to a metronome has driven much productive research on sensorimotor processing [18]. Three key features of human synchronization to a metronome are 1) prediction, 2) tempo flexibility, and 3) cross-modality. In terms of the first feature, when humans tap with a metronome they spontaneously align their taps with the beat: taps fall very close to the onset of metronome clicks, typically within a few tens of ms (Figure 1).

thumbnail

Figure 1. Illustration of how a human adult taps to an auditory metronome.

In Figure 1a, the upper gray bars represent the times of five metronome events (brief tones with interonset interval = 600 ms). The lower black bars show tap times, which fall very close to tone onsets. Figure 1b shows summary data for a trial of 40 tones. The relative phase (RP) of each tap is represented by a thin black vector on a unit circle: 0 indicates perfect temporal alignment between taps and tones, negative RP values indicate taps preceding tones, positive RP values indicate taps following tones, and 0.5 indicates taps midway between tones. The white arrow indicates mean relative phase, which is slightly negative in this case (i.e., on average, taps slightly precede tone onsets in time).

doi:10.1371/journal.pbio.1001821.g001

This shows that tapping is guided by an accurate prediction of when the next beat will occur. In other words, actions are guided by a mental model of time, rather than simply being a reaction to each stimulus (if taps were reactive rather than predictive, they would follow clicks by a few hundred ms). In terms of the second feature, synchronization to a metronome (and to music) in adult humans is very flexible: as long as the interval between beats is between about 300–900 ms (i.e., about 67–200 beats per minute or BPM), humans can achieve synchronization quickly and accurately [19]. This criterion distinguishes human synchronization to a beat from other examples of rhythmic entrainment in nature. Fireflies, for example, can only synchronize to other fireflies in a narrow tempo range around their spontaneous emission rate [20]. In terms of the third feature, humans can synchronize to a beat in a cross-modal fashion; that is, we can easily synchronize by moving silently (e.g., head bobbing), rather than by making sound ourselves (e.g., clapping or vocalizing) [8]. All other species exhibiting synchronous rhythmic behavior do so in the same modality (e.g., frogs calling together, or fireflies flashing together) [21]. While cross-modal synchronization is easy for humans, there is one respect in which our synchronization abilities are modality-biased. When humans synchronize to an auditory metronome, their tapping is much more accurate than when synchronizing with a visual metronome of identical temporal characteristics, a finding that has been replicated in the laboratory for over a century [22][24].

Turning to nonhuman primates, if the mechanisms underlying human beat-based processing are widespread in animal brains, one would expect nonhuman primates to show characteristics like those human synchronization exhibits when they are trained to tap to a beat. In the first study to train monkeys (or for that matter, any animal) to tap with a metronome, Hugo Merchant and colleagues obtained surprising results [16]. While the rhesus monkeys (Macaca mulatta) could successfully listen to two metronome clicks and then reproduce the same interval by tapping twice on a key, they had great difficulty learning to tap in synchrony with a metronome of several beats. Specifically, each monkey took over a year of training to learn the metronome task, and when tested, their taps were always a few hundred ms after each metronome click rather than aligned with it. This suggests that their behavior was dominated by reaction rather than anticipation (although they did react more quickly to metronome events than to randomly timed events, thus showing some modest anticipation abilities). The monkeys learned to tap with metronomes at several different tempi, but spontaneous tempo flexibility was not tested (i.e., training at one tempo and testing at another tempo). Finally, unlike humans, the monkeys showed similar tapping variability for auditory and visual metronomes.

Thus it seems that human-like beat-based processing may not come easily to monkeys. Surprisingly, these differences may extend from synchronization to pure perception of a beat (in the absence of movement). This is suggested by subsequent research in the Merchant lab, which used neural measures to examine beat perception in monkeys who were sitting still. In this work, modeled on previous work with humans [25], monkeys were presented with a repeating auditory rhythmic pattern (which was more complex than a metronome but which had an underlying beat) while EEG data were collected. Unlike humans tested with these stimuli, the monkeys did not show a neural correlate of beat perception [26].

Naturally one wonders if similar results would have been obtained with great apes, who are much more closely related to humans, and who are known to drum in the wild [27]. While no studies have examined neural responses to a beat in apes, the first study of synchronization to an auditory metronome in great apes was recently published. In this study, three chimpanzees (Pan troglodytes) were trained to tap rhythmically on a keyboard and were tested for spontaneous synchronization to a metronome at three different tempi [17]. One chimp synchronized her taps to the metronome at one tempo but not at the other two tempi, while the other chimps did not synchronize at any tempi. Thus while chimps may have the capacity for anticipatory synchronization (not yet evident in monkeys), so far they show no evidence of significant tempo flexibility.

What are we to make of these challenges to Darwin's view of animal rhythmic processing? One possibility is that different training and testing methods would produce different results, and that human-like synchronization to a metronome is possible in monkeys and apes. Indeed, given how few studies have examined synchronization to an auditory beat in nonhuman primates, this possibility deserves to be explored. For example, future studies with monkeys could use reaching tasks (at which monkeys are known to be adept [28]) and a touch screen. Specifically, two illuminated circles could appear periodically and in alternation at fixed positions on the left and right side of the screen, and the monkey could be trained to use one hand to touch each circle before it disappears. This would require anticipatory (rather than reactive) reaching movements, in order to touch each circle on the screen while it was illuminated. A tone could be played at the same time as each circle is displayed, and once the task was learned, the visual stimulus could be faded-out to make the stimulus auditory only. Once the task was mastered at one tempo, generalization to other tempi could be tested.

Could Beat-Based Processing Be Species-Restricted?

Might it be that only a few species have the capacity to synchronize rhythmic movements to a beat in a manner similar to humans? In theoretical writings that predated the recent work on synchronization in nonhuman primates, I suggested that this might be the case [29]. Specifically, I proposed the “vocal learning and rhythmic synchronization hypothesis” (henceforth, “vocal learning hypothesis”), which suggests that the capacity to synchronize with a musical beat resulted from changes in brain structure driven by the evolution of complex vocal learning. Complex vocal learning is learning to produce complex vocal signals based on auditory experience and sensory feedback. This is a rare trait in nature: most animals (including all nonhuman primates) have a small set of instinctive vocalizations which they can modify in only modest ways in terms of their acoustic patterning. Vocal learning occurs in just three groups of birds (songbirds, hummingbirds, and parrots) and a few groups of mammals, including humans, elephants, and some cetaceans, seals, and bats [30][35]. The neurobiology of vocal learning has been best studied in birds, where the brain structure of vocal learners has been compared in great detail to that of vocal nonlearners (such as chickens or pigeons). This work has revealed that vocal learning is associated with specialized neural circuitry, including specializations in forebrain premotor areas, the basal ganglia, and their connections [36]. One motivation for the vocal learning hypothesis was that human neuroimaging revealed that premotor and basal ganglia regions are important for beat-based processing. Indeed, neuroimaging reveals that pure beat perception (even in the absence of overt movement) engages mid-to-dorsal premotor regions and basal ganglia regions (e.g., the putamen) [37],[38], which become functionally coupled to auditory regions [39]. It has been theorized that this functional coupling plays a role in our ability to predict the timing of beats [40],[41], a key feature of beat-based processing. More generally, moving in synchronization with a beat requires tight auditory-motor coupling in the service of an auditory model (a mental model of a temporal interval), just as vocal learning requires tight auditory-motor coupling in the service of an auditory model (the sound an animal is trying to imitate).

Of course, even if vocal learning and synchronization to a beat both engage premotor-basal ganglia-auditory networks, it may seem puzzling to claim that the two abilities are related, since they use different parts of the motor system (the vocal tract vs. the limbs, trunk, head, etc.). Thus the vocal learning hypothesis entails the idea that the evolution of vocal learning led to more general integration of auditory and motor regions of the brain than just the circuits connecting auditory and vocal motor control centers [42].

Of particular interest in this regard are connections in the human brain between auditory superior temporal cortical regions and dorsal premotor regions of the frontal cortex, via the parietal cortex [43] (Figure 2, orange line connecting posterior superior temporal gyrus/middle temporal gyrus [pSTG/MTG] with angular gyrus [AG], and light blue line connecting angular gyrus with dorsal premotor cortex [dPMC]). As shown in Figure 2, these connections correspond to two branches of a large neural fiber pathway known as the superior longitudinal fasciculus (SLF): specifically, the temporo-parietal part (SLF-tp) and branch 2 of the SLF [SLF II].

thumbnail

Figure 2. Recent summary diagram of long-distance fiber tracts in the human dorsal auditory stream (adapted, with permission, from [43]).

Of particular interest here are connections between auditory regions in the posterior superior temporal gyrus/middle temporal gyrus (pSTG/MTG) and the angular gyrus (AG) of the parietal cortex, and connections between the angular gyrus and the dorsal premotor cortex (dPMC). These connections correspond to two branches of the superior longitudinal fasciculus (SLF): the SLF temporo-parietal branch (SLF-tp) and the 2nd branch (SLF II). Interestingly, both tracts appear to play a role in the human ability to repeat what is heard [43], a key part of vocal learning. PTL: posterior temporal lobe; SMG: supramarginal gyrus; vPMC: ventral premotor cortex; 44: Brodmann area 44 (part of Broca's area).

doi:10.1371/journal.pbio.1001821.g002

These connections are part of a “dorsal auditory stream” linking auditory and premotor regions, which is thought to play a role in sensorimotor transformations in speech and other domains [44]. Importantly, this pathway (especially the part connecting the auditory and superior parietal cortex, i.e., SLF-tp) may be much more developed in humans than in nonhuman primates [45],[46], which could account for differences between humans and other primates in the ability to synchronize to a beat (see [41] for an extended treatment). This issue merits further comparative study.

One virtue of the vocal learning hypothesis is that it makes testable predictions about what kinds of animals can vs. cannot synchronize to a beat in a human-like way. Specifically, it posits that vocal nonlearners lack this capacity, a prediction that has so far been borne out in primate research (though more work is needed, as noted above). In contrast, it predicts that vocal learners may have this capacity. (The qualification of “may” is important, because the hypothesis says that neural circuitry related to vocal learning is necessary for human-like synchronization, but does not claim that it is sufficient [15],[47],[48].) Support for the hypothesis comes from studies showing that several species of parrots can synchronize to the beat of music in a manner that is predictive, tempo-flexible, and cross-modal [49][51]. In two of these studies [49],[50], the parrots (who were human pets) appear to have developed this behavior without any formalized training, perhaps by observing humans (though they can now synchronize to music without a human model). It should be noted, though, that parrots do not synchronize to a beat as well as adult humans, and show transient “bouts” of synchronization to a beat, perhaps akin to human children [52]. Thus further work is needed to directly compare the synchronization abilities of parrots and nonhuman primates. In doing this work, it will be important to document whether synchronization to a beat emerges spontaneously, as it does in humans (i.e., via exposure to beat-based rhythms and to visual models of others synchronizing), or if it requires explicit reinforcement training. This is important because these two different ways of acquiring synchronization abilities may reflect differences in the underlying mechanisms.

On a related note, an important question for future work is whether the behavioral similarities in synchronization to a beat in parrots and humans are due to similar underlying neural mechanisms, or if these similarities are superficial and rely on rather different neural circuits (Box 1). The vocal learning hypothesis takes the former view. Since parrots are not known to synchronize to a beat as part of their natural behavior in the wild, the hypothesis implies that this capacity emerges as a serendipitous byproduct of brain circuitry that evolved for other reasons, i.e., for vocal learning.

Box 1. Parrot and human synchronization to a beat: similar or distinct brain mechanisms?

Research with parrots has provided the first experimental evidence that nonhuman species can synchronize movements to a beat in a human-like fashion [49][51] (for video examples, see http://www.youtube.com/watch?v=ERpIWTh18​cY). Is this behavioral similarity simply a superficial resemblance, resting on rather different brain mechanisms in parrots and humans? Similarity of behavior is no guarantee of similar underlying mechanisms. For example, a parrot can say “Polly want a cracker,” but this emulation of speech is produced by very different articulatory mechanisms than those used in human speech [79]. On the other hand, similar behavior in distantly related species can be supported by similar mechanisms. Vocal learning, for example, appears to have arisen independently in three distantly related groups of birds (parrots, songbirds, and hummingbirds). Yet a broadly similar set of brain nuclei appears to be involved in each case, pointing to “deep homology,” i.e., the convergent evolution of a trait based on similar biological mechanisms, possibly due to underlying genetic constraints on how those traits can be assembled [80],[81]. Could vocal learning in birds and mammals also be a case of deep homology, involving broadly similar premotor-basal ganglia-thalamic neural circuits? The biologist Tecumseh Fitch has argued for this view [81]. One fact that makes this argument interesting is that a gene important for motor control of human speech, FoxP2, is also expressed in avian brain regions important for motor control of learned song [81][83]. This is consistent with the idea that vocal learning in birds and humans has a similar underlying biology [84]. If this is the case, and if the capacity for synchronization to a beat is related to vocal learning circuitry, this would support a deep homology between the brain mechanisms used in synchronization to a beat in vocal learning birds and humans.

Possible Support for Darwin's View

Thus far I have discussed two very different views of beat-based processing: either as reflecting ancient and widespread aspects of brain function or as the result of specialized brain networks that exist in a small subset of animal species. If the former view is correct, then many animal species, if given the right training, should exhibit the capacity for beat-based processing.

Recently the capacity to synchronize to a musical beat has been demonstrated in a California sea lion (Zalophus californianus) [53]. Like parrots, sea lions are not known to synchronize movements to rhythmic sounds in the wild. Yet the sea lion learned to synchronize silent head bobs with an auditory beat (although this required structured reinforcement training, unlike with parrots). Crucially, the sea lion showed tempo flexibility: after training to synchronize at one tempo, she could generalize this behavior to novel tempi. This is potentially strong evidence in favor of Darwin's view and against the vocal learning hypothesis, since this species is not known to be a vocal learner. However, sea lions (family Otariidae) are related to true seals (family Phocidae) and to walruses (family Odobenidae), which are known vocal learners [30],[34],[54]. Hence the absence of evidence for vocal learning in sea lions is not strong evidence of absence of this capacity or its underlying neural mechanisms. To test the prevailing view that sea lions are much less vocally flexible than seals, behavioral training studies of vocal flexibility in sea lions are needed, particularly since the most recent experimental studies of sea lion vocal flexibility date from the 1960s and 1970s [54]. Neural studies would also be of interest, e.g., structural neuroimaging of sea lions vs. seal brains using diffusion tensor imaging (DTI), a type of magnetic resonance imaging (MRI) that can visualize white matter pathways in living brains. DTI could be used to search sea lion brains for neural connections associated with vocal learning and for other connections potentially relevant for beat processing (e.g., the temporo-parietal branch and 2nd branch of the superior longitudinal fasciculus, Figure 2). It may be, for example, that sea lions retain auditory-motor circuits inherited from a vocal-learning common ancestor of seals, sea lions, and walruses [55], even though they do not show obvious signs of vocal learning in captivity. (Flexible auditory-motor integration may be useful to sea lions because of their amphibious lifestyle: they produce and perceive a diverse set of vocalizations in two very different environments, i.e., above and underwater [56][58].) If future work shows that sea lions have very limited vocal flexibility and lack the neural circuitry associated with vocal learning, this would seriously challenge the vocal learning hypothesis. It would however leave open the broader question of whether the ability to synchronize to an auditory beat in a human-like way is species-restricted, and if so, why only certain animals have this capacity.

Where Do We Go from Here?

The range of species capable of human-like synchronization to a beat is currently an unsolved mystery. Apart from further research on parrots and nonhuman primates, which other animals should be tested for this ability? In terms of vocal learners, further work is needed to find out whether the capacity to synchronize to a beat is latent in all vocal learners (e.g., including bats), or only in a subset of vocal learners who also have other key traits. Parrots, for example, can imitate nonvocal gestures and are also deeply social creatures who may have a propensity for coordinated movement with social partners [59]. It may be that these other traits are necessary, in addition to vocal learning, to create the capacity for human-like synchronization to a beat [15],[47],[48]. If this is the case, then only vocal learners with these other traits, such as dolphins [60], may be able to synchronize to a beat in a human-like fashion.

In terms of vocal nonlearners, one animal of particular interest is the domestic horse (Equus ferus caballus), a vocal nonlearning animal that (unlike sea lions) has no close vocal-learning relatives. In favor of Darwin's views on musical rhythm, there are anecdotal accounts of horses spontaneously synchronizing their gait to the beat of music, even when they have no rider (who could unintentionally give them cues to the beat). This makes them an ideal test case for Darwin's view, since the vocal learning hypothesis predicts that they lack human-like capacities for synchronizing to a musical beat. Using new methods for testing synchronization to music in horses ([61][63], Movie S1), this prediction can now be tested.

Stepping back to a larger view, studies of beat-based processing in other animals are part of a small but growing body of cross-species research on music processing (e.g., [16],[17],[26],[49],[50],[64][78]). Such research is in its infancy, but is worth pursuing because it provides an empirical approach to studying the evolutionary history of human musicality. Specifically, it can help identify which aspects of our nonlinguistic auditory processing are broadly shared with other species, which aspects are shared with just a few other species, and which are uniquely human. It is important to note that such work is essentially Darwinian in its approach. That is, even if Darwin was wrong about the widespread nature of musical rhythm processing, the cross-species approach to evolutionary studies that he championed will undoubtedly lead us to a deeper understanding of the biological roots of human music.

Supporting Information

Movie S1.

Illustration of a new method for testing if horses synchronize their gait to the beat of music, from [61]. In this “circular trotting to music” method, a horse trots in circles around a trainer while ambient music with a clear beat is played in the arena. The trainer wears closed-ear headphones and listens to masking music with no beat (e.g., meditation music), in order to avoid giving the horse inadvertent cues to the musical beat. Using frame-by-frame video analysis and quantitative statistical methods, the timing of the horse's footfalls are compared to the timing of musical beats to test for synchronization. The test is repeated at several different tempi to examine tempo flexibility, as in [49].

doi:10.1371/journal.pbio.1001821.s001

(MP4)

Acknowledgments

I am grateful to Adena Schachner and L. Robert Slevc for ideas relating to conserved auditory-motor cortical circuitry in sea lions and how it might relate to their capacity to synchronize movements to an auditory beat. I am also grateful to Andrew Schwartz for his suggestions about using reaching tasks in future studies of synchronization to a metronome in monkeys.

References

  1. 1. Conard NJ, Malina M, Münzel SC (2009) New flutes document the earliest musical tradition in southwestern Germany. Nature 460: 737–740. doi: 10.1038/nature08169
  2. 2. Morley I (2013) The prehistory of music. Oxford: Oxford University Press.
  3. 3. Nettl B, Stone R (1998) The Garland encyclopedia of world music (10 vols). New York: Garland Publications.
  4. 4. Brown S, Jordania J (2013) Universals in the world's musics. Psychol Music 41: 229–248. doi: 10.1177/0305735611425896
  5. 5. Nettl B (2000) An ethnomusicologist contemplates universals in musical sound and musical culture. In: Wallin NL, Merker B, Brown S, editors. The origins of music. Cambridge, MA: MIT Press. pp. 463–472.
  6. 6. Darwin C (1871) The descent of man and selection in relation to sex. London: John Murray.
  7. 7. Buzáki G (2006) Rhythms of the brain. New York: Oxford University Press.
  8. 8. Toiviainen T, Luck G, Thompson R (2010) Embodied meter: hierarchical eigenmodes in music-induced movement. Music Percept 28: 59–70. doi: 10.1525/mp.2010.28.1.59
  9. 9. McNeill W (1995) Keeping together in time: dance and drill in human history. Cambridge, MA: Harvard University Press.
  10. 10. Greenfield MD (2005) Mechanisms and evolution of communal sexual displays in arthropods and anurans. Adv Study Behav 35: 1–62. doi: 10.1016/s0065-3454(05)35001-7
  11. 11. Mirollo RE, Strogatz SH (1990) Synchronization of pulse-coupled biological oscillators. SIAM J Appl Math 50: 1645–1662. doi: 10.1137/0150098
  12. 12. Large EW, Snyder JS (2009) Pulse and meter as neural resonance. Ann N Y Acad Sci 1169: 46–57. doi: 10.1111/j.1749-6632.2009.04550.x
  13. 13. Large EW (2010) Neurodynamics of music. In: Jones M, Fay RR, Popper AN, editors. Springer handbook of auditory research, Vol. 36: music perception. New York: Springer. pp. 201–231.
  14. 14. Fitch WT (2012) The biology and evolution of rhythm: unraveling a paradox. In: Rebuschat P, Rohmeier M, Hawkins JA, Cross I, editors. Language and music as cognitive systems. Oxford: Oxford University Press. pp.73–95.
  15. 15. Schachner A (2010) Auditory-motor entrainment in vocal mimicking species: additional ontogenetic and phylogenetic factors. Commun Integr Biol 3: 290–293. doi: 10.4161/cib.3.3.11708
  16. 16. Zarco w, Merchant H, Prado L, Mendez JC (2009) Subsecond timing in primates: comparison of interval production between human subjects and rhesus monkeys. J Neurophysiol 102: 3191–3202. doi: 10.1152/jn.00066.2009
  17. 17. Hattori Y, Tomonaga M, Matsuzawa T (2013) Spontaneous synchronized tapping to an auditory rhythm in a chimpanzee. Sci Rep 3: 1566. doi: 10.1038/srep01566
  18. 18. Repp BH, Su Y-H (2013) Sensorimotor synchronization: a review of recent research (2006–2012). Psychonomic Bull Rev 20: 403–452. doi: 10.3758/s13423-012-0371-2
  19. 19. Repp BH (2005) Sensorimotor synchronization: a review of the tapping literature. Psychon Bull Rev 12: 969–992. doi: 10.3758/bf03206433
  20. 20. Hanson FE, Case JF, Buck E, Buck J (1971) Synchrony and flash entrainment in a New Guinea firefly. Science 174: 162–164. doi: 10.1126/science.174.4005.161
  21. 21. Gerhardt HC, Huber F (2002) Acoustic communication in insects and anurans. Chicago: University of Chicago Press.
  22. 22. Dunlap K (1910) Reactions to rhythmic stimuli, with attempt to synchronize. Psychol Rev 17: 399–416. doi: 10.1037/h0074736
  23. 23. Patel AD, Iversen JR, Chen Y, Repp BH (2005) The influence of metricality and modality on synchronization with a beat. Exp Brain Res 163: 226–238. doi: 10.1007/s00221-004-2159-8
  24. 24. Hove MJ, Fairhurst MT, Kotz SA, Keller PE (2013) Synchronizing with auditory and visual rhythms: an fMRI assessment of modality differences and modality appropriateness. Neuroimage 67: 313–321. doi: 10.1016/j.neuroimage.2012.11.032
  25. 25. Winkler I, Háden G, Ladinig O, Sziller I, Honing H (2009) Newborn infants detect the beat in music. Proc Natl Acad Sci U S A 106: 2468–2471. doi: 10.1073/pnas.0809035106
  26. 26. Honing H, Merchant H, Háden GP, Prado L, Bartolo R (2012) Rhesus monkeys (Macaca mulatta) detect rhythmic groups in music, but not the beat. PLoS ONE 7: e51369. doi: 10.1371/journal.pone.0051369
  27. 27. Fitch WT (2006) The biology and evolution of music: a comparative perspective. Cognition 100: 173–215. doi: 10.1016/j.cognition.2005.11.009
  28. 28. Moran DW, Schwartz AB (1999) Motor cortical representation of speed and direction during reaching. J Neurophysiol 82: 2676–2692.
  29. 29. Patel AD (2006) Musical rhythm, linguistic rhythm, and human evolution. Music Percept 24: 99–104. doi: 10.1525/mp.2006.24.1.99
  30. 30. Janik VM, Slater PB (1997) Vocal learning in mammals. Adv Study Behav 26: 59–99. doi: 10.1016/s0065-3454(08)60377-0
  31. 31. Knörnschild M, Nagy M, Metz M, Mayer F, von Helversen O (2010) Complex vocal imitation during ontogeny in a bat. Biol Lett 6: 156–159. doi: 10.1098/rsbl.2009.0685
  32. 32. Stoeger AS, Mietchen D, Oh S, de Silva S, Herbst CT, et al. (2012) An Asian elephant imitates human speech. Curr Biol 22: 2144–2148. doi: 10.1016/j.cub.2012.09.022
  33. 33. Jarvis ED (2004) Learned birdsong and the neurobiology of human language. Ann New York Acad Sci 1016: 749–777. doi: 10.1196/annals.1298.038
  34. 34. Schusterman RJ, Reichmuth CJ (2008) Novel sound production via contingency learning in the Pacific walrus (Odobenus rosmarus divergens). Anim Cog 11: 319–327. doi: 10.1007/s10071-007-0120-5
  35. 35. Ridgway S, Carder D, Jeffries M, Todd M (2012) Spontaneous human speech mimicry by a cetacean. Curr Biol 22: R860–R861. doi: 10.1016/j.cub.2012.08.044
  36. 36. Petkov CI, Jarvis ED (2012) Birds, primates, and spoken language origins: behavioral phenotypes and neurobiological substrates. Front Evol Neurosci 4: 12 doi:10.3389/fnevo.2012.00012.
  37. 37. Grahn JA, Rowe JB (2009) Feeling the beat: premotor and striatal interactions in musicians and nonmusicians during beat perception. J Neurosci 29: 7540–7548. doi: 10.1523/jneurosci.2018-08.2009
  38. 38. Chen JL, Penhune VB, Zatorre RJ (2008) Listening to musical rhythms recruits motor regions of the brain. Cereb Cortex 18: 2844–2854. doi: 10.1093/cercor/bhn042
  39. 39. Kung S-J, Chen JL, Zatorre RJ, Penhune VB (2013) Interacting cortical and basal ganglia networks underlying finding and tapping to the musical beat. J Cogn Neurosci 25: 401–420. doi: 10.1162/jocn_a_00325
  40. 40. Zatorre RJ, Chen JL, Penhune VB (2007) When the brain plays music: auditory-motor interactions in music perception and production. Nat Rev Neurosci 8: 547–558. doi: 10.1038/nrn2152
  41. 41. Patel AD, Iversen JR (2014) The evolutionary neuroscience of musical beat perception: the action simulation for auditory prediction (ASAP) hypothesis. Front Syst Neuroci In press. doi: 10.3389/fnsys.2014.00057
  42. 42. Feenders G, Liedvogel M, Rivas M, Zapka M, Horita H, et al. (2008) Molecular mapping of movement-associated areas in the avian brain: a motor theory for vocal learning origin. PLoS ONE 3: e1768 doi:10.1371/journal.pone.0001768.
  43. 43. Gierhan SME (2013) Connections for auditory language in the human brain. Brain Lang 127: 205–221. doi: 10.1016/j.bandl.2012.11.002
  44. 44. Rauschecker JP (2011) An expanded role for the dorsal auditory pathway in sensorimotor control and integration. Hear Res 271: 16–25. doi: 10.1016/j.heares.2010.09.001
  45. 45. Lewis JW, van Essen DC (2000) Corticocortical connections of visual, sensorimotor, and multimodal processing areas in the parietal lobe of the macaque monkey. J Comp Neurol 428: 112–137. doi: 10.1002/1096-9861(20001204)428:1<112::aid-cne8>3.0.co;2-9
  46. 46. de Schotten MT, Dell'Acqua F, Valabregue R, Catani M (2012) Monkey to human comparative anatomy of the frontal lobe association tracts. Cortex 48: 82–96. doi: 10.1016/j.cortex.2011.10.001
  47. 47. Fitch WT (2009) Biology of music: another one bites the dust. Curr Biol 19: R403–R404. doi: 10.1016/j.cub.2009.04.004
  48. 48. Patel AD, Iversen JR, Bregman MR, Schulz I (2009) Studying synchronization to a musical beat in nonhuman animals. Ann New York Acad Sci 1169: 459–469. doi: 10.1111/j.1749-6632.2009.04581.x
  49. 49. Patel AD, Iversen JR, Bregman MR, Schulz I (2009) Experimental evidence for synchronization to a musical beat in a nonhuman animal. Curr Biol 19: 827–830. doi: 10.1016/j.cub.2009.03.038
  50. 50. Schachner A, Brady TF, Pepperberg IM, Hauser MD (2009) Spontaneous motor entrainment to music in multiple vocal mimicking species. Curr Biol 19: 831–836. doi: 10.1016/j.cub.2009.03.061
  51. 51. Hasegawa A, Okanoya K, Hasegawa T, Seki Y (2011) Rhythmic synchronization tapping to an audio-visual metronome in budgerigars. Sci Rep 1: 120 doi:10.1038/srep00120.
  52. 52. Eerola T, Luck G, Toiviainen P (2006) An investigation of pre-schoolers' corporeal synchronization with music. In: Baroni M, Addessi AR, Caterina R, Costa M, editors. Proc. of the 9th International Conference on Music Perception & Cognition (ICMPC9). Bologna, Italy: ICMPC and ESCOM. pp. 472–476.
  53. 53. Cook P, Rouse A, Wilson M, Reichmuth C (2013) A California sea lion (Zalophus californianus) can keep the beat: motor entrainment to rhythmic auditory stimuli in a non vocal mimic. J Comparative Psychol 127: 412–427. doi: 10.1037/a0032345
  54. 54. Schusterman RJ (2008) Vocal learning in mammals with special emphasis on pinnipeds. In: Oller DK, Gribel T, editors. The evolution of communicative flexibility: complexity, creativity, and adaptability in human and animal communication. Cambridge, MA: MIT Press. pp. 41–70.
  55. 55. Arnason U, Gullberg A, Janke A, et al. (2006) Pinniped phylogeny and a new hypothesis for their origin and dispersal. Mol Phylogenet Evol 41: 345–354. doi: 10.1016/j.ympev.2006.05.022
  56. 56. Schusterman RJ, Gentry R, Schmook J (1967) Underwater sound production by captive California sea lions, Zalophus californianus. Zoologica 52: 21–24.
  57. 57. Schusterman RJ (1978) Vocal communication in pinnipeds. In: Markowitz H, Stevens VJ, editors. Behavior of captive wild animals. Chicago: Nelson Hall. pp. 247–308.
  58. 58. Reichmuth C, Holt MM, Muslow J, Sills JM, Southall BL (2013) Comparative assessment of amphibious hearing in pinnipeds. J Comp Physiol A 199: 491–507. doi: 10.1007/s00359-013-0813-y
  59. 59. Moore BR (1992) Avian movement imitation and a new form of mimicry: tracing the evolution of a complex form of learning. Behaviour 122: 231–263. doi: 10.1163/156853992x00525
  60. 60. Connor RC, Smolker R, Bejder L (2006) Synchrony, social behaviour and alliance affiliation in Indian Ocean bottlenose dolphins, Tursiops aduncus. Anim Behav 72: 1371–1378. doi: 10.1016/j.anbehav.2006.03.014
  61. 61. Bregman MR, Iversen JR, Lichman D, Reinhart M, Patel AD (2012) A method for testing synchronization to a musical beat in domestic horses (Equus ferus caballus). Empir Musicol Rev 7: 144–156.
  62. 62. Venneman SS (2012) A commentary on Micah Bregman et al.: a method for testing synchronization to a musical beat in domestic horses (Equus ferus caballus). Empir Musicol Rev 7: 160–163.
  63. 63. Schachner A (2012) If horses entrain, don't entirely reject vocal learning: an experience-based vocal learning hypothesis. Empir Musicol Rev 7: 157–159.
  64. 64. Hulse SH, Page SC (1988) Toward a comparative psychology of music perception. Music Percept 5: 427–452. doi: 10.2307/40285409
  65. 65. Page SC, Hulse SH, Cynx J (1989) Relative pitch perception in the European starling (Sturnus vulgaris): further evidence for an elusive phenomenon. J Exp Psychol Anim Behav 15: 137–146. doi: 10.1037//0097-7403.15.2.137
  66. 66. Wright AA, Rivera JJ, Hulse SH, Shyan M, Neiworth JJ (2000) Music perception and octave generalization in rhesus monkeys. J Exp Psychol Gen 129: 291–307.
  67. 67. McDermott JH, Hauser MD (2004) Are consonant intervals music to their ears? Spontaneous acoustic preferences in a nonhuman primate. Cognition 94: B11–B21. doi: 10.1016/j.cognition.2004.04.004
  68. 68. McDermott JH, Hauser MD (2007) Nonhuman primates prefer slow tempos but dislike music overall. Cognition 104: 654–668. doi: 10.1016/j.cognition.2006.07.011
  69. 69. Brooks DI, Cook RG (2010) Chord discrimination by pigeons. Music Percept 27: 183–196. doi: 10.1525/mp.2010.27.3.183
  70. 70. Hagmann CE, Cook RG (2010) Testing meter, rhythm, and tempo discriminations in pigeons. Behav Processes 85: 99–110. doi: 10.1016/j.beproc.2010.06.015
  71. 71. Snowdon CT, Teie D (2010) Affective responses in tamarins elicited by species- specific music. Biol Lett 6: 30–32. doi: 10.1098/rsbl.2009.0593
  72. 72. Yin P, Fritz JB, Shamma SA (2010) Do ferrets perceive relative pitch? J Acoust Soc Am 127: 1673–1680. doi: 10.1121/1.3290988
  73. 73. Tierney AT, Russo FA, Patel AD (2011) The motor origins of human and avian song structure. Proc Natl Acad Sci U S A 108: 15510–15515. doi: 10.1073/pnas.1103882108
  74. 74. Bregman MR, Patel AD, Gentner TQ (2012) Stimulus-dependent flexibility in non-human auditory pitch processing. Cognition 122: 51–60. doi: 10.1016/j.cognition.2011.08.008
  75. 75. Selezneva E, Deike S, Knyazeva S, Scheich H, Brechmann A, et al. (2013) Rhythm sensitivity in macaque monkeys. Front Syst Neurosci 7: 49 doi:10.3389/fnsys.2013.00049.
  76. 76. Fitch W (2013) Rhythmic cognition in humans and animals: distinguishing meter and pulse perception. Front Syst Neurosci 7: 68 doi:10.3389/fnsys.2013.00068.
  77. 77. Merchant H, Honing H (2013) Are non-human primates capable of rhythmic entrainment? Evidence for the gradual audiomotor evolution hypothesis. Front Psychol 7: 274 doi:10.3389/fnins.2013.00274.
  78. 78. Patel AD, Demorest S (2013) Comparative music cognition: cross-species and cross-cultural studies. In: Deutsch D, editor. The psychology of music: 3rd ed. London: Academic Press/Elsevier. pp. 647–681.
  79. 79. Nottebohm F (1976) Phonation in the orange-winged Amazon parrot, Amazona amazonica. J Comp Physiol A 108: 157–170. doi: 10.1007/bf02169046
  80. 80. Jarvis ED (2013) Evolution of brain pathways for vocal learning in birds and humans. In: Bolhuis JJ, Everaert M, editors. Birdsong, speech, and language: exploring the evolution of mind and brain. Cambridge, MA: MIT Press. pp. 63–107.
  81. 81. Fitch WT, Meitchen D (2013) Convergence and deep homology in the evolution of spoken language. In: Bolhuis JJ, Everaert M, editors. Birdsong, speech, and language: exploring the evolution of mind and brain. Cambridge, MA: MIT Press. pp. 45–62.
  82. 82. Chen Q, Heston JB, Burkett ZD, White SA (2013) Expression analysis of the speech-related genes FoxP1 and FoxP2 and their relation to singing behavior in two songbird species. J Exp Biol 216: 3682–3692. doi: 10.1242/jeb.085886
  83. 83. Patel AD (2008) Music, language, and the brain. New York: Oxford University Press.
  84. 84. Lewandowski B, Vyssotski A, Hahnloser RHR, Schmidt M (2013) At the interface of the auditory and vocal motor systems: NIf and its role in vocal processing, production and learning. J Physiol Paris 107: 178–192. doi: 10.1016/j.jphysparis.2013.04.001