Music from all of the world contains different types of meter, or the underlying pulse of a song. If you are tapping your foot to the “beat” of a song, you are moving to that song’s meter. Music cognition research has theorized how meter might be represented in the brain and several theories have been proposed. Previous research used behavioral studies that involve “tapping tasks,” requiring participants to tap along to a superimposed beat. With the increased interest of using methods to measure brain activity, researchers have begun to look at how the brain might respond to mental representations of meter.
An all-star cast of John R. Iverson, Bruno H. Repp and Aniruddh D. Patel designed an experiment to investigate brain responses to perceived metrical structures that enabled the experiments to change the perceived meter without changing the musical structure of the auditory stimulus. As the researchers noted, previous studies have measured brain responses with regards to meter perception. These studies relied on and omitting or dropping a beat and measuring brain activity in response to these missing rhythms. As described in their methods section, Iverson, Repp and Patel decided to take a different route and created a stimulus that is capable of being perceived in multiple ways, making the perception of meter an experimental variable.
The stimulus consisted of two 45-ms tones followed by a 45-ms space of silence (TTO). In experiment 1, the endogenous condition, participants first listened to a familiarization phase in which an accented beat was placed on one of the two tone positions. A test phased followed during which the accent was removed from the TTO patterns. Participants were instructed to place the “beat” on one of two locations that matched the familiarization phase. The participants could push a button whenever the felt they had lost the meter. In experiment 2, the exogenous condition, the same methodology was used with one difference: the accent from the familiarization phase prevailed during the test phase. Additionally, the participants were not told about the meter or TTO pattern, but where instructed to just listen to a stream of tone events. It was the experiments hope, that the participants would not create a mental meter representation during experiment 2. During each trial, brain activity was measured with magnetoencephalography (MEG).
Three main findings were discovered: 1) Brain responses to the auditory stimulus were evoked in the beta frequency range (20-30 hz). Additionally, in the endogenous condition, when the beat was placed on a T, beta responses were increased by 35% 2) The endogenous and exogenous conditions (imagined versus accented beats) showed a similar increase of frequencies in the beta band. 3) Unique to the exogenous condition, the accented stimulus evoked responses in the evoked response field (ERF) and gammaband range. So what might this all mean? Well, the data suggests that the beta range in brain activity plays a role in both top-down (endogenous condition) and bottom-up (exogenous condition) processes. However, the physical accents in the exogenous condition seemed to evoke specific responses (gammaband & EFR) unique to this condition.
Iversen, J. R., Repp, B. H., & Patel, A. D. (2009). Top-down control of rhythm perception modulates early auditory responses. New York Academy of Sciences, 1169, 58-73.