Rhythm and Movement Lab
This laboratory is dedicated to studying the evolutionary origins of rhythmic abilities and synchronous coordination in humans by using a comparative, interdisciplinary approach. Methods used in our lab range from non-invasive behavioral experiments with humans and non-human primates to agent-based computational modeling. We apply both established experimental techniques (eye tracking) and self-developed hardware and software systems (motion tracking, primate drum kit).
Projects
Synchrony and temporal regularity in human vocalization
In this project we investigated whether inter-individual synchronization of a behavior that is typically irregular in time, speech, could lead to evenly paced or "isochronous" temporal patterns. Participants read nonsense phrases aloud with and without partners, and we found that synchronous reading resulted in greater regularity of durational intervals between words. Comparison of same-gender pairings showed that males and females were able to synchronize their temporal speech patterns with equal skill (PI: Dr. Daniel Bowling).
Movement synchrony in humans
In this project we investigate the prerequisites for and mechanisms of interpersonal synchrony in humans. We assess synchrony in a wide range of situations: from arbitrary stationary movements or movements along trajectories to complex dance moves (PI: Dr. Lisa Horn).
Comparative Drumming Studies
This project investigates the evolution of the sense of rhythm, namely the cognitive capabilities to perceive, produce and synchronize to quasi-periodic patterns over time. We developed two prototypes for testing rhythmic abilities in chimpanzees (Pan troglodytes), which allow to arbitrarily associate sounds to physical object movements (PI: Andrea Ravignani).
Methods & Technology
Methods & Technology
Eye Tracking
Our lab is equipped with the SR EyeLink 1000 infrared eye tracking system (SR Research, www.sr-research.com/index.html), usable alternatively with a chin rest or without head support. This eye tracker can be used for gaze tracking and to measure pupil diameter (pupillometry). It comes with its own graphic-based programming language (Experiment Builder). Gaze tracking is used to assess a participant’s attentional focus. Pupillometry has been shown to be a reliable, noninvasive measure of cognitive load and affective processing.
Motion Tracking
We developed a motion tracking system using marker-based video motion capture and hand-held acceleration sensors. Video motion capture is used for experiments where participants move along the horizontal plane. Video is recorded with a high-speed camera fixed on the ceiling directly above the participants and a computer program tracks the location of the markers over time. Acceleration data is collected in experiments where participants move on the horizontal and/or the vertical plane with small 3-axis accelerometers (PhidgetSpatial 0/0/3 High Resolution, www.phidgets.com/products.php.
Melodic Memory/Beat Alignment Perception Task (Gold MSI)
We developed a PsychoPy implementation of the Melodic Memory Task and the Beat Alignment Perception Task that are included in the Goldsmiths Musical Sophistication Index (GOLD MSI). The Beat Alignment Perception Task can be used to test the participants’ ability to perceptually track a rhythmic pulse or beat, by asking them to indicate whether a click track played alongside a music excerpt is on or off the beat of the music. (Dr. Bruno Gringas, Dr. Estela Puig-Waldmüller)
The PsychoPy implementations of the Melodic Memory and Beat Alignment Perception tasks can be downloaded here: German - English
Primate Drum Kit
We developed two prototypes for testing rhythmic abilities in chimpanzees (Pan troglodytes), which allow us to arbitrarily associate sounds to physical object movements. One prototype uses four piezoelectric elements embedded between layers of Plexiglas and foam. Strain data is sent to a computer running Python through an Arduino board. A second prototype consists in a modified Wii Remote contained in a rubber toy. Acceleration data is sent via Bluetooth to a computer running Max/MSP. We successfully pilot tested the first device with a group of chimpanzees. We foresee using these devices in a range of cognitive experiments.
(These pictures are taken from the publication Ravignani, A.; Olivera, V.M.; Gingras, B.; Hofer, R.; Hernandez, C.R.; Sonnweber, R.-S.; Fitch, W.T. (2013) Primate Drum Kit: A System for Studying Acoustic Pattern Production by Non-Human Primates Using Acceleration and Strain Sensors. Sensors, 13. doi:10.3390/s130809790)