Music and sound are usually characterized as art and ...



Emotive Motion:

Analysis of Expressive Timing and Body Movement in the Performance of an Expert Violinist

Sheena Chandran, John Stoecker, Matt Wright

Orthopedic Surgery 222 – Anatomy of Movement

March 14, 2006

Introduction

The vast majority of the music heard in our culture is considered to be a form of art or expression, but a scientist can use musical elements to sonify data in order to reveal trends or patterns in a way that is analogous to using graphical elements to visualize the data. Sonification assigns sound to data using musical elements such as pitch, volume, and multiple dimensions of timbre to color each individual sound, as well as infinitely many ways to combine sounds to control rhythm, density, harmony and many other perceptions, all potentially varying as continuous functions of time. Using sonification, the scientist has control over the aural expression of the musician’s data and makes a conscious effort to map data parameters to sound parameters. Therefore, much like an artist, the scientist is in control of whether or not sonification will elegantly reveal structure of the original data. Sonification therefore has qualities reflecting both art and scientific rigor. Our project explored this relationship by distilling aspects of motion data from a violin performance into a recognizable form through sonification.

In this study, we used sonification to show how the timing of the motion of a professional concert violinist corresponds to the temporal evolution of aspects of the music while playing an excerpt of J.S. Bach’s Chaconne (the fourth and final movement of Bach’s second Partita, BWV 1004). Our goal was to combine sonification and visual display of motion and force plate data from the Lucile Packard Children’s Hospital Motion & Gait Analysis Lab to show how rhythmic expression manifests in the musician’s body during performance.

Upon observing the highly emotionally expressive aspect of the musician’s body during performance and our personal music experience we hypothesized that:

1) Demonstrate differences between performances using position path data from force plate

2) Prove that there is a correlation of certain body movements and the tempo.

Background

It is well known that an important element of musical expression is the manipulation of timing 3-5,12, in terms of continuous changes of tempo, such as speeding up and slowing down over time, as well as expressive microtiming of individual notes In this paper, expressive microtiming is defined as the musical events coming before or after the appropriate “clock tick” according to the current tempo 5. One thread of computer music research aims to analyze expressive timing directly from the audio signal 3,11,12. In this study, we will apply a combination of these methods as well as manual analysis of the recorded music to identify the way in which the musician’s performances utilize expressive timing.

Many models of expressive musical timing, particularly continuous modulation of tempo in performance of western classical music, are inspired by motion of physical objects such as the parabola that results from throwing an object up into the air 10,14,16. These tempo curves typically follow the music’s phrase structure 2; we expect to see the music’s phrase structure also manifesting in the correlated motion of certain parts of The musician’s body.

Much previous work on the analysis of motion capture data from musical performance focuses on quantifying the kinematics of expert musicianship. Examples include the effect of increased tempo on the timing of pianists’ upward motion in preparation for playing each note 8, determining that movement amplitude contributes more than movement anticipation toward the tendency for pianists to play louder when they play faster 8, and studying the differences between novice and expert ‘cello performance in hopes of finding support for a model of musical skill as dynamic constraint satisfaction 13.

Methods

Data collection consisted of using an eight-camera three-dimensional motion-capture system (Motion Analysis Corp., Santa Rosa, CA) to record the motion of the musician’s head, trunk, limbs, violin, and bow with respect to time. A force plate (Bertec Corp., Columbus, OH) embedded in the floor was used to record the ground reaction forces. Data acquisition was conducted in the Lucille Packard Children’s Hospital Motion & Gait Lab at Stanford University, January 30, 2006.

[pic]

Figure 1. The subject performing while marked for three-dimensional motion capture. The indicated directional labels will be used throughout the paper.

The data consisted of three-dimensional forces and positions of the center of pressure of the musician on a force plate as well as the moment about the vertical axis sampled at a rate of 2400 Hz. The motion data captured by the digital cameras consisted of three-dimensional position data captured at a rate of 240 Hz.

The subject was instructed to perform an approximately 60 second segment the fourth movement of J.S. Bach’s Chaconne under eight different emotional cues: Normal Performance, Anger, Playful, Mournful, Trepidation, No Emotion, Least Motion, Angry. In addition to collecting motion capture and force plate data, a 3-chip miniDV camera was used to capture video and a "shotgun" microphone was used to capture sound at 48kHz (better than CD quality). A clapper was marked with motion-capture reflectors on either edge of the surface that makes the clapping noise and the noise of the clapper in the audio recording was synchronized to the closing of the clapping apparatus in all of our investigations.

Data was compared to find trends that were then be sonified to show their relations to the piece; previous sonification work has already laid a framework7 for the sonification of movement.

Using Matlab (The Mathworks Inc., Novi, MI) a matrix manipulation program, graphic representations of the mediolateral and anteroposterior displacements of the force plate data were plotted against time. In Figure 2, the visual plots of these data sets were matched to the bar markings of the pieces. Bar markings are representations of the onset and end of each musical measure, an arbitrary delineation of notes used in western music to give structure to written music.

[pic]

Figure 2. Mediolateral position vs. time and anteroposterior position vs. time plot of “normal” performance

Using the force plate data and the subset of the motion-capture data that we had, we created tempo curves for each of the eight trials. The musical “tempo” is the rate at which beats occur, generally in units of beats per minute. Modulation of tempo, such as speeding up and slowing down, is an essential element of the musician’s expressiveness. We marked the time at which each of the 17 measures and 171 notes began in each of the 8 performances.

From these note onset times it is simple to compute what might be called the “per-note instantaneous tempo”:

1) take each inter-onset interval (time between successive onsets)

2) take the reciprocal

3) scale by each note's duration in the score (to account for the fact that you expect eighth notes to be twice as long as sixteenth notes)

4) scale by the right constant to make the units be beats per minute.

We used this data to create tempo curves that we correlated against the motion-capture and force plate data.

A final step in this study was the sonification of interesting trends in the data. Using Audacity (Audacity, Seattle, WA) an open source software for recording and editing sounds, we plotted the change of various variables at varying frequencies and timbres in order to create what is equivalent to an auditory graph of our data.

Results

Our analysis of the data was focused primarily on the force plate data as discussed before. Figure 3 demonstrates the position path of the center of pressure for each of the eight performances. This can be thought of as roughly the position of his center of gravity with relation to the force plate in the mediolateral-anteroposterior plane. These plots are most easily conceptualized as the visualization of the subject’s center of gravity’s trajectory if it traveled in point form.

[pic]

Figure 3. Plot of the position paths of the center of pressure on the force plate for all 8 performances.

Seeing these 8 graphs side by side clearly demonstrates that the overall amount of movement is much less in the “no emotion” and “least motion” performances compared to the others. A closer look at the data reflected that the transitions in the data where the position path changes directions were indicative of the emotional state in which the various pieces were performed. Closer inspection of the “transition regions,” i.e., the extreme left/right edges of the position path (Figure 4) further demonstrates this finding by comparing of the “angry” and “playful” performance left transition regions.

[pic]

Figure 4. Plot of the left “transition region” of the position paths of the center of pressure of the “angry” and “playful” trials. Note difference in smoothness of transitions in “playful trial” versus “angry” trial. In this graph X position represents the mediolateral position and Y position represents anteroposterior position.

Figure 4 clearly demonstrates the jaggedness of the “angry” performance “transition region” versus the rounded appearance of the “playful” performance. This finding led us to consider a series of ways to quantify this easily visualized qualitative difference. In order to quantify the jaggedness of the transition region, we took the Cartesian first derivative of the center of pressure mediolateral position and anteroposterior position data with respect to time. We converted these values into polar coordinates to produce the angle of the direction of the velocity vector of the position path. We then took the first order derivative of that angle in order to quantify the rate in which that value was changing.

Figure 5 is the plot of the jaggedness of the eight performances. While there are what appear to be strong peaks in the data, we were unable to derive conclusive meaning for these graphs.

[pic]

Figure 5. Plot of the first derivative of the angle of the velocity vector of the position path data with respect to time for each of the 8 trials.

Despite the lack of quantitative evidence of the change in the jaggedness of the “transition regions” we were able to derive quantitative analysis of the difference in range of motion during the different performances. Figure 6 demonstrates the mean, standard deviation and range of motion in both the mediolateral and anteroposterior directions during each of the eight trials.

[pic]

Figure 6. Plot of the mean, range and standard deviation of the range of motion of the center of pressure of all 8 trials. Note the variety in the range of motion amongst the performances.

Table 1 gives a quantitative breakdown of the findings from Figure 6. Note the great difference in the range of motion of the “no emotion” and “least motion” performances and the other six trials. There is a clear difference between the emotional cue performances and those without emotional cue.

[pic]

Table 1. Table of the results of the statistical analysis of the range of motion of the position path of the center of pressure across the 8 trials. Note the short range of motion of the “no emotion” and “least motion” trial in comparison to that of the “angry” or “searching” performance.

Additional analysis of the force plate data demonstrated similar correlations in the occurrence of sway in the mediolateral axis and bar markations across the eight performances. Figure 7 demonstrates the two-bar interval on which most pieces demonstrate movement of the subject’s center of pressure.

Figure 7. Plot of time vs. the X(mediolateral) position of the center of pressure during each of the 8 performances, with time of the downbeat of each measure overlaid as a vertical line (meant to be visually analogous to a bar line in Western notation).

In Figure 7, the widely differing scales of the X(mediolateral) axis show the variability of range of motion in the eight performances. The “least motion” plot shows that the subject shifted their weight once and then barely moved their weight for the rest of the performance. In general the subject’s motion along the mediolateral axis is basically cyclic, and changes in directionality of movement occur near the bar markations. This trend even occurs during the only shift during the “least motion” performance. On occasion there are small “wobbles” in the data associated with the bar even if the main shifting of the weight is not evident. This is seen in the “normal” performance on bar 5 and 7, and in the “trepidation” performance on bar 8 and 9, and in the “searching” performance 7 and 11, and the “mournful” performance on bar 8. The extrema are more flat in some performances (i.e. the “mournful” performance), indicating that the subject shifted their weight and planted their center of pressure for some period of time, whereas for others, such as the “playful” performance, the extrema are more rounded, implying a continual motion. This is also apparent in the “searching” performance.

Our next area of investigation related to our second hypothesis that the periodicity of various body parts will relate to the periodicity of differing levels of the music - such as note, beat, measure, phrase, and section. We calculated the “per-bar instantaneous tempo” of each of the eight trials to produce Figure 8.

[pic]

Figure 8. Comparison of per-bar tempo for all 8 performances. Note that higher-tempo performances last for a shorter amount of time. (The integral of each tempo function is 48 beats, the duration of the excerpt.)

The per-note (or per-beat) tempo curve is too jagged and does not match actual perception of tempo. For example, if one note’s duration is smaller than the nearby notes, one does not naturally perceive that the tempo sped up just for that one note and then slowed down again, but rather that the subsequent note slightly anticipated the beat.

Operating under this assumption we smoothed the tempo curve with a triangular kernel. To avoid accumulating note offset times, we smoothed the integral of the tempo curve, sometimes known as a “time map,” which marks the fractional number of beats elapsed as a function of time in each performance. The “smoothed tempo curve” is therefore in fact the derivative of the smoothed integral of the tempo curve. For example, a length-three kernel would be [1/2, 1, 1/2], and smoothing with this kernel would mean

output(n) = 0.5*input(n-1) + 1.0*input(n) + 0.5*input(n+1)

for each position n.

The tempo curve predicts the time at which each note should be occur. The more one changes the tempo curve, the more these predictions will differ from the true event onset times. We refer to these differences as the per-note deviations; our convention is that a positive deviation indicates a note that comes after the time predicted by the tempo curve, whereas a negative deviation indicates a note that comes earlier than expected. Figure 9 shows the result of successively smoothing the tempo curve as well as the resulting deviations.

[pic]

Figure 9. The left column is tempo curves of the “angry” trial; each stem plot in the right column shows the per-note deviations computed from the corresponding tempo curve. Each subsequent plot is of the data smoothed to with a higher triangulation factor.

We tested the hypothesis that elements of expressive timing would be manifested in the physical motion of the musician’s body by methodically correlating all motion data as functions of time with the tempo curves and per-note deviation values and functions of time for the corresponding performances. Table 2 tabulates the results with the highest correlation coefficient (rho). In all cases, since the sampling rate is a 240 Hz and the duration of each performance is at least 30 seconds, there was an abundance of data points, so the P-value was infinitesimally small (i.e., 10^-30 or less).

[pic]

Table 2. Correlation of various markers and force plate data to tempo. In this chart ML symbolizes movement in the mediolateral direction, AP symbolizes movement in the anteroposterior direction, and dAP/dt symbolizes the velocity in the AP direction. Rho values indicate the degree of correlation of the data set to the tempo curve. A rho value of 1 would indicate complete correlation (i.e., that the given motion parameter is always exactly a scalar times the tempo curve) while a rho value of 0 implies no correlation at all (i.e., that the given motion parameter is completely statistically independent from the tempo curve).

As demonstrated in Table 2, it was interesting to find that the highest correlations of speed with tempo were in the knee and shank, whereas the high correlations of position with tempo occurred literally from head to toe. The high correlations of the per-dimension velocity with tempo were all relative to the MA position of the scapula and clavicle. This indicates that superior/inferior motion of the shoulder highly correlates to the tempo in this performance. The 3D acceleration of all markers had at most a rho value of .07 with any of the timing parameters. This indicates that acceleration has almost no relation to tempo.

Due to our overambitious application of markers during data collection, it proved to be prohibitively time consuming to clean the motion capture data. Therefore all of the above correlative analysis describes only the “normal” performance of the piece. However, further correlative investigation using the force plate data for the remaining seven performances demonstrated that each of the different performances had widely differing degrees of correlation between the force plate data and timing parameters, ranging from the “angry” performance, whose highest correlation was a rho value of 0.07 to the “no emotion” performance, whose highest correlation was a rho value -0.44. For certain performances he physically manifested the tempo in the shifting of his body weight, but in others, not. This finding is doubly remarkable, because in the “no emotion” case the subject video corroborates that the subject was not visibly shifting their weight back and forth, and as Figure 7 demonstrates the amplitude was low and there is no significant correlation between his movements and the incidence of the bar lines.

Sonification, or data-controlled sound, aided in the presentation of our results. We sonified two aspects of our results: mediolateral position and measure onset with the moment about the vertical axis and note onset as well as the rho correlation analysis results. We chose to sonify the periodicity of the mediolateral position and moment about the vertical axis because the correlation to musical timing of these variables was easily shown through sonification, and we chose to sonify the rho correlations because the contrary and similar motion shown in the correlations is a large part of music theory1.

We chose to modulate the timbre for the mediolateral and moment about the vertical axis sonification. Although no quantitative measure of timbre exists2 many models attribute harmonic content as the main determinant of timbre(5,9). Harmonic content is the frequency and amplitude of all tones in a sound that is not the perceived frequency. To modulate timbre, we used a band-pass filter, which amplifies certain frequencies of the harmonic content. By using the data to change which frequencies are amplified, we mapped mediolateral position and moment about the vertical axis to timbre. We imported the force plate data in PureData (Miller Puckette, La Jolla, CA), a musical programming environment, and ran a band-pass filter over a tambura drone. We chose the tambura, an Indian stringed instrument, for its rich harmonic sound to give the timbre sweep a distinct effect. When the audio recording of the violin performance is overlaid on the sonification, the timbre of the mediolateral position modulates approximately once every two measures, and the timbre of the moment about the vertical axis modulates approximately once every two notes. Also, the position in the mediolateral direction also modulates approximately once every to measures, and the timbre of the moment about eh the vertical axis modulates approximately once ever two notes. This is especially apparent in the “no emotion” sonification. These results are graphically represented in Figures 7 and 10.

[pic]

Figure 10. Plot of time vs. the moment about the Z(vertical) axis of the center of pressure across the 8 performances, with time of the beginning of each note overlaid as a vertical line. This data was used for sonification purposes primarily. Note the highly correlatedive aspect of the data to note-onset in the “no emotion” trial.

The ubiquitous aspect of contrary and similar motion in music theory motivated our sonification of the rho correlation analysis. As the ear is tuned to hear melodies and harmonies that change frequencies, we chose to modulate frequency for sonification. For high and low rho-value pairs, we used PureData to map each value to the frequency of a sine tone. Each sonification consisted of two sine tones with modulating frequencies. The motion of one tone in relation to the other shows the interplay of the rho correlations and gives a general idea of how much correlation certain rho values signify.

Audio examples of sonification will be on the Ortho 222 blog website.

Future Work

Due to technical delays that resulted from an overuse in reflective markers during motion-capture, we were unable to perform data analysis on much of the motion-capture data. In future studies we would conduct a large-scale correlative test of all of the motion-capture data correlated against tempo curves and against different performances. Our FFT13 investigations were confined to analysis of the force plate data and were mitigated by time constraints. In future studies we would propose application of FFT to the motion-capture data. While there were few remarkable trends in the force plate data our use of FFT was confined to the data we received and therefore we cannot exclude the possibility that more interesting findings may have resulted from analysis of the motion-capture data.

Future analysis of the motion-capture data would also include the use of modeling software such as the upper-extremity package called UEtrack 1.2.3 (Motion Analysis Corporation, Santa Rosa, CA). This program serves to interpret the data collected from the three-dimensional motion-capture system to find the joint kinematics of the musician’s performance. The joint kinematics data would allow us to plot and correlate joint torques, joint reaction forces and joint motion over time to tempo curves. This information may also yield insight when matched with note and bar markers.

One rich avenue of further study would be to use motion-capture markers or EMG to monitor the use of the muscles of the face during performance. The subject’s performance was highly emotive in an aural sense, and every emotional comment within the music manifested in the expression on the subject’s face. Future work would look at the activation of different muscles during the performance of the Chaconne under different emotional cutes.

A better understanding of the movement of the body in musical performance will potentially lay a foundation for understanding why a live performance is so much more powerful and aesthetically pleasing than listening to a recording.

References

1. Bartolotta, W.S., “Music 221.” Old Dominion.

2. Berger, J., Lecture, 3/08/05, Anatomy of Movement, Orthopedic Surgery 222, Stanford University.

3. Bilmes, J.: Timing is of the Essence: Perceptual and Computational Techniques for Representing, Learning, and Reproducing Timing in Percussive Rhythm. In Media Lab. Edited, Cambridge, Massachusetts, Massachusetts Institute of Technology, 1993.

4. Clarke, E. F.: Rhythm and Timing in Music. In The Psychology of Music, pp. 473-500. Edited by Deutsch, D., 473-500, San Diego, Academic Press, 1999.

5. Georgia State University, “Hyper Physics”

6. Iyer, V.; Bilmes, J.; Wessel, D.; and Wright, M.: A Novel Representation for Rhythmic Structure. In International Computer Music Conference, pp. 97-100. Edited, 97-100, Thessaloniki, Hellas, International Computer Music Association, 1997.

7. Kapur, A.; Tzanetakis, G.; Virji-Babul, N.; Wang, G.; and Cook, P. R.: A Framework for Sonification of Vicon Motion Capture Data. In 8th International Conference on Digital Audio Effects (DAFX-05). Edited, Madrid, Spain, 2005.

8. Palmer, C.: Time Course of Retrieval and Movement Preparation in Music Performance. Annals of the New York Academy of Sciences, (1060): 360-367, 2005.

9. Pierce, J. The Science of Musical Sound. W.H. Freeman and Company, NewYork: 1983.

10. Repp, B.: A constraint on the expressive timing of a melodic gesture: Evidence from performance and aesthetic judgement. Music Perception, 10: 221-243, 1992.

11. Scheirer, E. D.: Extracting Expressive Performance Information from Recorded Music. In Program in Media Arts and Sciences, School of Architecture and Planning, pp. 56. Edited, 56, Cambridge, MA, Massachusetts Institute of Technology, 1995.

12. Schloss, W. A.: On the Automatic Transcription of Percussive Music: From Acoustic Signal to High-Level Analysis. In Program in Hearing and Speech Sciences, pp. 119. Edited, 119, Palo Alto, CA, Stanford 1985.

13. Smith, J. O., III: Mathematics of the Discrete Fourier Transform. Edited, W3K Publishing, 2003.

14. Todd, N. P. M.: The kinematics of musical expression. Journal of the Acoustical Society of America, 97(3): 1940-9, 1995.

15. Ueno, K.; Furukawa, K.; and Bain, M.: Motor Skill as Dynamic Constraint Satisfaction. In Linköping Electronic Articles in Computer and Information Science. Edited, Linköping, Sweeden, Linköping University Electronic Press, 2000.

16. Widmer, G., and Goebl, W.: Computational Models of Expressive Music Performance: The State of the Art. Journal of New Music Research, 33(3): 203–216, 2004.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download

To fulfill the demand for quickly locating and searching documents.

It is intelligent file search solution for home and business.

Literature Lottery

Related searches