Recording-based performance analysis: Feature extraction in Chopin mazurkas Craig Sapp (Royal Holloway, Univ. of London) Andrew Earis (Royal College of.

Slides:



Advertisements
Similar presentations
Chapter 2: Rhythm and Pitch
Advertisements

Software for Music Analysis An Overview Joren Six.
EndNote Tutorial Martin Graham Why & How. What is EndNote? EndNote is Bibliographic Management software that integrates with MS Word – and OpenOffice.
Computational Rhythm and Beat Analysis Nick Berkner.
Introducing the Mazurka Project Nick Cook and Craig Stuart Sapp Music Informatics research group seminar School of Informatics, City University, London.
Toward Automatic Music Audio Summary Generation from Signal Analysis Seminar „Communications Engineering“ 11. December 2007 Patricia Signé.
LMMS is a digital audio workstation that allows you to produce instrumental songs. LMMS stands for Linux MultiMedia Studio. The software was originally.
Outline Introduction Music Information Retrieval Classification Process Steps Pitch Histograms Multiple Pitch Detection Algorithm Musical Genre Classification.
Mazurka Project Update Craig Stuart Sapp CHARM Symposium Kings College, University of London 26 January 2006.
Introduction The aim the project is to analyse non real time EEG (Electroencephalogram) signal using different mathematical models in Matlab to predict.
Reverse Conducting Slides Mazurka Project Presentation Materials Version: 14 November 2005.
Database-Based Hand Pose Estimation CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
Evaluation of the Audio Beat Tracking System BeatRoot By Simon Dixon (JNMR 2007) Presentation by Yading Song Centre for Digital Music
Soundprism An Online System for Score-informed Source Separation of Music Audio Zhiyao Duan and Bryan Pardo EECS Dept., Northwestern Univ. Interactive.
DEVON BRYANT CS 525 SEMESTER PROJECT Audio Signal MIDI Transcription.
T T18-05 Trend Adjusted Exponential Smoothing Forecast Purpose Allows the analyst to create and analyze the "Trend Adjusted Exponential Smoothing"
The Influence of Musical Context on Tempo Rubato Timmers, Ashley, Desain, Heijink.
1 My Beats Final Presentation Mike Smith, Karen Tipping, Dylan Barrett.
CHARM Advisory Board Meeting Institute for Historical Research, School of Advanced Study, UL Senate House, London 12 Dec 2006 Craig Stuart Sapp Centre.
infinity-project.org Engineering education for today’s classroom 53 Design Problem - Digital Band Build a digital system that can create music of any.
JSymbolic and ELVIS Cory McKay Marianopolis College Montreal, Canada.
MIDI Lesson with easy beat Sequencing a MIDI file to use with improvisation activities in the middle school music class.
Similarity Measurements in Chopin Mazurka Performances Craig Stuart Sapp C4DM Seminar, 11 July 2007 Queen Mary, University of London.
Beat-level comparative performance analysis Craig Stuart Sapp CHARM Friday the 13 th of April 2007 CHARM Symposium 4: Methods for analysing recordings.
Polyphonic Queries A Review of Recent Research by Cory Mckay.
A Time Based Approach to Musical Pattern Discovery in Polyphonic Music Tamar Berman Graduate School of Library and Information Science University of Illinois.
Performance Authenticity: A Case Study of the Concert Artist Label Association of Recorded Sound Collections Conference Stanford University 28 March 2008.
The Mazurka Project Craig Stuart Sapp Centre for the History and Analysis of Recorded Music Royal Holloway, Univ. of London Universiteit van Amsterdam.
Applications Statistical Graphical Models in Music Informatics Yushen Han Feb I548 Presentation.
Visualizing Harmony Craig Stuart Sapp ICMC 2001 Havana, Cuba.
Paper by Craig Stuart Sapp 2007 & 2008 Presented by Salehe Erfanian Ebadi QMUL ELE021/ELED021/ELEM021 5 March 2012.
Digital recording software Presented by Andrew Littlefield.
JSymbolic Cedar Wingate MUMT 621 Professor Ichiro Fujinaga 22 October 2009.
Polyphonic Music Transcription Using A Dynamic Graphical Model Barry Rafkind E6820 Speech and Audio Signal Processing Wednesday, March 9th, 2005.
Incorporating Dynamic Time Warping (DTW) in the SeqRec.m File Presented by: Clay McCreary, MSEE.
Rhythmic Transcription of MIDI Signals Carmine Casciato MUMT 611 Thursday, February 10, 2005.
Similarity Matrix Processing for Music Structure Analysis Yu Shiu, Hong Jeng C.-C. Jay Kuo ACM Multimedia 2006.
CHARM Symposium Kings College London 6 January 2006 The Mazurka Project.
Mazurka Project 29 June 2006 CHARM Symposium Craig Stuart Sapp.
Polyphonic Transcription Bruno Angeles McGill University - Schulich School of Music MUMT-621 Fall /14.
CODES FOR DATA ARCHIVING, INTERCHANGE, AND ANALYSIS: MUSEDATA Music 253/CS 275A Stanford University.
Digital Koto Music Scores
1 25/11/ :28:15 Marc Conrad University of Bedfordshire My Mouse – My Music Marc Conrad Tim French Department of Computer.
V ISUALIZING E XPRESSIVE P ERFORMANCE IN T EMPO -L OUDNESS S PACE Presented by Vinoth Kulaveerasingam QMUL ELE021/ELED021/ELEM021 5 Mar 2012 Paper by Jorg.
Music Genre Classification Alex Stabile. Example File
New Acoustic-Phonetic Correlates Sorin Dusan and Larry Rabiner Center for Advanced Information Processing Rutgers University Piscataway,
The Mazurka Project Science and Music Seminar University of Cambridge 28 Nov 2006 Craig Stuart Sapp Centre for the History and Analysis of Recorded Music.
Computational Performance Analysis Craig Stuart Sapp CHARM 25 April 2007 CCRMA Colloquium.
Issues in Automatic Musical Genre Classification Cory McKay.
For use with WJEC Performing Arts GCSE Unit 1 and Unit 3 Task 1 Music Technology Technical issues.
Elements of Music By: Montana Miracle. Pitch  The highness or lowness of a tone.  The position of a note determines the element of music.  It may be.
1 / 22 jSymbolic Jordan Smith – MUMT 611 – 6 March 2008.
This research was funded by a generous gift from the Xerox Corporation. Beat-Detection: Keeping Tabs on Tempo Using Auto-Correlation David J. Heid
Garage Band For MAC. What is it? A digital audio workstation that can record and play back multiple tracks of audio. Is a software application for OS.
Bryant Tober. Problem Description  View the sound wave produced from a wav file  Apply different modulations to the wave file  Hear the effect of the.
1 Tempo Induction and Beat Tracking for Audio Signals MUMT 611, February 2005 Assignment 3 Paul Kolesnik.
David Sears MUMT November 2009
Rhythmic Transcription of MIDI Signals
Chapter 2: Rhythm and Pitch
Dean Luo, Wentao Gu, Ruxin Luo and Lixin Wang
Lecture 26 Hand Pose Estimation Using a Database of Hand Images
Rosetta Stone of Musical Data
Codes for data archiving, interchange, and analysis
Craig Stuart Sapp ICMC 2001 Havana, Cuba
MLP Based Feedback System for Gas Valve Control in a Madison Symmetric Torus Andrew Seltzman Dec 14, 2010.
Introduction to Humdrum
ECE 791 Project Proposal Project Title: Developing and Evaluating a Tool for Converting MP3 Audio Files to Staff Music Project Team: Salvatore DeVito.
Measuring the Similarity of Rhythmic Patterns
The Audio Notetaker Workspace Explained
Presentation transcript:

Recording-based performance analysis: Feature extraction in Chopin mazurkas Craig Sapp (Royal Holloway, Univ. of London) Andrew Earis (Royal College of Music) UK Musical Acoustics Network Conference Royal College of Music / London Metropolitan University September 2006

Extraction Process estimate note locations in audio based on musical score automatically adjust estimations based on observation of audio manual correction of automatic output automatic extraction of refined information tap beats while listening to audio. use as click-track for score. search for actual event onsets in neighborhood of estimated time. listen to results and fix any errors in extracted data. individual note onsets & loudnesses

Input to Andrew’s System Scan the score Convert to symbolic data with SharpEye Tap to the beats in Sonic Visualiser Convert to Humdrum data format Create approximate performance score Simplify for processing in Matlab

Input Data Example **time**kern**kern **staff2*staff1 **clefF4*clefG2 **M3/4*M3/ r(4ee\ =1=1= r8.ff\L ee\Jk 31754A'\ 4d'\ (4f'\4dd\ 37784A'\ 4d'\ 4f'\)4ff\ =2=2= r2ff\ 49144A'\ 4c'\ (4f'\ A'\ 4c'\ 4e'\)4ee\) tapped timings left hand notes right hand notes hand on-time duration MIDI key metric level beat numb. measure Matlab input data Humdrum score

Reverse Conducting Orange = individual taps (multiple sessions) which create bands of time about 100 ms wide. Red = average time of individual taps for a particular beat

Refinement of tapped data = tapped times = corrected tap times Standard Deviation for tap accuracy is about ms. Automatic adjustments are 3-5 times more accurate than tapping.

Performance Data What to do with data? Mostly examing tempo thus far Starting to work with dynamics Need to examine individual note onsets (LH/RH) Currently extracting: note/chord onsets note/chord loudnesses Currently ignoring: note offsets: useful for -- articulations (staccato, legato) -- pedaling Long-term goals: Quantify and examine the performance layer of music Characterize pianists / schools of performance Automatic performance generation

MIDI Performance Reconstructions MIDI file imported as a note layer in Sonic Visualiser: Superimposed on spectrogram Easy to distinguish pitch/harmonics Legato; LH/RH time offsets “straight” performance matching performers tempo beat-by-beat: tempo = avg. of performance (pause at beginning)

Average tempo over time Performances of mazurkas slowing down over time: Friedman 1930 Rubinstein 1961 Indjic 2001 Slowing down at about 3 BPM/decade

Tempo Graphs tempo Rubenstein 1938 Rubenstein 1961 Smith 1975 Luisada 1991 Chiu 1999 Indjic 2001 Mauzurka in F major, Op. 68, No. 3 Poco piu vivo vivo avg.non-vivo avg.avg. tempo

Timescapes Examine the internal tempo structure of a performances Plot average tempos over various time-spans in the piece Example of a piece with 6 beats at tempos A, B, C, D, E, and F: average tempo for entire piece plot of individual tempos average tempo of adjacent neighbors 3-neighbor average 4-neighbor average 5-neighbor average (plotted on previous slide) where is tempo faster/slower?

Average tempo Mazurka in F major, Op. 67, No. 3: Frederic Chiu; 1999 average tempo of performance (normalized to green) average slower faster phrases

Average tempo over time

Tempo Correlation Pearson correlation:

Dynamics all at once:

For Further Information