JSymbolic Cedar Wingate MUMT 621 Professor Ichiro Fujinaga 22 October 2009.

Slides:



Advertisements
Similar presentations
N(T) Music Syllabus Implementation Workshop. Objectives of Workshop Interpret the GCE N(T) Music syllabus Plan the N(T) Music course Teach the N(T) Music.
Advertisements

Automatic Music Classification Cory McKay. 2/47 Introduction Many areas of research in music information retrieval (MIR) involve using computers to classify.
Symbolic Representations of Music Ichiro Fujinaga MUMT 611 McGill University.
Outline Introduction Music Information Retrieval Classification Process Steps Pitch Histograms Multiple Pitch Detection Algorithm Musical Genre Classification.
Classification of Music According to Genres Using Neural Networks, Genetic Algorithms and Fuzzy Systems.
California Pre-Kindergarten Music Standards
Classification of Music According to Genres Using Neural Networks, Genetic Algorithms and Fuzzy Systems.
Click to proceed to the next page1 Piano/Theory 4 PACING GUIDE Quarter 1.
JSymbolic and ELVIS Cory McKay Marianopolis College Montreal, Canada.
Polyphonic Queries A Review of Recent Research by Cory Mckay.
CLASSICAL MUSIC APPRECIATION ~ I love music but I don’t understand it at all ~
Theoretic and artistic research studying opportunities of symbiosis of Western and non-Western musical idiom Olmo Cornelis.
Educational Software using Audio to Score Alignment Antoine Gomas supervised by Dr. Tim Collins & Pr. Corinne Mailhes 7 th of September, 2007.
A&D 3 B&E 4-1 Mr. Pinelli Contact Information
August 12, 2004IAML - IASA 2004 Congress, Olso1 Music Information Retrieval, or how to search for (and maybe find) music and do away with incipits Michael.
Music Information Retrieval -or- how to search for (and maybe find) music and do away with incipits Michael Fingerhut Multimedia Library and Engineering.
ACE: A Framework for optimizing music classification Cory McKay Rebecca Fiebrink Daniel McEnnis Beinan Li Ichiro Fujinaga Music Technology Area Faculty.
Music Is The Art Of Expressing Yourself Through Sound.
MUMT611: Music Information Acquisition, Preservation, and Retrieval Presentation on Timbre Similarity Alexandre Savard March 2006.
Rhythmic Transcription of MIDI Signals Carmine Casciato MUMT 611 Thursday, February 10, 2005.
2nd grade music - Marking Period 1 During Marking Period 1, students practice identifying and describing the classification of classroom and orchestral.
Elements and Classifiaction Elements of Music Timbre Categories Genre vs. Musical Style Genre Categories.
"The Elements of Music" An Introduction. The Elements of Music.
Extracting Melody Lines from Complex Audio Jana Eggink Supervisor: Guy J. Brown University of Sheffield {j.eggink
MULTIMEDIA INPUT / OUTPUT TECHNOLOGIES INTRODUCTION 6/1/ A.Aruna, Assistant Professor, Faculty of Information Technology.
Automatic music classification and the importance of instrument identification Cory McKay and Ichiro Fujinaga Music Technology Area Faculty of Music McGill.
A preliminary computational model of immanent accent salience in tonal music Richard Parncutt 1, Erica Bisesi 1, & Anders Friberg 2 1 University of Graz,
Polyphonic Transcription Bruno Angeles McGill University - Schulich School of Music MUMT-621 Fall /14.
Creating Music Text, Rhythm, and Pitch Combined to Compose a Song.
For use with WJEC Performing Arts GCSE Unit 1 and Unit 3 Task 1 Music Technology Creativity in composing.
Audio Tempo Extraction Presenter: Simon de Leon Date: February 9, 2006 Course: MUMT611.
HELUS Middle School Elective Mr. Nosik
Issues in Automatic Musical Genre Classification Cory McKay.
MPEG-7 Audio Overview Ichiro Fujinaga MUMT 611 McGill University.
Revealing and listening to scales from the past Tone scale analysis of archived Central-African music using computational means.
The Elements of Music “Student Selected Piece of Music”
1 Automatic Music Style Recognition Arturo Camacho.
 6 th Musical Literacy 1.1 All students will be able to use a steady tone when performing.
Transcription Software Amazing Slow Downer & Transcribe! Rick Lollar Amazing Slow Downer & Transcribe! Rick Lollar.
1 Hidden Markov Model: Overview and Applications in MIR MUMT 611, March 2005 Paul Kolesnik MUMT 611, March 2005 Paul Kolesnik.
Texture A term that refers to the way the melodic, rhythmic and harmonic materials are woven together in a piece of music by, Kelly Schlittenhardt.
Classification of melody by composer using hidden Markov models Greg Eustace MUMT 614: Music Information Acquisition, Preservation, and Retrieval.
1 / 22 jSymbolic Jordan Smith – MUMT 611 – 6 March 2008.
BASS TRACK SELECTION IN MIDI FILES AND MULTIMODAL IMPLICATIONS TO MELODY gPRAI Pattern Recognition and Artificial Intelligence Group Computer Music Laboratory.
Metamidi: a tool for automatic metadata extraction from MIDI files Tomás Pérez-García, Jose M. Iñesta, and David Rizo Computer Music Laboratory University.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Musical Terms Musical Terms Studying music terminology through the 6 concepts Focus: Tone Colour & Texture Stage 6 Music – Musicology (Interactive Whiteboard)
Elements of Music. Melody Single line of notes heard in succession as unit Phrases Cadences—Points of arrival/rest Conjunct vs. disjunct motion Contour:
A shallow description framework for musical style recognition Pedro J. Ponce de León, Carlos Pérez-Sancho and José Manuel Iñesta Departamento de Lenguajes.
Genre Classification of Music by Tonal Harmony Carlos Pérez-Sancho, David Rizo Departamento de Lenguajes y Sistemas Informáticos, Universidad de Alicante,
Automatic Transcription of Polyphonic Music
National Curriculum Requirements of Music at Key Stage 1
Ionian University Department of Informatics
Tomás Pérez-García, Carlos Pérez-Sancho, José M. Iñesta
Rhythmic Transcription of MIDI Signals
OUTLINE Introduction Background Dataset Context Analysis Methodology
Carmine Casciato MUMT 611 Thursday, March 13, 2005
Musical Information 1B Music 253/CS 275A Stanford University
OUTLINE Introduction Background Dataset Context Analysis Methodology
National Curriculum Requirements of Music at Key Stage 1
Weaving Music Knowledge, Skills and Understanding into the new National Curriculum Key Stage 1: Music Forest Academy.
Carmine Casciato MUMT 611 Thursday, March 13, 2005
Automatic Scoring-up of Mensural Parts
Presented by Steven Lewis
2nd grade music - Marking Period 1
Fifth Grade Music TEKS.
Third Grade Music TEKS.
Analytical uses of Humdrum Tools
Music Terminology (continued)
Presentation transcript:

jSymbolic Cedar Wingate MUMT 621 Professor Ichiro Fujinaga 22 October 2009

Types of Features  Low Level  High Level  Cultural

What are High Level Features?  Information that consists of musical abstractions that are meaningful to musically trained individuals.  Examples include instruments present, melodic contour, chord frequencies and rhythmic density.

Why High Level Features?  Musicological and music theoretical value  Great deal of music already encoded in MIDI or Humdrum’s kern  Optical music recognition can provide even more symbolic musical data

jSymbolic application  Application to extract high-level features from MIDI files  Many high-level features cannot be extracted from audio recordings  Open Source  Designed to easily add new features with basic JAVA and MIDI skills

Building a feature set  Goals  Single software system that could be applied to classical music, jazz and a wide variety of popular and traditional musics  Use this software without needing to make any manual adjustments or adaptations in order to deal with different types of music  Issues to consider  “Curse of dimensionality”  Many systems of analysis  Many of which rely on intuitive subjective judgment  jSymbolic solution  Large catalogue of general features  User can select which ones to include or exclude  Concentrate on features that can be represented by relatively simple statistics  Intermediate representations  Histograms

Feature characteristics  Features that can be represented by simple numbers or small vectors  One-dimensional features  Means, standard deviations, true/false values  Multi-dimensional features  Histograms

Example: Beat Histogram (McKay, C., and I. Fujinaga. 2007)

The Features  Drawn from musical research  Music Theory (Julie Cumming), Ethnomusicology (Alan Lomax, Bruno Nettl), Music Cognition (Bret Aarden and David Huron) and Popular Musicology (Philip Tagg)  160 Total Features (111 implemented)  Instrumentation  Pitched/Unpitched  Note and Time Prevalence and Variability of Note Prevalence  Fraction  Texture  Independent Voices  Voice equality  Range of Voices  Rhythm  Strength  Looseness  Polyrhythms  Density  Tempo, Meter

The Featues (continued)  Dynamics  Range  Variation  Pitch Statistics  Common Pitches  Variety  Range  Glissando  Vibrato  Melody  Intervals  Arpeggiation  Repetition  Chromaticism  Melodic Arc  Chords (not implemented)

More examples Twenty sample features extracted from the first two measures of Fryderyk Chopin’s Nocturne in B, Op. 32, No. 1 (McKay, C., and I. Fujinaga. 2007).

More examples Twenty sample features extracted from measures 10 and 11 of the first movement of Felix Mendelssohn’s Piano Trio No. 2 in C minor, Op. 66 (McKay, C., and I. Fujinaga. 2007).

Application: Automatic Genre Classification  Automatic music classification and the importance of instrument identification (McKay, C., and I. Fujinaga. 2005)  Able to correctly classify MIDI recordings among 9 categories 90% of the time and among 38 categories 57% of the time.  Root genre identified 90% for 9 categories, 80% for 38 categories.  Better than audio based classification systems (below 80% among 5 categories)  Found that features relating to instrumentation performed significantly better than other features in automatic genre classification.

Application: Automatic Genre Classification (continued) (McKay, C., and I. Fujinaga. 2005)

Application: Automatic Genre Classification (continued, again…) (McKay, C., and I. Fujinaga. 2005)

Bibliography McKay, C. 2004a. Automatic genre classification of MIDI recordings. M.A. Thesis. McGill University, Canada. McKay, C. 2004b. Automatic genre classification as a study of the viability of high-level features for music classification. Proceedings of the International Computer Music Conference McKay, C., and I. Fujinaga Automatic music classification and the importance of instrument identification. Proceedings of the Conference on Interdisciplinary Musicology. McKay, C., and I. Fujinaga jSymbolic: A feature extractor for MIDI files. Proceedings of the International Computer Music Conference McKay, C., and I. Fujinaga Style-independent computer-assisted exploratory analysis of large music collecitons. Journal of Interdisciplinary Music Studies 1 (1):