THE 2006 MUSIC INFORMATION RETRIEVAL EVALUATION EXCHANGE (MIREX 2006) RESULTS OVERVIEW The IMIRSEL Group led by J. Stephen Downie Graduate School of Library.

Slides:



Advertisements
Similar presentations
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Advertisements

LYRIC-BASED ARTIST NETWORK Derek Gossi CS 765 Fall 2014.
Review Mining for Music Digital Libraries: Phase II J. Stephen Downie, Xiao Hu The International Music Information Retrieval Systems Evaluation Lab (IMIRSEL)
Outline Introduction Music Information Retrieval Classification Process Steps Pitch Histograms Multiple Pitch Detection Algorithm Musical Genre Classification.
tracking beat tracking beat Mcgill university :: music technology :: mumt 611>>
WITCHCRAFT Towards Integration of MIR and Folk Song Research Peter van Kranenburg, J ö rg Garbers, Anja Volk, Frans Wiering, Louis P. Grijp, Remco C. Veltkamp.
Data Mining and Text Analytics in Music Audi Sugianto and Nicholas Tawonezvi.
Onset Detection in Audio Music J.-S Roger Jang ( 張智星 ) MIR LabMIR Lab, CSIE Dept. National Taiwan University.
Evaluation of the Audio Beat Tracking System BeatRoot By Simon Dixon (JNMR 2007) Presentation by Yading Song Centre for Digital Music
Retrieval Evaluation: Precision and Recall. Introduction Evaluation of implementations in computer science often is in terms of time and space complexity.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
1 Content-based Music Information Retrieval Student: DENG Jie. Supervisor: Prof. LEUNG, Clement Department of Computer Science Hong Kong Baptist University.
1 My Beats Final Presentation Mike Smith, Karen Tipping, Dylan Barrett.
GCT731 Fall 2014 Topics in Music Technology - Music Information Retrieval Introduction to MIR Course Overview 1.
POTENTIAL RELATIONSHIP DISCOVERY IN TAG-AWARE MUSIC STYLE CLUSTERING AND ARTIST SOCIAL NETWORKS Music style analysis such as music classification and clustering.
The SEASR project and its Meandre infrastructure are sponsored by The Andrew W. Mellon Foundation SEASR Overview Loretta Auvil and Bernie Acs National.
Field Project Planning, Operations and Data Services Jim Moore, EOL Field Project Services (FPS) Mike Daniels, EOL Computing, Data and Software (CDS) Facility.
NM7613: Music Signal Analysis and Retrieval 音樂訊號分析與檢索 Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
A Time Based Approach to Musical Pattern Discovery in Polyphonic Music Tamar Berman Graduate School of Library and Information Science University of Illinois.
2015/9/111 Introduction to ISMIR/MIREX J.-S. Roger Jang (張智星) Multimedia Information Retrieval (MIR) Lab CSIE Dept, National Taiwan Univ.
Survey Of Music Information Needs, Uses, and Seeking Behaviors Jin Ha Lee J. Stephen Downie Graduate School of Library and Information Science University.
EDU December 1, Methods – 8 Likert five-point questions were entered into SPSS 14.0 – Analysis was conducted into two parts: Analysis.
Per Møldrup-Dalum State and University Library SCAPE Information Day State and University Library, Denmark, SCAPE Scalable Preservation Environments.
Managing Globally for Acting Locally Organizational Support for Liaison Librarians Tracy Gabridge, MIT Libraries.
Creating and Operating a Digital Library for Information and Learning– the GROW Project Muniram Budhu Department of Civil Engineering & Engineering Mechanics.
Testing and Improving Interoperability The Z39.50 Interoperability Testbed William E. Moen School of Library and Information Sciences Texas Center for.
Music Information Retrieval -or- how to search for (and maybe find) music and do away with incipits Michael Fingerhut Multimedia Library and Engineering.
Developing a Concept Extraction Technique with Ensemble Pathway Prat Tanapaisankit (NJIT), Min Song (NJIT), and Edward A. Fox (Virginia Tech) Abstract.
Autism Team Training in South Dakota Presented by Brittany Schmidt, MA-CCC/SLP Center for Disabilities Autism Spectrum Disorders Program
Aspects of Music Information Retrieval Will Meurer School of Information University of Texas.
SEASR Applications and Future Work University of Illinois at Urbana-Champaign.
1 A Collaborative Music Information Retrieval / Music Digital Library Research Model: A Briefing J. Stephen Downie, Ph.D. Graduate School of Library &
Fundamentals of Music Processing
TEMPLATE DESIGN © Zhiyao Duan 1,2, Lie Lu 1, and Changshui Zhang 2 1. Microsoft Research Asia (MSRA), Beijing, China.2.
LSST: Preparing for the Data Avalanche through Partitioning, Parallelization, and Provenance Kirk Borne (Perot Systems Corporation / NASA GSFC and George.
Exploiting Recommended Usage Metadata: Exploratory Analyses Xiao Hu, J. Stephen Downie, Andreas Ehmann THE ANDREW W. MELLON FOUNDATION The International.
Demos for QBSH J.-S. Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Structure Discovery of Pop Music Using HHMM E6820 Project Jessie Hsu 03/09/05.
Greedy is not Enough: An Efficient Batch Mode Active Learning Algorithm Chen, Yi-wen( 陳憶文 ) Graduate Institute of Computer Science & Information Engineering.
Polyphonic Transcription Bruno Angeles McGill University - Schulich School of Music MUMT-621 Fall /14.
THE 2007 MUSIC INFORMATION RETRIEVAL EVALUATION EXCHANGE (MIREX 2007) RESULTS OVERVIEW The IMIRSEL Group led by J. Stephen Downie Graduate School of Library.
Introduction to Onset Detection Functions HAO-HSUN LI 1/30.
Penn State Electronic Theses and Dissertations Project Graduate Council Library Computing Services Graduate School University Libraries Center for Academic.
Enabling Access to Sound Archives through Integration, Enrichment and Retrieval Annual Review Meeting - Introduction.
United Nations Economic Commission for Europe Statistical Division Data Initiatives: The UNECE Gender Database and Website Victoria Velkoff On behalf of.
FACTORS AFFECTING RESPONSE RATES FOR REAL-LIFE MIR QUERIES Jin Ha LeeM. Cameron JonesJ. Stephen Downie Graduate School of Library and Information Science.
THE 2005 MUSIC INFORMATION RETRIEVAL EVALUATION EXCHANGE (MIREX 2005) RESULTS OVERVIEW Graduate School of Library and Information Science J. Stephen DownieKris.
Managing Media Services: Theory and Practice William D. Schmidt Donald Arthur Rieck Chapter 5 Managing Media Materials Services ETEC 579 Dr. Jason Lee.
QBSH Corpus The QBSH corpus provided by Roger Jang [1] consists of recordings of children’s songs from students taking the course “Audio Signal Processing.
Ask a Librarian: The Role of Librarians in the Music Information Retrieval Community Jenn Riley, Indiana University Constance A. Mayer, University of Maryland.
Intern: Sofien Lazreg MSc in Informatics: Information Management.
Audio Train/Test: Mood Classification - MIREX08 Dataset SubIDParticipantsAccuracy JJ1Ming-Ju Wu, Jyh-Shing Roger Jang0.68 PP1Renato Panda, Rui Pedro Paiva0.68.
ASSOCIATIVE BROWSING Evaluating 1 Jinyoung Kim / W. Bruce Croft / David Smith for Personal Information.
ECE 477 Final Presentation Team 13  Spring 2012 Martin Pendergast, Stephen Edwards, Nick Kwolek, David Duelmler.
Nick Kwolek Martin Pendergast Stephen Edwards David Duemler.
Discussions on Audio Melody Extraction (AME) J.-S. Roger Jang ( 張智星 ) MIR Lab, CSIE Dept. National Taiwan University.
R ESEARCH P ROGRESS R EPORT – C OVER S ONGS I DENTIFICATION Ken.
School Library Management Sunil MV SDM Institute for Management Development
1 Tempo Induction and Beat Tracking for Audio Signals MUMT 611, February 2005 Assignment 3 Paul Kolesnik.
Introduction to ISMIR/MIREX
Onset Detection, Tempo Estimation, and Beat Tracking
Large-Scale Music Audio Analyses Using High Performance Computing Technologies: Creating New Tools, Posing New Questions J.
David Sears MUMT November 2009
Musical Similarity: More perspectives and compound techniques
MIR Lab: R&D Foci and Demos ( MIR實驗室:研發重點及展示)
MATCH A Music Alignment Tool Chest
Introduction to Music Information Retrieval (MIR)
Aspects of Music Information Retrieval
Mixed Up Multiplication Challenge
Jack G. Conrad, Thomson R&D
Presentation transcript:

THE 2006 MUSIC INFORMATION RETRIEVAL EVALUATION EXCHANGE (MIREX 2006) RESULTS OVERVIEW The IMIRSEL Group led by J. Stephen Downie Graduate School of Library and Information Science University of Illinois at Urbana-Champaign MIREX Results MIREX Results Key Issues for Future MIREX Key Issues for Future MIREX MIREX 2006 Challenges MIREX 2006 Challenges Audio Beat Tracking ParticipantAvg. P-Score S. Dixon0.407 D. P. W. Ellis0.401 A. Klapuri0.395 Davies & Plumbley0.394 P. Brossier0.391 Audio Melody Extraction (ADC 2004 Dataset) Participant Overall Accuracy K. Dressler82.50% Ryynänen & Klapuri77.30% Poliner & Ellis71.90% Sutton, Vincent, Plumbley & Bello 58.20% P. Brossier49.60% Audio Melody Extraction (MIREX 2005 Dataset) Participant Overall Accuracy K. Dressler73.20% Ryynänen & Klapuri67.90% Poliner & Ellis63.00% Sutton, Vincent, Plumbley & Bello 53.70% P. Brossier31.90% Audio Music Similarity and Retrieval Participant Avg. Fine Score E. Pampalk0.430 T. Pohle0.423 V. Soares0.404 Lidy & Rauber0.393 K. West (trans)0.372 K. West (likely)0.339 Query-by-Singing/Humming (Task 1) Participant MRR Wu & Li (1)0.926 Wu & Li (2)0.900 Jang & Lee0.883 López & Rocamora0.800 Lemström, Mikkilä, Mäkinen & Ukkonen C. Sailer (ear)0.568 Typke, Wiering & Veltkamp (2)0.390 C. Sailer (warp)0.348 A. L. Uitdenbogerd (2)0.288 C. Sailer (midi)0.283 Ferraro & Hanna0.218 A. L. Uitdenbogerd (1)0.205 Typke, Wiering & Veltkamp (1)0.196 Query-by-Singing/Humming (Task 2) Participant Mean Precision Jang & Lee0.926 Lemström, Mikkilä, Mäkinen & Ukkonen C. Sailer (midi)0.649 C. Sailer (ear)0.587 Typke, Wiering & Veltkamp (1)0.468 C. Sailer (warp)0.415 Typke, Wiering & Veltkamp (2)0.401 Ferraro & Hanna0.309 A. L. Uitdenbogerd (2)0.238 A. L. Uitdenbogerd (1)0.163 Score Following Participant Total Precision Cont & Schwarz82.90% M. Puckette29.75% Symbolic Melodic Similarity (RISM Collection) ParticipantADRBinary* Typke, Wiering & Veltkamp Ferraro & Hanna Frieler & Müllensiefen (2) A. L. Uitdenbogerd Frieler & Müllensiefen (3) Lemström, Mikkilä, Mäkinen & Ukkonen (2) Lemström, Mikkilä, Mäkinen & Ukkonen (1) Frieler & Müllensiefen (1) Symbolic Melodic Similarity (Karaoke Collection) ParticipantADRBinary* Typke, Wiering & Veltkamp A. L. Uitdenbogerd Ferraro & Hanna Lemström, Mikkilä, Mäkinen & Ukkonen (2) Lemström, Mikkilä, Mäkinen & Ukkonen (1) Symbolic Melodic Similarity (Mixed Polyphonic Collection) Participant ADRBinary* Typke, Wiering & Veltkamp A. L. Uitdenbogerd Ferraro & Hanna Lemström, Mikkilä, Mäkinen & Ukkonen (1) Lemström, Mikkilä, Mäkinen & Ukkonen (2) Audio Onset Detection Participant Avg. F- Measure A. Röbel (3)0.788 A. Röbel (2)0.780 A. Röbel (1)0.777 Du, Li & Liu0.762 P. Brossier (hfc)0.734 S. Dixon (sf)0.726 P. Brossier (dual)0.724 P. Brossier (complex)0.721 S. Dixon (rcd)0.716 S. Dixon (cd)0.710 P. Brossier (specdiff)0.707 S. Dixon (wpd)0.685 S. Dixon (nwpd)0.620 Audio Tempo Extraction ParticipantP-Score A. Klapuri0.806 Davies & Plumbley0.776 Alonso, David & Richard (2)0.724 Alonso, David & Richard (1)0.693 D. P. W. Ellis0.673 Antonopoulos, Pikrakis & Theodoridis P. Brossier0.628 Audio Cover Song Identification ParticipantAvg. Perf. D. P. W. Ellis2.306 K. Lee (1)1.106 K. Lee (2)0.951 Sailer & Dressler0.639 Lidy & Rauber0.451 K. West (likely)0.354 T. Pohle0.351 K. West (trans)0.309 Special Thanks to: The Andrew W. Mellon Foundation, the National Science Foundation (Grant No. NSF IIS ), the content providers and the MIR community, the Automated Learning Group (ALG) at the National Center for Supercomputing Applications at UIUC, Paul Lamere of Sun Labs, all of the IMIRSEL group – Mert Bay, Andreas Ehmann, Joe Futrelle, Anatoliy Gruzd, Xiao Hu, M. Cameron Jones, Jin Ha (Gina) Lee, Martin McCrory, David Tcheng, Qin Wei, Kris West, Byounggil Yoo, and all the volunteers who evaluated the MIREX results, the GSLIS technology services team, and the ISMIR 2006 organizing committee  Data quality issues with regard to the selection of test sets and the identification of problematic formats and non-representative content  Basic content-management issues with regard to multiple formats and ever growing result set sizes  Difficulties performing sanity-check verifications of result sets as some individual returns are now approaching 0.5 Gigabytes  Scaling issues between the “at home” testing of submissions and the formal “in lab” running of the task  Balancing the need for depth and breadth in evaluation data with the burden placed on the voluntary human evaluators  Continue to explore more statistical significance testing procedures beyond Friedman’s ANOVA test  Continue refinement of Evalutron 6000 technology and procedures  Continue to establish a more formal organizational structure for future MIREX contests  Continue to develop the evaluator software and establish an open- source evaluation API  Make useful evaluation data publicly available year round  Establish a webservices-based IMIRSEL/M2K online system prototype  Binary*: Binary Relevance Judgment (Not Similar=0, Somewhat Similar=1, Very Similar=1)