Alex Stabile. Research Questions: Could a computer learn to distinguish between different composers? Why does music by different composers even sound.

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Backpropagation Algorithm
Advertisements

Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Music Analysis Josiah Boning TJHSST Senior Research Project Computer Systems Lab,
Brian Merrick CS498 Seminar.  Introduction to Neural Networks  Types of Neural Networks  Neural Networks with Pattern Recognition  Applications.
1 Part I Artificial Neural Networks Sofia Nikitaki.
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
Neural Networks My name is Burleson. Neural Networks vs Conventional Computing Programming is broken into small, unambiguous steps Algorithms must be.
Data Mining Techniques Outline
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks Chapter Feed-Forward Neural Networks.
Classification of Music According to Genres Using Neural Networks, Genetic Algorithms and Fuzzy Systems.
IT 691 Final Presentation Pace University Created by: Robert M Gust Mark Lee Samir Hessami Mark Lee Samir Hessami.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
NEURAL NETWORKS FOR DATA MINING
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
Multi-Layer Perceptron
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 25 Nov 4, 2005 Nanjing University of Science & Technology.
Music Genre Classification Alex Stabile. Example File
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Procedure for Training a Child to Identify a Cat using 10,000 Example Cats For Cat_index  1 to Show cat and describe catlike features (Cat_index)
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Neural Networks 2nd Edition Simon Haykin
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
Big data classification using neural network
Multiple-Layer Networks and Backpropagation Algorithms
Neural Network Architecture Session 2
Deep Learning Amin Sobhani.
CSE 473 Introduction to Artificial Intelligence Neural Networks
Derivation of a Learning Rule for Perceptrons
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Introduction to Neural Networks And Their Applications
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Networks for Pattern Recognition
Example: Voice Recognition
Artificial Neural Network & Backpropagation Algorithm
Artificial Neural Networks
network of simple neuron-like computing elements
Backpropagation.
Capabilities of Threshold Neurons
Computer Vision Lecture 19: Object Recognition III
David Kauchak CS51A Spring 2019
Learning Combinational Logic
Pattern Recognition: Statistical and Neural
Principles of Back-Propagation
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Alex Stabile

Research Questions: Could a computer learn to distinguish between different composers? Why does music by different composers even sound different?

Possible Answers Backer et al.: On Musical Stylometry—a Pattern Recognition Approach Analyzed low-level musical characteristics: note entropy, intervals, rhythms Used information as input for a statistical model

Project Design Chords/harmonies all have their own character, so: Analyze harmonies found in music Use machine learning techniques to find a relationship between types of harmonies and musical style Used Python, analyzed Midi files Compared works by Mozart to works by Rachmaninoff

Example File

Organization/Parsing file Beat class Notes on beatNotes off beat (Beat number = 8)

Chord Identification Notes: C, E, G What kind of chord? Look at intervals… E: m3, m6 -no matches G: P4, M6 -no matches C: M3, P5 -These intervals form a C major chord, root position

Analyzing Data—Machine Learning Approach Neural Networks: Each node has a value and an associated weight Top layer is receives input Values are propagated through the network, creating values for the other nodes A simple neural network

Learning Algorithm The network is given a set of training data whose outputs are known Inputs are “fed” through the network: Calculated output is compared with desired output to obtain error

Learning Algorithm Back-propagation: the error is propagated backward though the network, and a respective error is calculated for each node The weights and node values are adjusted based on the errors so that a more desirable output will be obtained

Learning Algorithm—This Project Inputs to the network are the frequencies of different kinds of chords Two composers analyzed: Mozart and Rachmaninoff Expected output for Mozart: 0 Expected output for Rachmaninoff: 1

Results 4,000 Iterations10,000 Iterations 14,000 Iterations20,000 Iterations

Interpretation of Results Relationship between harmonic content and style/composer Humans may learn to analyze this subconsciously, but a computer can be trained to do so as well

Future Research Analyze more musical factors Analyze more composers Analyze composers who are more similar (e.g., Mozart and Haydn)