MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Slides from: Doug Gray, David Poole
K-NEAREST NEIGHBORS AND DECISION TREE Nonparametric Supervised Learning.
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Learning from Observations Chapter 18 Section 1 – 4.
Lesson 8: Machine Learning (and the Legionella as a case study) Biological Sequences Analysis, MTA.
Lazy Learning k-Nearest Neighbour Motivation: availability of large amounts of processing power improves our ability to tune k-NN classifiers.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Classification and Prediction by Yen-Hsien Lee Department of Information Management College of Management National Sun Yat-Sen University March 4, 2003.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Data Mining Joyeeta Dutta-Moscato July 10, Wherever we have large amounts of data, we have the need for building systems capable of learning information.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
We introduce the use of Confidence c as a weighted vote for the voting machine to avoid low confidence Result r of individual expert from affecting the.
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
NEURAL NETWORKS FOR DATA MINING
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Learning from observations
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Machine Learning: Ensemble Methods
General-Purpose Learning Machine
Machine Learning for Computer Security
Neural Network Architecture Session 2
Data Mining, Neural Network and Genetic Programming
School of Computer Science & Engineering
Prepared by: Mahmoud Rafeek Al-Farra
CSSE463: Image Recognition Day 11
Dr. Unnikrishnan P.C. Professor, EEE
Data Mining Lecture 11.
Learning.
Learning with Identification Trees
CSSE463: Image Recognition Day 11
Biological and Artificial Neuron
Prepared by: Mahmoud Rafeek Al-Farra
Biological and Artificial Neuron
CSI 5388:Topics in Machine Learning
Instance Based Learning
Perceptron as one Type of Linear Discriminants
Biological and Artificial Neuron
network of simple neuron-like computing elements
COSC 4335: Other Classification Techniques
Computer Vision Chapter 4
Pattern Recognition & Machine Learning
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 11
A task of induction to find patterns
CAMCOS Report Day December 9th, 2015 San Jose State University
A task of induction to find patterns
Random Neural Network Texture Model
Presentation transcript:

MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING CSCI 8810 Course Project MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science

Outline Introduction to Machine Learning The example application Machine Learning Methods Decision Trees Artificial Neural Networks Instant Based Learning

What is Machine Learning Machine Learning (ML) is constructing computer programs that develop solutions and improve with experience Solves problems which can not be solved by enumerative methods or calculus-based techniques Intuition is to model human way of solving some problems which require experience When the relationships between all system variables is completely understood ML is not needed

… A Generic System System Input Variables: Hidden Variables: Output Variables:

Learning Task Face recognition problem: Whose face is this in the picture? Hard to model describing face and its components Humans recognize with experience: The more we see the faster we perceive.

The example Vision module for Sony Aibo Robots that we have developed for Legged Robot Tournament in RoboCup 2002. Output of the module is distance and orientation of the target objects: the ball, the players the goals the beacons - used for localization of the robot.

Aibo’s View

Main ML Methods Decision Trees Artificial Neural Networks (ANN) Instant-Based Learning Bayesian Methods Reinforcement Learning Inductive Logic Programming (ILP) Genetic Algorithms (GA) Support Vector Machines (SVM)

Decision Trees Approximation of discrete functions by a decision tree. In the nodes of trees are attributes and in the leaves are values of discrete function Ex: A decision tree for “play tennis”

Algorithm to derive a tree Until each leaf node is populated by as homogeneous a sample set as possible: Select a leaf node with an inhomogeneous sample set. Replace that leaf node by a test node that divides the inhomogeneous sample set into minimally inhomogeneous subsets, according to an entropy calculation.

Color Classification Data set includes pixel values labeled with different colors manually The tree classifies a pixel to a color according to its Y,U,V values. Adaptable for different conditions.

How do we construct the data set? 1) Open an image taken by the robot

How do we construct the data set? 2) Label the pixels with colors [Y,U,V,color] entries are created for each pixel labeled

How do we construct the data set? 3) Use the ML method and display results

The decision tree output The data set is divided into training and validation set After training the tree is evaluated with validation set. Training should be done carefully, avoiding bias.

Artificial Neural Networks (ANN) Made up of interconnected processing elements which respond in parallel to a set of input signals given to each

ANN Algorithm Training algorithm adjusts the weights reducing the error between the known output values and the actual values At first, the outputs are arbitrary. As cases are reintroduced repeatedly ANN gives more right answers. Test set is used to stop training. ANN is validated with unseen data (validation set)

ANN output for our example

Face Recognition with ANN Problem: Orientation of face Input nodes are pixel values of the image. (32 x 32) Output has 4 nodes (right, left, up, straight) 6 hidden nodes

Face Recognition with ANN Hidden nodes normally does not infer anything, in this case we can observe some behavior.

Instance Based Learning A learn-by-memorizing method: K-Nearest Neighbor Given a data set {Xi, Yi} it estimates values of Y for X's other than those in the sample. The process is to choose the k values of Xi nearest the X and average their Y values. Here k is a parameter to the estimator. The average could be weighted, e.g. with the closest neighbor having the most impact on the estimate.

KNN facts Database of knowledge about known instances is required – memory complexity “Lazy learning”, no model for the hypothesis Ex: Color classification A voting method is applied in order to output a color class for the pixel.

Summary Machine Learning is an interdisciplinary field involving programs that improve by experience ML is good for pattern recognition, object extraction and color classification etc. problems in image processing problem domain. 3 methods are considered: Decision Trees Artificial Neural Networks Instant Based Learning

Thank you!