Modular Neural Networks for Pattern Classification Using LabVIEW®

Slides:



Advertisements
Similar presentations
Chapter 11 user support. Issues –different types of support at different times –implementation and presentation both important –all need careful design.
Advertisements

1 Image Classification MSc Image Processing Assignment March 2003.
G53MLE | Machine Learning | Dr Guoping Qiu
also known as the “Perceptron”
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
DecisionCombination of Multiple Classifiers for Pattern Classification: Hybridization of Majority Voting and Divide and Conquer Techniques A. F. R. Rahman.
Simple Neural Nets For Pattern Classification
Characterization Presentation Neural Network Implementation On FPGA Supervisor: Chen Koren Maria Nemets Maxim Zavodchik
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
MATLAB Extras Week 16 – 5/12/09 Kate Musgrave
Supervised learning: Mixture Of Experts (MOE) Network.
Robust Real-Time Object Detection Paul Viola & Michael Jones.
Supervised Learning Networks. Linear perceptron networks Multi-layer perceptrons Mixture of experts Decision-based neural networks Hierarchical neural.
Introduction to Artificial Neural Network and Fuzzy Systems
Radial Basis Function (RBF) Networks
Face Detection using the Viola-Jones Method
This week: overview on pattern recognition (related to machine learning)
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Artificial Intelligence Techniques Multilayer Perceptrons.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Introduction: Olfactory Physiology Organic Chemistry Signal Processing Pattern Recognition Computational Learning Electronic Nose Chemical Sensors.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
C OMPUTATIONAL I NTELLIGENCE : I NTRODUCTION Ranga Rodrigo January 27,
Neural Network Recognition of Frequency Disturbance Recorder Signals Stephen Tang REU Final Presentation July 22, 2014.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Aa Scripting SPM analyses using aa Rhodri Cusack.
Evolutionary Computation Evolving Neural Network Topologies.
Copyright ©2008, Thomson Engineering, a division of Thomson Learning Ltd.
Big data classification using neural network
CS 9633 Machine Learning Support Vector Machines
EET 2259 Unit 4 SubVIs Read Bishop, Chapter 4.
Classifications of Software Requirements
Applying Neural Networks
Ananya Das Christman CS311 Fall 2016
CSCI-235 Micro-Computer Applications
Learning in Neural Networks
Feature Selection for Pattern Recognition
Lecture 2 of Computer Science II
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
Artificial Neural Networks for Pattern Recognition
The Object-Oriented Thought Process Chapter 05
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Neuro-Computing Lecture 4 Radial Basis Function Network
Face Recognition with Neural Networks
Chap 8: Adaptive Networks
Creating Data Representations
Communication and Coding Theory Lab(CS491)
Multilayer Perceptron & Backpropagation
Long Short Term Memory within Recurrent Neural Networks
Zip Codes and Neural Networks: Machine Learning for
Chapter 11 user support.
Introduction to Computer Programming
Human Computer Interaction Lecture 14 HCI in Software Process
The Network Approach: Mind as a Web
A task of induction to find patterns
Code vulnerabilities Vulnerabilities are mistakes, errors or weaknesses in a piece of software’s source code that can be directly used by a hacker to perform.
Automatic Handwriting Generation
EET 2259 Unit 4 SubVIs Read Bishop, Chapter 4.
A task of induction to find patterns
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
This presentation was developed by Dr. Steven C
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Modular Neural Networks for Pattern Classification Using LabVIEW® CS 539 Neural Networks and Fuzzy Systems Rey Hernandez CS539 Final Project Goals Description Competing Models LabVIEW® Implementation Differences From Standard Approaches Current Status 9/16/2018 University of Wisconsin - Madison

Goals Goal: To develop a pattern classifier in LabVIEW® 9/16/2018 University of Wisconsin - Madison Goals Goal: To develop a pattern classifier in LabVIEW® Pattern Classifiers normally developed with C++ or Matlab Equations used in Modular Neural Network

University of Wisconsin - Madison 9/16/2018 University of Wisconsin - Madison Description There is a strong interest in developing robust pattern classifiers that can handle an assortment of data Modular Neural networks provide that robustness by specializing in different areas of the data without sacrificing on generalizing capabilities

University of Wisconsin - Madison 9/16/2018 University of Wisconsin - Madison Competing Models Multilayer perceptrons (MLP) Preferred choice when doing pattern recognition MLPs can learn from data that is not linearly separable Weaknesses of MNNs In LabVIEW® a weakness of the MNN approach is its large memory usage

Competing Models Cont. Strengths of Modular Neural Networks (MNN) 9/16/2018 Competing Models Cont. Strengths of Modular Neural Networks (MNN) MNNs can also learn from not-linearly separable data MNNs can perform better than MLPs because they can devote each module to a separate piece of data MLPs may have difficulty with data sets that are discontinuous

University of Wisconsin - Madison 9/16/2018 University of Wisconsin - Madison LabVIEW® Popular Data Acquisition and Analysis Tool Uses a completely modular approach to programming Rather than written code, LabVIEW® uses icons and ‘wires’ to control the flow of data and computation LabVIEW® makes very efficient use of arrays which can be an asset when calculating the weights of the networks

9/16/2018 LabVIEW® cont. Modularity inherent to both LabVIEW® and Modular Neural Networks MNN can benefit from the ease of designing separate modules for simple tasks In MNN there are modules which are single layer perceptrons In LabVIEW® can make a module to behave like a perceptron network

Implementation Differences From Standard Approaches 9/16/2018 University of Wisconsin - Madison Implementation Differences From Standard Approaches Again, not written code, graphical in nature Equations can’t be handled in the usual way As in the pictorial representations of MNNs, so to does LabVIEW® provide a pictorial view of the MNN

University of Wisconsin - Madison 9/16/2018 University of Wisconsin - Madison Current Status The Program is in the initial stages right now Working version classifies nicely More can be added along the way of making a larger hierarchical structure Features to add and goals to accomplish Can classify new data but can’t save it Statistical information about the network is non-existent Ability to use a larger hierarchy would be nice