Neural Networks Advantages Criticism

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Slides from: Doug Gray, David Poole
Hopefully a clearer version of Neural Network. I1 O2 O1 H1 H2I2.
1 Data Mining: and Knowledge Acquizition — Chapter 5 — BIS /2014 Summer.
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Preventing Overfitting Problem: We don’t want to these algorithms to fit to ``noise’’ The generated tree may overfit the training data –Too many branches,
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
Machine Learning Neural Networks
Lecture 14 – Neural Networks
The Perceptron CS/CMPE 333 – Neural Networks. CS/CMPE Neural Networks (Sp 2002/2003) - Asim LUMS2 The Perceptron – Basics Simplest and one.
Gini Index (IBM IntelligentMiner)
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Chapter 9 Neural Network.
Appendix B: An Example of Back-propagation algorithm
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Today’s Lecture Neural networks Training
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Fall 2004 Backpropagation CS478 - Machine Learning.
Data Mining: and Knowledge Acquizition — Chapter 5 —
Learning with Perceptrons and Neural Networks
Data Mining, Neural Network and Genetic Programming
Advanced information retreival
Neural Nets for Data Mining
第 3 章 神经网络.
Real Neurons Cell structures Cell body Dendrites Axon
Neural Networks CS 446 Machine Learning.
Classification with Perceptrons Reading:
Artificial Neural Networks
Artificial Neural Networks
CSSE463: Image Recognition Day 17
Neural Networks: Improving Performance in X-ray Lithography Applications ECE 539 Ryan T. Hogg May 10, 2000.
Prof. Carolina Ruiz Department of Computer Science
Machine Learning Today: Reading: Maria Florina Balcan
Data Mining with Neural Networks (HK: Chapter 7.5)
Lecture 9 MLP (I): Feed-forward Model
Classification Neural Networks 1
Chapter 3. Artificial Neural Networks - Introduction -
Artificial Neural Network & Backpropagation Algorithm
Perceptron as one Type of Linear Discriminants
Classification Bayesian Classification 2018年12月30日星期日.
Neural Network - 2 Mayank Vatsa
An Introduction To The Backpropagation Algorithm
Perceptrons Introduced in1957 by Rosenblatt
CSSE463: Image Recognition Day 17
Multi-Layer Perceptron
Lecture Notes for Chapter 4 Artificial Neural Networks
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
Multilayer Perceptron: Learning : {(xi, f(xi)) | i = 1 ~ N} → W
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
COSC 4335: Part2: Other Classification Techniques
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Seminar on Machine Learning Rada Mihalcea
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Neural Networks Advantages Criticism prediction accuracy is generally high robust, works when training examples contain errors output may be discrete, real-valued, or a vector of several discrete or real-valued attributes fast evaluation of the learned target function Criticism difficult to understand the learned function (weights) not easy to incorporate domain knowledge long training time These notes contain NDSU confidential & Proprietary material. Patents pending on bSQ, Ptree technology

A Neuron mk - f weighted sum Input vector x output y Activation function weight vector w å w0 w1 wn x0 x1 xn The n-dimensional input vector x is mapped into variable y by means of the scalar product and a nonlinear function mapping

Network Training The ultimate objective of training Steps obtain a set of weights that makes almost all the tuples in the training data classified correctly Steps Initialize weights with random values Feed the input tuples into the network one by one For each unit Compute the net input to the unit as a linear combination of all the inputs to the unit Compute the output value using the activation function Compute the error Update the weights and the bias

Multi-Layer Perceptron Output vector Output nodes Hidden nodes wij Input nodes Input vector: xi