Neural Networks: Backpropagation algorithm Data Mining and Semantic Web University of Belgrade School of Electrical Engineering Chair of Computer Engineering.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks Eric Wong Martin Ho Eddy Li Kitty Wong.
Advertisements

Beyond Linear Separability
Neural Networks: Learning
Introduction to Artificial Neural Networks
NEURAL NETWORKS Backpropagation Algorithm
Neural networks Introduction Fitting neural networks
1 Image Classification MSc Image Processing Assignment March 2003.
Page 1 of 50 Optimization of Artificial Neural Networks in Remote Sensing Data Analysis Tiegeng Ren Dept. of Natural Resource Science in URI (401)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
1 Neural Networks - Basics Artificial Neural Networks - Basics Uwe Lämmel Business School Institute of Business Informatics
Neural Networks I CMPUT 466/551 Nilanjan Ray. Outline Projection Pursuit Regression Neural Network –Background –Vanilla Neural Networks –Back-propagation.
Neural Nets Using Backpropagation Chris Marriott Ryan Shirley CJ Baker Thomas Tannahill.
Lecture 14 – Neural Networks
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
Decision Support Systems
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks Chapter Feed-Forward Neural Networks.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Biointelligence Laboratory, Seoul National University
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Explorations in Neural Networks Tianhui Cai Period 3.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Appendix B: An Example of Back-propagation algorithm
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
NEURAL NETWORKS FOR DATA MINING
Classification / Regression Neural Networks 2
Neural Network Introduction Hung-yi Lee. Review: Supervised Learning Training: Pick the “best” Function f * Training Data Model Testing: Hypothesis Function.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Perceptrons Michael J. Watts
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural Networks Review
Data Mining, Neural Network and Genetic Programming
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Classification / Regression Neural Networks 2
Prof. Carolina Ruiz Department of Computer Science
Non-linear hypotheses
CSC 578 Neural Networks and Deep Learning
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Zip Codes and Neural Networks: Machine Learning for
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Artificial Neural Networks
Neural networks (1) Traditional multi-layer perceptrons
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Prof. Carolina Ruiz Department of Computer Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Neural Networks: Backpropagation algorithm Data Mining and Semantic Web University of Belgrade School of Electrical Engineering Chair of Computer Engineering and Information Theory Miroslav Tišma

You see this: But the camera sees this: What is this? Miroslav Tišma 2/21

Computer Vision: Car detection Testing: What is this? Not a car Cars Miroslav Tišma 3/21

pixel 1 pixel 2 Raw image Cars “Non”-Cars 50 x 50 pixel images→ 2500 pixels (7500 if RGB) pixel 1 intensity pixel 2 intensity pixel 2500 intensity Quadratic features ( ): ≈3 million features Learning Algorithm pixel 1 pixel Miroslav Tišma 4/21

Neural Networks Origins: Algorithms that try to mimic the brain Was very widely used in 80s and early 90s; popularity diminished in late 90s. Recent resurgence: State-of-the-art technique for many applications Miroslav Tišma 5/21

Neurons in the brain Dendr(I)tes Ax(O)n Miroslav Tišma 6/21

Neuron model: Logistic unit Sigmoid (logistic) activation function. “output” “input wires” “weights” - parameters Miroslav Tišma 7/21

Neural Network Layer 3Layer 1Layer 2 “bias unit” “output layer” “hidden layer”“input layer” Miroslav Tišma 8/21

Neural Network “activation” of unit in layer matrix of weights controlling function mapping from layer to layer If network has units in layer, units in layer, then will be of dimension Miroslav Tišma 9/21

Simple example: AND Miroslav Tišma 10/21

Example: OR function Miroslav Tišma 11/21

Multiple output units: One-vs-all. Pedestrian Car MotorcycleTruck Want, when pedestrian Miroslav Tišma 12/21 when car when motorcycle, etc.,

Neural Network (Classification) Binary classification 1 output unit Layer 1Layer 2Layer 3Layer 4 Multi-class classification (K classes) K output units total no. of layers in network no. of units (not counting bias unit) in layer pedestrian car motorcycle truck E.g.,,, Miroslav Tišma 13/21

Cost function Logistic regression: Miroslav Tišma 14/21 Neural network:

Gradient computation Need code to compute: Miroslav Tišma 15/21 Our goal is to minimize the cost function

Given one training example (, ): Forward propagation: Layer 1Layer 2Layer 3Layer Miroslav Tišma 16/21 Backpropagation algorithm

Intuition: “error” of node in layer. Layer 1Layer 2Layer 3Layer 4 For each output unit (layer L = 4) element-wise multiplication operator Miroslav Tišma 17/21

Backpropagation algorithm Training set Set (for all ). For Set Perform forward propagation to compute for Using, compute Compute Miroslav Tišma 18/21

Advantages: -Relatively simple implementation -Standard method and generally wokrs well -Many practical applications: * handwriting recognition, autonomous driving car Disadvantages: -Slow and inefficient -Can get stuck in local minima resulting in sub-optimal solutions Miroslav Tišma 19/21

Literature: Miroslav Tišma 20/21

Miroslav Tišma 21/21 Thank you for your attention!