1 Computation in neural networks M. Meeter. 2 Perceptron learning problem Input Patterns Desired output [+1, +1, -1, -1] [+1, -1, +1] [-1, -1, +1, +1]

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

You have been given a mission and a code. Use the code to complete the mission and you will save the world from obliteration…
Advanced Piloting Cruise Plot.
Kapitel 21 Astronomie Autor: Bennett et al. Galaxienentwicklung Kapitel 21 Galaxienentwicklung © Pearson Studium 2010 Folie: 1.
Chapter 1 The Study of Body Function Image PowerPoint
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 5 Author: Julia Richards and R. Scott Hawley.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 2.1 Chapter 2.
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
Business Transaction Management Software for Application Coordination 1 Business Processes and Coordination.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Title Subtitle.
My Alphabet Book abcdefghijklm nopqrstuvwxyz.
Multiplying binomials You will have 20 seconds to answer each of the following multiplication problems. If you get hung up, go to the next problem when.
0 - 0.
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULTIPLICATION EQUATIONS 1. SOLVE FOR X 3. WHAT EVER YOU DO TO ONE SIDE YOU HAVE TO DO TO THE OTHER 2. DIVIDE BY THE NUMBER IN FRONT OF THE VARIABLE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Addition Facts
Year 6 mental test 5 second questions
Around the World AdditionSubtraction MultiplicationDivision AdditionSubtraction MultiplicationDivision.
BALANCING 2 AIM: To solve equations with variables on both sides.
ZMQS ZMQS
Solve Multi-step Equations
BT Wholesale October Creating your own telephone network WHOLESALE CALLS LINE ASSOCIATED.
ABC Technology Project
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
© Charles van Marrewijk, An Introduction to Geographical Economics Brakman, Garretsen, and Van Marrewijk.
VOORBLAD.
Quadratic Inequalities
Solving Equations How to Solve Them
1 Breadth First Search s s Undiscovered Discovered Finished Queue: s Top of queue 2 1 Shortest path from s.
Constant, Linear and Non-Linear Constant, Linear and Non-Linear
Factor P 16 8(8-5ab) 4(d² + 4) 3rs(2r – s) 15cd(1 + 2cd) 8(4a² + 3b²)
Squares and Square Root WALK. Solve each problem REVIEW:
Do you have the Maths Factor?. Maths Can you beat this term’s Maths Challenge?
Beyond Linear Separability
© 2012 National Heart Foundation of Australia. Slide 2.
The x- and y-Intercepts
0 Median Growth Percentile CSAP MGPs by GT vs. Non-GT Students Morey Middle School.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
1 Chapter 4 The while loop and boolean operators Samuel Marateck ©2010.
Understanding Generalist Practice, 5e, Kirst-Ashman/Hull
Backpropagation Learning Algorithm
There are two basic categories: There are two basic categories: 1. Feed-forward Neural Networks These are the nets in which the signals.
Chapter 5 Test Review Sections 5-1 through 5-4.
SIMOCODE-DP Software.
GG Consulting, LLC I-SUITE. Source: TEA SHARS Frequently asked questions 2.
Addition 1’s to 20.
25 seconds left…...
Princess Nora University Artificial Intelligence Artificial Neural Network (ANN) 1.
Test B, 100 Subtraction Facts
1 Atlantic Annual Viewing Trends Adults 35-54, Total TV, By Daypart Average Minute Audience (000) Average Weekly Reach (%) Average Weekly Hours Viewed.
Januar MDMDFSSMDMDFSSS
Week 1.
We will resume in: 25 Minutes.
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
A SMALL TRUTH TO MAKE LIFE 100%
1 Unit 1 Kinematics Chapter 1 Day
PSSA Preparation.
How Cells Obtain Energy from Food
Classification Classification Examples
Traktor- og motorlære Kapitel 1 1 Kopiering forbudt.
1H.AHMET KARAKAYA aj 2H.AHMET KARAKAYA ja 3H.AHMET KARAKAYA.
Tying up loose ends.  Understand your data  No answers available, only data.
Presentation transcript:

1 Computation in neural networks M. Meeter

2 Perceptron learning problem Input Patterns Desired output [+1, +1, -1, -1] [+1, -1, +1] [-1, -1, +1, +1] [+1, +1, -1] [-1, -1, -1, -1] [-1, -1, +1, -1] [-1, -1, -1] [-1, +1, +1, -1][-1, +1, +1] [+1, -1, +1, -1] Calculating a function

3 Types of networks & functions  Attractor  Feedfwrd Hebbian associative (Hebbian) competitive  Feedfwrd error corr. perceptron backprop  completion, autoass. memory association, assoc. memory clustering categorization, generalization nonlinear, same

4 Types of networks  Attractor  Feedfwrd Hebbian associative (Hebbian) competitive  Feedfwrd error corr. perceptron backprop  completion, autoass. memory association, assoc. memory clustering categorization, generalization nonlinear, same

5 Classification A

6 Generalization ?

7 Univariate Linear Regression prediction of values Regression = generalization

8 Clustering

9 Types of networks  Attractor  Feedfwrd Hebbian associative (Hebbian) competitive  Feedfwrd error corr. perceptron backprop  completion, autoass. memory association, assoc. memory clustering categorization, generalization nonlinear, same

10 Perceptron learning problem Prototypical Input Patterns Desired output [+1, +1, -1, -1] [+1, -1, +1] [-1, -1, +1, +1] [+1, +1, -1] [-1, -1, -1, -1] [-1, -1, +1, -1] [-1, -1, -1] [-1, +1, +1, -1][-1, +1, +1] [+1, -1, +1, -1] Classification - discrete

11 Perceptron learning problem Prototypical Input Patterns Desired output [+1, +1, -1, -1] [+1, -1, +1] [-1, -1, +1, +1] [+1, +1, -1] [-1, -1, -1, -1] [-1, -1, +1, -1] [-1, -1, -1] [-1, +1, +1, -1][-1, +1, +1] [+1, -1, +1, -1] Classification - discrete

12 XiXi X1X1 X2X2 XnXn w ji  threshold Classification in Perceptron

13 Effe tussendoor…  Bij perceptron etc.: net input knoop>0 dan activatie 0  Niet altijd gewenst: daarom heeft knoop in continue vormen perceptron / backprop een ‘bias’, een activatie die altijd bij input opgeteld wordt  Effect: verschuiven threshold

14 Classification in 2 dimensions Threshold Input= Threshold Input= mixture + -

15 Discriminant Analysis Produces exact same result Find center of two categories, draw line in between, then one diagonal in middle = discrimination line

16 Univariate Linear Regression prediction of values Generalization = Regression

17 XiXi Activation function  (·) X1X1 X2X2 XnXn y Change weights with  rule, minimizing  e 2  j w ji v =  x i *w ji  (v) = av + b   Bias Perceptron with linear activation rule

18 Multivariate = multiple independent variables X =multiple inputs X i X 1 X 2 X n 1 y 1 2 y 2 X Y1Y1 Y2Y2 Multivariate Multiple Linear Regression Multiple = multiple dependent variables Y =multiple outputs

19 Linear vs. nonlinear regression linear x y nonlinear x y  Here: quadratic  General: wrinkle-fitting

20 y1y1 y2y2 XX X= [x 1, x 2,.., x i,.., x n ]    *wxv i jii Multi-Layer Perceptron  Fit any function: “Universal approximators”

21 x y Too simple model Bad Too complex model x y Extremely bad Overfitting

22 Clustering Competitive learning:  next week  ART

23 Conclusions  Neural networks similar to statistical analyses  Perceptron-> categorization / generalization  Backprop-> same but nonlinear  Competitive l.-> clustering  But…  Whole data set vs. one pattern at a time

24 Feature reduction with PCA

25 Feature extraction with PCA ?? Unsupervised Learning Hebbian Learning