Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics for Managers using Excel by Levine (2) Computer Algorithms: Introduction to Design & Analysis by Baase and Gelder (slides by Ben Choi to accompany the Sara Baase’s text). (3) Discrete Mathematics by Richard Johnsonbaugh 1 Conceptual Foundations (MATH21001) Lecture Week 12 Review and Applications Reading: Textbook, Chapters 1-14
2 Learning Objectives In this lecture, you will learn: To apply some basic concepts from weeks 1-11 in computational intelligence based learning algorithms Matrices in implementing associate memories Gaussian distribution and matrices in RBF neural network Statistics in various learning algorithms Automata in real world applications such as speech recognition, handwriting recognition, etc.
3 Artificial Intelligence (AI)/ Computational Intelligence (CI) The techniques that make computers learn and behave like humans are called artificial/ computational intelligence based techniques. The term AI was first time used in 1956 by John McCarthy. The term Computational Intelligence (CI) was first time used in 1994 to mainly cover areas such as neural networks, evolutionary algorithms and fuzzy logic. In this lecture we will focus only on neural network based algorithms because of time constrain.
4 Artificial Neuron – Mathematical Model
5 Activation Functions
6 Implementation of Boolean AND and OR
7
8 Hebb: 1949 Learning/Training Algorithm Step 1: Initialise weights Initialise weights to small random values. Step 2: Present input x 1,..,x n Step 3: Calculate actual output n y = f ( w i x i ) i=1 Step 4: Adjust weights wij(t+1)= wij(t) + * x i * y j where is between 0 and 1 Step 5: Repeat by going to Step 2
9
10 Application - matrices Bi-directional Associative Memory (BAM) The BAM is a nearest neighbour, pattern matching neural network that encodes binary or bipolar pattern pairs. It thus associates patterns from a set A to patterns from a set B, and vice versa.
11 BAM Training The weight matrix W is calculated as W= Recall B i =A i Wwhere i is ith training pair sum =where j is number of inputs sum>0values >= +1 become +1 < +1 become -1 sum -1 become +1 <= -1 become -1 sum=0values >0 become +1 <=0 become -1
12 Example
13 Example
14 Application of Gaussian function The structure of radial basis function based neural network (classifier) with n inputs and one output is shown in the figure below: w
15 Gaussian/radial basis function The general form of a Gaussian/radial basis function is where is a parameter specifying the width of the basis function, often called the smooth factor or the receptive field. The shape of the function with three different sizes of is shown below:
16 Finite State Automata (FSA) and different variants of Hidden Markov Models (HMMs) Finite State Automata (FSA) and different variants of Hidden Markov Models (HMMs), have been used quite successfully to address several complex pattern recognition problems, such as speech recognition, cursive handwritten text recognition, time series prediction, biological sequence analysis, etc.
17 Websites For "Introduction to Neural Networks" you may refer to the following websites. rnal/vol4/cs11/report.html whatisNN.html des.html
18 Exam Review Date And Time As advised by the university. Open Book (3 hours) Conceptual Foundations, TERM 1, 2009 is an open book examination.
19 You will see the following instructions on your exam paper Instructions to the student Answer all questions in the examination booklet provided.
20 Things to do before the exam … A specimen exam paper is available on the course website. You should do it before the exam and if you have a problem, please see your local tutor. You should review the course material (week 1 to week 12), including your assignments.
Some typical questions & topics… Some typical questions & topics are given below to help you in exam preparation & you should study accordingly. You may be asked to state whether a statement is true or false. For example, “Gaussian distribution is square shaped” “A problem Q is NP-complete, if it has no solution” You may be given some distribution and asked to calculate various thing such as standard deviation, variance, mean, etc. You may be asked to calculate regression coefficients based on provided samples. 21
22 Some typical questions & topics… You may be asked to calculate sample size based on given parameters such as Z, etc. You may be asked to draw a transition diagram for the finite state automaton. Do you know various sorting algorithms? If not, try to learn how to compare sorting algorithms. You may be asked to give an example. Do you know various searching algorithms? You might be asked to apply searching algorithm (e.g. binary search). You may be asked to compare two algorithms. Do you know how to calculate “number of comparisons”? You may be asked to combine AND, OR, NAND, etc.
23 Some typical questions & topics… You may be asked to write a boolean expression to describe the combinatorial circuit. You may be asked to write the logic table. Do you understand weighted graphs and various strategies such as nearest neighbor? You may be given a weighted graph and asked to do the following. What is the tour (a cycle through all the vertices) found by the nearest neighbor algorithm? What is the total weight for the tour found by the nearest neighbor algorithm? Did the algorithm find the optimum solution (minimum tour)? Good Luck Course Coordinator of MATH21001
24 Summary Introduced simple AI/CI algorithms Discussed applications Reviewed sample exam questions In this lecture, we have