Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Slides from: Doug Gray, David Poole
CSE 5522: Survey of Artificial Intelligence II: Advanced Techniques Instructor: Alan Ritter TA: Fan Yang.
Machine Learning Neural Networks
Data Mining Techniques Outline
About the Course Lecture 0: Sep 2 AB C. Plan  Course Information and Arrangement  Course Requirement  Topics and objectives of this course.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Module #1 - Logic Based on Rosen, Discrete Mathematics & Its Applications. Prepared by (c) , Michael P. Frank. Modified By Mingwu Chen 1 Module.
Chapter 4: The Building Blocks: Binary Numbers, Boolean Logic, and Gates Invitation to Computer Science, C++ Version, Fourth Edition.
Neural Networks Primer Dr Bernie Domanski The City University of New York / CSI 2800 Victory Blvd 1N-215 Staten Island, New York 10314
Introduction to Artificial Neural Network and Fuzzy Systems
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function (RBF) Networks
Radial Basis Function Networks
CSNB143 – Discrete Structure
Radial Basis Function Networks
CS223 Algorithms D-Term 2013 Instructor: Mohamed Eltabakh WPI, CS Introduction Slide 1.
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Chapter 10 Artificial Intelligence. © 2005 Pearson Addison-Wesley. All rights reserved 10-2 Chapter 10: Artificial Intelligence 10.1 Intelligence and.
Chapter 13 Statistics © 2008 Pearson Addison-Wesley. All rights reserved.
CS 103 Discrete Structures Lecture 01 Introduction to the Course
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Introduction. 2COMPSCI Computer Science Fundamentals.
Course Web Page Most information about the course (including the syllabus) will be posted on the course wiki:
© 2008 Pearson Addison-Wesley. All rights reserved Chapter 1 Section 13-6 Regression and Correlation.
CONTENTS:  Introduction  What is neural network?  Models of neural networks  Applications  Phases in the neural network  Perceptron  Model of fire.
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Selection Control Structures. Simple Program Design, Fourth Edition Chapter 4 2 Objectives In this chapter you will be able to: Elaborate on the uses.
MATH 224 – Discrete Mathematics
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Digital Image Processing Lecture 25: Object Recognition Prof. Charlene Tsai.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Elements of a Discrete Model Evaluation.
Design and Analysis of Algorithms (09 Credits / 5 hours per week) Sixth Semester: Computer Science & Engineering M.B.Chandak
CES 592 Theory of Software Systems B. Ravikumar (Ravi) Office: 124 Darwin Hall.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Data Structures and Algorithms in Java AlaaEddin 2012.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Review Lecture Tuesday, 12/11/01.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Structured Programming (COIT 29222) Term 2, 2009, Final Exam Date And Time  As advised by the university Open/Closed Book (3 hours)  Structured Programming,
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Introductory Lecture. What is Discrete Mathematics? Discrete mathematics is the part of mathematics devoted to the study of discrete (as opposed to continuous)
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Today’s Lecture Neural networks Training
Big data classification using neural network
Invitation to Computer Science, C++ Version, Fourth Edition
Self-Organizing Network Model (SOM) Session 11
CITS4404 Artificial Intelligence & Adaptive Systems
Invitation to Computer Science, Java Version, Third Edition
Overview of Supervised Learning
Random walk initialization for training very deep feedforward networks
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
Midterm Exam Closed book, notes, computer Format:
Introduction to Radial Basis Function Networks
Junheng, Shengming, Yunsheng 11/09/2018
Principles of Computing – UFCFA Week 1
FLIPPED CLASSROOM ACTIVITY CONSTRUCTOR – USING EXISTING CONTENT
Presentation transcript:

Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics for Managers using Excel by Levine (2) Computer Algorithms: Introduction to Design & Analysis by Baase and Gelder (slides by Ben Choi to accompany the Sara Baase’s text). (3) Discrete Mathematics by Richard Johnsonbaugh 1 Conceptual Foundations (MATH21001) Lecture Week 12 Review and Applications Reading: Textbook, Chapters 1-14

2 Learning Objectives In this lecture, you will learn:  To apply some basic concepts from weeks 1-11 in computational intelligence based learning algorithms  Matrices in implementing associate memories  Gaussian distribution and matrices in RBF neural network  Statistics in various learning algorithms  Automata in real world applications such as speech recognition, handwriting recognition, etc.

3 Artificial Intelligence (AI)/ Computational Intelligence (CI)  The techniques that make computers learn and behave like humans are called artificial/ computational intelligence based techniques.  The term AI was first time used in 1956 by John McCarthy. The term Computational Intelligence (CI) was first time used in 1994 to mainly cover areas such as neural networks, evolutionary algorithms and fuzzy logic.  In this lecture we will focus only on neural network based algorithms because of time constrain.

4 Artificial Neuron – Mathematical Model

5 Activation Functions

6 Implementation of Boolean AND and OR

7

8 Hebb: 1949 Learning/Training Algorithm Step 1: Initialise weights Initialise weights to small random values. Step 2: Present input x 1,..,x n Step 3: Calculate actual output n y = f (  w i x i ) i=1 Step 4: Adjust weights wij(t+1)= wij(t) +  * x i * y j where  is between 0 and 1 Step 5: Repeat by going to Step 2

9

10 Application - matrices Bi-directional Associative Memory (BAM)  The BAM is a nearest neighbour, pattern matching neural network that encodes binary or bipolar pattern pairs. It thus associates patterns from a set A to patterns from a set B, and vice versa.

11 BAM Training The weight matrix W is calculated as W= Recall B i =A i Wwhere i is ith training pair sum =where j is number of inputs sum>0values >= +1 become +1 < +1 become -1 sum -1 become +1 <= -1 become -1 sum=0values >0 become +1 <=0 become -1

12 Example

13 Example

14 Application of Gaussian function  The structure of radial basis function based neural network (classifier) with n inputs and one output is shown in the figure below: w

15 Gaussian/radial basis function  The general form of a Gaussian/radial basis function is  where  is a parameter specifying the width of the basis function, often called the smooth factor or the receptive field. The shape of the function with three different sizes of  is shown below:

16 Finite State Automata (FSA) and different variants of Hidden Markov Models (HMMs)  Finite State Automata (FSA) and different variants of Hidden Markov Models (HMMs), have been used quite successfully to address several complex pattern recognition problems, such as speech recognition, cursive handwritten text recognition, time series prediction, biological sequence analysis, etc.

17 Websites  For "Introduction to Neural Networks" you may refer to the following websites.  rnal/vol4/cs11/report.html  whatisNN.html  des.html

18 Exam Review  Date And Time  As advised by the university.  Open Book (3 hours)  Conceptual Foundations, TERM 1, 2009 is an open book examination.

19 You will see the following instructions on your exam paper  Instructions to the student Answer all questions in the examination booklet provided.

20 Things to do before the exam …  A specimen exam paper is available on the course website. You should do it before the exam and if you have a problem, please see your local tutor.  You should review the course material (week 1 to week 12), including your assignments.

Some typical questions & topics…  Some typical questions & topics are given below to help you in exam preparation & you should study accordingly.  You may be asked to state whether a statement is true or false.  For example, “Gaussian distribution is square shaped” “A problem Q is NP-complete, if it has no solution”  You may be given some distribution and asked to calculate various thing such as standard deviation, variance, mean, etc.  You may be asked to calculate regression coefficients based on provided samples. 21

22 Some typical questions & topics…  You may be asked to calculate sample size based on given parameters such as Z, etc.  You may be asked to draw a transition diagram for the finite state automaton.  Do you know various sorting algorithms? If not, try to learn how to compare sorting algorithms. You may be asked to give an example.  Do you know various searching algorithms? You might be asked to apply searching algorithm (e.g. binary search).  You may be asked to compare two algorithms. Do you know how to calculate “number of comparisons”?  You may be asked to combine AND, OR, NAND, etc.

23 Some typical questions & topics…  You may be asked to write a boolean expression to describe the combinatorial circuit.  You may be asked to write the logic table.  Do you understand weighted graphs and various strategies such as nearest neighbor?  You may be given a weighted graph and asked to do the following.  What is the tour (a cycle through all the vertices) found by the nearest neighbor algorithm?  What is the total weight for the tour found by the nearest neighbor algorithm?  Did the algorithm find the optimum solution (minimum tour)? Good Luck Course Coordinator of MATH21001

24 Summary  Introduced simple AI/CI algorithms  Discussed applications  Reviewed sample exam questions In this lecture, we have