Download presentation
Presentation is loading. Please wait.
Published byAsher Watts Modified over 5 years ago
1
CS 621 Artificial Intelligence Lecture 20 - 03/10/05 Prof
CS 621 Artificial Intelligence Lecture /10/05 Prof. Pushpak Bhattacharyya Neural Networks
2
Knowledge Representation
Basic Diagram of AI Robotics Natural Language Processing Search, Reasoning, Knowledge Representation Learning Expert Systems Planning Computer Vision IIT Bombay
3
Foundational Concepts in ML
Theory Driven Data Driven Learning from Examples (symbolic) Learning from Examples (connectionist) Explanation Based Analogy Based IIT Bombay
4
In learning (Data Driven)
Examples i) + ve ii) – ve +ve belong to the concept -ve Does not belong to the concept IIT Bombay
5
Example of Concept Concept of Even nos. + ve: 2, 4, 6, …..
Concept of Table + ve: - ve: Table Spoon IIT Bombay
6
Near Miss Examples Arch ( concept) +ve : -ve : Near miss 27.09.05
IIT Bombay
7
Only +ve Examples Overgeneralize Arch
ARCH is a structure on top of two vertical stands IIT Bombay
8
No hypothesis for the concept is formed
Only – ve Examples No hypothesis for the concept is formed IIT Bombay
9
C: Concept H: Hypothesis A: False negative B: False Positive C H A B
IIT Bombay
10
H Precision= | C H| / |C| A B Recall= | C H| / |H| C 27.09.05
IIT Bombay
11
Testing phase: Performance on unseen examples.
Training phase: Data is presented, hypothesis formed, 100% accuracy on training data is desired. Testing phase: Performance on unseen examples. IIT Bombay
12
Generalization Performance on unseen data. Leap to generalization
IIT Bombay
13
Occam Razor Out of all hypotheses, the simplest one ‘generalizes’ the best. IIT Bombay
14
Complexity a) Sample complexity
No of examples from which to learn, should not be too large. b) Time complexity Time taken to learn should be small. IIT Bombay
15
Neural Computer It is a learning machine, motivated by brain.
Aims to correct limitations of Von-Neumann style of computing. IIT Bombay
16
Von-Neumann Bottleneck
Memory Processor Bottleneck IIT Bombay
17
Brain Slow processing elements (neurons).
Very large no ( in order of 1012). Very large no of connections ( in order of 1024). Parallel and Distributed processing (same computation divided across elements). IIT Bombay
18
Distributed Different computations at different parts
PDP: Characteristic feature of brain IIT Bombay
19
Key Features Of Brain Computation
Processing – PDP. Storage – Content addressable. Not address driven. IIT Bombay
20
Basic Brain cell (Neuron)
Dendrite Axon cell Synapse Dendrite IIT Bombay
21
Within the cell K+ ions Na+ ions membrane IIT Bombay
22
Mechanism Of Neuron Firing
When the potential difference is large between two sides of the membrane there is a signal from the cell. NEURON is ‘high’ or ‘On’. Signal is transmitted over AXON-Dendrite combination. IIT Bombay
23
Modeling Perceptron Output = y Threshold = θ W1 Wn Wn-1 X1 Xn Xn-1
IIT Bombay
24
Computing Law Σwixi > θ, y =1 for i = 1 to n otherwise , y = 0
IIT Bombay
25
Threshold Function y 1 θ Threshold element characteristic function is a step function. IIT Bombay
26
In connectionist paradigm Basic computing element is a perceptron.
In symbolic paradigm, basic computing element is Turing machine. IIT Bombay
27
McCulloh and Pitts ( 1960’s)
Assembly of perceptron with appropriate characteristic function and connection is equivalent to a Turing Machine. IIT Bombay
28
Difference Between Symbolic Computation and Connectionist Computation
In symbolic computation Read, write symbols, moves head (through state change) In connectionist computation, sets up “classifying surfaces”. IIT Bombay
29
Classifying surface + + - + + - - + + - - - + + - + - 27.09.05
IIT Bombay
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.