CS 621 Artificial Intelligence Lecture 20 - 03/10/05 Prof CS 621 Artificial Intelligence Lecture 20 - 03/10/05 Prof. Pushpak Bhattacharyya Neural Networks
Knowledge Representation Basic Diagram of AI Robotics Natural Language Processing Search, Reasoning, Knowledge Representation Learning Expert Systems Planning Computer Vision 27.09.05 IIT Bombay
Foundational Concepts in ML Theory Driven Data Driven Learning from Examples (symbolic) Learning from Examples (connectionist) Explanation Based Analogy Based 27.09.05 IIT Bombay
In learning (Data Driven) Examples i) + ve ii) – ve +ve belong to the concept -ve Does not belong to the concept 27.09.05 IIT Bombay
Example of Concept Concept of Even nos. + ve: 2, 4, 6, ….. Concept of Table + ve: - ve: Table Spoon 27.09.05 IIT Bombay
Near Miss Examples Arch ( concept) +ve : -ve : Near miss 27.09.05 IIT Bombay
Only +ve Examples Overgeneralize Arch ARCH is a structure on top of two vertical stands 27.09.05 IIT Bombay
No hypothesis for the concept is formed Only – ve Examples No hypothesis for the concept is formed 27.09.05 IIT Bombay
C: Concept H: Hypothesis A: False negative B: False Positive C H A B 27.09.05 IIT Bombay
H Precision= | C H| / |C| A B Recall= | C H| / |H| C 27.09.05 IIT Bombay
Testing phase: Performance on unseen examples. Training phase: Data is presented, hypothesis formed, 100% accuracy on training data is desired. Testing phase: Performance on unseen examples. 27.09.05 IIT Bombay
Generalization Performance on unseen data. Leap to generalization 27.09.05 IIT Bombay
Occam Razor Out of all hypotheses, the simplest one ‘generalizes’ the best. 27.09.05 IIT Bombay
Complexity a) Sample complexity No of examples from which to learn, should not be too large. b) Time complexity Time taken to learn should be small. 27.09.05 IIT Bombay
Neural Computer It is a learning machine, motivated by brain. Aims to correct limitations of Von-Neumann style of computing. 27.09.05 IIT Bombay
Von-Neumann Bottleneck Memory Processor Bottleneck 27.09.05 IIT Bombay
Brain Slow processing elements (neurons). Very large no ( in order of 1012). Very large no of connections ( in order of 1024). Parallel and Distributed processing (same computation divided across elements). 27.09.05 IIT Bombay
Distributed Different computations at different parts PDP: Characteristic feature of brain 27.09.05 IIT Bombay
Key Features Of Brain Computation Processing – PDP. Storage – Content addressable. Not address driven. 27.09.05 IIT Bombay
Basic Brain cell (Neuron) Dendrite Axon cell Synapse Dendrite 27.09.05 IIT Bombay
Within the cell K+ ions Na+ ions membrane 27.09.05 IIT Bombay
Mechanism Of Neuron Firing When the potential difference is large between two sides of the membrane there is a signal from the cell. NEURON is ‘high’ or ‘On’. Signal is transmitted over AXON-Dendrite combination. 27.09.05 IIT Bombay
Modeling Perceptron Output = y Threshold = θ W1 Wn Wn-1 X1 Xn Xn-1 27.09.05 IIT Bombay
Computing Law Σwixi > θ, y =1 for i = 1 to n otherwise , y = 0 27.09.05 IIT Bombay
Threshold Function y 1 θ Threshold element characteristic function is a step function. 27.09.05 IIT Bombay
In connectionist paradigm Basic computing element is a perceptron. In symbolic paradigm, basic computing element is Turing machine. 27.09.05 IIT Bombay
McCulloh and Pitts ( 1960’s) Assembly of perceptron with appropriate characteristic function and connection is equivalent to a Turing Machine. 27.09.05 IIT Bombay
Difference Between Symbolic Computation and Connectionist Computation In symbolic computation Read, write symbols, moves head (through state change) In connectionist computation, sets up “classifying surfaces”. 27.09.05 IIT Bombay
Classifying surface + + - + + - - + + - - - + + - + - 27.09.05 IIT Bombay