Presentation is loading. Please wait.

Presentation is loading. Please wait.

Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Intro Neural Networks (Reading:

Similar presentations


Presentation on theme: "Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Intro Neural Networks (Reading:"— Presentation transcript:

1

2 Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes gomes@cs.cornell.edu Module: Intro Neural Networks (Reading: Chapter 20.5)

3 Carla P. Gomes CS4700 Neural Networks Rich history, starting in the early forties with McCulloch and Pitts’s model of artificial neurons (McCulloch and Pitts 1943). Two views: – Modeling the brain – “Just” representation of complex functions (Continuous; contrast decision trees) Much progress on both fronts. Drawn interest from: Neuroscience, Cognitive science, AI, Physics, Statistics, and CS/EE.

4 Computer vs. Brain Circa 1997 Computer processor speed (MIPS) Information or computer storage (Megabytes)

5 Carla P. Gomes CS4700 Increasing Compute Power: Moore’s Law In 1965, Gordon Moore, Intel co-founder, predicted that the number of transistors on a chip would double about every two years. (popularly known as Moore's Law). Intel has kept that pace for nearly 40 years. The world's first 2-billion transistor microprocessor delivered in next-generation Intel® Itanium® processors codenamed Tukwila Tukwila is targeted for production towards the end of 2008.

6 Computer Power / Cost Computer processor speed (MIPS) Circa 1997

7 Carla P. Gomes CS4700 Neural Networks Computational model inspired by the brain based on the interaction of multiple connected processing elements (Connectionism, parallel distributed processing, neural computation). Brain’s information and processing power emerges from a highly interconnected network of neurons. Brain Brain made up of neurons (~10 11 ) Inputs Outputs Connection between cells Excitatory or inhibitory and may change over time Around 10 11 neurons, 10 14 synapses; a cycle time of 1ms-10 ms. When inputs reach some threshold  an action potential (electric pulse) is sent along the axon to the outputs

8 Carla P. Gomes CS4700 Biological Neurons The brain is made up of neurons which have –A cell body (soma) –Dendrites (inputs) –An axon (outputs) –Synapses (connection between cells) Synapses can be excitatory or inhibitory and may change over time When the inputs reach some threshold an action potential (electric pulse) is sent along the axon to the outputs There are around 10 11 neurons, 10 14 synapses; a cycle time of 1ms-10 ms. Signals are noisy “spike trains" of electrical potential

9 Carla P. Gomes CS4700 Issue: The Hardware The brain –a neuron, or nerve cell, is the basic information –processing unit (10^11 ) –many more synapses (10^14) connect the neurons –cycle time: 10^(-3) seconds (1 millisecond) How complex can we make computers? –10^8 or more transistors per CPU –supercomputer: hundreds of CPUs, 10^10 bits of RAM –cycle times: order of 10^(-9) seconds (1 nanosecond)

10 Carla P. Gomes CS4700 Compute Power vs. Brain Power In near future we can have computers with as many processing elements as our brain, but: far fewer interconnections (wires or synapses) much faster updates (1 millisecond, 10 -3 vs. 1 nanosecond 10 -9 ) Fundamentally different hardware may require fundamentally different algorithms! – Very much an open question.

11 Carla P. Gomes CS4700 Why Neural Nets? Motivation: Solving problems under the constraints similar to those of the brain may lead to solutions to AI problems that would otherwise be overlooked. Individual neurons operate very slowly massively parallel algorithms Neurons are failure-prone devices distributed and redundant representations Neurons promote approximate matching less brittle

12 Carla P. Gomes CS4700 Connectionist Models of Learning Characterized by: A large number of very simple neuron-like processing elements. A large number of weighted connections between the elements. Highly parallel, distributed control. An emphasis on learning internal representations automatically. But of course the interconnectivity is not really at the brain scale…

13 Carla P. Gomes CS4700 Autonomous Learning Vehicle In a Neural Net (ALVINN) ALVINN learns to drive an autonomous vehicle at normal speeds on public highways. Pomerleau et al, 1993 ALVINN is a perception systems which learns to control the NAVLAB vehicles by watching a person drive.

14 ALVINN drives 70mph on highways Each output unit correspond to a particular steering direction. The most highly activated one gives the direction to steer. 30 x 32 grid of pixel intensities from camera

15 Carla P. Gomes CS4700 What kinds of problems are suitable for neural networks? Have sufficient training data Long training times are acceptable Not necessary for humans to understand learned target function or hypothesis  neural networks are magic black boxes

16 Carla P. Gomes CS4700 Tasks –Function approximation, or regression analysis, including time series prediction and modeling. –Classification, including pattern and sequence recognition, novelty detection and sequential decision making. –Data processing, including filtering, clustering, blind signal separation and compression.

17 Carla P. Gomes CS4700 Example of Application Areas Application areas include: System identification and control (vehicle control, process control), Game-playing and decision making (backgammon, chess, racing), Pattern recognition (radar systems, face identification, object recognition, etc.) Sequence recognition (gesture, speech, handwritten text recognition), Medical diagnosis Financial applications Data mining (or knowledge discovery in databases, "KDD"), Visualization E-mail spam filtering.


Download ppt "Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Intro Neural Networks (Reading:"

Similar presentations


Ads by Google