Models of the brain hardware

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

Slides from: Doug Gray, David Poole
Introduction to Neural Networks Computing
Artificial Neural Networks (1)
Machine Learning Neural Networks.
Artificial Intelligence (CS 461D)
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
Data Mining with Neural Networks (HK: Chapter 7.5)
CS 484 – Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
2101INT – Principles of Intelligent Systems Lecture 10.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Bain on Neural Networks and Connectionism Stephanie Rosenthal September 9, 2015.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 18 Learning Boltzmann Machines Geoffrey Hinton.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Chapter 18 Connectionist Models
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
Today’s Lecture Neural networks Training
Neural networks.
Learning with Perceptrons and Neural Networks
Learning in Neural Networks
Artificial Intelligence (CS 370D)
Machine Learning Neural Networks.
Other Classification Models: Neural Network
Real Neurons Cell structures Cell body Dendrites Axon
CSE 473 Introduction to Artificial Intelligence Neural Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
CSE P573 Applications of Artificial Intelligence Neural Networks
Data Mining with Neural Networks (HK: Chapter 7.5)
Neural Networks Advantages Criticism
Chapter 3. Artificial Neural Networks - Introduction -
Artificial Intelligence Chapter 3 Neural Networks
Perceptron as one Type of Linear Discriminants
CSE 573 Introduction to Artificial Intelligence Neural Networks
Neural Network - 2 Mayank Vatsa
Artificial Intelligence Lecture No. 28
Neural Networks ICS 273A UC Irvine Instructor: Max Welling
Models of the brain hardware
Artificial neurons Nisheeth 10th January 2019.
Artificial Intelligence Chapter 3 Neural Networks
Artificial Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
The Network Approach: Mind as a Web
Introduction to Neural Network
The McCullough-Pitts Neuron
David Kauchak CS158 – Spring 2019
CO Games Development 2 Week 22 Machine Learning
Biological Based Networks
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Models of the brain hardware Meet the Neurons Models of the brain hardware

A Real Neuron The Neuron                                                                                                           

Mcculloch-pitts 1940’s Y = f(next) = f( S wi xi + b) Wi are weights Xi are inputs B is a bias term F() is activation function F = 1 if next >= 0 else -1

Activation Functions Modified Mcculloch-pitts The function shown before is a threshold function Tanh function Logistic

Human Brain Neuron Speed - 10-3 seconds per operation Brain weights about 3 pounds and at rest consumes 20% of the bodies oxygen. Estimates place neuron count at 1012 to 1014 Connectivity can be 10,000

What is the capacity of the brain? Estimate the MIPS of a brain Estimate the MIPS needed by a computer to simulate the brain

Structure The cortex is estimated to be 6 layers The brain does recognition type computations is 100-200 milliseconds The brain clearly uses some specialized structures.

Survey of Artificial Neural Networks

The Perceptron Rosenblatt - 1950’s Linear classifiers Mcculloch-pitts neurons: x1 1 y1 x2 y2 2 xn m ym

Multi-layer Feed Forward x1 1 1 x2 2 2 xn m m

Training Vs. Learning Learning is ‘self directed” Training is externally controlled Set of pairs of inputs and desired outputs

Learning Vs. Training Hebbian learning: Training: Strengthen connections that are fired at same time Training: Back propagation Hopfield networks Boltzman machines

Back Propagation Present input then measure desired vs.. Actual outputs Correct weights by back propagating error though net Hidden layers are corrected in proportion to the weight they provide to output stage. Need a constant, used to prevent rapid training

Back Propagation The Math Number the Levels: 0 is input N is output All neurons in level i is connected to each neuron in i+1 Weights from level i to i+1 form a matrix Wi Input is a vector x and Training data is a vector t The vector yi is the input to level i: y0 = x yi+1 = fi ( Wi * yi + bi)

The Math (cont.) Error is dn= t – yn Back propagate the error di = (WTi+1 di+1) (fi(net)) Where WT is a transpose matrix of W Update the W matrices and bias terms DWi = a di yTi-1 D bi = a di b

Hopfield Networks Single layer Each neuron receives and input Each neuron has connections to all others Training by “clamping” and adjusting weights

Boltzman Machines Change f() in McCulloch-Pitts to be probabilistic The energy gap between 0 and 1 outputs is related to a “temperature” pk = 1 / [ 1 + e t] t = -DEk / T Learning: Hold inputs and outputs according to training data Anneal temperature while adjusting weights

PDP - an turning point Parallel Distributed Processing (1985) Properties: Learning similar to observed human behavior Knowledge is distributed! Robust

Connectionism Superposition principle Distributed “knowledge representation” Separation of process and “knowledge representation” Knowledge representation is not symbolic in the same sense as symbolic AI!

Example 1 Face recognition – Gary Cottrell Input a 64x64 grid Hidden layer 80 neurons Output layer 8 neurons – face yes,no, 2 bits for gender and 5 bits for name Results: Recognized the input set – 100% Face / no-face on new faces – 100% Male / female determination on new faces – 81%

Example 2 SRI worked on a net to spot tanks Used pictures of tanks and non-tanks Pictures were both exposed and hidden vehicles! When it worked, exposed it to new pictures It failed! Why?

Example 3 The nature of Sub-symbolic Categorize words by lexical type based on word order (Elman 1991)

Elman’s Network output Hidden Layer Input Context Units

Training Set build from sample sentences 29 words 10,000 two and three word sentences Training sample is input word and following word pair No unique answer – output is a set of words Analysis of the trained network – No symbol in the hidden layer corresponding to words or word pairs!!

Connectionisms challenge Fodor’s Language of the Mind Folk Psychology mind states and our tags for them How does the brain get to these? Marr’s type 1 theory – competence with out explanations See Associative Engines by Andy Clark for more!

Pulsed Systems Real neurons pulse! Pulsed neurons have computing power over level

Spike Response Model Variable ui describes the internal state Firing times are: Fi = {ti(f) ; i< f < n} = {t | ui(t) = threshold } After a spike, the state variable's value is lowered Inputs ui= S wij eij(t - tj(f))

Models Full Models Spike Models Simulate continuous functions Can integrate other factors Currents other than dendrite Chemical states Spike Models Simplify and treat output as an impulse

Computational Power Spiked Neurons Power All the neurons are Turing Computable Can do some things cheaper in neuron count

Encoding Problem How does the human brain use the spike trains? Rate Coding Spike density Rate over population Pulse Coding Time to first spike Phase Correlation

Where Next? Build a brain model! Analyze the operation of real brains

Interesting Work Computational Cockroach by Randall D Beer Systems of Igor Aleksander Imagination and Consciousness Cat brain of Hugo DeGrais