Frank Rosenblatt Dr. Frank Rosenblatt, 1928-1971 PhD Experimental Psychology, Cornell, 1956 Developed neural networks called perceptrons A probabilistic.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Introduction to Artificial Neural Networks
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Artificial Neural Networks (1)
Perceptron Learning Rule
NEURAL NETWORKS Perceptron
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
G5BAIM Artificial Intelligence Methods Graham Kendall Neural Networks.
A pioneer of research in Artificial Intelligence Jordan W. Bennett Comp Winter 2011.
Networks and N-dimensions. When to start? As we have seen, there is a continuous pattern of interest in network-style analysis, starting at least as early.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
CS 4700: Foundations of Artificial Intelligence
Artificial Neural Network
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
CS-485: Capstone in Computer Science Artificial Neural Networks and their application in Intelligent Image Processing Spring
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Artificial Neural Networks
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Artificial Neural Networks An Overview and Analysis.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
The Cluster Computing Project Robert L. Tureman Paul D. Camp Community College.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Perceptrons and Linear Classifiers William Cohen
Introduction to Artificial Intelligence and Soft Computing
Introduction to Artificial Intelligence (G51IAI) Dr Rong Qu Neural Networks.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Chapter 2 Single Layer Feedforward Networks
Lecture 5 Neural Control
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Lecture notes for Stat 231: Pattern Recognition and Machine Learning 1. Stat 231. A.L. Yuille. Fall Perceptron Rule and Convergence Proof Capacity.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Neural Networks Si Wu Dept. of Informatics PEV III 5c7 Spring 2008.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 2 Nanjing University of Science & Technology.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Computational Intelligence Semester 2 Neural Networks Lecture 2 out of 4.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Introduction to the TLearn Simulator n CS/PY 231 Lab Presentation # 5 n February 16, 2005 n Mount Union College.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Neural Networks: An Introduction and Overview
Who is the “Father of Deep Learning”?
شبكه هاي عصبي مصنوعي جلسه دوم تاريخچه شبكه هاي عصبي مصنوعي
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Artificial Intelligence Chapter 3 Neural Networks
Perceptron as one Type of Linear Discriminants
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence 12. Two Layer ANNs
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Introduction to Neural Network
Neural Networks: An Introduction and Overview
David Kauchak CS158 – Spring 2019
PYTHON Deep Learning Prof. Muhammad Saeed.
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Frank Rosenblatt Dr. Frank Rosenblatt, PhD Experimental Psychology, Cornell, 1956 Developed neural networks called perceptrons A probabilistic model for information storage and organization in the brain Key properties Association or learning Generalization to new patterns Distributed memory Biologically plausible brain model Cornell Aeronautical Lab ( ), Cornell ( )

Frank Rosenblatt Dr. Frank Rosenblatt Wikipedia portrayal exaggerated Wikipedia portrayal “ Rosenblatt was a colorful character at Cornell in the early 1960s. A handsome bachelor, he drove a classic MGA sports car and was often seen with his cat named Tobermory.” Those who knew him would consider him a rather shy genius and more of a Renaissance man Renaissance man because he excelled in a wide variety of subjects, including psychology (his original field), computing, mathematics, neurophysiology, astronomy, and music

Frank Rosenblatt Agenda The Mark I Perceptron – Visual System Model The Tobermory Perceptron – Auditory System Model Perceptron Computer Simulations Rosenblatt's Book Rosenblatt-Minsky Debates and Minsky-Papert Book Rat Brain Experiments Hobbies – Astronomy, Climbing, Music, Sailing Untimely Death

Frank Rosenblatt The Mark I Perceptron Visual system model and pattern classifier Typical three-layer perceptron: fixed S → A and variable A → R connections Examining A-unit of Mark I

Frank Rosenblatt The Mark I Perceptron Visual system model and pattern classifier Input (sensory) layer of 400 photosensitive units in a 20x20 grid modeling a small retina Connections from input to association layer altered through plug-board wiring, but once wired they were fixed for the duration of an experiment Association layer of 512 units (stepping motors) each of which could take several excitatory and inhibitory inputs Connections from association to output layer were variable weights (motor- driven potentiometers) adjusted through error-propagating training process Output (response) layer of 8 units

Frank Rosenblatt The Tobermory Perceptron Auditory system model and pattern classifier Named after talking cat, Tobermory, in story by H.H. Munro (aka Saki) Large machine S-units: 45 band-pass filters and 80 difference detectors A-units: 1600 A1-units (20 time samples per detector) & 1000 A2-units R-units: 12, with 12,000 adaptive weights A2 → R-units.

Frank Rosenblatt Perceptron Computer Simulations Hardware implementations made good demonstrations but software simulations were far more flexible In early 1960s these computer simulations required machine language coding for speed and memory usage Simulation software package – user could specify the number of layers, the number of units per layer, type of connections between layers, etc. Computer time at Cornell and NYU

Frank Rosenblatt Rosenblatt's Book Principles of Neurodynamics, 1962 Part I: historical review of brain modeling approaches, physiological and psychological considerations, and basic definitions and concepts of the perceptron approach Part II: three-layer, series-coupled perceptrons – mathematical underpinnings and experimental results Part III: multi-layer and cross-coupled perceptrons Part VI: back-coupled perceptrons Book used to teach an interdisciplinary course "Theory of Brain Mechanisms" that drew students from Cornell's Engineering and Liberal Arts colleges

Frank Rosenblatt Series-Coupled Perceptrons A perceptron is a network of sensory (S), association (A), and response (R) signal generating units A series-coupled perceptron is feed-forward S → A → R An elementary perceptron is a series-coupled perceptron with one R-unit connected to every A-unit and fixed S → A connections Convergence Theorem: Given elementary perceptron, stimulus world W, and any classification C(W) for which a solution exists, then if all stimuli in W re-occur in finite time, the error correction procedure will always find a solution

Frank Rosenblatt Series-Coupled Perceptrons Mark I was typical S → A → R perceptron Connections S → A: fixed, usually local A → R: adjustable w training

Frank Rosenblatt Series-Coupled Perceptrons A-units usually local biologically-plausible detectors

Frank Rosenblatt Series-Coupled Perceptrons Dotted lines are variable connections Rosenblatt studied three and four-layer series-coupled perceptrons with two sets of variable weights but was unable to find a suitable training procedure like back-propagation

Frank Rosenblatt Cross-Coupled Perceptrons A cross-coupled perceptron is a system in which some connections join units of the same type (S, A, and/or R)

Frank Rosenblatt Back-Coupled Perceptrons A back-coupled perceptron is a system with feedback paths from units located near the output end of the system to units closer to the sensory end

Frank Rosenblatt Rosenblatt-Minsky Debates and Minsky-Papert Book Rosenblatt and Marvin Minsky (MIT) debated at conferences the value of biologically inspired computation, Rosenblatt arguing that his neural networks could do almost anything and Minsky countering that they could do littleMarvin Minsky Minsky, wanting to decide the matter once and for all, collaborated with Seymour Papert and published a book in 1969, Perceptrons: An Introduction to Computational Geometry, where they asserted about perceptrons (page 4), "Most of this writing... is without scientific value...” Minsky, although well aware that powerful perceptrons have multiple layers and Rosenblatt's basic feed-forward perceptrons have three layers, defined a perceptron as a two-layer machine that can handle only linearly separable problems and, for example, cannot solve the exclusive-OR problem

Frank Rosenblatt Minsky-Papert Book H.D. Block’s response paper H.D. Block’s response paper The authors address three classes of readers 1. Computer scientists specializing in pattern recognition, learning machines, and threshold logic 2. Abstract mathematicians interested in the debut of Computational Geometry 3. Those interested in the general theory of computation leading to decisions based on the weight of partial evidence, e.g. psychologists and biologists H.D. Block concludes 1. Computer scientists “will find the book of little value” 2. Abstract mathematicians consulted “were not captivated” 3. “For psychologists and biologists, the level of mathematical maturity demanded will, I believe, make the book somewhat difficult to read.”

Frank Rosenblatt Rat Brain Experiments Late 1960s – Rosenblatt began experiments in the Cornell Department of Entomology on the transfer of learned behavior via rat brain extracts Rats were taught discrimination tasks such as Y- maze and two-lever Skinner box, their brains extracted and injected into untrained rats that were then tested in the discrimination tasks to determine whether or not there was behavior transfer from the trained to the untrained rats Rosenblatt spent his last several years on this problem and showed convincingly that the initial reports of larger effects were wrong and that any memory transfer was at most very small

Frank Rosenblatt Astronomy Rosenblatt built a modest observatory on a hilltop behind his house 6 miles east of Ithaca. Work began summer 1961, Fecker 12" cassegrain telescope. He had interest in SETI (Search for Extraterrestrial Intelligence), wrote a proposal touting a "Stellar Coherometer” he designed, and was awarded $75K for the project. The observatory was completed about 1966 – a circular cinderblock structure with dome housing the telescope Current photo – house in background

Frank Rosenblatt Music Rosenblatt was an accomplished pianist and had a grand piano at his house in Brooktondale He played the well-known classical pieces of Mozart, Beethoven, etc. He also composed music and had a penchant of improvising endlessly on "Three Blind Mice"

Frank Rosenblatt Practical Joker As a graduate student Frank was a psychology major and Prof. James Gibson was a well-known faculty member and Frank’s dissertation advisor. As the story goes, Frank and some other graduate students drove to the town of Gibson one night and stole the town's "Gibson" signs, which they then mounted at the door of Professor Gibson's office. When Department Chair saw the signs, he remarked to the department secretary, "Don't you think Gibby's getting a little ostentatious?“ Traveling to a conference Frank remarked “Do you think McCulloch sleeps with his beard under or over the covers?”

Frank Rosenblatt Rosenblatt – Renaissance man Excelled in a wide variety of subjects – psychology (his original field), computing, mathematics, neurophysiology, astronomy, and music He had two research reputations neural networks with his perceptron work neurophysiology with the rat brain experiments When learning a new subject (to paraphrase Rodman Miller) In a few weeks he knew a little In a few months he knew a great deal Soon thereafter he was discussing topics with experts in the field

Frank Rosenblatt Untimely Death Sailboat accident on his 43rd birthday He “was a most gifted human being... had made his entire life a contribution to mankind” Congressional Record