شبكه هاي عصبي مصنوعي جلسه دوم تاريخچه شبكه هاي عصبي مصنوعي

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Perceptron Learning Rule
Machine Learning Neural Networks.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Simple Neural Nets For Pattern Classification
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
September 7, 2010Neural Networks Lecture 1: Motivation & History 1 Welcome to CS 672 – Neural Networks Fall 2010 Instructor: Marc Pomplun Instructor: Marc.
Neural Networks. What are they Models of the human brain used for computational purposes Brain is made up of many interconnected neurons.
Back-Propagation Algorithm
1 Pendahuluan Pertemuan 1 Matakuliah: T0293/Neuro Computing Tahun: 2005.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Neural Networks An Introduction.
Artificial Neural Networks KONG DA, XUEYU LEI & PAUL MCKAY.
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Artificial Neural Network
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks. Plan Perceptron  Linear discriminant Associative memories  Hopfield networks  Chaotic networks Multilayer perceptron  Backpropagation.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Brief History of AI Augusta Ada ( ) work on Charles Babbage’s early mechanical general-purpose computer, the Analytical Engine. Her.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
NEURAL NETWORKS FOR DATA MINING
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
ADALINE (ADAptive LInear NEuron) Network and
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Lecture 5 Neural Control
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Introduction to Deep Learning
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Perceptrons Michael J. Watts
Previous Lecture Perceptron W  t+1  W  t  t  d(t) - sign (w(t)  x)] x Adaline W  t+1  W  t  t  d(t) - f(w(t)  x)] f’ x Gradient.
Computational Intelligence Semester 2 Neural Networks Lecture 2 out of 4.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
INTRODUCTION TO NEURAL NETWORKS 2 A new sort of computer What are (everyday) computer systems good at... and not so good at? Good at..Not so good at..
Neural networks.
Fall 2004 Backpropagation CS478 - Machine Learning.
One-layer neural networks Approximation problems
Machine Learning Neural Networks.
第 3 章 神经网络.
Joost N. Kok Universiteit Leiden
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
ECE 471/571 - Lecture 17 Back Propagation.
CSE (c) S. Tanimoto, 2004 Neural Networks
of the Artificial Neural Networks.
Artificial Intelligence Chapter 3 Neural Networks
Perceptron as one Type of Linear Discriminants
Advanced AI Neural Networks.
CSE (c) S. Tanimoto, 2001 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Neural Networks
Machine Learning: Lecture 4
CSE (c) S. Tanimoto, 2002 Neural Networks
Machine Learning: UNIT-2 CHAPTER-1
Artificial Intelligence 12. Two Layer ANNs
Artificial Intelligence Chapter 3 Neural Networks
Artificial Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Lecture 02: Perceptron By: Nur Uddin, Ph.D.
The Network Approach: Mind as a Web
CSE (c) S. Tanimoto, 2007 Neural Nets
Introduction to Neural Network
David Kauchak CS158 – Spring 2019
Artificial Intelligence Chapter 3 Neural Networks
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

شبكه هاي عصبي مصنوعي جلسه دوم تاريخچه شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

تاريخچه دوره ابتدايي (1943-1956) دوره شروع (1956-1968) دوره تاريكي يا رخوت (1969-1982) دوره تولد تازه (1982-1986) دوره شكوفايي (1986 تا كنون) شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

William James William James (1890): Association “Mental facts cannot properly be studied apart from the physical environment of which they take cognizance” Principle of association: “When two brain processes are active together or in immediate succession, one of them, on reoccurring tends to propagate its excitement into the other” شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره ابتدايي: 1943 McCulloch & Pitts مدل ساده نرون Warren McCulloch (a psychiatrist and neuroanatomist) and Walter Pitts (a mathematician) مدل ساده نرون قابليت ساخت دروازه‌هاي منطقي And و Or شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره ابتدايي: 1949 Hebb قانون تعليم هِب: اطلاعات در سيناپس‌ها ذخيره مي‌شود نحوه تعليم (تغيير سيناپس‌ها) قدرت ارتباط سيناپسي (وزن weight) وقتي گروهي از نرونهاي با سيناپس‌هاي ضعيف فعال مي‌شوند، قدرت سيناپس‌ها افزايش مي‌يابد. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره ابتدايي: 1954 Farley and Clark, in 1954 on an early digital computer at MIT, modeled a network of randomly connected neurons employing a modified Hebb learning rule that was able to learn to discriminate between two input patterns شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره شروع: 1957 Rosenblatt اولين مدل شبكه Perceptron: (perception neuron) قانون تعليم پرسپترون شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

Perceptron - history Proposed by Rosenblatt et al. (1958-1962). A large class of neural models that incorporate learning. “Mark I” perceptron is built with a retina of 20X20 receptors. Learned to recognize letters. Created excitement, and hype. (ENIAC was built in 1945) شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره شروع: 1960 Widrow مدل ديگري از نرون Adaline: (Adaptive Linear Element) قانون تعليم Widrow-Hoff‌‌ تعميم به شبكه‌اي از نرونها Madaline: (Many Adaline) مشكل اصلي Madaline در روش تعليم آن بود شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

Adaline - history An adaptive pattern classication machine (called Adaline, for adaptive linear). . . Proposed by Widrow and Hoff. “During a training phase, crude geometric patterns are fed to the machine by setting the toggle switches in the 4X4 input switch array. Setting another toggle switch (the reference switch) tells the machine whether the desired output for the particular input pattern is +1 or -1 . The system learns something from each pattern and accordingly experiences a design change.. (Widrow and Hoff; 1960) شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

Adaline – Widrow-Hoff Learning The learning idea is as follows: Define an error function that measure the performance of the performance in terms of the weights, input, output and desired output. Take the derivative of this function with respect to the weights, and modify the weights accordingly such that the error is decreased. Also known as the Least Mean Square (LMS) error algorithm, the Widrow-Hoff rule, the Delta rule. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

Widrow - Hoff Within a half hour of the time that the algorithm was written on the blackboard, Hoff had it working in hardware. [There was] a large analog computer in the building, right across the hallway. There was nobody in the computer room. We just went in, picked up a plugboard and Hoff wired it together. ... This happened on a Friday afternoon in the autumn of 1959. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره رخوت: 1969 Minsky & Papert در كتاب خود به بررسي perceptron پرداخته و نشان دادند كه پرسپترون قادر به حل مسئله XOR نيست. آنان اشتباهاً نتيجه‌گيري كردند كه چند نرون متوالي نيز قادر به حل XOR نيست. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

- + - + Linearly Separable - OR X2 (1,1) (1,0) (0,0) (0,1) X1 شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

- + - + Linearly Separable - And X2 (1,1) (1,0) (0,0) (0,1) X1 شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

- + - + Not Linearly Separable - XOR X2 (1,1) (1,0) (0,0) (0,1) X1 شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

Dark Age Marvin Minsky and Seymour Papert, in 1969, published Perceptrons, in which they mathematically proved that single-layer perceptrons were only able to distinguish linearly separable classes of patterns While true, they also (mistakenly) speculated that an extension to multiple layers would lose the “virtue” of the perceptron’s simplicity and be otherwise “sterile” In the 1950s and 1960s, symbolic AI and sub-symbolic Connectionism competed for prestige and funding AI investigated higher-order cognitive problems—logic, rational thought, problem solving Connectionism investigated neural models like the perceptron and struggled to find a learning algorithm for multi-layer perceptrons As a result of Minsky & Papert’s Perceptrons, research in neural networks was effectively abandoned in the 1970s and early 1980s شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

Dark Age Shun-ichi Amari, in 1967, and Christoph von der Malsburgh, in 1973, published ANN models of self-organizing maps, but the work was largely ignored Paul Werbos, in his 1974 PhD thesis, first demonstrated a method for training multi-layer perceptrons, essentially identical to Backprop but the work was largely ignored Stephen Grossberg and Gail Carpenter, in 1980, established a new principle of self-organization called Adaptive Resonance Theory (ART), largely ignored at the time شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره تولد مجدد: 1982 Hopfield Hopfield Network and the Auto-associative Memory Models Again, in 1985, Hopfield proposed another Hopfield & Tank Network to solve the Traveling Salesman Problem After these researches, the ANN models were again treated as a promising research area . شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره تولد مجدد: 1986 David Rumelhart, James (Jay) McClelland, and the PDP Research Group (Psychologists at MIT) Error Back-Propagation “Parallel Distributed Processing” in which generalized delta rule are explained In addition, they explain how BPN can solve the XOR problem. Until 1990, BPN had become one of the most popular and highly utilized ANN model. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

دوره شكوفايي Adaptive Resonance Theory (Grossberg; 1980) Hopfield model (Hopfield; 1982, 1984) Self-organizing maps (Kohonen; 1982) Reinforcement learning (Sutton and Barto; 1983) Simulated Annealing (Kirkpatrick et al.; 1983) Boltzmann machines (Ackley, Hinton, Terrence; 1985) Backpropagation (Rumelhart, Hinton, Williams; 1986) ART-networks (Carpenter, Grossberg; 1992) Support Vector Machines (1990) …… شبكه هاي عصبي مصنوعي سيد مجتبي روحاني

Mature Age Up to now, the ANN models has been widely studied and many models has been proposed. Conferences and Journals are created for ANN studies, such as ICNN(International Conference on NN, IJCNN( International Joint Conference on NN, held by IEEE & INNS). Besides, many tools and software,such as SNNS and MatLab, are been developed for making applying NN easier. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني