Download presentation
Presentation is loading. Please wait.
Published byMarian Paul Modified over 6 years ago
1
شبكه هاي عصبي مصنوعي جلسه دوم تاريخچه شبكه هاي عصبي مصنوعي
سيد مجتبي روحاني
2
تاريخچه دوره ابتدايي (1943-1956) دوره شروع (1956-1968)
دوره تاريكي يا رخوت ( ) دوره تولد تازه ( ) دوره شكوفايي (1986 تا كنون) شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
3
William James William James (1890): Association
“Mental facts cannot properly be studied apart from the physical environment of which they take cognizance” Principle of association: “When two brain processes are active together or in immediate succession, one of them, on reoccurring tends to propagate its excitement into the other” شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
4
دوره ابتدايي: 1943 McCulloch & Pitts مدل ساده نرون
Warren McCulloch (a psychiatrist and neuroanatomist) and Walter Pitts (a mathematician) مدل ساده نرون قابليت ساخت دروازههاي منطقي And و Or شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
5
دوره ابتدايي: 1949 Hebb قانون تعليم هِب:
اطلاعات در سيناپسها ذخيره ميشود نحوه تعليم (تغيير سيناپسها) قدرت ارتباط سيناپسي (وزن weight) وقتي گروهي از نرونهاي با سيناپسهاي ضعيف فعال ميشوند، قدرت سيناپسها افزايش مييابد. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
6
دوره ابتدايي: 1954 Farley and Clark, in 1954 on an early digital computer at MIT, modeled a network of randomly connected neurons employing a modified Hebb learning rule that was able to learn to discriminate between two input patterns شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
7
دوره شروع: 1957 Rosenblatt اولين مدل شبكه Perceptron: (perception neuron) قانون تعليم پرسپترون شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
8
Perceptron - history Proposed by Rosenblatt et al. (1958-1962).
A large class of neural models that incorporate learning. “Mark I” perceptron is built with a retina of 20X20 receptors. Learned to recognize letters. Created excitement, and hype. (ENIAC was built in 1945) شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
9
دوره شروع: 1960 Widrow مدل ديگري از نرون Adaline: (Adaptive Linear Element) قانون تعليم Widrow-Hoff تعميم به شبكهاي از نرونها Madaline: (Many Adaline) مشكل اصلي Madaline در روش تعليم آن بود شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
10
Adaline - history An adaptive pattern classication machine (called Adaline, for adaptive linear). . . Proposed by Widrow and Hoff. “During a training phase, crude geometric patterns are fed to the machine by setting the toggle switches in the 4X4 input switch array. Setting another toggle switch (the reference switch) tells the machine whether the desired output for the particular input pattern is +1 or -1 . The system learns something from each pattern and accordingly experiences a design change.. (Widrow and Hoff; 1960) شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
11
Adaline – Widrow-Hoff Learning
The learning idea is as follows: Define an error function that measure the performance of the performance in terms of the weights, input, output and desired output. Take the derivative of this function with respect to the weights, and modify the weights accordingly such that the error is decreased. Also known as the Least Mean Square (LMS) error algorithm, the Widrow-Hoff rule, the Delta rule. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
12
Widrow - Hoff Within a half hour of the time that the algorithm was written on the blackboard, Hoff had it working in hardware. [There was] a large analog computer in the building, right across the hallway. There was nobody in the computer room. We just went in, picked up a plugboard and Hoff wired it together. ... This happened on a Friday afternoon in the autumn of 1959. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
13
دوره رخوت: 1969 Minsky & Papert
در كتاب خود به بررسي perceptron پرداخته و نشان دادند كه پرسپترون قادر به حل مسئله XOR نيست. آنان اشتباهاً نتيجهگيري كردند كه چند نرون متوالي نيز قادر به حل XOR نيست. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
14
- + - + Linearly Separable - OR X2 (1,1) (1,0) (0,0) (0,1) X1
شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
15
- + - + Linearly Separable - And X2 (1,1) (1,0) (0,0) (0,1) X1
شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
16
- + - + Not Linearly Separable - XOR X2 (1,1) (1,0) (0,0) (0,1) X1
شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
17
Dark Age Marvin Minsky and Seymour Papert, in 1969, published Perceptrons, in which they mathematically proved that single-layer perceptrons were only able to distinguish linearly separable classes of patterns While true, they also (mistakenly) speculated that an extension to multiple layers would lose the “virtue” of the perceptron’s simplicity and be otherwise “sterile” In the 1950s and 1960s, symbolic AI and sub-symbolic Connectionism competed for prestige and funding AI investigated higher-order cognitive problems—logic, rational thought, problem solving Connectionism investigated neural models like the perceptron and struggled to find a learning algorithm for multi-layer perceptrons As a result of Minsky & Papert’s Perceptrons, research in neural networks was effectively abandoned in the 1970s and early 1980s شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
18
Dark Age Shun-ichi Amari, in 1967, and Christoph von der Malsburgh, in 1973, published ANN models of self-organizing maps, but the work was largely ignored Paul Werbos, in his 1974 PhD thesis, first demonstrated a method for training multi-layer perceptrons, essentially identical to Backprop but the work was largely ignored Stephen Grossberg and Gail Carpenter, in 1980, established a new principle of self-organization called Adaptive Resonance Theory (ART), largely ignored at the time شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
19
دوره تولد مجدد: 1982 Hopfield
Hopfield Network and the Auto-associative Memory Models Again, in 1985, Hopfield proposed another Hopfield & Tank Network to solve the Traveling Salesman Problem After these researches, the ANN models were again treated as a promising research area . شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
20
دوره تولد مجدد: 1986 David Rumelhart, James (Jay) McClelland, and the PDP Research Group (Psychologists at MIT) Error Back-Propagation “Parallel Distributed Processing” in which generalized delta rule are explained In addition, they explain how BPN can solve the XOR problem. Until 1990, BPN had become one of the most popular and highly utilized ANN model. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
21
دوره شكوفايي Adaptive Resonance Theory (Grossberg; 1980)
Hopfield model (Hopfield; 1982, 1984) Self-organizing maps (Kohonen; 1982) Reinforcement learning (Sutton and Barto; 1983) Simulated Annealing (Kirkpatrick et al.; 1983) Boltzmann machines (Ackley, Hinton, Terrence; 1985) Backpropagation (Rumelhart, Hinton, Williams; 1986) ART-networks (Carpenter, Grossberg; 1992) Support Vector Machines (1990) …… شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
22
Mature Age Up to now, the ANN models has been widely studied and many models has been proposed. Conferences and Journals are created for ANN studies, such as ICNN(International Conference on NN, IJCNN( International Joint Conference on NN, held by IEEE & INNS). Besides, many tools and software,such as SNNS and MatLab, are been developed for making applying NN easier. شبكه هاي عصبي مصنوعي سيد مجتبي روحاني
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.