Synaptic Dynamics: Unsupervised Learning

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Perceptron Lecture 4.
Memristor in Learning Neural Networks
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Introduction to Neural Networks Computing
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Artificial Neural Networks
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
PROTEIN SECONDARY STRUCTURE PREDICTION WITH NEURAL NETWORKS.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Learning Process CS/CMPE 537 – Neural Networks. CS/CMPE Neural Networks (Sp 2004/2005) - Asim LUMS2 Learning Learning…? Learning is a process.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
September 16, 2010Neural Networks Lecture 4: Models of Neurons and Neural Networks 1 Capabilities of Threshold Neurons By choosing appropriate weights.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
Learning Processes.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
5.5 Learning algorithms. Neural Network inherits their flexibility and computational power from their natural ability to adjust the changing environments.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Artificial Neural Network Unsupervised Learning
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 14/15 – TP19 Neural Networks & SVMs Miguel Tavares.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Building high-level features using large-scale unsupervised learning Anh Nguyen, Bay-yuan Hsu CS290D – Data Mining (Spring 2014) University of California,
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Non-Bayes classifiers. Linear discriminants, neural networks.
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Architecture and Equilibra 结构和平衡 Chapter Chapter 6 Architecture and Equilibria Perface lyaoynov stable theorem.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Architecture and Equilibria 结构和平衡 Chapter 6 神经网络与模糊系统 学生: 李 琦 导师:高新波.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Lecture 2 Introduction to Neural Networks and Fuzzy Logic President UniversityErwin SitompulNNFL 2/1 Dr.-Ing. Erwin Sitompul President University
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Machine Learning 12. Local Models.
Self-Organizing Network Model (SOM) Session 11
Real Neurons Cell structures Cell body Dendrites Axon
Other Applications of Energy Minimzation
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Synaptic Dynamics: Unsupervised Learning
Unsupervised Learning and Neural Networks
Lecture 22 Clustering (3).
Biological and Artificial Neuron
Biological and Artificial Neuron
Biological and Artificial Neuron
Architecture and Equilibria 结构和平衡 学生:郑巍 导师:刘三阳
NEURAL DYNAMIC1: ACTIVATIONHS AND SIGNALS
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Introduction to Neural Network
Introduction to Neural Networks
PYTHON Deep Learning Prof. Muhammad Saeed.
Artificial Neural Network learning
Machine Learning.
Presentation transcript:

Synaptic Dynamics: Unsupervised Learning Part Ⅰ Xiao Bing

处理 单元 Input Output

outline Learning Supervised Learning and Unsupervised Learning Supervised Learning and Unsupervised Learning in neural network Four Unsupervised Learning Laws

outline Learning Supervised Learning and Unsupervised Learning Supervised Learning and Unsupervised Learning in neural network Four Unsupervised Learning Laws

Learning Encoding Change Quantization A system learns a pattern if the system encodes the pattern in its structure. Change A system learns or adapts or “self -organizes” when sample data changes system parameters. Quantization A system learns only a small proportion of all patterns in the sampled pattern environment, so quantization is necessary.

Learning Encoding: A system learns a pattern if the system encodes the pattern in its structure. Change: A system learns or adapts or “self -organizes” when sample data changes system parameters. Quantization A system learns only a small proportion of all patterns in the sampled pattern environment.

Encoding A system has Learned a stimulus-response pair If is a sample from the function A system has learned if the system responses with for all ,and .

Encoding A system has partially learned or approximated the function . Close to Close to , S A system has partially learned or approximated the function .

Learning Encoding: A system learns a pattern if the system encodes the pattern in its structure. Change: A system learns or adapts or “self -organizes” when sample data changes system parameters. Quantization A system learns only a small proportion of all patterns in the sampled pattern environment.

Change We have learned calculus if our calculus-exam-behavior has changed from failing to passing. A system learns when pattern stimulation change a memory medium and leaves it changed for some comparatively long stretch of time.

Change Please pay attention to: We identify learning with change in any synapse, not in a neuron.

Learning Quantization Encoding: A system learns a pattern if the system encodes the pattern in its structure. Change: A system learns or adapts or “self -organizes” when sample data changes system parameters. Quantization A system learns only a small proportion of all patterns in the sampled pattern environment.

Quantization Pattern space sampling Sampled pattern space quantizing Quantized pattern space Uniform(一致的) sampling probability provides an information-theoretic criterion for an optimal quantization.

Quantization 1.Learning replaces old stored patterns with new patterns and forms “internal representations” or prototypes of sampled patterns. 2.Learned prototypes define quantized patterns.

Quantization Neural network models prototype patterns are presented as vectors of real numbers. learning “adaptive vector quantization” (AVQ)

Quantization Process of learning Quantize pattern space from into regions of quantization or decision classes. Learned prototype vectors define synaptic points . If and only if some point moves in the pattern space ,the system learns see also figure 4.1, page 113

outline Learning Supervised Learning and Unsupervised Learning Supervised Learning and Unsupervised Learning in neural network Four Unsupervised Learning Laws

Supervised Learning and Unsupervised Learning Criterion Whether the learning algorithm uses pattern-class information

Supervised learning Unsupervised learning Depending on the class membership of each training sample Using unlabelled pattern samples. More computational complexity Less computational complexity More accuracy Less accuracy allowing algorithms to detect pattern misclassification to reinforce the learning process Be practical in many high-speed real time environments

outline Learning Supervised Learning and Unsupervised Learning Supervised Learning and Unsupervised Learning in neural network Four Unsupervised Learning Laws

Supervised Learning and Unsupervised Learning in neural network Besides differences presented before, there are more differences between supervised learning and unsupervised learning in neural network.

Supervised learning Unsupervised learning Referring to estimated gradient descent in the space of all possible synaptic-value combinations. Referring to how biological synapses modify their parameters with physically local information about neuronal signals. Using class-membership information to define a numerical error signal or vector guiding the estimated gradient descent The synapses don’t use the class membership of training samples.

Unsupervised Learning in neural network Local information is information physically available to the synapse. The differential equations define unsupervised learning laws and describe how synapses evolve with local information.

Unsupervised Learning in neural network Local information include: synaptic properties or neuronal signal properties information of structural and chemical alterations in neurons and synapses …… Synapse has access to this information only briefly.

Unsupervised Learning in neural network Function of local information Allowing asynchronous synapses to learn in real time. Shrinking the function space of feasible unsupervised learning laws.

outline Learning Supervised Learning and Unsupervised Learning Supervised Learning and Unsupervised Learning in neural network Four Unsupervised Learning Laws

Four Unsupervised Learning Laws Signal Hebbian Competitive Differential Hebbian Differential competitive

Four Unsupervised Learning Laws dendrite axon Neuron i Neuron j Synapse presynaptic postsynaptic Input neuron field Output neuron field

Signal Hebbian Correlating local neuronal signals If neuron i and neuron j are activated synchronously, energy of synapse is strengthened, or energy of synapse is weakened.

Competitive Modulating the signal-synaptic difference with the zero-one competitive signal (signal of neuron j ). Synapse learns only if their postsynaptic neurons win. Postsynaptic neurons code for presynaptic signal patterns.

Differential Hebbian Correlating signal velocities as well as neuronal signals The signal velocity is obtained by differential of neuronal signal

Differential competitive Combining competitive and differential Hebbian learning Learn only if change

See also Simple competitive learning applet of neuronal networks http://www.psychology.mcmaster.ca/4i03/demos/competitive1-demo.html

See also Kohonen SOM applet http://www.psychology.mcmaster.ca/4i03/demos/competitive-demo.html

Welcome Wang Xiumei and Wang Ying to introduce four unsupervised learning laws in detail