What is a neural network? Collection of interconnected neurons that compute and generate impulses. Components of a neural network include neurons, synapses,

Slides:



Advertisements
Similar presentations
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Decision Support Systems
Neural Networks Basic concepts ArchitectureOperation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Self-Organizing Hierarchical Neural Network
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
November 24, 2009Introduction to Cognitive Science Lecture 21: Self-Organizing Maps 1 Self-Organizing Maps (Kohonen Maps) In the BPN, we used supervised.
NEURAL NETWORKS Introduction
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Chapter 7 Neural Networks in Data Mining Automatic Model Building (Machine Learning) Artificial Intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Privacy-Preserving Self- Organizing Map Shuguo Han and Wee Keong Ng Center for Advanced Information Systems, School of Computer Engineering,Nanyang Technological.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Lecture 5 Neural Control
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Self-Organizing Network Model (SOM) Session 11
Neural Networks.
Learning in Neural Networks
Ch10 : Self-Organizing Feature Maps
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Self organizing networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSSE463: Image Recognition Day 17
Prof. Carolina Ruiz Department of Computer Science
Neural Networks Advantages Criticism
Artificial Intelligence Methods
XOR problem Input 2 Input 1
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
The Network Approach: Mind as a Web
Artificial Neural Networks
Artificial Neural Networks / Spring 2002
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

What is a neural network? Collection of interconnected neurons that compute and generate impulses. Components of a neural network include neurons, synapses, and activation functions. Neural networks modeled by mathematical or computer models are referred to as artificial neural networks. Neural networks achieve functionality through learning/training functions.

x1 x xn Σ Thresh old x0 w0 w1 w2 wn 1 if y >0 -1 otherwise

» The key to neural networks is weights. Weights determine the importance of signals and essentially determine the network output. » Training cycles adjust weights to improve the performance of a network. The performance of an ANN is critically dependent on training performance. » An untrained network is basically useless. » Different training algorithms lend themselves better to certain network topologies, but some also lend themselves to parallelization.

» Network topologies describe how neurons map to each other. » Many artificial neural networks (ANNs) utilize network topologies that are very parallelizable.

x 1 x 2. x Hidden Units Outputs Inputs

» Training and evaluation of each node in large networks can take an incredibly long time » However, neural networks are "embarrassingly parallel“ ˃ Computations for each node are generally independent of all other nodes

Typical structure of an ANN: » For each training session ˃ For each training example in the session + For each layer (forward and backward) – For each neuron in the layer » For all weights of the neuron >For all bits of the weight value

● Multilayer feedforward networks Layered network where each layer of neurons only outputs to next layer. ● Feedback networks single layer of neurons with outputs that loops back to the inputs. ● Self-organizing maps (SOM) Forms mappings from a high dimensional space to a lower dimensional space(one layer of neurons) ● Sparse distributed memory (SDM) Special form of a two layer feedforward network that works as an associative memory.

 S.O.M. is producing a low dimensional of the training samples.  They use neighborhood function to preserve the topological properties of the input space.  S.O.M. is useful for visualizing.

[1] Tomas Nordstrom. Highly parallel computers for artificial neural networks [2] George Dahl, Alan McAvinney and Tia Newhall. Parallelizing neural network training for cluster systems,2008. [3] Hopfield, j.j. and D. Tank. Computing with neural circuits. Vol.,2008. :pp ,1986.