Neural Network Implementations on Parallel Architectures

Slides:



Advertisements
Similar presentations
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Advertisements

Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Neural Networks Basic concepts ArchitectureOperation.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Spring 2002 Shreekanth Mandayam Robi Polikar ECE Department.
Classification of Music According to Genres Using Neural Networks, Genetic Algorithms and Fuzzy Systems.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial neural networks:
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Akram Bitar and Larry Manevitz Department of Computer Science
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Lecture 5 Neural Control
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Neural Networks 2nd Edition Simon Haykin
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Multiple-Layer Networks and Backpropagation Algorithms
Fall 2004 Backpropagation CS478 - Machine Learning.
Neural Networks.
Deep Learning Amin Sobhani.
Learning in Neural Networks
buses, crossing switch, multistage network.
Artificial Intelligence (CS 370D)
Artificial neural networks:
Other Classification Models: Neural Network
Neural Networks Dr. Peter Phillips.
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Dr. Unnikrishnan P.C. Professor, EEE
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Prof. Carolina Ruiz Department of Computer Science
Chapter 12 Advanced Intelligent Systems
OVERVIEW OF BIOLOGICAL NEURONS
Artificial Neural Network & Backpropagation Algorithm
of the Artificial Neural Networks.
buses, crossing switch, multistage network.
network of simple neuron-like computing elements
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Capabilities of Threshold Neurons
Machine Learning: Lecture 4
Machine Learning: UNIT-2 CHAPTER-1
Artificial Neural Networks
ARTIFICIAL NEURAL networks.
The Network Approach: Mind as a Web
Introduction to Neural Network
Artificial Neural Networks
Sanguthevar Rajasekaran University of Connecticut
Artificial Neural Networks / Spring 2002
Akram Bitar and Larry Manevitz Department of Computer Science
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Neural Network Implementations on Parallel Architectures

Index Neural Networks Learning in NN Parallelism Characteristics Mapping Schemes & Architectures

Artificial Neural Networks inspired from human brain parallel,distributed computing model consists of a large number of simple,neuron-like processing elements called units weighted,directed connections between pairs of units.

Artificial Neural Networks-2 weights may be positive or negative each unit computes a simple function of its inputs,which are weighted outputs from other units

Artificial Neural Networks-3 a threshold value is used in each neuron to determine the activation of the output learning in NN: finding the weights and threshold values training set multi-layer,feedforward networks: input layer,hidden layer,output layer

Learning in ANN initialize all weights apply input vector to network propagate vector forward and obtain unit outputs compare output layer response with desired outputs compute and propagate error measure backward, correcting weights layer by layer iIterate until ”good” mapping is achieved

Learning in ANN-2

Parallelism further speed-up of training neural networks exhibit high degree of parallelism(distributed set of units operating simultaneously) process of parallelism: what type of machine? how to parallelize?

Parallelism-2 different neural network models highly dependend on the model used SIMD(small computation & a lot of data exchange) one neuron for one processor MIMD (distributed memory & message passing) bad performance in frequent communication

Characteristics theoretical analysis of the inherent algorithm portability ease of use access to ANN model description

Historical Data Integration prediction of the sensor output two parallelism methods: parallel calculation of weighted sum time increases with the number of processors parallel training of each seperate NN time decreases with the number of processors 8 RISC processors 4 MB cache memory 512 RAM

Method-1

Method-2

Distributed Training Data

A library on MIMD machines distributed shared memory

A library on MIMD machines-2 several communication and syncronization schemes message passing or shared memory thread programming with shared memory has the best performance every data is shared but handled only by one processor training of a Kohonen map of 100*100 neurons with 100000 iterations with 8 processors are 7 times faster than the sequential execution.

A library on MIMD machines-3

AP1000 Architecture

AP1000 Architecture-2 MIMD computer with distributed memory vertical slicing of the network 3 methods for communication One to one communication Rotated messages in horizontal and vertical rings Parallel routed messages different neural network implementations

AP1000 Architecture-3 different mappings according to the network and the training data heuristic on training time combine multiple degrees of parallelism training set parallelism node parallelism pipelining parallelism

References “APPROACH TO PARALLEL TRAINING OF INTEGRATION HISTORICAL DATA NEURAL NETWORKS”, V. TURCHENKO1, C. TRIKI2, A. SACHENKO1 “A LIBRARY TO IMPLEMENT NEURAL NETWORKS ON MIMD MACHINES”,Y.BONIFACE, F.ALEXANDRE,S. VIALLE “A BRIDGE BETWEEN TWO PARADIGMS FOR PARALLELISM: NEURAL NETWORKS AND GENERAL PURPOSE MIMD COMPUTERS”, Y.BONIFACE, F.ALEXANDRE,S. VIALLE “PARALLEL ENVIRONMENTS FOR IMPLEMENTING NEURAL NETWORKS”, M. MISRA