Functional Brain Signal Processing: EEG & fMRI Lesson 8 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

G53MLE | Machine Learning | Dr Guoping Qiu
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Lecture 14 – Neural Networks
Simple Neural Nets For Pattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Back-Propagation Algorithm
Chapter 6: Multilayer Neural Networks
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Data Mining with Neural Networks (HK: Chapter 7.5)
Artificial Neural Networks
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
CS 4700: Foundations of Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial-Basis Function Networks
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
ENDA MOLLOY, ELECTRONIC ENG. FINAL PRESENTATION, 31/03/09. Automated Image Analysis Techniques for Screening of Mammography Images.
Radial Basis Function Networks
Functional Brain Signal Processing: EEG & fMRI Lesson 1 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
DIGITAL IMAGE PROCESSING Dr J. Shanbehzadeh M. Hosseinajad ( J.Shanbehzadeh M. Hosseinajad)
Artificial Neural Networks An Overview and Analysis.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
NEURAL NETWORKS FOR DATA MINING
Classification / Regression Neural Networks 2
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Functional Brain Signal Processing: EEG & fMRI Lesson 3 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
Artificial Intelligence Techniques Multilayer Perceptrons.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
Functional Brain Signal Processing: EEG & fMRI Lesson 7 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
Functional Brain Signal Processing: EEG & fMRI Lesson 13
Functional Brain Signal Processing: EEG & fMRI Lesson 17 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
Functional Brain Signal Processing: EEG & fMRI Lesson 15 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Intro. ANN & Fuzzy Systems Lecture 14. MLP (VI): Model Selection.
Functional Brain Signal Processing: EEG & fMRI Lesson 14
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Neural Networks Si Wu Dept. of Informatics PEV III 5c7 Spring 2008.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Machine Learning Supervised Learning Classification and Regression
Big data classification using neural network
Multiple-Layer Networks and Backpropagation Algorithms
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Classification / Regression Neural Networks 2
Artificial Neural Network & Backpropagation Algorithm
Neural Networks Geoff Hulten.
Computer Vision Lecture 19: Object Recognition III
Presentation transcript:

Functional Brain Signal Processing: EEG & fMRI Lesson 8 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech. (CS), Semester III, Course B50

Artificial Neural Network (ANN) What does a single node in an ANN do? x1x1 x2x2 x3x3 x4x4 x5x5 w 12 w 22 w 32 w 42 w 52 y2y2

More Nodes x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 y1y1 y2y2 y3y3 y4y4 out put Hidden layer Input layer Output layer 1 if inside, 0 if outside the closed region

Number of Hidden Layers There must be two hidden layers to identify the following annulus. A neural network is basically a function approximator, which can approximate continuous functions by piecewise linear functions (interpolation). Neural networks are also known as universal approximator.

Separation or Classification A separation or classification is nothing but approximating the surface separating the (mixed) data. In other words it approximates a continuous function generating the separating surface. A classifier will have to approximate the function whose graph is this curve.

Classification by ANN Most classification tasks are accomplished by separating the data with curve(s) consisting only a single line. Therefore for most classification tasks ANNs with a single hidden layer is sufficient. However number of nodes in the hidden layer is to be determined by trial and error for optimal classification.

Universal Approximation For any continuous mapping there must exist a three-layer neural network (having an input or ‘fanout’ layer with n processing elements, a hidden layer with 2n + 1 processing elements, and an output layer with m processing elements) that implements exactly. Hecht-Nielsen, 1988.

Backpropagation Neural Network By far the most widely used type of neural network. It is simple yet powerful neural network even for complex models having hundred of thousands of parameters. Its conceptual simplicity and high success rate makes it a mainstay in adaptive pattern recognition. Offers means to calculate input to hidden layer weights. Duda et al., Chapter 6, p. 283 & 289

Regularization It is a deep issue concerning complexity of the network. Number of input and output nodes is fixed. But number of hidden nodes and connection weights are not. These are free parameters. If there are too few of them the training set cannot be adequately learned. If there are too many of them, generalization of the network will be poor

Regularization (cont.) (apart from enhanced computational complexity). That is, its performance on the test data set will fall down (while on training data set its performance may remain very high). Training seizure patternTesting seizure pattern

Backpropagation Architecture Hecht-Nielsen, 1988 x1x1 x2x2 x3x3 x4x4 y1y1 y2y2 General Three layer

Backpropagation Architecture (cont.) Hecht-Nielsen, 1988

Backpropagation Algorithm has to be minimized, where t and z are target and network output vectors respectively. c is # output nodes. where is the learning rate. m stands for the m’th iteration.

Epileptic EEG Signal Subasi and Ercelebi, Comp. Meth. Progr. Biomed., 78: 87 – 99, 2005

DB4 Wavelet DB wavelets do not have closed form representation (cannot be expressed by an elegant mathematical formula, like Morlet wavelet).

DB4 Wavelet Generation: Cascade Algorithm g(n), h(n) are impulse response functions. Ψ(t) is the wavelet. DB4 will contain only 4 taps or coefficients.

EEG Data Electrode placement was according to 10 – 20 system. 4 signals selected as F7 – C3, F8 – C4, T5 – O1 and T6 – O2. Sample frequency 200 Hz. Band-pass filtered in 1 – 70 Hz range upon acquisition. EEG was segmented at 1000 time point window (5s).

Feature Extraction by DB4 Wavelets EEG signals decomposed by high-pass (called ‘detail signal’) and low-pass (called ‘approximation’) FIR filtering

Assignment Preprocess depth EEG signals (to be given) by wavelet transforms (DB4 wavelet is seen to be more efficient than other wavelets, see Subasi & Ercelebi, 2005 and Vardhan & Majumdar, 2011). This will extract features from the signals. Use a three layer (that is, with only one hidden layer) perceptron neural network to

Assignment (cont.) classify the features to separate out the seizure portion from non-seizure portion in the signals.

References A. Subasi and E. Ercelebi, Classification of EEG signals using neural networks and logistic regression, Comp. Meth. Progrm. Biomedicine, 78: 87 – 99, I. Kaplan, Daubechies D4 wavelet transform, elets/daubechies/index.html elets/daubechies/index.html

References (cont.) R. Hecht-Nielsen, Theory of the backpropagation neural network, INNS 1988, p. I-593 – I-605. Freely available at s/Research%20Paper%20Library/backPropT heory.pdf s/Research%20Paper%20Library/backPropT heory.pdf I. Daubechies, Ten lectures on wavelets, SIAM, p. 115, 132, 194, 242.

THANK YOU This lecture is available at