Introduction to Deep Learning for neuronal data analyses

Slides:



Advertisements
Similar presentations
An introduction to: Deep Learning aka or related to Deep Neural Networks Deep Structural Learning Deep Belief Networks etc,
Advertisements

Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
NEURAL NETWORKS FOR DATA MINING
Deep Convolutional Nets
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 6: Applying backpropagation to shape recognition Geoffrey Hinton.
CS 188: Artificial Intelligence Learning II: Linear Classification and Neural Networks Instructors: Stuart Russell and Pat Virtue University of California,
Web-Mining Agents Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Tanya Braun (Übungen)
Object Recognizing. Deep Learning Success in 2012 DeepNet and speech processing.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Comparing TensorFlow Deep Learning Performance Using CPUs, GPUs, Local PCs and Cloud Pace University, Research Day, May 5, 2017 John Lawrence, Jonas Malmsten,
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Vision-inspired classification
Big data classification using neural network
The Relationship between Deep Learning and Brain Function
CS 6501: 3D Reconstruction and Understanding Convolutional Neural Networks Connelly Barnes.
Deep Learning Amin Sobhani.
Compact Bilinear Pooling
an introduction to: Deep Learning
DeepCount Mark Lenson.
Deep Learning in HEP Large number of applications:
Artificial Intelligence (CS 370D)
Article Review Todd Hricik.
Many slides and slide ideas thanks to Marc'Aurelio Ranzato and Michael Nielson.
Applications of Deep Learning and how to get started with implementation of deep learning Presentation By : Manaswi Advisor : Dr.Chinmay.
Deep Learning Libraries
Deep Learning Fundamentals online Training at GoLogica
Intelligent Information System Lab
Basic machine learning background with Python scikit-learn
Schizophrenia Classification Using
ECE 6504 Deep Learning for Perception
Supervised Training of Deep Networks
A VERY Brief Introduction to Convolutional Neural Network using TensorFlow 李 弘
Lecture 5 Smaller Network: CNN
Training Techniques for Deep Neural Networks
Deep Learning Qing LU, Siyuan CAO.
Comparison Between Deep Learning Packages
Deep Learning Workshop
Machine Learning: The Connectionist
Computer Vision James Hays
Introduction to Neural Networks
Neural network systems
Brain Inspired Algorithms Dr. David Fagan
Deep Learning Packages
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
Convolutional Neural Networks
Introduction to Deep Learning with Keras
Deep learning Introduction Classes of Deep Learning Networks
Smart Robots, Drones, IoT
network of simple neuron-like computing elements
Creating Data Representations
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Long Short Term Memory within Recurrent Neural Networks
Neural Networks Geoff Hulten.
On Convolutional Neural Network
Deep Learning Some slides are from Prof. Andrew Ng of Stanford.
CSSE463: Image Recognition Day 17
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
An introduction to: Deep Learning aka or related to Deep Neural Networks Deep Structural Learning Deep Belief Networks etc,
Automatic Handwriting Generation
Deep Learning Libraries
CS295: Modern Systems: Application Case Study Neural Network Accelerator Sang-Woo Jun Spring 2019 Many slides adapted from Hyoukjun Kwon‘s Gatech “Designing.
VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION
Learning and Memorization
Debasis Bhattacharya, JD, DBA University of Hawaii Maui College
Igor Stančin, Alan Jović to: {igor.stancin,
Parallel Systems to Compute
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

Introduction to Deep Learning for neuronal data analyses Artur Luczak, Ph.D. Canadian Centre for Behavioural Neuroscience University of Lethbridge, AB, Canada http://lethbridgebraindynamics.com/artur_luczak

google inception network Deep Learning is providing breakthrough results in speech recognition, image classification, etc. google inception network

Examples from the test set (with the network’s guesses)

Video analyses and decision making

Speech recognition

Neuroscience data is similar to other types of data Ryait et al. & Luczak EEG / LFP Spiking data

What is Artificial Neuronal Network?

Artificial neural

Training Artificial Neural network Training data Fields class 1.4 2.7 1.9 0 3.8 3.4 3.2 0 6.4 2.8 1.7 1 4.1 0.1 0.2 0 etc … Initialise with random weights https://www.macs.hw.ac.uk/~dwcorne/Teaching/introdl.ppt

Training data Fields class 1.4 2.7 1.9 0 3.8 3.4 3.2 0 6.4 2.8 1.7 1 4.1 0.1 0.2 0 etc … Present a training pattern 1.4 2.7 1.9

Training data Fields class 1.4 2.7 1.9 0 3.8 3.4 3.2 0 6.4 2.8 1.7 1 4.1 0.1 0.2 0 etc … Feed it through to get output 1.4 2.7 0.8 1.9

Training data Fields class 1.4 2.7 1.9 0 3.8 3.4 3.2 0 6.4 2.8 1.7 1 4.1 0.1 0.2 0 etc … Compare with target output 1.4 2.7 0.8 1.9 error 0.8

Training data Fields class 1.4 2.7 1.9 0 3.8 3.4 3.2 0 6.4 2.8 1.7 1 4.1 0.1 0.2 0 etc … Adjust weights based on error 1.4 2.7 0.8 1.9 error 0.8

1 Training data Fields class And so on …. 1.4 2.7 1.9 0 3.8 3.4 3.2 0 1.4 2.7 1.9 0 3.8 3.4 3.2 0 6.4 2.8 1.7 1 4.1 0.1 0.2 0 etc … And so on …. 6.4 2.8 0.9 1 1.7 error -0.1 Repeat this thousands, maybe millions of times – each time taking a random training instance, and making slight weight adjustments Algorithms for weight adjustment are designed to make changes that will reduce the error

weight-learning algorithms for NNs work by making thousands and thousands of tiny adjustments, each making the network do better at the most recent pattern, but perhaps a little worse on many others eventually this tends to be good enough to learn effective classifiers for many real applications

What is deep learning ? A network with 1 hidden layer can, in theory, learn perfectly any classification problem. A set of weights exists that can produce the targets from the inputs. The problem is finding them.

Hierarchical models Riesenhuber & Poggio. Nature Neurosci 1999

Deep Learning = Learning Hierarchical Representations

Convolutional Networks (ConvNet or CNN) (currently the dominant approach for neural networks) Use many different copies of the same feature detector with different positions. Replication greatly reduces the number of free parameters to be learned. Use several different feature types, each with its own map of replicated detectors. Allows each patch of image to be represented in several ways.

CNN Architecture: Pooling Layer Pooling partitions the input image into a set of non- overlapping rectangles and, for each such sub-region, outputs the maximum value of the features in that region. Intuition: to progressively reduce the spatial size of the representation to reduce the amount of parameters and computation in the network, and hence to also control overfitting 9 3 9

Full CNN pooling pooling

Recurrent Neural Networks and LSTM neurons Potjans and Diesmann (2014) Note: No top-down feedback connections from top layers

Autoencoder Train the neural network to reproduce its input vector as its output This forces it to compress as much information as possible into few numbers in the central bottleneck. These few (here 30) numbers are then a good way to represent data.

Autoencoder

Convolutional Autoencoder Turchenko & Luczak, IEEE IDAACS 2017

Deep Neuronal Networks Le et al. (2013) ICASSP, IEEE International Conference http://theanalyticsstore.ie/deep-learning/

Visualizing MRI scans using autoencoder Plis et al. Front. Neurosci. 2014

1) Non-linearity 2) self-learned features Why Neural Networks are generally better than other methods? 1) Non-linearity 2) self-learned features Linear method Non-linear method

Applying Conv nets for electrophysiological signals EEG / LFP Spiking data

Generating LFP-like data to test Conv Net ConvNeurNet_example.m 19Hz vs 21Hz sin + noise

Taking segments of data for network training

Combining data from both groups in one array and taking randomly 80% of samples for training and 20% for testing

Our Conv Net architecture

Training and testing our Conv net

Optional fine tuning

Two LFP-like signals with the same freq. but different phase locking + noise

More advanced DL frameworks TensorFlow is an open source software library for numerical computation using data flow graphs. TensorFlow was developed by Google Brain Team to deploy machine learning and deep learning researches. The framework is written in C++ and Python. TensorFlow may stay as the most widely used framework in the DL for the next few years. Keras was developed as an easily operated interface to simplify building neural networks with a speedy approach. It is written in Python and can be functioned on top of TenserFlow and Theano. It is more user-friendly and easy to use as compared to  TensorFlow. Google may be including Keras in the next TenserFlow releases. Caffe was was developed by Berkeley Artificial Intelligence Research. Caffe main application is in modelling Convolutional Neural Network (CNN). Following popularity of Caffe, Facebook introduced Caffe2 in 2017. Caffe2 framework offers users to use pre-trained models to build demo applications. https://www.digitaldoughnut.com/articles/2018/february/a-comparison-of-deep-learning-frameworks

Python more popular than MATLAB Python + Numpy + Scipy + Matplotlib is just as good as MATLAB. Python is open and free, it is very easy for other parties to design packages or other software tools that extend Python. The expensive proprietary nature makes MATLAB difficult/ impossible for 3th parties to extend the functionality of MATLAB. Mathworks puts restrictions on code portability. The standard library does not contain as much generic programming functionality, but does include matrix algebra and an extensive library for data processing and plotting. If you want to experiment with some of the newest models for Machine Learning or Neural Networks, just use ScikitLearn and Keras + Tensorflow. Python as a programming language is becoming more popular than MATLAB

Thank you Discovery Accelerator Supplement