Deep Learning Workshop

Slides:



Advertisements
Similar presentations
Deep Learning Early Work Why Deep Learning Stacked Auto Encoders
Advertisements

Advanced topics.
Rajat Raina Honglak Lee, Roger Grosse Alexis Battle, Chaitanya Ekanadham, Helen Kwong, Benjamin Packer, Narut Sereewattanawoot Andrew Y. Ng Stanford University.
Presented by: Mingyuan Zhou Duke University, ECE September 18, 2009
Deep Learning.
Unsupervised Learning With Neural Nets Deep Learning and Neural Nets Spring 2015.
Comp 5013 Deep Learning Architectures Daniel L. Silver March,
Nantes Machine Learning Meet-up 2 February 2015 Stefan Knerr CogniTalk
Hurieh Khalajzadeh Mohammad Mansouri Mohammad Teshnehlab
A shallow introduction to Deep Learning
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
Building high-level features using large-scale unsupervised learning Anh Nguyen, Bay-yuan Hsu CS290D – Data Mining (Spring 2014) University of California,
LeCun, Bengio, And Hinton doi: /nature14539
Introduction to Deep Learning
Object Recognizing. Deep Learning Success in 2012 DeepNet and speech processing.
Neural Networks William Cohen [pilfered from: Ziv; Geoff Hinton; Yoshua Bengio; Yann LeCun; Hongkak Lee - NIPs 2010 tutorial ]
Deep Belief Network Training Same greedy layer-wise approach First train lowest RBM (h 0 – h 1 ) using RBM update algorithm (note h 0 is x) Freeze weights.
Introduction to Convolutional Neural Networks
Xintao Wu University of Arkansas Introduction to Deep Learning 1.
Deep Learning Primer Swadhin Pradhan Reading Group Presentation 03/30/2016, UT Austin.
Vision-inspired classification
Welcome deep loria !.
Big data classification using neural network
Some Slides from 2007 NIPS tutorial by Prof. Geoffrey Hinton
Handwritten Digit Recognition Using Stacked Autoencoders
Learning Deep Generative Models by Ruslan Salakhutdinov
Convolutional Neural Network
Deep Learning Amin Sobhani.
Energy models and Deep Belief Networks
Data Mining, Neural Network and Genetic Programming
ECE 5424: Introduction to Machine Learning
Deep Learning Insights and Open-ended Questions
Deep Learning Models 길 이 만 성균관대학교 소프트웨어대학.
Deep Learning with Symbols
Matt Gormley Lecture 16 October 24, 2016
Restricted Boltzmann Machines for Classification
ICS 491 Big Data Analytics Fall 2017 Deep Learning
Deep Learning Yoshua Bengio, U. Montreal
Deep Learning Fundamentals online Training at GoLogica
Deep Learning with TensorFlow online Training at GoLogica Technologies
Intelligent Information System Lab
Supervised Training of Deep Networks
A VERY Brief Introduction to Convolutional Neural Network using TensorFlow 李 弘
Deep learning and applications to Natural language processing
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Convolution Neural Networks
Deep Learning Qing LU, Siyuan CAO.
Structure learning with deep autoencoders
Unsupervised Learning and Autoencoders
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Restricted Boltzman Machines
A brief introduction to neural network
State-of-the-art face recognition systems
Introduction to Deep Learning for neuronal data analyses
FUNDAMENTALS OF MACHINE LEARNING AND DEEP LEARNING
Department of Electrical and Computer Engineering
Deep Learning Packages
Limitations of Traditional Deep Network Architectures
Recurrent Neural Networks
ECE 599/692 – Deep Learning Lecture 1 - Introduction
Introduction to Deep Learning with Keras
Deep learning Introduction Classes of Deep Learning Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
المشرف د.يــــاســـــــــر فـــــــؤاد By: ahmed badrealldeen
Logistic Regression & Transfer Learning
Deep Learning Some slides are from Prof. Andrew Ng of Stanford.
Martin Schrimpf & Jon Gauthier MIT BCS Peer Lectures
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Recurrent Neural Networks (RNNs)
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Deep Learning Workshop with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017 09/11/2018 Deep Learning Workshop

Your Lecturers and Guides Daniel L. Silver Grew up in the Annapolis Valley, NS BSc Acadia, CIM SMU, Ph.D. Western Software developer / Technical Architect from 1979 to 1992 Machine Learning and Data Mining since 1993 Professor/Director Jodrey School of CS, Acadia University since 2000 Director of the Acadia Institute for Data Analytics since 2014 Research area: Inductive Transfer and Lifelong Machine Learning 09/11/2018 Deep Learning Workshop

Your Lecturers and Guides Christian Frey Grew up in Vermont, USA BBA from Acadia in 2016, major in Computer Science Business/Data Analyst at AIDA for the past 10 months Website administrator for casenet.ca 09/11/2018 Deep Learning Workshop

Deep Learning Workshop Agenda Day 1 10:00 – Regression and BP ANNs (Python OR/XOR example) 10:30 - Generalization and bias/variance tradeoff (Python BP example of train/validation graphs) 11:00 – Design, training and evaluation of BP NN (Example using letter classification) 11:30 – Workshop - basic BP code 12:00 - Limitations of traditional deep network architectures (Example software) <LUNCH> 01:30 – Unsupervised Learning and BP AutoEncoders (Autoencoder reconstructs letters) 02:00 – RBMs (Python RBM code reconstructs letters) 02:30 – Workshop - Autoencoders and RBMs 03:00 – Intro to TensorFlow BP and MNIST dataset (Example code)) 03:30 – Workshop through the TF and MNIST BP tutorial, review homework Homework: Work through the Core and Contrib TF tutorials 09/11/2018 Deep Learning Workshop

Deep Learning Workshop Agenda Day 2 09:30 - For those students needing head start 10:00 - Deep Belief Networks (Example TF DBN with MNIST data) 10:30 – Workshop - DBNs 11:00 - Intro to Convolution Neural Networks 11:30 - Intro to Keras and Conv2D 12:00 - Developing models for MNIST using TF Conv2D (Example code provided) 12:30 – Workshop - CNNs  <LUNCH> 01:30 – Deep Learning Frameworks and Parallel Processing 02:00 – Intro to Recurrent NNs and LSTM networks (Example code provided) 02:30 – Workshop - RNNs 03:00 – Introduction to Assigned problem (TBA) 03:30 – Workshop – Assignment Homework: Comparison of DBN and CNN 09/11/2018 Deep Learning Workshop

Deep Learning Overview with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017 09/11/2018 Deep Learning Workshop

Deep Learning Overview Developing artificial neural networks with many layers Versus shallow nets with just a couple of layers Multiple layers work to build an improved feature space Lowest hidden layer learns 1st order features (e.g. edges…) Upper hidden layers learn higher order features (comb. of lower features) Some models learn hidden node representations in an unsupervised manner and discover general features of the input space (DBN) Some models learn features in a supervised manner based on architected networks (CNN) Final layer of transformed features flow into final supervised layer(s) and trained as one large network 09/11/2018 Deep Learning Workshop

Deep Learning Architectures A deep learning network can learn multiple levels of features from unlabeled data Simulating human sensory modality Unsupervised Feature Learning and Deep Learning – Andrew Ng, Stanford University

Deep Learning Workshop Why Deep Learning Biological Plausibility – e.g. Visual Cortex Hastad proof - Problems which can be represented with a polynomial number of nodes with k layers, may require an exponential number of nodes with k-1 layers (e.g. parity) Highly varying functions can be efficiently represented with deep architectures - less weights to update than a less efficient shallow representation Sub-features created in deep architecture can potentially be shared between multiple tasks – transfer and lifelong machine learning 09/11/2018 Deep Learning Workshop

History of Deep Learning Networks 1980 - Fukushima – Neo-Cognitron 1986 - Rumelhart et al. backpropagation networks 1989 – LeCun - Convolutional Neural Nets for Image, speech, etc. 1990s - Interest subsides as other models are introduced – SVMs, Graphical models, etc. – each their turn... 2006 - Deep Belief Networks (Hinton) and Stacked Autoencoders (Bengio) – Unsupervised pre-training followed by supervised learning 2012 - Initial successes with supervised approaches which overcome vanishing gradient and are more general applicable 2013 – Schmidhuber - Deep recurrent neural networks (LSTMs, GRUs) 09/11/2018 Deep Learning Workshop

Deep Learning Workshop References: http://deeplearning.net/ http://www.cs.toronto.edu/~hinton/ http://www.andrewng.org/ http://people.idsia.ch/~juergen/ https://www.coursera.org/courses?languages=en&query=machine+learnin g+andrew+ng https://deeplearning4j.org/index.html http://www.deeplearningbook.org/ https://arxiv.org/abs/1206.5538 http://www.iro.umontreal.ca/~bengioy/yoshua_en/Highlights.html 09/11/2018 Deep Learning Workshop