Basics of Deep Learning No Math Required

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
PROTEIN SECONDARY STRUCTURE PREDICTION WITH NEURAL NETWORKS.
Neural Nets How the brain achieves intelligence 10^11 1mhz cpu’s.
Neural Networks Dr. Peter Phillips. Neural Networks What are Neural Networks Where can neural networks be used Examples Recognition systems (Voice, Signature,
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Presenting: Itai Avron Supervisor: Chen Koren Characterization Presentation Spring 2005 Implementation of Artificial Intelligence System on FPGA.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Appendix B: An Example of Back-propagation algorithm
Image noise filtering using artificial neural network Final project by Arie Ohana.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Lecture 10: 8/6/1435 Machine Learning Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
1.  Experimental Results  ELM  Weighted ELM  Locally Weighted ELM  Problem 2.
An informal description of artificial neural networks John MacCormick.
ARTIFICIAL NEURAL NETWORKS. Overview EdGeneral concepts Areej:Learning and Training Wesley:Limitations and optimization of ANNs Cora:Applications and.
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Feedforward semantic segmentation with zoom-out features
Internal States Sensory Data Learn/Act Action Development Action Selection Action[1] Action[2] Action[N] Action[i] (being developed) Action[i] Environment.
CSC321 Lecture 5 Applying backpropagation to shape recognition Geoffrey Hinton.
O NCE TRADING BEGINS, PLEASE ALLOW IT TO TRADE O NCE TRADING BEGINS, PLEASE ALLOW IT TO TRADE B E REALISTIC WITH YOUR EXPECTATIONS B E REALISTIC WITH.
Each neuron has a threshold value Each neuron has weighted inputs from other neurons The input signals form a weighted sum If the activation level exceeds.
Neural Network and Deep Learning 王强昌 MLA lab.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
1 Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory.
Feasibility of Using Machine Learning Algorithms to Determine Future Price Points of Stocks By: Alexander Dumont.
Lecture 4b Data augmentation for CNN training
How do you get here?
Xintao Wu University of Arkansas Introduction to Deep Learning 1.
Big data classification using neural network
Deep Learning Amin Sobhani.
COMP24111: Machine Learning and Optimisation
Lecture 24: Convolutional neural networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Intelligent Information System Lab
Final Year Project Presentation --- Magic Paint Face
AV Autonomous Vehicles.
Structure learning with deep autoencoders
Unsupervised Learning and Autoencoders
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Dynamic Routing Using Inter Capsule Routing Protocol Between Capsules
Introduction to Neural Networks
Deep Belief Nets and Ising Model-Based Network Construction
Face Recognition with Neural Networks
network of simple neuron-like computing elements
Word Embedding Word2Vec.
CSC 578 Neural Networks and Deep Learning
Machine Learning 101 Intro to AI, ML, Deep Learning
Creating Data Representations
Neural Networks and Deep Learning
Neural Networks Geoff Hulten.
Pattern Recognition & Machine Learning
Example of a simple deep network architecture.
Artificial Neural Networks
Artificial Neural Networks
CSC321 Winter 2007 Lecture 21: Some Demonstrations of Restricted Boltzmann Machines Geoffrey Hinton.
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
CIS 519 Recitation 11/15/18.
Lecture 09: Introduction Image Recognition using Neural Networks
Introduction to Neural Networks
Deep Neural Networks as Scientific Models
Example of a simple deep network architecture.
Example of training and deployment of deep convolutional neural networks. Example of training and deployment of deep convolutional neural networks. During.
CSC 578 Neural Networks and Deep Learning
Patterson: Chap 1 A Review of Machine Learning
Presentation transcript:

Basics of Deep Learning No Math Required Roland Meertens Machine learning engineer Autonomous Intelligent Driving

What we will learn

Inspired by the brain Neurons signal to other neurons Enough input activation means it becomes activated itself

Predicting the price of a house Area of the house Age of the house Distance to train station Higher activation -> higher price Weights (influence of that neuron on the output neuron)

Predicting the price of a house Hidden layer Area of the house Age of the house Distance to train station Too high or too low? Adjust the weights! Close to the station AND small

Activation function Activation function

Representations for characters Activation per class (10 output neurons) Flatten We could take all 28x28 images, make them into a list of 784 input neurons Output: an activation per class, 10 output neurons Probably want even more hidden layers for combinations of combinations of pixels

Problems with this approach

Create a “feature extractor” Line Arc ?? Network will have the chance to learn the same feature at multiple locations

Finishing our convolutional network “Normal” feedforward Activation per class Final prediction with a dense layer Same approach, “this neuron that predicted this feature should have been more active”.

What we learned