Overview of deep learning

Slides:



Advertisements
Similar presentations
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Advertisements

Neural networks Introduction Fitting neural networks
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Computational Learning An intuitive approach. Human Learning Objects in world –Learning by exploration and who knows? Language –informal training, inputs.
also known as the “Perceptron”
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
Lecture 14 – Neural Networks
Kernel Methods and SVM’s. Predictive Modeling Goal: learn a mapping: y = f(x;  ) Need: 1. A model structure 2. A score function 3. An optimization strategy.
Artificial Neural Networks
CS 4700: Foundations of Artificial Intelligence
Machine learning Image source:
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Machine learning Image source:
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Linear Discrimination Reading: Chapter 2 of textbook.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Non-Bayes classifiers. Linear discriminants, neural networks.
Linear Classification with Perceptrons
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Introduction to Deep Learning
1 Perceptron as one Type of Linear Discriminants IntroductionIntroduction Design of Primitive UnitsDesign of Primitive Units PerceptronsPerceptrons.
Machine Learning in Practice Lecture 8 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
Usman Roshan Dept. of Computer Science NJIT
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Neural networks and support vector machines
Big data classification using neural network
Fall 2004 Backpropagation CS478 - Machine Learning.
Artificial Intelligence
Chilimbi, et al. (2014) Microsoft Research
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
Deep Learning Hung-yi Lee 李宏毅.
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Neural Networks Dr. Peter Phillips.
Neural Networks CS 446 Machine Learning.
Classification with Perceptrons Reading:
COMP61011 : Machine Learning Ensemble Models
Intelligent Information System Lab
Machine Learning Basics
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTALS OF MACHINE LEARNING AND DEEP LEARNING
Machine Learning Week 1.
Machine Learning Today: Reading: Maria Florina Balcan
Goodfellow: Chap 6 Deep Feedforward Networks
Introduction to Artificial Intelligence and Soft Computing
Learning in Pictures An intuitive approach.
Instance Based Learning
Perceptron as one Type of Linear Discriminants
Neural Networks Chapter 5
[Figure taken from googleblog
Neural Networks Geoff Hulten.
Artificial Intelligence Lecture No. 28
المشرف د.يــــاســـــــــر فـــــــؤاد By: ahmed badrealldeen
Chapter 8: Generalization and Function Approximation
Artificial Intelligence 10. Neural Networks
Machine learning overview
Machine Learning Support Vector Machine Supervised Learning
Machine Learning Perceptron: Linearly Separable Supervised Learning
David Kauchak CS158 – Spring 2019
Usman Roshan Dept. of Computer Science NJIT
Introduction to Neural Networks
Word representations David Kauchak CS158 – Fall 2016.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Modeling IDS using hybrid intelligent systems
Image recognition.
Presentation transcript:

Overview of deep learning Gregery T. Buzzard

Varieties of AI/ML AI is most general Many types of ML algs: Nearest Neighbor Naive Bayes Decision Trees Linear Regression Support Vector Machines (SVM) Neural Networks Deep learning is associated with neural networks

Machine learning Machine learning changes the paradigm from programming for answers to programming to discover rules (how to transform new data into new answers).

Shallow vs deep Shallow learning – only 1 or 2 layers: limited ability to combine features into more sophisticated features. E.g., the concept of an eye. Deep learning allows a model to learn all layers of a representation jointly rather than sequentially - better than stacking shallow models. Two key points: Layer-by-layer way in which increasingly complex representations are developed These representations are learned jointly

NN depth Deep learning uses multiple layers of processing Promotes reuse of rules for multiple inputs – i.e., identify common features rather than global properties. Promotes increasing levels of abstraction with depth.

NN depth Example activations in simple NN Layers farther from input are more abstract

NN overview Simple layers use a linear transformation (AX + b) plus an activation function (relu). The weights (parameters) A and b determine behavior – need to learn them.

NN overview Compare predictions with training targets using a loss function (or error function). Often CE or MSE

NN overview Use the loss function with an optimizer to update the weights. E.g., gradient descent.

Driving forces Three technical forces are driving advances in machine learning: Hardware Datasets and benchmarks Algorithmic advances (and software platforms) Belief that it works

Neural networks – hype is not new! Developed by Frank Rosenblatt in 1957 at Cornell under a grant from the Office of Naval Research. The New York Times reported the perceptron to be "the embryo of an electronic computer that [the Navy] expects will be able to walk, talk, see, write, reproduce itself and be conscious of its existence.” https://en.wikipedia.org/wiki/Perceptron https://www.dexlabanalytics.com/wp-content/uploads/2017/07/ The-Timeline-of-Artificial-Intelligence-and-Robotics.png

Neural networks - roadblocks In 1969 Minsky and Papert of MIT claimed that Rosenblatt’s predictions had been grossly exaggerated.  Several booms and busts later, lots of data, faster computers, and good programming infrastructure led to many of the early predictions coming true. What next? https://qph.fs.quoracdn.net/main-qimg-6c76e47604e22987a0349e2d4c892646

Example code on Google colaboratory