Download presentation
Presentation is loading. Please wait.
Published byAbraham Scott Modified over 8 years ago
1
Xintao Wu University of Arkansas Introduction to Deep Learning 1
2
Outline 2 Introduction Differential privacy definition and mechanisms Regression and deep learning Function mechanism for DP-preserving learning Our Work Regression under DP and Model Inversion Attack DP-Preserving Auto-Encoder Conclusion
3
Regression 3
4
Linear Regression 4
5
Logistic Regression 5
6
Deep Learning Machine learning algorithms based on multiple levels of representation/abstraction automatically learning good features or representations not simply using human-designed representations or input features 6 Neural Network
7
Deep Learning 7 Pixels 1 st Layer “Edges” 3 rd Layer “Objects” [Andrew Ng]
8
Deep Learning Basics From Quoc V. Le tutorial on deep learning Part I: Nonlinear classifiers and the backpropagation algorithm Part II: Autoencoders,Convolutional Neural Networks, and Recurrent Neural Networks (skipped) 8
9
Illustrative example 9
10
10
11
Decision function 11
12
Decision function 12
13
13
14
14
15
15
16
16
17
Stochastic gradient descent algorithm 17
18
18
19
Graphical illustration 19
20
Limitation of linear decision function 20
21
Divide and Conquer 21
22
22
23
23
24
Deep neural network 24
25
25
26
Backpropagation 26
27
27
28
Multilayer neural network 28 [LeCun, Bengio & Hinton]
29
Back propagation 29 [LeCun, Bengio & Hinton]
30
30
31
31
32
32
33
Rectified linear units 33
34
Misc More discussions Deep vs. shallow networks Deep networks vs. Kernel methods History of deep learning Why ReLU is better Dropout to avoid overfitting Part II of the Quoc V. Le’s tutorial NIPS’2015 tutorial by Geoff Hinton, Yoshu Bengio and Yann LeCun 34
35
35 [LeCun & Ranzato]
36
Autoencoders Use Deep Belief Networks to pretrain deep networks Random initialization vs. unsupervised learning for initial weights Restricted Boltzmann Machines and Autoencoders 36
37
Data compression via autoencoders 37
38
Idea 38
39
Objective function 39
40
Network architecture 40 Linear function from 4-D to 2-D nonlinear
41
Auto-Encoder 41
42
Deep Auto-Encoders for Supervised Learning 42 Auto-encoder ……… Deep Auto-encoder Data reconstruction Softmax layer
43
43
44
Pretraining: one layer at a time 44
45
Convolutional neural network Very successful in object recognition for ImageNet 45 Neurons only look at adjacent pixels in the image
46
Weight sharing and convolution 46
47
Max-pooling 47 subsampling
48
Invariant to shifts 48
49
Brightness invariance Local Contrast Normalization Operate on the outputs of the max-pooling layer Subtract the mean and divide the standard deviation of the incoming neurons. 49
50
Backpropagation 50
51
Multiple channels 51
52
Multiple maps 52
53
Recurrent neural networks for sequence preduction 53 Variable-sized inputs The stock price today is likely to be more influenced by that of yesterday than 10 years ago
54
Recurrent Neural Network 54
55
Language modeling 55 Word embedding
56
Word Embeddings 56
57
Long Short Term Memory Networks 57 Sigmoidal activation function vs. ReLU
58
LSTM architecture 58
59
Sequence output prediction Dynamic classifier to predict non-fixed length vectors a scalar for classification and regression a fixed-length vector for autoencoders 59
60
Sequence output prediction Greedy search Full search Beam search Keep a list of k possible sequences sorted by the joint probability 60
61
Attention model 61
62
62
63
63
64
64 Alpha Go 64 [Silver et al.]
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.