Basics of Deep Learning No Math Required Roland Meertens Machine learning engineer Autonomous Intelligent Driving
What we will learn
Inspired by the brain Neurons signal to other neurons Enough input activation means it becomes activated itself
Predicting the price of a house Area of the house Age of the house Distance to train station Higher activation -> higher price Weights (influence of that neuron on the output neuron)
Predicting the price of a house Hidden layer Area of the house Age of the house Distance to train station Too high or too low? Adjust the weights! Close to the station AND small
Activation function Activation function
Representations for characters Activation per class (10 output neurons) Flatten We could take all 28x28 images, make them into a list of 784 input neurons Output: an activation per class, 10 output neurons Probably want even more hidden layers for combinations of combinations of pixels
Problems with this approach
Create a “feature extractor” Line Arc ?? Network will have the chance to learn the same feature at multiple locations
Finishing our convolutional network “Normal” feedforward Activation per class Final prediction with a dense layer Same approach, “this neuron that predicted this feature should have been more active”.
What we learned