Applications of Deep Learning and how to get started with implementation of deep learning Presentation By : Manaswi Advisor : Dr.Chinmay Hegde
Application -1 Automatic Colorization of Black and White Images
Automatic Machine Translation Application -2 Automatic Machine Translation
Automatic Image Caption Generation Application -3 Automatic Image Caption Generation https://www.clarifai.com/demo
All about building your first deep learning model Tutorials: http://neuralnetworksanddeeplearning.com/index.html - Brief than the book we are studying ( has implementation of deep learning on MNIST dataset with Theano) https://www.tensorflow.org/get_started/mnist/beginners - implementation of deep learning on MNIST dataset with Tensorflow Packages Install Anaconda – contains most of the scientific libraries used in machine learning Numpy, Scipy, matplotlib, scikitlearn (https://www.continuum.io/downloads) Install Tensorflow – contains deep learning algorithms, different optimization techniques (https://www.tensorflow.org/install/ ) And that’s it you are ready to build your first deep learning model.
For those who want to get started with python https://learnpythonthehardway.org a decent resource to get started with python
Datasets MNIST – Large database of handwritten digit Caltech 101 - Pictures of objects belonging to 101 categories Image Net - 14,197,122 images, pictures of objects belonging to 21841 categories MNIST CALTECH-101 IMAGE NET
Forums Kaggle community– hundreds of datasets to work with and 536,000 fellow members to help you. (https://www.kaggle.com/ ) http://deeplearning.net/ - datasets, research groups, jobs, tutorials everything in one place
Results on MNIST dataset: K-NN: Got accuracy up to 85%, but according to Yann Lecun if some tactics (like one vs all classification) are implemented then we can get accuracy up to 95%
Results on MNIST dataset: Neural Networks Quadratic cost Function: 1 hidden layer,30 neurons,30 epochs, mini batch size=10, learning rate=3 – 95.42%accuracy 1 hidden layer,100 neurons,30 epochs, mini batch size=10, learning rate=3 – 96.59%accuracy Cross entropy cost function: 1 hidden layer,30 neurons,30 epochs, mini batch size=10, learning rate=0.5 – 95.49%accuracy 1 hidden layer,100 neurons,30 epochs, mini batch size=10, learning rate=0.5– 96.82%accuracy 1 hidden layer,100 neurons,60 epochs, mini batch size=10, learning rate=0.1, regularization parameter = 5.0 – 98.04%accuracy
Results on MNIST dataset: Convolutional nets: convolutional layer with 20 feature maps, 60 epochs, mini batch size=10, learning rate=0.1 – 99.06% accuracy Bottom line convolutional nets perform better than neural nets and KNN when there is enough data
Features learnt by conv nets on MNIST dataset Whiter blocks mean a smaller typically more negative weight, so the feature map responds less to corresponding input pixels. Darker blocks mean a larger weight, so the feature map responds more to the corresponding input pixels
GDXray Dataset
Accuracy on GDXray Dataset Got accuracy of 96% with 2 Convolutional layers (each layer having 5x5 filters, 16 feature maps, 2x2 max pool) followed by fully connected layer (100 neurons) and a Softmax layer
Weights learnt by conv nets for GDXray dataset
The wonderful and terrifying implications of computers that can learn