A Kaggle Project By Ryan Bambrough

Slides:



Advertisements
Similar presentations
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Advertisements

1 Image Classification MSc Image Processing Assignment March 2003.
Character Recognition using Hidden Markov Models Anthony DiPirro Ji Mei Sponsor:Prof. William Sverdlik.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Machine Learning Lecture 4 Multilayer Perceptrons G53MLE | Machine Learning | Dr Guoping Qiu1.
Real-time Computer Vision with Scanning N-Tuple Grids Simon Lucas Computer Science Dept.
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
October 28, 2010Neural Networks Lecture 13: Adaptive Networks 1 Adaptive Networks As you know, there is no equation that would tell you the ideal number.
SE367 Project Final Presentation By: Sujith Thomas Parimi Krishna Chaitanya In charge:- Prof Amitabha Mukerjee.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Explorations in Neural Networks Tianhui Cai Period 3.
Appendix B: An Example of Back-propagation algorithm
Image noise filtering using artificial neural network Final project by Arie Ohana.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Learning to perceive how hand-written digits were drawn Geoffrey Hinton Canadian Institute for Advanced Research and University of Toronto.
Korean Phoneme Discrimination Ben Lickly Motivation Certain Korean phonemes are very difficult for English speakers to distinguish, such as ㅅ and ㅆ.
Neural Network Applications in OCR Daniel Hentschel Robert Johnston Center for Imaging Science Rochester Institute of Technology.
Analysis of Classification Algorithms In Handwritten Digit Recognition Logan Helms Jon Daniele.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Tips for Training Neural Network
CSC321: Introduction to Neural Networks and Machine Learning Lecture 19: Learning Restricted Boltzmann Machines Geoffrey Hinton.
Hand-written character recognition
Deep learning Tsai bing-chen 10/22.
Grim Grins Project Number 5.. Grim Grins: The Team. Team members: Adrian Hoitan (Romania) Serkan Öztürk (Turkey) Günnar Yagcilar (Turkey) Póth Miklós.
Digit Recognition Using SIS Testbed Mengjie Mao. Overview Cycle 1: sequential component AAM training Cycle 2: sequential components Identifier 0 Ten perfect.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Lecture 4b Data augmentation for CNN training
Machine Learning Supervised Learning Classification and Regression
Unsupervised Learning of Video Representations using LSTMs
Outline Problem Description Data Acquisition Method Overview
ECE 5424: Introduction to Machine Learning
Artificial Neural Networks
CSE 190 Neural Networks: How to train a network to look and see
CSC321: Neural Networks Lecture 22 Learning features one layer at a time Geoffrey Hinton.
DeepCount Mark Lenson.
Convolutional Neural Fabrics by Shreyas Saxena, Jakob Verbeek
One-layer neural networks Approximation problems
Applications of Deep Learning and how to get started with implementation of deep learning Presentation By : Manaswi Advisor : Dr.Chinmay.
Matt Gormley Lecture 16 October 24, 2016
Classification with Perceptrons Reading:
Understanding the Difficulty of Training Deep Feedforward Neural Networks Qiyue Wang Oct 27, 2017.
State-of-the-art face recognition systems
Handwritten Digits Recognition
Bird-species Recognition Using Convolutional Neural Network
Tensorflow in Deep Learning
Non-linear hypotheses
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
Neural Networks Advantages Criticism
Speech Generation Using a Neural Network
MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,
Cache Replacement Scheme based on Back Propagation Neural Networks
Face Recognition with Neural Networks
network of simple neuron-like computing elements
Convolutional neural networks Abin - Roozgard.
What is an artificial neural network?
Creating Data Representations
Matteo Fischetti, University of Padova
Optimization for Fully Connected Neural Network for FPGA application
Image Classification Painting and handwriting identification
Deep Learning and Mixed Integer Optimization
Long Short Term Memory within Recurrent Neural Networks
Neural Networks Geoff Hulten.
Machine Learning: Lecture 4
Machine Learning: UNIT-2 CHAPTER-1
DRC with Deep Networks Tanmay Lagare, Arpit Jain, Luis Francisco,
VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION
EE 193/Comp 150 Computing with Biological Parts
Single Parameter Tuning
THE ASSISTIVE SYSTEM SHIFALI KUMAR BISHWO GURUNG JAMES CHOU
Presentation transcript:

A Kaggle Project By Ryan Bambrough Digit Recognizer A Kaggle Project By Ryan Bambrough

Problem To Solve Given a 28 x 28 pixel image determine what digit is present Data includes 42,000 training samples & 28,000 testing samples Used for a multitude of applications: Recognizing address on letters Digital ink to text

Approaches This is an ‘older’ Neural Network problem with a large dataset (MNIST) Quite a few ways to solve it, each with its own benefits and complexities Can add a lot of optimizations to get a little more accuracy Law of diminishing returns

Chosen Approach I chose to go with a simple Multi-Layer Network Basic Neural Network, no bells or whistles Sigmoidal Activation Function 784 Inputs Notes (28 x 28) 30 – 100 Hidden Nodes 10 Output Nodes (One for each possible output)

Problems Encountered Setting up useful python environment on Windows Had used python before, just not for neural networks Pre-Processing data, while not terrible difficult took time to ensure accuracy

Quality The accuracy of the network depended greatly on its composition: 30 Hidden, 2.75 Eta, 60 Epochs, 20 Smp/Epoch ~94.77 % 100 Hidden, 2.75 Eta, 60 Epochs, 20 Smp/Epoch ~86.99 % 30 Hidden, 5.00 Eta, 60 Epochs, 20 Smp/Epoch ~12.59 % 30 Hidden, 5.00 Eta, 60 Epochs, 50 Smp/Epoch ~94.39 %

Conclusions For this Kaggle competition it is fairly easy to get a decent accuracy with a simple network Even simple networks, if implemented with nicely pre-processed data, can perform extremely well and highly efficient

References Professor Hu’s Lecture Slides Using neural nets to recognize handwritten digits, by Michael Nielsen, http://neuralnetworksanddeeplearning.com/chap1.html