MNIST Dataset Training with Tensorflow

Slides:



Advertisements
Similar presentations
Summer Computing Workshop. Introduction to Variables Variables are used in every aspect of programming. They are used to store data the programmer needs.
Advertisements

Stanford University CS243 Winter 2006 Wei Li 1 Register Allocation.
Lists Introduction to Computing Science and Programming I.
Compiling and Linking. Compiling is quite the same as creating an executable file! Instead, creating an executable is a multistage process divided into.
Software design and development Marcus Hunt. Application and limits of procedural programming Procedural programming is a powerful language, typically.
Computer Science 101 Introduction to Programming.
11 Games and Content Session 4.1. Session Overview  Show how games are made up of program code and content  Find out about the content management system.
Python November 28, Unit 9+. Local and Global Variables There are two main types of variables in Python: local and global –The explanation of local and.
Modern VLSI Design 4e: Chapter 8 Copyright  2008 Wayne Wolf Topics Basics of register-transfer design: –data paths and controllers; –ASM charts. Pipelining.
VIRTUAL MEMORY By Thi Nguyen. Motivation  In early time, the main memory was not large enough to store and execute complex program as higher level languages.
11 Working with Images Session Session Overview  Find out more about image manipulation and scaling when drawing using XNA  Start to implement.
Advanced Stata Workshop FHSS Research Support Center.
11 Making a Sprite Session 4.2. Session Overview  Describe the principle of a game sprite, and see how to create a sprite in an XNA game  Learn more.
Comparison of different output options from Stata
McIDAS-V Status and Demonstration by Gail Dengel Tom Whittaker University of Wisconsin-Madison SSEC 2005 MUG Meeting October 27-28, 2005 Madison, WI.
CSCI1600: Embedded and Real Time Software Lecture 28: Verification I Steven Reiss, Fall 2015.
COP 3275 – Finishing Loops and Beginning Arrays Instructor: Diego Rivera-Gutierrez.
Naïve Bayes Classification Recitation, 1/25/07 Jonathan Huang.
I/O Errors 1 Computer Organization II © McQuain RAID Redundant Array of Inexpensive (Independent) Disks – Use multiple smaller disks (c.f.
Data Parallel Computations and Pattern ITCS 4/5145 Parallel computing, UNC-Charlotte, B. Wilkinson, slides6c.ppt Nov 4, c.1.
TensorFlow CS 5665 F16 practicum Karun Joseph, A Reference:
TensorFlow– A system for large-scale machine learning
Deep Learning Software: TensorFlow
Zheng ZHANG 1-st year PhD candidate Group ILES, LIMSI
Segmentation COMP 755.
Deep Feedforward Networks
Histograms CSE 6363 – Machine Learning Vassilis Athitsos
Tensorflow Tutorial Homin Yoon.
Artificial Neural Networks
Lecture 3. Fully Connected NN & Hello World of Deep Learning
Applications of Deep Learning and how to get started with implementation of deep learning Presentation By : Manaswi Advisor : Dr.Chinmay.
Image recognition DFE implementation
Optimization Code Optimization ©SoftMoore Consulting.
Deep Learning Libraries
Intro to NLP and Deep Learning
CS 224S: TensorFlow Tutorial
Classification with Perceptrons Reading:
Overview of TensorFlow
A VERY Brief Introduction to Convolutional Neural Network using TensorFlow 李 弘
Torch 02/27/2018 Hyeri Kim Good afternoon, everyone. I’m Hyeri. Today, I’m gonna talk about Torch.
Tensorflow in Deep Learning
Designing and Debugging Batch and Interactive COBOL Programs
CSE Social Media & Text Analytics
INF 5860 Machine learning for image classification
Bases and Representations, Memory, Pointers, Arrays, For-Loops
Teaching Analytics with Case Studies: Finding Love in a Classification Tree Ruth Hummel, PhD JMP Academic Ambassador.
Introduction to Tensorflow
An open-source software library for Machine Intelligence
MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,
Construct a Convolutional Neural Network with Python
VHDL Discussion Subprograms
CSC 578 Neural Networks and Deep Learning
Word Embedding Word2Vec.
AI abc Learn AI in one hour Do AI in one day (a life
On Convolutional Neural Network
VHDL Discussion Subprograms
Trace-based Just-in-Time Type Specialization for Dynamic Languages
Debugging Dataflow Graphs using TensorFlow Debugger.
Tensorflow Tutorial Presented By :- Ankur Mali
Deploy Tensorflow on PySpark
Word embeddings (continued)
TensorFlow: A System for Large-Scale Machine Learning
point when a program element is bound to a characteristic or property
An introduction to systems programming
EE 193: Parallel Computing
Deep Learning Libraries
Practical Session 9, Memory Management continues
CSC 578 Neural Networks and Deep Learning
Deep Learning with TensorFlow
Machine Learning for Cyber
Presentation transcript:

MNIST Dataset Training with Tensorflow Avinash More

Tensorflow Example of numpy : expensive operations such as matrix multiplication outside Python, using highly efficient code implemented in another language still be a lot of overhead from switching back to Python every operation especially bad if you want to run computations on GPUs or in a distributed manner, where there can be a high cost to transferring data Instead of running a single expensive operation independently from Python, TensorFlow lets us describe a graph of interacting operations that run entirely outside Python

Dataset MNIST data set is available of Yann LeCun’s website. It has 70000 images 3 Parts: 55,000 data points of training data (mnist.train) 10,000 points of test data (mnist.test) 5,000 points of validation data (mnist.validation) This split is important because we want to make sure that model we generate can be generalized.

Data point Each MNIST data point has 2 parts Image of a handwritten digit (mnist.train.images) Corresponding label (mnist.train.label)

Mnist data – image 28 * 28 Big array of numbers We can flatten this array into a vector of 28 * 28 = 784 pixels Flattening will result into loss of 2 D structure but for softmax function it won’t matter. So the result would be a tensor of dimension [55000, 784]

Mnist data - Image Label a number between 0 and 9 representing the digit drawn in the image “one-hot-vector” - a vector which is 0 in most dimensions, and 1 in a single dimension For example, 3 would be [0,0,0,1,0,0,0,0,0,0] Dimension of mnist.train.labels is a [55000, 10]

Softmax Regression Goal: to look at an image and give the probabilities for it being each digit look at a picture of a nine and be 80% sure it's a nine, but give a 5% chance to it being an eight (because of the top loop) and a bit of probability to all the others because it isn't 100% sure If you want to assign probabilities to an object being one of several different things, softmax is the thing to do, because softmax gives us a list of values between 0 and 1 that add up to 1

Tensorflow terminologies Placeholder: A placeholder is simply a variable that we will assign data to at a later date. It allows us to create our operations and build our computation graph, without needing the data. In TensorFlow terminology, we then feed data into the graph through these placeholders. A Variable is a modifiable tensor that lives in TensorFlow's graph of interacting operations. It can be used and even modified by the computation.

Tensoflow graph and session TensorFlow separates definition of computations from their execution tf.Graph = A graph defines the computation. It doesn’t compute anything, it doesn’t hold any values, it just defines the operations that you specified in your code. Tf.session: A session allows to execute graphs or part of graphs. It allocates resources (on one or more machines) for that and holds the actual values of intermediate results and variables.