Overview of TensorFlow

Slides:



Advertisements
Similar presentations
Installing software on personal computer
Advertisements

SE: CHAPTER 7 Writing The Program
Playing Tic-Tac-Toe with Neural Networks
6.5.4 Back-Propagation Computation in Fully-Connected MLP.
Comparing TensorFlow Deep Learning Performance Using CPUs, GPUs, Local PCs and Cloud Pace University, Research Day, May 5, 2017 John Lawrence, Jonas Malmsten,
Machine Learning Supervised Learning Classification and Regression
Big data classification using neural network
TensorFlow CS 5665 F16 practicum Karun Joseph, A Reference:
TensorFlow– A system for large-scale machine learning
Deep Learning Software: TensorFlow
Reinforcement Learning
Component 1.6.
Zheng ZHANG 1-st year PhD candidate Group ILES, LIMSI
Deep Feedforward Networks
Tensorflow Tutorial Homin Yoon.
基于多核加速计算平台的深度神经网络 分割与重训练技术
Why it is Called Tensor Flow Parallelism in ANNs Project Ideas and Discussion Glenn Fung Presents Batch Renormalizating Paper.
Introduction to Redux Header Eric W. Greene Microsoft Virtual Academy
Spark Presentation.
Deep Learning Libraries
Intro to NLP and Deep Learning
CS 224S: TensorFlow Tutorial
Neural Networks CS 446 Machine Learning.
Intelligent Information System Lab
Mini Presentations - part 2
A VERY Brief Introduction to Convolutional Neural Network using TensorFlow 李 弘
Comparison Between Deep Learning Packages
First Steps With Deep Learning Course.
Policy Compression for MDPs
Convolutional Networks
TensorFlow and Clipper (Lecture 24, cs262a)
Introduction to CuDNN (CUDA Deep Neural Nets)
Neural Networks and Backpropagation
Torch 02/27/2018 Hyeri Kim Good afternoon, everyone. I’m Hyeri. Today, I’m gonna talk about Torch.
Introduction to Deep Learning for neuronal data analyses
Tensorflow in Deep Learning
INF 5860 Machine learning for image classification
Brewing Deep Networks With Caffe
Introduction to Neural Networks
Topics Introduction to File Input and Output
Brain Inspired Algorithms Dr. David Fagan
Deep Learning Packages
Tensorflow in Deep Learning
Introduction to Tensorflow
An open-source software library for Machine Intelligence
MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,
Introduction to Deep Learning with Keras
GENERAL VIEW OF KRATOS MULTIPHYSICS
Smart Robots, Drones, IoT
MNIST Dataset Training with Tensorflow
CSC 578 Neural Networks and Deep Learning
APACHE MXNET By Beni Mulyana.
Neural Networks Geoff Hulten.
Vinit Shah, Joseph Picone and Iyad Obeid
A QUICK START TO OPL IBM ILOG OPL V6.3 > Starting Kit >
Intro to PHP.
7 Arrays.
Debugging Dataflow Graphs using TensorFlow Debugger.
Tensorflow Tutorial Presented By :- Ankur Mali
HP Quality Center 10.0 The Test Plan Module
Overview of Workflows: Why Use Them?
实习生汇报 ——北邮 张安迪.
TensorFlow: A System for Large-Scale Machine Learning
1.3.7 High- and low-level languages and their translators
Deep Learning Libraries
Object Detection Implementations
CSC 578 Neural Networks and Deep Learning
Deep Learning with TensorFlow
Overall Introduction for the Lecture
Machine Learning for Cyber
Presentation transcript:

Overview of TensorFlow Saba Shah Machine Learning Engineer @ FAM Global 18/9/2018

Things to do today What is TensorFlow Motivation Features Execution System What does TensorFlow bring to the table Basic Usage Data Flow Graph TensorBoard: Visualizing Learning The new world ahead - places you've never been before where TensorFlow may take you today and tomorrow 18/9/2018

What is TensorFlow TensorFlow is an open source software library for machine learning in various kinds of perceptual and language understanding tasks.[1] TensorFlow was originally developed by the Google Brain team for Google's research and production purposes and later released under the Apache 2.0 open source license on November 9, 2015.[1] 18/9/2018

Motivation for TensorFlow [3] DistBelief – 1st generation system DistBelief was scalable (forward and back propagation) DistBelief was not as flexible for complex models like reinforcement learning Flexibility for Research Scalability 18/9/2018

TensorFlow Features [1] Deep Flexibility True Portability Connect Research and Production Auto-Differentiation Language Options Maximize Performance Extensible – Easy to define new operations and kernels. [3] Library of Operations specialized for neural nets. [3] High Level Operations - Convulations, Pooling, Softmax etc Standard Losses – L1, L2, Cross Entropy Different Optimizers – Gradient Descent, AdaGrad etc 18/9/2018

Core TensorFlow Execution System [3] Core is in C++ Different front ends for specifying/driving computation – Python, C++ Frontend in Python – which is already prevalent with machine learning enthusiasts Computations can be done on any device a user has – it gives us portability 18/9/2018

What does TensorFlow bring to the table [4] TensorFlow is unique in its ability to perform partial subgraph computation.  This feature specifically allows the partitioning of a neural network so that distributed training may be possible.   Subgraph computation enables TensorFlow to support Model Parallelism.   Data Parallelism is supported through the use of Stateful nodes in the graph. 18/9/2018

What does TensorFlow bring to the table [5] Theano invented the genre. It's the Ford motors of compiling code for deep learning. Theano got a lot of things right, and fortunately TensorFlow appears to mostly embrace the Theano way. TensorFlow appears well on its way to emerging as the Tesla motors of the genre. TensorFlow offers a better interface and faster compile time. Caffe is a terrific library for training convolutional neural networks but is not really in the same category of tools for prototyping and training arbitrary neural networks. Torch appears to have a comparable offering 18/9/2018

[3] 18/9/2018

Basic Usage [1] To use TensorFlow you need to understand how TensorFlow: Represents computations as graphs. Executes graphs in the context of Sessions. Represents data as tensors. Maintains state with Variables. Uses feeds and fetches to get data into and out of arbitrary operations. 18/9/2018

Overview of the Flow [1] TensorFlow is a programming system in which you represent computations as graphs. Nodes in the graph are called ops (short for operations). An op takes zero or more Tensors, performs some computation, and produces zero or more Tensors. A Tensor is a typed multi-dimensional array. For example, you can represent a mini-batch of images as a 4-D array of floating point numbers with dimensions [batch, height, width, channels]. A TensorFlow graph is a description of computations. To compute anything, a graph must be launched in a Session. A Session places the graph ops onto Devices, such as CPUs or GPUs, and provides methods to execute them. These methods return tensors produced by ops as numpy ndarray objects in Python, and as tensorflow::Tensor instances in C and C++. 18/9/2018

The computation graph [1] TensorFlow programs are usually structured into a construction phase, that assembles a graph, and an execution phase that uses a session to execute ops in the graph. For example, it is common to create a graph to represent and train a neural network in the construction phase, and then repeatedly execute a set of training ops in the graph in the execution phase. TensorFlow can be used from C, C++, and Python programs. It is presently much easier to use the Python library to assemble graphs, as it provides a large set of helper functions not available in the C and C++ libraries. The session libraries have equivalent functionalities for the three languages. 18/9/2018

Building the graph [1] To build a graph start with ops that do not need any input (source ops), such as Constant, and pass their output to other ops that do computation. The ops constructors in the Python library return objects that stand for the output of the constructed ops. You can pass these to other ops constructors to use as inputs. The TensorFlow Python library has a default graph to which ops constructors add nodes. The default graph is sufficient for many applications.  18/9/2018

Launching the graph in a session [1] Launching follows construction. To launch a graph, create a Session object. Without arguments the session constructor launches the default graph. 18/9/2018

Interactive Usage [1] The Python examples in the documentation launch the graph with a Session and use the Session.run()method to execute operations. For ease of use in interactive Python environments, such as IPython you can instead use the InteractiveSession class, and the Tensor.eval() and Operation.run() methods. This avoids having to keep a variable holding the session. 18/9/2018

Tensors [1] TensorFlow programs use a tensor data structure to represent all data -- only tensors are passed between operations in the computation graph. TensorFlow tensor is an n-dimensional array or list. A tensor has a static type, a rank, and a shape. 18/9/2018

Variables [1] Variables maintain state across executions of the graph. You typically represent the parameters of a statistical model as a set of Variables. For example, you would store the weights for a neural network as a tensor in a Variable. During training you update this tensor by running a training graph repeatedly. 18/9/2018

Fetches [1] To fetch the outputs of operations, execute the graph with a run() call on the Session object and pass in the tensors to retrieve. All the ops needed to produce the values of the requested tensors are run once (not once per requested tensor). 18/9/2018

Feeds [1] TensorFlow also provides a feed mechanism for patching a tensor directly into any operation in the graph. A feed temporarily replaces the output of an operation with a tensor value. Supply feed data as an argument to a run() call. The feed is only used for the run call to which it is passed. The most common use case involves designating specific operations to be "feed" operations by using tf.placeholder() to create them Example code: with tf.Session() as sess: print(sess.run([output], feed_dict={input1:[7.], input2:[2.]})) A placeholder() operation generates an error if you do not supply a feed for it. 18/9/2018

What is a Data Flow Graph? [1] Data flow graphs describe mathematical computation with a directed graph of nodes & edges. Nodes typically implement mathematical operations, but can also represent endpoints to feed in data, push out results, or read/write persistent variables. Edges describe the input/output relationships between nodes. These data edges carry dynamically-sized multidimensional data arrays, or tensors. The flow of tensors through the graph is where TensorFlow gets its name. Nodes are assigned to computational devices and execute asynchronously and in parallel once all the tensors on their incoming edges becomes available. 18/9/2018

Data Flow Graph [1] Tensors flow around the nodes Tensors are n-dimensional arrays Variables are stateful like bias. Some nodes are stateful, and some are stateless It is distributed Users can specify what operations need to be done on which device RELU – rectified linear models 18/9/2018

TensorBoard: Visualizing Learning [1] 18/9/2018

18/9/2018

The New World Ahead Reinforcement Learning Artificial image building Artificially created pieces of text What are your thoughts on its possible applications? 18/9/2018

References [1] - https://www.tensorflow.org/ [2] – Wikipedia - https://en.wikipedia.org/wiki/TensorFlow [3] - BayLearn15-Keynote3: https://www.youtube.com/watch?v=90-S1M7Ny_o [4] - https://www.quora.com/What-is-unique-about-Tensorflow-from-the-other-existing-Deep-Learning-Libraries - Carlos E. Perez [5] - http://www.kdnuggets.com/2015/12/tensor-flow-terrific-deep-learning-library.html All materials accessed on 03-08-16 18/9/2018