Presentation is loading. Please wait.

Presentation is loading. Please wait.

Overview of TensorFlow

Similar presentations


Presentation on theme: "Overview of TensorFlow"— Presentation transcript:

1 Overview of TensorFlow
Saba Shah Machine Learning FAM Global 18/9/2018

2 Things to do today What is TensorFlow Motivation Features
Execution System What does TensorFlow bring to the table Basic Usage Data Flow Graph TensorBoard: Visualizing Learning The new world ahead - places you've never been before where TensorFlow may take you today and tomorrow 18/9/2018

3 What is TensorFlow TensorFlow is an open source software library for machine learning in various kinds of perceptual and language understanding tasks.[1] TensorFlow was originally developed by the Google Brain team for Google's research and production purposes and later released under the Apache 2.0 open source license on November 9, 2015.[1] 18/9/2018

4 Motivation for TensorFlow [3]
DistBelief – 1st generation system DistBelief was scalable (forward and back propagation) DistBelief was not as flexible for complex models like reinforcement learning Flexibility for Research Scalability 18/9/2018

5 TensorFlow Features [1]
Deep Flexibility True Portability Connect Research and Production Auto-Differentiation Language Options Maximize Performance Extensible – Easy to define new operations and kernels. [3] Library of Operations specialized for neural nets. [3] High Level Operations - Convulations, Pooling, Softmax etc Standard Losses – L1, L2, Cross Entropy Different Optimizers – Gradient Descent, AdaGrad etc 18/9/2018

6 Core TensorFlow Execution System [3]
Core is in C++ Different front ends for specifying/driving computation – Python, C++ Frontend in Python – which is already prevalent with machine learning enthusiasts Computations can be done on any device a user has – it gives us portability 18/9/2018

7 What does TensorFlow bring to the table [4]
TensorFlow is unique in its ability to perform partial subgraph computation.  This feature specifically allows the partitioning of a neural network so that distributed training may be possible.   Subgraph computation enables TensorFlow to support Model Parallelism.   Data Parallelism is supported through the use of Stateful nodes in the graph. 18/9/2018

8 What does TensorFlow bring to the table [5]
Theano invented the genre. It's the Ford motors of compiling code for deep learning. Theano got a lot of things right, and fortunately TensorFlow appears to mostly embrace the Theano way. TensorFlow appears well on its way to emerging as the Tesla motors of the genre. TensorFlow offers a better interface and faster compile time. Caffe is a terrific library for training convolutional neural networks but is not really in the same category of tools for prototyping and training arbitrary neural networks. Torch appears to have a comparable offering 18/9/2018

9 [3] 18/9/2018

10 Basic Usage [1] To use TensorFlow you need to understand how TensorFlow: Represents computations as graphs. Executes graphs in the context of Sessions. Represents data as tensors. Maintains state with Variables. Uses feeds and fetches to get data into and out of arbitrary operations. 18/9/2018

11 Overview of the Flow [1] TensorFlow is a programming system in which you represent computations as graphs. Nodes in the graph are called ops (short for operations). An op takes zero or more Tensors, performs some computation, and produces zero or more Tensors. A Tensor is a typed multi-dimensional array. For example, you can represent a mini-batch of images as a 4-D array of floating point numbers with dimensions [batch, height, width, channels]. A TensorFlow graph is a description of computations. To compute anything, a graph must be launched in a Session. A Session places the graph ops onto Devices, such as CPUs or GPUs, and provides methods to execute them. These methods return tensors produced by ops as numpy ndarray objects in Python, and as tensorflow::Tensor instances in C and C++. 18/9/2018

12 The computation graph [1]
TensorFlow programs are usually structured into a construction phase, that assembles a graph, and an execution phase that uses a session to execute ops in the graph. For example, it is common to create a graph to represent and train a neural network in the construction phase, and then repeatedly execute a set of training ops in the graph in the execution phase. TensorFlow can be used from C, C++, and Python programs. It is presently much easier to use the Python library to assemble graphs, as it provides a large set of helper functions not available in the C and C++ libraries. The session libraries have equivalent functionalities for the three languages. 18/9/2018

13 Building the graph [1] To build a graph start with ops that do not need any input (source ops), such as Constant, and pass their output to other ops that do computation. The ops constructors in the Python library return objects that stand for the output of the constructed ops. You can pass these to other ops constructors to use as inputs. The TensorFlow Python library has a default graph to which ops constructors add nodes. The default graph is sufficient for many applications.  18/9/2018

14 Launching the graph in a session [1]
Launching follows construction. To launch a graph, create a Session object. Without arguments the session constructor launches the default graph. 18/9/2018

15 Interactive Usage [1] The Python examples in the documentation launch the graph with a Session and use the Session.run()method to execute operations. For ease of use in interactive Python environments, such as IPython you can instead use the InteractiveSession class, and the Tensor.eval() and Operation.run() methods. This avoids having to keep a variable holding the session. 18/9/2018

16 Tensors [1] TensorFlow programs use a tensor data structure to represent all data -- only tensors are passed between operations in the computation graph. TensorFlow tensor is an n-dimensional array or list. A tensor has a static type, a rank, and a shape. 18/9/2018

17 Variables [1] Variables maintain state across executions of the graph.
You typically represent the parameters of a statistical model as a set of Variables. For example, you would store the weights for a neural network as a tensor in a Variable. During training you update this tensor by running a training graph repeatedly. 18/9/2018

18 Fetches [1] To fetch the outputs of operations, execute the graph with a run() call on the Session object and pass in the tensors to retrieve. All the ops needed to produce the values of the requested tensors are run once (not once per requested tensor). 18/9/2018

19 Feeds [1] TensorFlow also provides a feed mechanism for patching a tensor directly into any operation in the graph. A feed temporarily replaces the output of an operation with a tensor value. Supply feed data as an argument to a run() call. The feed is only used for the run call to which it is passed. The most common use case involves designating specific operations to be "feed" operations by using tf.placeholder() to create them Example code: with tf.Session() as sess: print(sess.run([output], feed_dict={input1:[7.], input2:[2.]})) A placeholder() operation generates an error if you do not supply a feed for it. 18/9/2018

20 What is a Data Flow Graph? [1]
Data flow graphs describe mathematical computation with a directed graph of nodes & edges. Nodes typically implement mathematical operations, but can also represent endpoints to feed in data, push out results, or read/write persistent variables. Edges describe the input/output relationships between nodes. These data edges carry dynamically-sized multidimensional data arrays, or tensors. The flow of tensors through the graph is where TensorFlow gets its name. Nodes are assigned to computational devices and execute asynchronously and in parallel once all the tensors on their incoming edges becomes available. 18/9/2018

21 Data Flow Graph [1] Tensors flow around the nodes
Tensors are n-dimensional arrays Variables are stateful like bias. Some nodes are stateful, and some are stateless It is distributed Users can specify what operations need to be done on which device RELU – rectified linear models 18/9/2018

22 TensorBoard: Visualizing Learning [1]
18/9/2018

23 18/9/2018

24 The New World Ahead Reinforcement Learning Artificial image building
Artificially created pieces of text What are your thoughts on its possible applications? 18/9/2018

25 References [1] - https://www.tensorflow.org/
[2] – Wikipedia - [3] - BayLearn15-Keynote3: [4] Carlos E. Perez [5] - All materials accessed on 18/9/2018


Download ppt "Overview of TensorFlow"

Similar presentations


Ads by Google