Torch 02/27/2018 Hyeri Kim Good afternoon, everyone. I’m Hyeri. Today, I’m gonna talk about Torch.

Slides:



Advertisements
Similar presentations
A brief review of non-neural-network approaches to deep learning
Advertisements

Neural networks Introduction Fitting neural networks
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
ROOT: A Data Mining Tool from CERN Arun Tripathi and Ravi Kumar 2008 CAS Ratemaking Seminar on Ratemaking 17 March 2008 Cambridge, Massachusetts.
Classification / Regression Neural Networks 2
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
6.5.4 Back-Propagation Computation in Fully-Connected MLP.
Introduction to Operating Systems Concepts
Comparing TensorFlow Deep Learning Performance Using CPUs, GPUs, Local PCs and Cloud Pace University, Research Day, May 5, 2017 John Lawrence, Jonas Malmsten,
TensorFlow The Deep Learning Library You Should Be Using.
Python Programming Unit -1.
TensorFlow CS 5665 F16 practicum Karun Joseph, A Reference:
TensorFlow– A system for large-scale machine learning
Deep Learning Software: TensorFlow
John Canny Where to put block multi-vector algorithms?
Operating System Structures
Analysis of Sparse Convolutional Neural Networks
CSC391/691 Intro to OpenCV Dr. Rongzhong Li Fall 2016
Big Data A Quick Review on Analytical Tools
Tensorflow Tutorial Homin Yoon.
pycuda Jin Kwon Kim May 25, 2017 Hi my name is jin kwon kim.
Chilimbi, et al. (2014) Microsoft Research
Neural Network Implementations on Parallel Architectures
Why it is Called Tensor Flow Parallelism in ANNs Project Ideas and Discussion Glenn Fung Presents Batch Renormalizating Paper.
Enabling machine learning in embedded systems
Deep Learning in HEP Large number of applications:
Deep Learning Libraries
Compiling Dynamic Data Structures in Python to Enable the Use of Multi-core and Many-core Libraries Bin Ren, Gagan Agrawal 9/18/2018.
Overview of TensorFlow
Prepared by Kimberly Sayre and Jinbo Bi
Comparison Between Deep Learning Packages
First Steps With Deep Learning Course.
Chapter 2: Operating-System Structures
structures and their relationships." - Linus Torvalds
Introduction to CuDNN (CUDA Deep Neural Nets)
Neural Networks and Backpropagation
Classification / Regression Neural Networks 2
CS 31006: Computer Networks – The Routers
CSE Social Media & Text Analytics
INF 5860 Machine learning for image classification
Brewing Deep Networks With Caffe
Deep Learning Packages
Introduction to Tensorflow
An open-source software library for Machine Intelligence
Chapter 2: System Structures
Deep Neural Network Toolkits: What’s Under the Hood?
MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,
cs540 - Fall 2016 (Shavlik©), Lecture 20, Week 11
of the Artificial Neural Networks.
3D applications in Delphi
MNIST Dataset Training with Tensorflow
APACHE MXNET By Beni Mulyana.
Optimization for Fully Connected Neural Network for FPGA application
Long Short Term Memory within Recurrent Neural Networks
ML – Lecture 3B Deep NN.
Overview of Neural Network Architecture Assignment Code
Forward and Backward Max Pooling
The Challenge of Cross - Language Interoperability
Artificial Neural Networks
Artificial Neural Networks
Automatic Handwriting Generation
Collecting, Analyzing, and Visualizing Data with Python Part I
Deep Learning Libraries
structures and their relationships." - Linus Torvalds
Igor Stančin, Alan Jović to: {igor.stancin,
An introduction to neural network and machine learning
Overall Introduction for the Lecture
Presentation transcript:

Torch 02/27/2018 Hyeri Kim Good afternoon, everyone. I’m Hyeri. Today, I’m gonna talk about Torch.

Contents Introduction to Torch Strong points of Torch Core features Torch structures PyTorch structures Compare Torch and PyTorch Conclusion My presentation is organized with this order.

Introduction to Torch An open source machine learning library, a scientific computing framework based on the Lua programming language(similar to javascript) with strong CPU and CUDA backends. Goal: to have maximum flexibility and speed in building your scientific algorithms while making the process extremely simple. Original author: Ronan Collobert, Koray Kavukcuoglu, Clement Farabet Initial release: October 2002(NYU/Facebook) Operating system: Linux, Android, Mac OS X, iOS Applications It is used by the Facebook AI research group, IBM, Yandex, and the Idiap Research Institutes. Torch is an open source machine learning library, it’s a scientific computing framework

Strong points of Torch It’s simple to use, while having maximum flexibility in implementing complex neural network topologies. Efficient Tensor library with an efficient CUDA backend Good community and industry support(several hundred community) built and maintained packages We can make arbitrary graphs of neural networks, and parallelize them over CPUs and GPUs in an efficient manner. It has tons of built-in modules and loss functions

❖ CPU vs GPU CPU(Central Processing Unit), GPU(Graphics Processing Unit) * Extracted image from cs231n, Spring 2017: Lecture 8

Core features A powerful N-dimensional array Lots of routines for indexing, slicing, and transposing Amazing interface to C, via LuaJIT Neural network, and energy-based models Numeric optimization routines Fast and efficient GPU support Embeddable, with ports to iOS and Android backends

Torch structures [1] Tensors Tensor is a general name of multi-way array data It can easily change data type like numpy It’s easy to use GPU unlike numpy [2] Modules Modules are classes written in Lua using tensor methods It’s easy to read and write Lots of package nn: It lets you easily build and train neural networks cunn: Running on GPU is easy optim: It implements different update rules to optimize the algorithm Torch has 2 levels: Tensor, Module Modules are classes written in Lua; easy to read and write Forward/backward written in Lua using Tensor methods It has tons of built-in modules and loss functions It’s easy to write

❖ What is tensors? Tensor is a general name of multi-way array data It is defined mathematically, are simply arrays of numbers, or functions, that transform according to certain rules under a change of coordinates. vector matrix cube

Torch structures [1] Tensors It can easily change data type like numpy It’s easy to use GPU unlike numpy [2] Modules Modules are classes written in Lua using tensor methods It’s easy to read and write Lots of package nn: It lets you easily build and train neural networks cunn: Running on GPU is easy optim: It implements different update rules to optimize the algorithm Torch has 2 levels. It has tons of built-in modules and loss functions Direct ancestor of PyTorch Written in Lua, not Python Torch has 2 levels: Tensor, Module Modules are classes written in Lua; easy to read and write Forward/backward written in Lua using Tensor methods It’s easy to write

ex) nn modules Let you easily build and train neural networks You can easily compute scores and losses in forward pass In backward pass, you can compute gradients. This is an example of the module called “nn” It lets you easily build and train neu * Extracted image from cs231n, Winter 2016: Lecture 12

PyTorch: Three levels of abstraction [1] Tensor Imperative N-dimensional array like numpy But it runs on GPU unlike numpy [2] Variable Node in a computational graph It stores data and gradient [3] Module - A neural network layer - It may store state or learnable weights Torch is direct ancestor of PyTorch. Torch is based on Lua, but, PyTorch is based on python. These days, many people use the python, so, PyTorch is getting more popular

PyTorch: Three levels of abstraction [1] Tensor Imperative N-dimensional array like numpy But it runs on GPU unlike numpy [2] Variable Node in a computational graph It stores data and gradient [3] Module - A neural network layer - It may store state or learnable weights To run on GPU dtype=torch.cuda.FloatTensor Torch is direct ancestor of PyTorch. Torch is based on Lua, but, PyTorch is based on python. These days, many people use the python, so, PyTorch is getting more popular * Extracted image from cs231n, Spring 2017: Lecture 8

PyTorch: Three levels of abstraction [1] Tensor Imperative ndarray like numpy arrays But it runs on GPU unlike numpy [2] Variable Node in a computational graph It stores data and gradient [3] Module - A neural network layer - It may store state or learnable weights PyTorch variable is a node in a computational graph For example, in this code, x.data is is a Tensor x.Grad is a variable of gradients x.grad.data is a tensor of gradients ======== We will not want gradients of loss with respect to data, and we want gradients with respect to weights Compute gradient of loss with respect to w1 and w2(zero out grads first) Make gradient step on weights * Extracted image from cs231n, Spring 2017: Lecture 8

❖ Compare to numpy Numpy PyTorch Define variable Forward pass Compute the gradients * Extracted image from cs231n, Spring 2017: Lecture 8 Computational graphs

❖ Compare to numpy Run on GPU Numpy PyTorch Define variable x = Variable(torch.randn(N,D).cuda(), requires_grad=True) y= Variable(torch.randn(N,D).cuda(), requires_grad=True) z = Variable(torch.randn(N,D).cuda(), requires_grad=True) Define variable Forward pass Compute the gradients * Extracted image from cs231n, Spring 2017: Lecture 8 Computational graphs

PyTorch: Three levels of abstraction [1] Tensor Imperative n-dimensional array like numpy But it runs on GPU unlike numpy [2] Variable Node in a computational graph It stores data and gradient [3] Module - A neural network layer - It may store state or learnable weights Torch is direct ancestor of PyTorch. Torch is based on Lua, but, PyTorch is based on python. These days, many people use the python, so, PyTorch is getting more popular

Module example Define new modules Initializer sets up two children Define forward pass using child modules and autograd operations on variables No need to define backward – autograd will handle it In PyTorch, we can define our own autograd function by writing forward and backward for Tensors We can define * Extracted image from cs231n, Spring 2017: Lecture 8

Torch vs PyTorch Torch PyTorch Probably PyTorch use for new project! Lua Python No autograd Autograd More stable Newer, still changing Lots of existing code Less existing code Fast Torch is 1st generation deep learning framework, Torch: NYC/Facebook(Academia), PyTorch: Facebook(industry) Usually, 1st generation of the framework was developed by academia, but, next generation of the framework was all ariginated from industry. Now, industry is giving us powerful framework to work with. So, it probably use for new project Torch is in Lua, so, learning Lua(programming language) is a bit of a turn off for some people. It’s older, so, it’s more stable. That means it’s less susceptible to bugs, there’s more example codes for Torch. PyTorch is in Python, you’ve got autograd which makes it a lot simpler to write complex models. In Lua Torch, you end up writing a lot of your own back propagation code. But, it’s newer, there is less existing codes. And, it’s still subject to change. 1st generation(developed by academia) next generation(originated from industry) (-): red, (+): blue, (0): black Probably PyTorch use for new project!

Conclusion Torch is an open source machine learning library, a scientific computing framework based on the Lua programming language It provides a wide range of algorithms(modules) for deep M.L. It can easily change data type like numpy. It’s easy to use GPU unlike numpy. PyTorch is the next generation of the Torch based on the python. It has 3 levels: Tensor, variable, and module

Reference Cs231n Winter 2016: Lecture 12 Cs231n Spring 2017: Lecture 8 http://torch.ch https://en.wikipedia.org/wiki/Torch_(machine_learning)

Thank you Do you have any question?