Presentation is loading. Please wait.

Presentation is loading. Please wait.

MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,

Similar presentations


Presentation on theme: "MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,"โ€” Presentation transcript:

1 MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,
Principal Solutions AWS Deep Learning June 2017

2 Computational Dependency
3 ๐‘ข ๐‘ก + x 2 ๐‘ง ๐œ† k x x x a y b 1 1 ๐‘ง=๐‘ฅโ‹…๐‘ฆ ๐‘˜=๐‘Žโ‹…๐‘ ๐‘ก=๐œ†๐‘ง+๐‘˜

3 Execution Dependency in Matrix Operations
๐‘ฅ 1, 1 โ‹ฏ ๐‘ฅ 1,๐‘› โ‹ฎ โ‹ฑ โ‹ฎ ๐‘ฅ ๐‘›,1 โ‹ฏ ๐‘ฅ ๐‘›,๐‘› ๐‘ฆ 1, 1 โ‹ฏ ๐‘ฆ 1,๐‘› โ‹ฎ โ‹ฑ โ‹ฎ ๐‘ฆ ๐‘›,1 โ‹ฏ ๐‘ฆ ๐‘›,๐‘› = ๐‘–,๐‘— ๐‘ฅ ๐‘– ๐‘ฆ ๐‘— .

4 MXNet Architecture

5 Modules and Components
Runtime Dependency Engine: Schedules and executes the operations according to their read/write dependency. Storage Allocator: Efficiently allocates and recycles memory blocks on host (CPU) and devices (GPUs). Resource Manager: Manages global resources, such as the random number generator and temporal space. NDArray: Dynamic, asynchronous n-dimensional arrays, which provide flexible imperative programs for MXNet. Symbolic Execution: Static symbolic graph executor, which provides efficient symbolic graph execution and optimization. Operator: Operators that define static forward and gradient calculation (backprop). SimpleOp: Operators that extend NDArray operators and symbolic operators in a unified fashion. Symbol Construction: Symbolic construction, which provides a way to construct a computation graph (net configuration). KVStore: Key-value store interface for efficient parameter synchronization. Data Loading(IO): Efficient distributed data loading and augmentation.

6 MXNet Basics NDArray: Manipulate multi-dimensional arrays in a command line paradigm (imperative). Symbol: Symbolic expression for neural networks (declarative). Module: Intermediate-level and high-level interface for neural network training and inference.ย  Loading Data: Feeding data into training/inference programs. Mixed Programming: Training algorithms developed using NDArrays in concert with Symbols.

7 NDArray The intention is to replicate numpyโ€™s API, but optimized for GPU It provides matrix operations. API Docs are loated here. Tutorials are here

8 Symbols A symbol represents a multi-output symbolic expression.
Composited by operators, such as simple matrix operations or a neural network layer. An operator can take several input variables, produce more than one output variables, and have internal state variables. A variable can be either free, which we can bind with value later, or an output of another symbol. Tutorial is here

9 Symbols vs NDArray Both are made to deliver multi-dimensional array operators. Symbol NDArray Declarative Imperative Hard to Debug Easy to debug Complecated to work with Easy to work ANN related + tensor operations Provides tensor operations Automatic differentiation No pre-defined differentiation Easy to build complex computations Must self develop Easy to save and load and visualize Back-end optimization No back-end optimization

10 Modules Commonly used code for training and inference in modularized in module API. We first create a network using symbol API and then Create a module passing symbol, context, list of vars, and label vars Then using module.fit we can train a model as in the tutorial

11 Modules and checkpoints
After running each epoch, minibatch, or evaluation we can save the outcome of training in checkpoints using callbacks. This helps us stop the training if a model stops converging and simply pick the outcome of the best epoch. We can load the model from a saved checkpoint using loadcheckpoint

12 Loading Data with Data Iterators
Training and inference modules in MXNet accept data iterators, which simplify this procedure, especially when reading large datasets from filesystems. A data iterator reads data batch by batch. Iterators are used to load data into symbols. Iterators are similar to python iterators. MXNet data iterator returns a batch of data at each next call. Iterators are similar to python iterators. When next is called at the end of an array, StopIteration exception is raised.

13 Data Batch Iterators operate on data.
data is a list of NDArray, each of which has n (batch size) length first dimention. Example: RGB image of the size 224 x 224 has array shape of (n, 3, 224, 244) label is a list of NDArray, each of which often 1-dimensional with the shape (n,) pad is an integer shows how many examples are for merely used for padding, which should be ignored in the results.

14 MXNet Data Iterators io.NDArrayIter
Iterating on eitherย mx.nd.NDArrayย orย numpy.ndarray. io.CSVIter Iterating on CSV files io.ImageRecordIter Iterating on image RecordIO files io.ImageRecordUInt8Iter Create iterator for dataset packed in recordio. io.MNISTIter Iterating on the MNIST dataset. recordio.MXRecordIO Read/write RecordIO format data. recordio.MXIndexedRecordIO Read/write RecordIO format data supporting random access. image.ImageIter Image data iterator with a large number of augmentation choices.

15 Building a Data Iterator
An iterator should: Return a data batch or raise a StopIterator exception if reaching the end. Have reset() method to restart reading from the beginning Has provide_data and provide_label attributes. provide_data returns a list of (str, tuple) pairs to store data variable name and its shapre. provide_label does the same but for input labels.

16 Building a Data Iterator

17 Imperative vs Symbolic Programming
Execution Flow is the same as flow of the code: Abstract functions are defined and compiled first, data binding happens next. Flexible but inefficient: Efficient Memory: 4 * 10 * 8 = 320 bytes Interim values are available No Operation Folding. Familiar coding paradigm. Memory: 2 * 10 * 8 = 160 bytes Interim values are not available Operation Folding: Folding multiple operations into one. We run one op. instead of many on GPU. This is possible because we have access to whole comp. graph

18 Mixed Programming MXNet permits mixing both styles into your code.
Module abstracts the need for many of the symbolic operations and provides functions to simply run within your flow.

19 Cyrus M. Vahid


Download ppt "MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,"

Similar presentations


Ads by Google