Advanced User Guide 1 Application – Train a MLP model for digit recognition Dataset – MNIST , 60K training, 10K test images, 10 labels. The MLP model.

Slides:



Advertisements
Similar presentations
Training Guide. `
Advertisements

CPSC 441 TUTORIAL – JANUARY 16, 2012 TA: MARYAM ELAHI INTRODUCTION TO C.
Chapter 6 Advanced Function Features Pass by Value Pass by Reference Const parameters Overloaded functions.
1 ARM Movement Instructions u MOV Rd, ; updates N, Z, C Rd = u MVN Rd, ; Rd = 0xF..F EOR.
CS 450 Module R4. R4 Overview Due on March 11 th along with R3. R4 is a small yet critical part of the MPX system. In this module, you will add the functionality.
Beyond map/reduce functions partitioner, combiner and parameter configuration Gang Luo Sept. 9, 2010.
Templated Functions. Overloading vs Templating  Overloaded functions allow multiple functions with the same name.
Chapter 8 Scope, Lifetime and More on Functions. Definitions Scope –The region of program code where it is legal to reference (use) an identifier Three.
UNIX Process Control Bach 7 Operating Systems Course Hebrew University Spring 2007.
Advanced Object-Oriented Programming Features
Chapter 10 Inheritance, Polymorphism, and Scope. 2 Knowledge Goals Understand the hierarchical nature of classes in object-oriented programming Understand.
Advanced Programming in the UNIX Environment Hop Lee.
Buffer Overflow Attacks. Memory plays a key part in many computer system functions. It’s a critical component to many internal operations. From mother.
CS 241 Section Week #4 (2/19/09). Topics This Section  SMP2 Review  SMP3 Forward  Semaphores  Problems  Recap of Classical Synchronization Problems.
Principles of Computer Programming (using Java) Review Haidong Xue Summer 2011, at GSU.
Initialization of Clang
Grail Interface and Code Ramón Creager. What is Grail? Like the Roman god Janus, Grail provides two faces to two different worlds: From the outside, a.
Design of an ODE Solver Environment. System of ODEs Consider the system of ODEs: Typically solved by forward Euler or 4th order Runge-Kutta.
Chapter 4: Threads. 4.2 Silberschatz, Galvin and Gagne ©2005 Operating System Concepts Threads A thread (or lightweight process) is a basic unit of CPU.
chap13 Chapter 13 Programming in the Large.
Inheritance and Class Hierarchies Ellen Walker CPSC 201 Data Structures Hiram College.
FLTK Tutorial.
FLTK Help Session By Richard Yu Gu CS 638 -Graphics Fall, 1999.
System Calls: A Kernel Project Hunter Bell CS Fall
ITEC 320 C++ Examples.
Basic User Guide 1 Installation Data preparation Examples – Convolutional Neural Network (CNN) Dataset: CIFAR-10 Single Worker / Synchronous / Downpour.
Introduction to UNIX Road Map: 1. UNIX Structure 2. Components of UNIX 3. Process Structure 4. Shell & Utility Programs 5. Using Files & Directories 6.
Instructor: Alexander Stoytchev CprE 185: Intro to Problem Solving (using C)
1 + 1 becomes 11 what does our software promise?.
1 Command-Line Processing In many operating systems, command-line options are allowed to input parameters to the program SomeProgram Param1 Param2 Param3.
Generic Programming in C++. Generic Parameters Function for squaring a number: Function for squaring a number: sqrt(x) { return x * x; } C version: C.
4.1 Silberschatz, Galvin and Gagne ©2005 Operating System Concepts Project1: Unix Shell with History Feature Goals Descriptions Methodology Submission.
1 Functions  A function is a named, independent section of C++ code that performs a specific task and optionally returns a value to the calling program.
USER DEFINED FUNCTIONS Computer Programming Asst. Prof. Dr. Choopan Rattanapoka and Asst. Prof. Dr. Suphot Chunwiphat.
What is a Process? u A process is an executable “cradle” in which a program may run u This “cradle” provides an environment in which the program can run,
CSE 232: C++ memory management Overview of Arrays Arrays are the simplest kind of data structure –One item right after another in memory (“contiguous range”
OS Project 0 February 25, Outline  Linux Installation  Linux Kernel Compilation  System Call Development  Kernel Modules / 452.
Java and C++ Transitioning. A simple example public class HelloWorldApp { public static void main(String[] args) { //Display the string. System.out.println("Hello.
 In the java programming language, a keyword is one of 50 reserved words which have a predefined meaning in the language; because of this,
An Introduction to MPI (message passing interface)
EIToolkit stub doc. main.cpp file int main(int argc, char* argv[]) { EIProperties::UseStandards(); // create the stub ExampleStub stub; stub.Start();
Inherited Classes in Java CSCI 392 Ch 6 in O’Reilly Adapted from Dannelly.
WaveMaker Visual AJAX Studio 4.0 Training Java Script Events.
Classes in C++ And comparison to Java CS-1030 Dr. Mark L. Hornick.
Prof. amr Goneid, AUC1 CSCE 110 PROGRAMMING FUNDAMENTALS WITH C++ Prof. Amr Goneid AUC Part 15. Dictionaries (1): A Key Table Class.
Quick Review of OOP Constructs Classes:  Data types for structured data and behavior  fields and methods Objects:  Variables whose data type is a class.
INFSO-RI Enabling Grids for E-sciencE gLite C++ Configurator Practical experience gLite Configuration Meeting, March 1, 2005 Peter.
C:\Temp\Templates 4 5 Use This Main Program 6.
1 Homework Continue with K&R Chapter 5 –Skipping sections for now –Not covering section 5.12 Continue on HW5.
JAVA ACCESS MODIFIERS. Access Modifiers Access modifiers control which classes may use a feature. A classes features are: - The class itself - Its member.
OpenMoko Shakthi Kannan October 2007
External Scope CECS 277 Mimi Opkins.
C++ Lesson 1.
MT262A Review.
OS Homework 1 February 22, 2017.
Command line arguments
Agenda Make Utility Command Line Arguments in Unix
Command-Line Arguments
Multi-dimensional Array
Friday, January 26, 2018 Announcements… For Today… For Next Time…
Object Oriented Analysis and Design
Project1: Unix Shell using Multi-Processing
הרצאה 08 פרמטרים ל- main קרן כליף.
Built-In (a.k.a. Native) Types in C++
Inherited Classes in Java
Electronic Field Study Advanced User Training
프로그래밍2 및 실습 Sort Code 전명중.
C++ Constructor Insanity CSE 333 Autumn 2018
C++ Constructor Insanity CSE 333 Winter 2019
Makefile Assignment Create a file called f1.cpp. It should contain the following function: int squareIt ( int x ) { //insert code to calc and return the.
Presentation transcript:

Advanced User Guide 1 Application – Train a MLP model for digit recognition Dataset – MNIST , 60K training, 10K test images, 10 labels. The MLP model

myproto.proto Configuration fields – Name, type, source layers, parameters – Number of units Insert the specific config into singa::LayerProto via extension package singa; import "job.proto”; message HiddenProto { required int32 num_output = 1; } extend LayerProto { optional HiddenProto hidden_conf = 102; } package singa; import "job.proto”; message HiddenProto { required int32 num_output = 1; } extend LayerProto { optional HiddenProto hidden_conf = 102; }

Config the neuralnet – The hidden layer is shown job.conf layer{ name: "hid1" user_type: "kHidden" srclayers:"mnist" [singa.hidden_conf] { num_output: 10 } param{ name: "w1" init { type: kUniform low:-0.05 high:0.05 } param{ name: "b1" init { type : kUniform low: high:0.05 } layer{ name: "hid1" user_type: "kHidden" srclayers:"mnist" [singa.hidden_conf] { num_output: 10 } param{ name: "w1" init { type: kUniform low:-0.05 high:0.05 } param{ name: "b1" init { type : kUniform low: high:0.05 }

hidden_layer.h Declare the hidden layer class HiddenLayer : public NeuronLayer { public: ~HiddenLayer(); void Setup(const LayerProto& proto, int npartitions) override; void ComputeFeature(int flag, Metric* perf) override; void ComputeGradient(int flag, Metric* perf) override; const std::vector GetParams() const override { std::vector params{weight_, bias_}; return params; } private: int batchsize_, vdim_, hdim_; Param *weight_, *bias_; }; class HiddenLayer : public NeuronLayer { public: ~HiddenLayer(); void Setup(const LayerProto& proto, int npartitions) override; void ComputeFeature(int flag, Metric* perf) override; void ComputeGradient(int flag, Metric* perf) override; const std::vector GetParams() const override { std::vector params{weight_, bias_}; return params; } private: int batchsize_, vdim_, hdim_; Param *weight_, *bias_; };

hidden_layer.cc HiddenLayer :: Setup void HiddenLayer::Setup(const LayerProto& proto, int npartitions) { Layer::Setup(proto, npartitions); CHECK_EQ(srclayers_.size(), 1); const auto& src = srclayers_[0]->data(this); batchsize_ = src.shape()[0]; vdim_ = src.count() / batchsize_; hdim_ = layer_proto_.GetExtension(hidden_conf).num_output(); data_.Reshape(vector {batchsize_, hdim_}); grad_.ReshapeLike(data_); weight_ = Param::Create(proto.param(0)); bias_ = Param::Create(proto.param(1)); weight_->Setup(vector {hdim_, vdim_}); bias_->Setup(vector {hdim_}); } void HiddenLayer::Setup(const LayerProto& proto, int npartitions) { Layer::Setup(proto, npartitions); CHECK_EQ(srclayers_.size(), 1); const auto& src = srclayers_[0]->data(this); batchsize_ = src.shape()[0]; vdim_ = src.count() / batchsize_; hdim_ = layer_proto_.GetExtension(hidden_conf).num_output(); data_.Reshape(vector {batchsize_, hdim_}); grad_.ReshapeLike(data_); weight_ = Param::Create(proto.param(0)); bias_ = Param::Create(proto.param(1)); weight_->Setup(vector {hdim_, vdim_}); bias_->Setup(vector {hdim_}); }

hidden_layer.cc HiddenLayer :: ComputeFeature void HiddenLayer::ComputeFeature(int flag, Metric* perf) { … data = dot(src, weight.T()); data += expr::repmat(bias, batchsize_); data = expr::F (data); } void HiddenLayer::ComputeFeature(int flag, Metric* perf) { … data = dot(src, weight.T()); data += expr::repmat(bias, batchsize_); data = expr::F (data); }

hidden_layer.cc HiddenLayer :: ComputeGradient void HiddenLayer::ComputeGradient(int flag, Metric* perf) { …. grad = expr::F (data) * grad; gbias = expr::sum_rows(grad); gweight = dot(grad.T(), src); if (srclayers_[0]->mutable_grad(this) != nullptr) { auto gsrc = NewTensor2(srclayers_[0]->mutable_grad(this)); gsrc = dot(grad, weight); } void HiddenLayer::ComputeGradient(int flag, Metric* perf) { …. grad = expr::F (data) * grad; gbias = expr::sum_rows(grad); gweight = dot(grad.T(), src); if (srclayers_[0]->mutable_grad(this) != nullptr) { auto gsrc = NewTensor2(srclayers_[0]->mutable_grad(this)); gsrc = dot(grad, weight); }

main.cc Register HiddenLayer … #include "hidden_layer.h" #include "myproto.pb.h” int main(int argc, char **argv) { … // users can register new subclasses of layer, updater, etc. driver.RegisterLayer ("kHidden"); … } … #include "hidden_layer.h" #include "myproto.pb.h” int main(int argc, char **argv) { … // users can register new subclasses of layer, updater, etc. driver.RegisterLayer ("kHidden"); … }

job.conf name: "mlp” train_one_batch { alg: kBP } updater{ type: kSGD learning_rate{ type : kStep base_lr: step_conf{ change_freq: 60 gamma: } neuralnet{…..} cluster {….} name: "mlp” train_one_batch { alg: kBP } updater{ type: kSGD learning_rate{ type : kStep base_lr: step_conf{ change_freq: 60 gamma: } neuralnet{…..} cluster {….}

Compile and run Compile – cp Makefile.example Makefile – make Run – export LD_LIBRARY_PATH=.libs:$LD_LIBRARY_PATH –./bin/singa-run –exec examples/mlp/mlp.bin –conf examples/mlp/job.conf –./bin/singa-run –exec examples/mlp/mlp.bin –conf examples/mlp/deep.conf