Mantis: Automatic Performance Prediction for Smartphone Applications Yongin Kwon, Sangmin Lee, Hayoon Yi, Donghyun Kwon, Seungjun Yang, Byung-Gon Chun,

Slides:



Advertisements
Similar presentations
Efficient Program Compilation through Machine Learning Techniques Gennady Pekhimenko IBM Canada Angela Demke Brown University of Toronto.
Advertisements

Random Forest Predrag Radenković 3237/10
Chapter Outline 3.1 Introduction
Mantis: Automatic Performance Prediction for Smartphone Applications Byung-Gon Chun Microsoft Yongin Kwon, Sangmin Lee, Hayoon Yi, Donghyun Kwon, Seungjun.
Automated Regression Modeling Descriptive vs. Predictive Regression Models Four common automated modeling procedures Forward Modeling Backward Modeling.
Combining Monte Carlo Estimators If I have many MC estimators, with/without various variance reduction techniques, which should I choose?
ICONIP 2005 Improve Naïve Bayesian Classifier by Discriminative Training Kaizhu Huang, Zhangbing Zhou, Irwin King, Michael R. Lyu Oct
Pattern Recognition and Machine Learning: Kernel Methods.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Supervised Learning Recap
Ch11 Curve Fitting Dr. Deshi Ye
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
Paper presentation for CSI5388 PENGCHENG XI Mar. 23, 2005
Ao-Jan Su † Y. Charlie Hu ‡ Aleksandar Kuzmanovic † Cheng-Kok Koh ‡ † Northwestern University ‡ Purdue University How to Improve Your Google Ranking: Myths.
Sparse vs. Ensemble Approaches to Supervised Learning
Discriminative Naïve Bayesian Classifiers Kaizhu Huang Supervisors: Prof. Irwin King, Prof. Michael R. Lyu Markers: Prof. Lai Wan Chan, Prof. Kin Hong.
Bioinformatics Challenge  Learning in very high dimensions with very few samples  Acute leukemia dataset: 7129 # of gene vs. 72 samples  Colon cancer.
+ Doing More with Less : Student Modeling and Performance Prediction with Reduced Content Models Yun Huang, University of Pittsburgh Yanbo Xu, Carnegie.
Classification and Prediction: Regression Analysis
CSC2535: 2013 Advanced Machine Learning Lecture 3a: The Origin of Variational Bayes Geoffrey Hinton.
1 Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data Presented by: Tun-Hsiang Yang.
Walter Hop Web-shop Order Prediction Using Machine Learning Master’s Thesis Computational Economics.
A nonlinear hybrid fuzzy least- squares regression model Olga Poleshchuk, Evgeniy Komarov Moscow State Forest University, Russia.
Predictive Runtime Code Scheduling for Heterogeneous Architectures 1.
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Clone-Cloud. Motivation With the increasing use of mobile devices, mobile applications with richer functionalities are becoming ubiquitous But mobile.
Scheduling Many-Body Short Range MD Simulations on a Cluster of Workstations and Custom VLSI Hardware Sumanth J.V, David R. Swanson and Hong Jiang University.
Dept. of Computer and Information Sciences : University of Delaware John Cavazos Department of Computer and Information Sciences University of Delaware.
Storage Allocation for Embedded Processors By Jan Sjodin & Carl von Platen Present by Xie Lei ( PLS Lab)
Jeff Howbert Introduction to Machine Learning Winter Regression Linear Regression.
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
Performance of mathematical software Agner Fog Technical University of Denmark
Exploiting Context Analysis for Combining Multiple Entity Resolution Systems -Ramu Bandaru Zhaoqi Chen Dmitri V.kalashnikov Sharad Mehrotra.
Evolutionary Algorithms for Finding Optimal Gene Sets in Micro array Prediction. J. M. Deutsch Presented by: Shruti Sharma.
Active Sampling for Accelerated Learning of Performance Models Piyush Shivam, Shivnath Babu, Jeff Chase Duke University.
ISCG8025 Machine Learning for Intelligent Data and Information Processing Week 3 Practical Notes Application Advice *Courtesy of Associate Professor Andrew.
Gang WangDerek HoiemDavid Forsyth. INTRODUCTION APROACH (implement detail) EXPERIMENTS CONCLUSION.
Patch Based Prediction Techniques University of Houston By: Paul AMALAMAN From: UH-DMML Lab Director: Dr. Eick.
USE RECIPE INGREDIENTS TO PREDICT THE CATEGORY OF CUISINE Group 7 – MEI, Yan & HUANG, Chenyu.
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
CpSc 881: Machine Learning
Improving Support Vector Machine through Parameter Optimized Rujiang Bai, Junhua Liao Shandong University of Technology Library Zibo , China { brj,
Logistic Regression & Elastic Net
CISC Machine Learning for Solving Systems Problems Presented by: Eunjung Park Dept of Computer & Information Sciences University of Delaware Solutions.
1 Overview of Programming Principles of Computers.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Multiple Regression Learning Objectives n Explain the Linear Multiple Regression Model n Interpret Linear Multiple Regression Computer Output n Test.
A Framework to Predict the Quality of Answers with Non-Textual Features Jiwoon Jeon, W. Bruce Croft(University of Massachusetts-Amherst) Joon Ho Lee (Soongsil.
Predicting Post-Operative Patient Gait Jongmin Kim Movement Research Lab. Seoul National University.
U NIVERSITY OF M ASSACHUSETTS, A MHERST Department of Computer Science John Cavazos J Eliot B Moss Architecture and Language Implementation Lab University.
Incremental Reduced Support Vector Machines Yuh-Jye Lee, Hung-Yi Lo and Su-Yun Huang National Taiwan University of Science and Technology and Institute.
Information Processing by Neuronal Populations Chapter 6: Single-neuron and ensemble contributions to decoding simultaneously recoded spike trains Information.
Cross-Architecture Performance Prediction (XAPP): Using CPU to predict GPU Performance Newsha Ardalani Clint Lestourgeon Karthikeyan Sankaralingam Xiaojin.
Data Summit 2016 H104: Building Hadoop Applications Abhik Roy Database Technologies - Experian LinkedIn Profile:
Item Based Recommender System SUPERVISED BY: DR. MANISH KUMAR BAJPAI TARUN BHATIA ( ) VAIBHAV JAISWAL( )
SketchVisor: Robust Network Measurement for Software Packet Processing
Stats Methods at IC Lecture 3: Regression.
Date : 2016/08/09 Advisor : Jia-ling Koh Speaker : Yi-Yui Lee
Chapter 7. Classification and Prediction
The Elements of Statistical Learning
Feature Selection for Pattern Recognition
Roberto Battiti, Mauro Brunato
Students: Meiling He Advisor: Prof. Brain Armstrong
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
Jongik Kim1, Dong-Hoon Choi2, and Chen Li3
Basis Expansions and Generalized Additive Models (2)
Example on the Concept of Regression . observation
Feature Selection Methods
Determining the Risk Level Regarding to the Positioning of an Exam Machine Used in the Nuclear Environment, based of polynomial regression Mihai OPROESCU1,
Presentation transcript:

Mantis: Automatic Performance Prediction for Smartphone Applications Yongin Kwon, Sangmin Lee, Hayoon Yi, Donghyun Kwon, Seungjun Yang, Byung-Gon Chun, Ling Huang, Petros Maniatis, Mayur Naik, Yunheung Paek USENIX ATC’13

Performance Prediction Problem Predict the execution time of a program on a given input before running it.

Two kinds of Approaches Most existing techniques can be classified into two broad categories. ◦ Domain-specific programs, automatically- extracted features. ◦ General-purpose programs, manually-specified features.

Mantis A new framework to automatically predict the performance of general- purpose byte-code programs on given inputs. Four components: ◦ Feature instrumentor ◦ Profiler ◦ Performance model generator ◦ Predictor code generator

Architecture

Feature Instrumentor Instruments the program to collect the values of feature (f 1, …, f M ) as per feature schemes. Feature scheme ◦ Branch counts ◦ Loop counts ◦ Method-call counts ◦ Variable values

Examples

Profiler Outputs a data set ◦ t i : the i th observation of execution time. ◦ v i : the i th observation of the vector of M features.

Performance Modeling Performs a sparse nonlinear regression on the feature values and execution time. Produces a function ◦ is the approximation of execution time ◦ is a subset of  In practice, K << M.

Performance Modeling(Cont.) However, regression with best subset selection is NP-hard. ◦ Find the subset of size K that gives the smallest Residual Sum of Squares(RSS). ◦ Discrete optimization problem.

SPORE-FoBa Sparse POlynomial REgression – FoBa. ◦ A feature from the candidate set is added into the model if and only if adding it makes the RSS decrease a lot.  If the drop is greater than ε. ◦ Remove a feature from the active set if deleting it makes the RSS increase the least.  If the increment is smaller than ε ’.

Example Degree-2 polynomial with ◦ Expand (1+x 1 +x 2 ) 2 to get 1, x 1, x 2, x 1 2, x 1 x 2, x 2 2. ◦ Construct the following function for regression

Predictor code generator Produce a code snippet, called slice, for each chosen features. ◦ Slice: an executable sub-programs that yields the same value v of a feature at a program point p as the given program on all inputs. Automatically evaluate feature values for each input by executing slices.

Example

Prototype Toolchain

Experiment Setup A machine runs Ubuntu bit with a 3.1GHz quad-core CPU, and 8GB of RAM. A Galaxy Nexus running Android with dual-core 1.2Ghz CPU and 1GB RAM. Six CPU-intensive Android applications. ◦ Each with 1,000 randomly generated inputs. ◦ Train the predictor on 100 inputs.

Experimental Results

Features and Models

Effect of the Number of Training Inputs

Compare with Linear Model

Prediction Time of Mantis and PE

Prediction Error of Mantis and BE

Prediction on Different Hardware Platform

Prediction under Background Load

Offline Stage Processing Time

Conclusion Mantis is a framework that automatically generates program performance predictors. ◦ Combines program slicing and sparse regression in a novel way. Evaluation shows that the generated predictors estimate execution time accurately and efficiently for smartphone applications.