Download presentation
Presentation is loading. Please wait.
Published byDaniel Neil Stevens Modified over 9 years ago
1
ASAC : Automatic Sensitivity Analysis for Approximate Computing Pooja ROY, Rajarshi RAY, Chundong WANG, Weng Fai WONG National University of Singapore LCTES 2014
2
2ASAC : Automatic Sensitivity Analysis for Approximate Computing Why Approximate?
3
ASAC : Automatic Sensitivity Analysis for Approximate Computing3 Quality of Service QOS Band Acceptable QoS High QoS Relaxed accuracy
4
Architecture Compilation Circuit Programming (API) Carbin et.al. (OOPSLA’13) Exploring Previous Works 4ASAC : Automatic Sensitivity Analysis for Approximate Computing Algorithm Ansel et.al. (CGO’11) Baek et.al. (PLDI’10) Carbin et.al. (ISSTA’10) Esmaeilzadeh et.al. (ASPLOS’12) Gupta et.al. (ISLPED’11) Hoffman et.al. (ASPLOS’11) Misailovic et.al. (TECS’13) Sampson et.al. (PLDI’11) Sidiroglou-Douskos et.al. (FSE’11) Chippa et.al. (DAC’10) Kahng et.al. (DAC’12) Zhu et.al. (POPL’12) Venkataramani et.al. (MICRO’13) Sampsin et.al. (MICRO’13)
5
Programming (API) Architecture Compilation Circuit Carbin et.al. (OOPSLA’13) Exploring Previous Works 5ASAC : Automatic Sensitivity Analysis for Approximate Computing Algorithm Ansel et.al. (CGO’11) Baek et.al. (PLDI’10) Carbin et.al. (ISSTA’10) Esmaeilzadeh et.al. (ASPLOS’12) Gupta et.al. (ISLPED’11) Hoffman et.al. (ASPLOS’11) Misailovic et.al. (TECS’13) Sampson et.al. (PLDI’11) Sidiroglou-Douskos et.al. (FSE’11) Chippa et.al. (DAC’10) Kahng et.al. (DAC’12) Zhu et.al. (POPL’12) Venkataramani et.al. (MICRO’13) Sampsin et.al. (MICRO’13)
6
Approximation based Programming Paradigm New programming paradigm Explicit classification of program data (variables, methods etc.) 6ASAC : Automatic Sensitivity Analysis for Approximate Computing Code Compilation framework to support approximation Approximable data Non- approximable data
7
Need of Automation 7ASAC : Automatic Sensitivity Analysis for Approximate Computing Code Compilation framework to support approximation Approximable data Non- approximable data Original Code Programmer’s Annotation, Provision of multiple versions Programmer’s Annotation, Provision of multiple versions Rewrite using new language constructs
8
Need of Automation Writing ‘binutils’ from scratch? Expect app developers to provide many versions? Recompile and test ‘Picassa’, ‘VLC,’ with multiple QoS requirements? Providing for entire android/ios kernels? 8ASAC : Automatic Sensitivity Analysis for Approximate Computing
9
Our Approach : ASAC Automatic Sensitivity Analysis Statistical perturbation based framework Scalable Specifically, considers internal program data for approximation 9ASAC : Automatic Sensitivity Analysis for Approximate Computing
10
Key Idea 10ASAC : Automatic Sensitivity Analysis for Approximate Computing Code Perturb each Variable Acceptable QoS Perturbed Output sensitivity Sensitivity particular variables’ contribution towards the output Based on ‘sensitivity’ the variables are ranked Low ranked variables can be approximated Higher ranked variables are critical
11
Key Idea 11ASAC : Automatic Sensitivity Analysis for Approximate Computing Code Perturb each Variable Acceptable QoS Perturbed Output sensitivity How to systematically perturb the variables? How to translate the perturbed output to sensitivity ranking?
12
Hyperbox Sampling 12ASAC : Automatic Sensitivity Analysis for Approximate Computing int sum(){ int i; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } sum a i Creating hyperbox with value range of each variable
13
Hyperbox Sampling 13ASAC : Automatic Sensitivity Analysis for Approximate Computing int sum(){ int i; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } sum a i Discretizing each dimension by ‘k’
14
int sum(){ int i; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } Hyperbox Sampling 14ASAC : Automatic Sensitivity Analysis for Approximate Computing sum a i Choosing samples based on “Latin Hyperbox Sampling”
15
int sum(){ int i; double a = 0.1, sum = 0.0; for(i=0;i<10;i++){ sum += a/10; } return sum; } Hyperbox Sampling 15ASAC : Automatic Sensitivity Analysis for Approximate Computing sum a i 0.2 3 3 0.7 Controlled perturbation
16
Perturbed Outputs Rule 1 For a program with ‘n’ variables, discretization constant ‘k’ and ‘m’ randomly chosen points, number of perturbed outputs are - m* ( (k-i) ) 16ASAC : Automatic Sensitivity Analysis for Approximate Computing i=0 (k 1) n1n1 Not trivial!
17
Key Idea 17ASAC : Automatic Sensitivity Analysis for Approximate Computing Code Perturb each Variable Acceptable QoS Perturbed Output sensitivity How to systematically perturb the variables? How to translate the perturbed output to sensitivity ranking?
18
Perturbed Outputs ‘good’ sample – within QoS band ‘bad’ sample – outlies the QoS band 18ASAC : Automatic Sensitivity Analysis for Approximate Computing (cumulative distribution function (cdf) for each variable)
19
Hypothesis Testing Kolmogorov-Smirnov test calculates the max distance between the curves Rule 2 The maximum distance between the curves is the sensitivity score for the variable. Higher the score, the more the variable contributes towards the program output. 19ASAC : Automatic Sensitivity Analysis for Approximate Computing
20
Approximable vs. Critical Sensitivity score ( > 0.5) is critical For evaluation Mild Error Injection : 1/3 (or 1/2) of approximable variables Medium Error Injection : 1/6 of approximable variables Aggressive Error Injection : All of the approximable variables Programs SciMark2 MiBench (JPEG) SPEC2006 (464.H264ref) 20ASAC : Automatic Sensitivity Analysis for Approximate Computing
21
ASAC Correctness 21ASAC : Automatic Sensitivity Analysis for Approximate Computing
22
ASAC Correctness 22ASAC : Automatic Sensitivity Analysis for Approximate Computing Bench marks True Positive False Positive False Negative True Negative PrecisionRecallAccuracy SOR50120.8310.88 SMM10160.5010.88 Monte20120.6710.80 FFT1522120.88 0.87 LU71150.88 0.86 Average0.750.950.86 *as compared to ‘manually annotated baseline’ (EnerJ, PLDI’11)
23
ASAC : JPEG 23ASAC : Automatic Sensitivity Analysis for Approximate Computing Encode (Mild) Encode (Aggressive)Decode (Aggressive) Decode (Mild)Input
24
ASAC : H264 Error RateSNR_YSNR_USNR_VBitRate No Error36.6740.7442.32149.62 Mild36.6937.6437.65146.6 Medium34.0536.9236.79147.12 Aggressive29.7832.8932.99146.03 24ASAC : Automatic Sensitivity Analysis for Approximate Computing
25
ASAC Runtime 25ASAC : Automatic Sensitivity Analysis for Approximate Computing
26
ASAC Sanity Check 26ASAC : Automatic Sensitivity Analysis for Approximate Computing JPEG : Encode and Decode with error injected in variables marked as ‘non-approximable’ H264 – Application crash
27
Concluding ASAC Automatic classification of approximable and non-approximable data Scalable No profiling Can be applied to program without available source code Approximation Saves energy and without performance loss 27ASAC : Automatic Sensitivity Analysis for Approximate Computing
28
Thank you 28ASAC : Automatic Sensitivity Analysis for Approximate Computing
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.