Download presentation
Presentation is loading. Please wait.
Published byMelinda Megan Andrews Modified over 9 years ago
1
Brett D. Higgins ^, Kyungmin Lee *, Jason Flinn *, T.J. Giuli +, Brian Noble *, and Christopher Peplin + Arbor Networks ^ University of Michigan * Ford Motor Company + The future is cloudy: Reflecting prediction error in mobile applications
2
Mobile applications are adaptive 2 Kyungmin Lee
3
How do applications adapt? 3 Kyungmin Lee Make predictions Choose optimal strategy Execute it!
4
How do applications adapt? 4 Kyungmin Lee Make predictions Choose optimal strategy Execute it!
5
How do applications adapt? 5 Kyungmin Lee Make predictions Choose optimal strategy Execute it!
6
How do applications adapt? 6 Kyungmin Lee Make predictions Choose optimal strategy Execute it! CloneCloud ’11 MAUI ’10 Chroma ’07 Spectra ‘02
7
How do applications adapt? 7 Kyungmin Lee Make predictions Choose optimal strategy Execute it! What can possibly go wrong?
8
Predictions are not perfect 8 Kyungmin Lee Make predictions Choose optimal strategy Execute it! Need to consider predictor errors!
9
Need to consider redundancy 9 Kyungmin Lee Make predictions Choose optimal strategy Execute it!
10
Re-evaluate the environment 10 Kyungmin Lee Make predictions Choose optimal strategy Execute it! Needs to constantly re-evaluate the environment Needs to constantly re-evaluate the environment
11
Embracing uncertainty Our library chooses the best strategy – Incorporates prediction errors – Single strategy or redundant – Balances cost & benefit of redundancy Benefit (time saved) Cost (energy + cellular data) – Re-evaluates the environment 11 Kyungmin Lee
12
Outline Motivation Uncertainty-aware decision-making methods – Library overview – Our three methods – Re-evaluation from new information Evaluation Conclusion 12 Kyungmin Lee
13
Library overview 13 Kyungmin Lee Application provides Our library provides Our library provides Strategies Predictors Error distribution Error distribution Environment reevaluation Decision mechanism
14
Remote vs. Local 14 Kyungmin Lee Local expected time: 20 sec Remote expected time: 10.9 sec Uncertain server load
15
Remote vs. Local 15 Kyungmin Lee Remote expected time: 10.9 sec Uncertain server load Local expected time: 20 sec
16
Let’s consider redundancy 16 Kyungmin Lee Redundancy expected time: 2.9 sec Remote expected time: 10.9 sec Uncertain server load
17
Incorporating prediction errors 17 Kyungmin Lee Use redundancy? – When predictions are too uncertain – Benefit (time) > Cost (energy + cellular data) Our library provides three methods – Brute force, error bounds, Bayesian estimation – Hides complexity from the application
18
Brute force 18 Kyungmin Lee Compute error upon new measurement Weighted sum over joint error distribution – For redundant strategies: Time: min across all strategies Cost: sum across all strategies Simple, but computationally expensive
19
Error bounds Obtain bound for new measurement Calculate bound on net gain of redundancy max(benefit) – min(cost) = max(net gain) 98765432109876543210 B P1 B P2 Bandwidth (Mbps) Network bandwidth 98765432109876543210 T1T1 T2T2 Time (seconds) Time to send 10Mb Max time savings from redundancy 19 Kyungmin Lee
20
Bayesian estimation Basic idea: – Given a prior belief about the world, – and some new evidence, – update our beliefs to account for the evidence, AKA obtaining posterior distribution – using the likelihood of the evidence Via Bayes’ Theorem: posterior = likelihood * prior p(evidence) Normalization factor; ensures posterior sums to 1 20 Kyungmin Lee
21
Bayesian estimation Applied to decision making: – Prior: completion time measurements – Evidence: complet. time prediction + implied decision – Posterior: new belief about completion time – Likelihood: When local wins, how often has the prediction agreed? When remote wins, how often has the prediction agreed? Via Bayes’ Theorem: posterior = likelihood * prior p(evidence) 21 Kyungmin Lee Normalization factor; ensures posterior sums to 1
22
Reevaluation: conditional distributions 22 Kyungmin Lee Expected time: 20secExpected time: 10.9sec Decision Elapsed Time Remote 0 11s 31s …. 100s Uncertain server load RemoteRemote & local
23
Outline Motivation Uncertainty-aware decision-making methods – Library overview – Our three methods – Re-evaluation from new information Evaluation Conclusion 23 Kyungmin Lee
24
Evaluation: methodology Network trace replay (walking & driving) – Speech recognition, network selection app Metric: weighted cost function – time + c energy * energy + c data * data 24 No- cost Low- cost Mid- cost High- cost c energy 00.000010.00010.001 Battery life reduction under average use (normally 20 hours) N/A6 min36 sec3.6 sec Kyungmin Lee
25
Speech recognition, server load 25 Weighted cost (norm.) Kyungmin Lee Our library matches the best strategy 23% Redundancy is less beneficial as cost increases Redundancy is less beneficial as cost increases
26
Network selection, walking trace 26 Weighted cost (norm.) Kyungmin Lee Our library matches the best strategy 24% 2x
27
Discussion Our library provides the best strategy Which method is the best? – Brute force: Accurate, but expensive – Error bounds: Leans toward redundancy – Bayesian: Mixed bag No clear winner 27 Kyungmin Lee
28
Conclusion Need to consider uncertainty in predictions Redundancy is powerful! Our library helps apps to choose best strategy Source code at – https://github.com/brettdh/instruments https://github.com/brettdh/instruments – https://github.com/brettdh/libcmm https://github.com/brettdh/libcmm 28 Kyungmin Lee
29
29 Questions? Kyungmin Lee
30
30 Kyungmin Lee
31
Speech recognition, server load 31 Weighted cost (norm.) Kyungmin Lee Error bounds leans towards redundancy Error bounds leans towards redundancy
32
Network selection, walking trace 32 Weighted cost (norm.) 2x 24% Low-resource strategies improve Meatballs matches the best strategy Error bounds leans towards redundancy Error bounds leans towards redundancy SimpleOur library Kyungmin Lee
33
Speech recognition, server load 33 23% Meatballs matches the best strategy Simple Error bounds leans towards redundancy Error bounds leans towards redundancy Weighted cost (norm.) Kyungmin Lee Our library
34
Network selection, driving trace 34 Not much benefit from using WiFi Not much benefit from using WiFi Simple Weighted cost (norm.) Kyungmin Lee Our library
35
Speech recognition, walking trace 35 Benefit of redundancy persists more 23- 35% 23- 35% >2x Meatballs matches the best strategy Simple Weighted cost (norm.) Kyungmin Lee Our library
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.