A Prediction-based Real-time Scheduling Advisor Peter A. Dinda Carnegie Mellon University.

Slides:



Advertisements
Similar presentations
Feedback Control Real-Time Scheduling: Framework, Modeling, and Algorithms Chenyang Lu, John A. Stankovic, Gang Tao, Sang H. Son Presented by Josh Carl.
Advertisements

Thread Criticality Predictors for Dynamic Performance, Power, and Resource Management in Chip Multiprocessors Abhishek Bhattacharjee Margaret Martonosi.
An Evaluation of Linear Models for Host Load Prediction Peter A. Dinda David R. O’Hallaron Carnegie Mellon University.
Active Learning and Collaborative Filtering
The Case For Prediction-based Best-effort Real-time Peter A. Dinda Bruce Lowekamp Loukas F. Kallivokas David R. O’Hallaron Carnegie Mellon University.
Host Load Trace Replay Peter A. Dinda Thesis Seminar 11/23/98.
Scheduling with Uncertain Resources Reflective Agent with Distributed Adaptive Reasoning RADAR.
Responsive Interactive Applications by Dynamic Mapping of Activation Trees February 20, 1998 Peter A. Dinda School of Computer.
Measurement and Analysis of Link Quality in Wireless Networks: An Application Perspective V. Kolar, Saquib Razak, P. Mahonen, N. Abu-Ghazaleh Carnegie.
Understanding and Predicting Host Load Peter A. Dinda Carnegie Mellon University
Data Mining.
Resource Signal Prediction and Its Application to Real-time Scheduling Advisors or How to Tame Variability in Distributed Systems Peter A. Dinda Carnegie.
U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Dynamic Resource Allocation for Shared Data Centers Using Online Measurements.
A Prediction-based Approach to Distributed Interactive Applications Peter A. Dinda Jason Skicewicz Dong Lu Prescience Lab Department of Computer Science.
Online Prediction of the Running Time Of Tasks Peter A. Dinda Department of Computer Science Northwestern University
1 Dong Lu, Peter A. Dinda Prescience Laboratory Computer Science Department Northwestern University Virtualized.
A Prediction-based Approach to Distributed Interactive Applications Peter A. Dinda Department of Computer Science Northwestern University
A Prediction-based Real-time Scheduling Advisor Peter A. Dinda Prescience Lab Department of Computer Science Northwestern University
Load Analysis and Prediction for Responsive Interactive Applications Peter A. Dinda David R. O’Hallaron Carnegie Mellon University.
The Running Time Advisor A Resource Signal-based Approach to Predicting Task Running Time and Its Applications Peter A. Dinda Carnegie Mellon University.
Realistic CPU Workloads Through Host Load Trace Playback Peter A. Dinda David R. O’Hallaron Carnegie Mellon University.
Liquan Shen Zhi Liu Xinpeng Zhang Wenqiang Zhao Zhaoyang Zhang An Effective CU Size Decision Method for HEVC Encoders IEEE TRANSACTIONS ON MULTIMEDIA,
Copyright © 2010 Lumina Decision Systems, Inc. Statistical Hypothesis Testing (8 th Session in “Gentle Introduction to Modeling Uncertainty”) Lonnie Chrisman,
© 2008 McGraw-Hill Higher Education The Statistical Imagination Chapter 9. Hypothesis Testing I: The Six Steps of Statistical Inference.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 24 Statistical Inference: Conclusion.
Integrated Risk Analysis for a Commercial Computing Service Chee Shin Yeo and Rajkumar Buyya Grid Computing and Distributed Systems (GRIDS) Lab. Dept.
OPERATING SYSTEMS CPU SCHEDULING.  Introduction to CPU scheduling Introduction to CPU scheduling  Dispatcher Dispatcher  Terms used in CPU scheduling.
Chapter 9 Comparing More than Two Means. Review of Simulation-Based Tests  One proportion:  We created a null distribution by flipping a coin, rolling.
INT-Evry (Masters IT– Soft Eng)IntegrationTesting.1 (OO) Integration Testing What: Integration testing is a phase of software testing in which.
Chapter 3 System Performance and Models. 2 Systems and Models The concept of modeling in the study of the dynamic behavior of simple system is be able.
© 2003, Carla Ellis Simulation Techniques Overview Simulation environments emulation exec- driven sim trace- driven sim stochastic sim Workload parameters.
Real-Time Scheduling CS4730 Fall 2010 Dr. José M. Garrido Department of Computer Science and Information Systems Kennesaw State University.
Scheduling policies for real- time embedded systems.
Software Dynamics: A New Method of Evaluating Real-Time Performance of Distributed Systems Janusz Zalewski Computer Science Florida Gulf Coast University.
Chapter 101 Multiprocessor and Real- Time Scheduling Chapter 10.
1 Nasser Alsaedi. The ultimate goal for any computer system design are reliable execution of task and on time delivery of service. To increase system.
11 Experimental and Analytical Evaluation of Available Bandwidth Estimation Tools Cesar D. Guerrero and Miguel A. Labrador Department of Computer Science.
Optimizing Shipping Times Using Fractional Factorial Designs Steven Walfish June 6, 2002.
Prepare by : Ihab shahtout.  Overview  To give an overview of fixed priority schedule  Scheduling and Fixed Priority Scheduling.
Object-Oriented Design and Implementation of the OE-Scheduler in Real-time Environments Ilhyun Lee Cherry K. Owen Haesun K. Lee The University of Texas.
6.1 Inference for a Single Proportion  Statistical confidence  Confidence intervals  How confidence intervals behave.
Chapter 3: Statistical Significance Testing Warner (2007). Applied statistics: From bivariate through multivariate. Sage Publications, Inc.
1/22 Optimization of Google Cloud Task Processing with Checkpoint-Restart Mechanism Speaker: Sheng Di Coauthors: Yves Robert, Frédéric Vivien, Derrick.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
QoPS: A QoS based Scheme for Parallel Job Scheduling M. IslamP. Balaji P. Sadayappan and D. K. Panda Computer and Information Science The Ohio State University.
Undergraduate course on Real-time Systems Linköping University TDDD07 Real-time Systems Lecture 2: Scheduling II Simin Nadjm-Tehrani Real-time Systems.
Estimating a Population Proportion Textbook Section 8.2.
Dynamic Resource Allocation for Shared Data Centers Using Online Measurements By- Abhishek Chandra, Weibo Gong and Prashant Shenoy.
Chapter 8: Estimating with Confidence
OPERATING SYSTEMS CS 3502 Fall 2017
Chapter 8: Estimating with Confidence
CHAPTER 9 Testing a Claim
Scheduling and Fairness
Wayne Wolf Dept. of EE Princeton University
Partially Predictable
Cost Estimation Chapter 5
Partially Predictable
CPU SCHEDULING.
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Chapter 8: Estimating with Confidence
Presentation transcript:

A Prediction-based Real-time Scheduling Advisor Peter A. Dinda Carnegie Mellon University

2 Outline Real-time scheduling advisor model and interface Prediction-based implementation Randomized evaluation using load trace playback

3 The Problem Solved by the Real-time Scheduling Advisor At time t now, the application gives you a task with compute requirements t nom, a deadline t now +t nom (1+slack), a confidence level c, and a list of hosts in a shared, unreserved distributed computing environment. The application can run the task on any of the hosts. Choose a host from the list such that the task, if run on that host, will meet the deadline with probability c or better, if possible.

4 Model Task model Compute-bound Initiated by user actions (interactive applications) Arrive aperiodically Do not overlap Must be started immediately (t now ) Application model Knows task’s compute requirements (t nom ) Knows appropriate slack for task –deadline = t now + (1+slack)t nom Can run task on one of a set of hosts Real-time scheduling advisor recommends the most appropriate host

5 RTSA Interface int RTAdviseTask(RTSchedulingAdvisorRequest &req, RTSchedulingAdvisorResponse &resp); struct RTSchedulingAdvisorRequest { double tnom; double slack; double conf; Host hosts[]; } struct RTSchedulingAdvisorResponse { double tnom; double slack; double conf; Host host; RunningTimePredictionResponse runningtime; } Deadline: t now + t nom (1+slack) Hosts to choose from Required certainty of meeting deadline Most appropriate host Confidence interval for running time on host

6 Prediction-based Implementation

7 Anchoring this talk Studied statistical properties of host load signals Found appropriate predictive models for host load signals Developed RPS toolkit for building fast, low overhead resource prediction systems Developed load trace playback technique for reconstructing load Built host load prediction system This talk: description and evaluation of the real-time scheduling advisor Assume this works (later talk)

8 Scheduling Strategies Prediction-based (MEAN, LAST, AR(16)) Operation –Acquire running time predictions for each host –Select host at random from those where confidence interval is below deadline –If none exist, choose host with lowest expected running time Return host and running time prediction MEASURE Return host with current lowest measured load No running time prediction RANDOM Return random host No running time prediction

9 Performance Metrics Fraction of deadlines met “Will the deadline be met?” Depends on (at least) strategy, slack, and resource availability Fraction of deadlines met when possible “If strategy claims deadline will be met, will the deadline be met? Should depend only on strategy Application can try other t nom, slack Number of possible hosts “How much randomness is introduced?” Helps to avoid disastrous advisor synchronization

10 Methodology Recreate “scenario” (load on a set of hosts) on manchester testbed using load trace playback Schedule and run randomized tasks random arrival times (5 to 15 seconds apart) t nom randomly selected from 0.1 to 10 secs Slack randomly selected from 0 to 2 Randomly selected strategy Data-mine results

11 4LS Scenario Four PSC alpha cluster hosts axp0 (interactive), axp4, axp5, axp10 (batch) high load, high variability Traces start Tuesday, August 12, ,000 tasks run in 36 hours

12 Terminology I will Use Scheduling feasibility How likely it is that a host exists on which deadline can be met Increases with slack, decreases with t nom Also depend on variation among the hosts Predictor sensitivity How likely that the deadline will be missed due to a bad prediction Low when scheduling feasibility is high or low Highest near critical slack Critical slack Slack at which scheduling feasibility is 50%

13 Overview of Results AR(16) prediction-based strategy is superior Fraction of deadlines met at least as good as MEASURE, and much improved at critical slack Fraction of deadlines met when possible higher than all competitors and most independent of slack and nominal time Introduces similar randomness as other prediction-based strategies Performance metrics depend slack, nominal time

14 Fraction of Deadlines Met Versus Slack

15 Fraction of Deadlines Met Versus t nom

16 Fraction of Deadlines Met Versus t nom (near critical slack)

17 Fraction of Deadlines Met When Possible Versus Slack

18 Fraction of Deadlines Met When Possible Versus t nom

19 Fraction of Deadlines Met When Possible Versus t nom (Near Critical Slack)

20 Number of Possible Hosts Versus Slack

21 Number of Possible Hosts Versus t nom

22 Number of Possible Hosts Versus t nom (Near Critical Slack)

23 Conclusions MEASURE greatly increases chance of meeting deadlines compared to RANDOM AR(16) increases that chance with miniscule additional overhead Especially near critical slack and for short tasks In addition, AR(16) can tell the application, with high accuracy, whether the deadline will be met before the task is run Gives the application opportunity to negotiate AR(16) introduces appropriate randomness into their choices, reducing chance of conflict AR(16) Prediction-based Real-time Scheduling Advisor is a useful tool