Presentation is loading. Please wait.

Presentation is loading. Please wait.

Harini Ramaprasad, Frank Mueller North Carolina State University Center for Embedded Systems Research Bounding Preemption Delay within Data Cache Reference.

Similar presentations


Presentation on theme: "Harini Ramaprasad, Frank Mueller North Carolina State University Center for Embedded Systems Research Bounding Preemption Delay within Data Cache Reference."— Presentation transcript:

1 Harini Ramaprasad, Frank Mueller North Carolina State University Center for Embedded Systems Research Bounding Preemption Delay within Data Cache Reference Patterns for Real-Time Tasks

2 2 Motivation Timing Analysis — Calculation of Worst Case Execution Times (WCETs) of tasks — Required for scheduling of real-time tasks – Schedulability theory requires a-priori knowledge of WCET — Estimates need to be safe — Static Timing Analysis – an efficient method to calculate WCET of a program! — Data caches introduce unpredictability in timing analysis Data caches: Improve Performance Significantly Complicate Static Timing Analysis for a task

3 3 Preemptive scheduling Practical Real-Time systems — Multiple tasks with varying priorities — Higher prio. task may preempt a lower prio. task at any time — Additional DC misses occur when lower prio. task restarted — WCET with preemption delay required Static Timing Analysis becomes even more complicated!

4 4 Data Cache Reference Patterns (Prior Work) Data Cache Analyzer added to Static Timing Analysis framework Enhanced Cache Miss Equations (Ghosh et al.) framework  D$ miss/hit patterns for memory references Used for loop-nest oriented code — Scalar and array references analyzed Considers only a single task with no preemptions Patterns fed to timing analyzer to tighten WCET estimate Necessary terminology: — Iteration point –Represents an iteration of a loop-nest — Set of all iteration points – Iteration Space

5 5 Static Timing Analyzer Framework

6 6 Methodology Task Schedulability  Response Time Analysis used Steps involved in calculation of WCET with preemption delay — Calculate max. # of preemptions possible for a task — Identify placement of preemption points in iteration space — Calculate preemption delay at a certain point

7 7 Methodology: Analysis Phases Phase 1: Single-Task Analysis — For every task –Calculate Base Time –Build D$ Reference Patterns assuming NO preemptions Performed once for every task Phase 2: Preemption Delay Calculation (in task-set context) — Step 1: Identification of preemption points — Step 2: Calculation of WCET with preemption delay Performed once for a task in context of task-set

8 8 Phase 2: Preemption Delay Calculation Max # of preemption points for task T i — For every higher priority task, T j –Find max time T j can preempt T i –Subtract this from time rem. before deadline of T i — Termination –No more higher priority tasks –No time left before deadline (whichever occurs first) –Sum of gives max # of preemptions TaskPeriodWCET T1508 T210010 T320035 For T1: No hp task. # preemptions = 0 For T2: T rem = 100 – (2 * 8) = 84 No more hp tasks # preemptions = 2 For T3 T rem = 200 – (4 * 8) = 168 T rem = 168 – (2 * 10) = 148 No more hp tasks #preemptions = 4 + 2 = 6

9 9 Phase 2: Preemption Delay Calculation Identification of preemption points — Access Chain building –Build time-ordered list of all mem. refs in task –Connect all refs accessing same D$ set to form chain –Different cache sets shown with different colors — Assign weights to every access point –Weight –# distinctly colored chains that cross the point –indicates # misses if preemption at that point –Count only chains for D$ sets used by a higher prio. task –Count only if next point in chain is a HIT

10 10 Phase 2: Preemption Delay Calculation Calculation of WCET with preemption delay — Identification of worst-case preemption scenario –General Observation: –Large chunk of iter. pts. have max. preemption delay –Reason: high temporal/spatial reuse in code –Considering n highest costs gives upper bound on delay –n = max # of preemptions for task

11 11 Distribution of preemption costs

12 12 Experimental Results – Task Set 1 BenchmarkPeriod Stand-alone WCET R w/o delay WCET w/ delay R w/ delay dotproduct50000750 convolution62500749182411249113241 fir1250009537177782203735278 lms12500014536323142913677655 nrealupdates250000167384869279138235198 matrix125000054168111851104568> period Without delay, seems schedulable Adding delay to response time  safe Above task-set is actually unschedulable!

13 13 Ratios – Task Set 1 Benchmark R w/o delay / stand-alone WCET R with delay / WCET w/ delay WCET w/ delay / stand-alone WCET dot-product111 convolution1.11.061.67 fir1.871.62.31 lms2.222.62 n-real-updates2.912.974.73 matrix12.06-1.93 Preemption delay calculation: No significant change in R/WCET factor Increase in WCET itself is significant  pessimistic analysis

14 14 Spreading preemption points – key idea Find n most expensive points Spread them out in the iteration space

15 15 Related Work S. Basumallick and K. Nilsen. Cache issues in real-time systems. in ACM SIGPLAN Workshop on Language, Compiler, and Tool Support for Real-Time Systems, 1994. C.-G. Lee, J. Hahn, Y.-M. Seo, S. L. Min, R. Ha, S. Hong, C. Y. Park, M. Lee, and C. S. Kim. Analysis or cache-related preemption delay in Fixed-priority preemptive scheduling. IEEE Transactions on Computers, 47(6):700.713, 1998. C.-G. Lee, K. Lee, J. Hahn, Y.-M. Seo, S. L. Min, R. Ha, S. Hong, C. Y. Park, M. Lee, and C. S. Kim. Bounding cache related preemption delay for real-time systems. IEEE Transactions on Software Engineering, 27(9):805.826, Nov. 2001. J. Staschulat and R. Ernst. Multiple process execution in cache related preemption delay analysis. In ACM International Conference on Embedded Software, 2004. J. Staschulat, S. Schliecker, and R. Ernst. Scheduling analysis of real-time systems with precise modeling of cache related preemption delay. In Euromicro Conference on Real- Time Systems, 2005.

16 16 Conclusions Derivation of data cache reference patterns for every task Construction of data cache access chains from these — Calculate preemption delay at a point Determination of the max # of preemptions, n, for a given task — Context of a task set. Identification of the worst-case scenarios of preemptions. — Current work: Choose the n most expensive points First work addressing data cache related preemption delay

17 17 Thank you! Questions?


Download ppt "Harini Ramaprasad, Frank Mueller North Carolina State University Center for Embedded Systems Research Bounding Preemption Delay within Data Cache Reference."

Similar presentations


Ads by Google