Download presentation
Presentation is loading. Please wait.
Published byKerry Goodman Modified over 9 years ago
1
Software Dynamics: A New Method of Evaluating Real-Time Performance of Distributed Systems Janusz Zalewski Computer Science Florida Gulf Coast University Ft. Myers, FL 33965-6565 http://www.fgcu.edu/zalewski/ FALSE2002, Nashville, Nov. 14-15, 2002
2
Talk Outline RT Software Architecture Evaluating S/W Architectures Timeliness & S/W Dynamics Conclusion FALSE2002, Nashville, Nov. 14-15, 2002
3
Feedback Control System FALSE2002, Nashville, Nov. 14-15, 2002
4
Generic Real-Time Software Architecture FALSE2002, Nashville, Nov. 14-15, 2002
5
Sensor/Actuator component User Interface component Communication Link component Database component Processing component Timing component. Basic Components of Real-Time Software Architecture FALSE2002, Nashville, Nov. 14-15, 2002
6
Air-Traffic Control System Physical Diagram FALSE2002, Nashville, Nov. 14-15, 2002
7
Air-Traffic Control System Context Diagram FALSE2002, Nashville, Nov. 14-15, 2002
8
The idea of grouping I/O information into different categories, which later determine the software architecture follows the fundamental software engineering principle of separation of concerns (Parnas, 1970s). FALSE2002, Nashville, Nov. 14-15, 2002
9
Model of a Distributed Embedded Simulation FALSE2002, Nashville, Nov. 14-15, 2002
10
We are missing good (any) measures to characterize Behavioral Properties of a software module (its dynamics). FALSE2002, Nashville, Nov. 14-15, 2002
11
Interrupt Latency The time interval between the occurrence of an external event and start of the first instruction of the interrupt service routine. FALSE2002, Nashville, Nov. 14-15, 2002
12
H/W logic processing Interrupt disable time Handling higher H/W priorities Switching to handler code. Interrupt Latency Involves FALSE2002, Nashville, Nov. 14-15, 2002
13
Real-Time System Responsiveness FALSE2002, Nashville, Nov. 14-15, 2002
14
Dispatch Latency The time interval between the end of the interrupt handler code and the first instruction of the process activated (made runnable) by this interrupt. FALSE2002, Nashville, Nov. 14-15, 2002
15
Dispatch Latency Involves OS decision time to reschedule (non-preemptive kernel state) context switch time return from OS call. FALSE2002, Nashville, Nov. 14-15, 2002
16
Real-Time Properties * Responsiveness * Timeliness * Schedulability * Predictability FALSE2002, Nashville, Nov. 14-15, 2002
17
How to measure these properties? * Responsiveness - just outlined * Timeliness - proposed below * Schedulability - rate monotonic and deadline monotonic analyses. FALSE2002, Nashville, Nov. 14-15, 2002
18
Two measures of timeliness: * Overall time deadlines are missed (by a task) * Number of times deadlines are missed by X percent FALSE2002, Nashville, Nov. 14-15, 2002
19
5-task Benchmark FALSE2002, Nashville, Nov. 14-15, 2002
20
Overall time the deadlines are missed for 100 experiments. FALSE2002, Nashville, Nov. 14-15, 2002
21
The number of times the deadlines are missed by 2%.
22
Overall time the deadlines are missed for 100 experiments (CORBA). FALSE2002, Nashville, Nov. 14-15, 2002
23
The number of times the deadlines are missed by 2% (CORBA). FALSE2002, Nashville, Nov. 14-15, 2002
24
ATCS: Software Components Communicating via CORBA FALSE2002, Nashville, Nov. 14-15, 2002
25
Overall time (in milliseconds) deadlines are missed for 20 aircraft (in 100 experiments). FALSE2002, Nashville, Nov. 14-15, 2002
26
Number of times deadlines are missed by more than 20% for 20 aircraft (in 100 experiments). FALSE2002, Nashville, Nov. 14-15, 2002
29
Satellite Ground Control Station FALSE2002, Nashville, Nov. 14-15, 2002
30
SGCS Implementation FALSE2002, Nashville, Nov. 14-15, 2002
31
SGCS Physical Architecture FALSE2002, Nashville, Nov. 14-15, 2002
33
Single DB Client Request Processing Time. FALSE2002, Nashville, Nov. 14-15, 2002
34
Percent of deadlines missed for one DB Client. FALSE2002, Nashville, Nov. 14-15, 2002
35
Five DB Clients Request Processing Time. FALSE2002, Nashville, Nov. 14-15, 2002
36
Percent of deadlines missed for five DB Clients. FALSE2002, Nashville, Nov. 14-15, 2002
37
Sensitivity: a measure of the magnitude of system’s response to changes. FALSE2002, Nashville, Nov. 14-15, 2002
38
Sensitivity: (y1 – y0)/[(y1 + y0)/2] (x1 – x0)/[(x1 + x0)/2] FALSE2002, Nashville, Nov. 14-15, 2002
39
Sensitivity = 1.73 FALSE2002, Nashville, Nov. 14-15, 2002
40
Sensitivity = 1.00 FALSE2002, Nashville, Nov. 14-15, 2002
41
Sensitivity = 1.64 FALSE2002, Nashville, Nov. 14-15, 2002
42
First Order Dynamics G(s) = K / ( *s + 1) FALSE2002, Nashville, Nov. 14-15, 2002
43
Time constant - : a measure of the speed of system’s response to changes. FALSE2002, Nashville, Nov. 14-15, 2002
44
Settling Time: time when curve reaches 2% max Time Constant = 0.25 * Settling Time FALSE2002, Nashville, Nov. 14-15, 2002
45
= 165 ms FALSE2002, Nashville, Nov. 14-15, 2002
46
= 87.5 ms FALSE2002, Nashville, Nov. 14-15, 2002
47
= 15 ms FALSE2002, Nashville, Nov. 14-15, 2002
48
Distributed Embedded Simulation Architecture FALSE2002, Nashville, Nov. 14-15, 2002
49
Statistical measures of timeliness: * Round-trip time stability * Service time effect FALSE2002, Nashville, Nov. 14-15, 2002
50
Service time effect for a specific architecture FALSE2002, Nashville, Nov. 14-15, 2002
51
Round-trip message time for 5-task simulation FALSE2002, Nashville, Nov. 14-15, 2002
52
Conclusion Behavioral Properties are crucial for successful software development Sensitivity is one important property Software Dynamics seems to be a measurable property as well FALSE2002, Nashville, Nov. 14-15, 2002
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.