Presentation is loading. Please wait.

Presentation is loading. Please wait.

David Halbig Rocky Mountain CMG Denver, Colorado

Similar presentations


Presentation on theme: "David Halbig Rocky Mountain CMG Denver, Colorado"— Presentation transcript:

1 David Halbig Rocky Mountain CMG Denver, Colorado
Apdex (Application Performance Index):  A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG Denver, Colorado Thursday Feb 26, 2009

2 Agenda Are there metrics that are of primary importance?
What kind of presentation should we use? Raw Data? Derived Data? Formatting? How do we determine thresholds?

3 What Metrics are Important?
How busy is a given resource? What is the end-user perceiving?

4 What Metrics are Important?
How busy is a given resource? What is the end-user perceiving? Delivered Service isn’t EVERYTHING!

5 What Metrics are Important?
How busy is a given resource? What is the end-user perceiving? Delivered Service isn’t EVERYTHING! It’s the ONLY thing! (with apologies to Vince Lombardi)

6 What kind of presentation should we use?

7 What kind of presentation should we use?

8 What kind of presentation should we use?

9 What kind of presentation should we use?
Problems with Raw Data Presentation Does not reflect user satisfaction Does not capture effects of multi-modal distribution The definition of ‘good’ varies with each application, making cross-comparisons difficult

10 ©Netforecast.com - use by permission
Apdex Defined Apdex is a numerical measure of user satisfaction with the performance of enterprise applications It defines a method that converts many measurements into one number Uniform 0-1 scale 0 = no users satisfied 1 = all users satisfied Standardized method that is a comparable metric across Applications, Measurement approaches, and Enterprises ©Netforecast.com - use by permission

11 ©Netforecast.com - use by permission
How Apdex Works Define T for the application T = Target time (satisfied-tolerating threshold) F = 4T (tolerating-frustrated threshold) Define a Report Group Extract data set from existing measurements Count number of samples in three zones Calculate the Apdex formula Display Apdex value showing T Optionally display value using quality colors Existing Task Response Time Measurement Samples 7 2 Good Fair Poor Unacceptable 0.00T 0.50T 1.00T 0.85T 0.94T 0.70T Excellent Report Group: Application User Group Time Period 3 Frustrated 5 6 4 Tolerating ApdexT= Total samples Satisfied 2 + F 0.91 [6] Tolerating Value T 1 T Satisfied ©Netforecast.com - use by permission

12 Apdex Example (Computation)
T = 4 Seconds (important in setting up buckets, but not in math) # of tx representing ‘satisfied’ = 3000 # of tx representing ‘tolerating’ = 1000 # of tx representing ‘frustrated’ = Total = 4500 Apdex score = (1000/2) = (out of possible 1.0) 4500 = ‘fair’ response time result However, if we compute ‘average’ response time, we get 2.7 second response time, which is ‘good/excellent’…. ( ((0.1*3000) +(4.1*1000) +(16.1*500))/4500) = 2.7) Hmmmm……

13 (Another) Example 52% Satisfied 42% Tolerating 6% Frustrated
Major eCommerce site ($4B annual on-line sales) North American broadband users accessing the San Francisco data center 16% 14% 12% 52% Satisfied 10% This site had an average response time of 4 seconds, so it looked like all was well But: Apdex = = Fair Probability of Experiencing the Time 8% 6% 4% 42% Tolerating 2% 6% Frustrated 0% 20 40 60 80 100 120 Load Time of a Typical Business Page (sec) ©Netforecast.com - use by permission

14 Soooo… What should T be set to?
Methods for determining T BOGGSAT External SLA mandates Type of work Heads-down data entry Research with certain # of items of interest/entry Competitive information submission/retrieval environment Creative / Analytical work Control Systems work (What role does VARIABILITY play? How would you adjust the Apdex methodology to penalize high variability?)

15 Counting Interest Elements
One Simple check box One data entry field: enter part number Few Select among the following options Expected few lines: headers of recently arrived Several Type your first name, last name, address, phone number Information on product, prices, shipping alternatives, etc. The user will typically only be interested in a few of these information fields, do not assume if you present 20, the user will read 20 Many Interesting report that is read Scrolling down the page for more content ©Netforecast.com - use by permission

16 Rating Repetitiveness
Very High There are many short tasks to the process High There are a few tasks to the process Low Sometimes there are a few tasks, sometimes there is browsing Very Low The user is browsing, there is no actual process being performed ©Netforecast.com - use by permission

17 Satisfied-Tolerating Threshold
User is satisfied if task completes by T seconds Number of Elements Viewed 1 2 3 4 Very Low 8 12 16 Low 6 9 High Very High Task Repetitiveness Source: NetForecast, Inc.

18 Apdex SLA SLT SLO SLA California Colorado Florida Minnesota New York
1.00 0.95 0.92 0.90 SLO 0.85 0.80 Apdex [4] 0.75 0.70 0.65 SLA 0.60 Weekdays 0.55 2/25 2/26 2/27 2/28 3/1 3/2 3/3 3/4 3/5 3/6 3/7 3/8 3/9 3/10 3/11 3/12 3/13 3/14 3/15 3/16 3/17

19 Resources Peter Sevcik NetForecast, Inc. peter@netforecast.com
955 Emerson Drive Charlottesville, VA Apdex Alliance – “Designing the User Interface”, Shneiderman / Plaisant “Instrumentation Dashboard Design”, Stephen Few

20 Questions?


Download ppt "David Halbig Rocky Mountain CMG Denver, Colorado"

Similar presentations


Ads by Google