Presentation is loading. Please wait.

Presentation is loading. Please wait.

Apdex (Application Performance Index): A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG.

Similar presentations


Presentation on theme: "Apdex (Application Performance Index): A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG."— Presentation transcript:

1 Apdex (Application Performance Index): A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG Denver, Colorado Thursday Feb 26, 2009 1

2 Agenda  Are there metrics that are of primary importance?  What kind of presentation should we use?  Raw Data?  Derived Data?  Formatting?  How do we determine thresholds? 2

3 What Metrics are Important?  How busy is a given resource?  What is the end-user perceiving? 3

4 What Metrics are Important?  How busy is a given resource?  What is the end-user perceiving? Delivered Service isn’t EVERYTHING! 4

5 What Metrics are Important?  How busy is a given resource?  What is the end-user perceiving? Delivered Service isn’t EVERYTHING! It’s the ONLY thing! (with apologies to Vince Lombardi) 5

6 What kind of presentation should we use? 6

7 7

8 8

9 Problems with Raw Data Presentation  Does not reflect user satisfaction  Does not capture effects of multi-modal distribution  The definition of ‘good’ varies with each application, making cross-comparisons difficult 9

10 Apdex Defined Apdex is a numerical measure of user satisfaction with the performance of enterprise applications It defines a method that converts many measurements into one number – Uniform 0-1 scale – 0 = no users satisfied – 1 = all users satisfied Standardized method that is a comparable metric across – Applications, – Measurement approaches, and – Enterprises 10©Netforecast.com - use by permission

11 How Apdex Works Tolerating Apdex T = Total samples Satisfied 2 + Frustrated Satisfied Tolerating Good Fair Poor Unacceptable 0.00 T 0.50 T 1.00 T 0.85 T 0.94 T 0.70 T Excellent Report Group: Application User Group Time Period Existing Task Response Time Measurement Samples T 1 2 3 4 5 7 F 6 0.91 [6] Value T 1.Define T for the application T = Target time (satisfied-tolerating threshold) F = 4T (tolerating-frustrated threshold) 2.Define a Report Group 3.Extract data set from existing measurements 4.Count number of samples in three zones 5.Calculate the Apdex formula 6.Display Apdex value showing T 7.Optionally display value using quality colors 11©Netforecast.com - use by permission

12 Apdex Example (Computation) T = 4 Seconds (important in setting up buckets, but not in math) # of tx representing ‘satisfied’ = 3000 # of tx representing ‘tolerating’ = 1000 # of tx representing ‘frustrated’ = 500 Total =4500 Apdex score = 3000 + (1000/2) = 0.77 (out of possible 1.0) 4500 = ‘fair’ response time result However, if we compute ‘average’ response time, we get 2.7 second response time, which is ‘good/excellent’…. ( ((0.1*3000) +(4.1*1000) +(16.1*500))/4500) = 2.7) Hmmmm…… 12

13 0% 2% 4% 6% 8% 10% 12% 14% 16% 020406080100120 Load Time of a Typical Business Page (sec) Probability of Experiencing the Time (Another) Example 52% Satisfied 42% Tolerating 6% Frustrated Major eCommerce site ($4B annual on-line sales) North American broadband users accessing the San Francisco data center This site had an average response time of 4 seconds, so it looked like all was well But: Apdex = 0.73 10 = Fair 13©Netforecast.com - use by permission

14 Soooo… What should T be set to? Methods for determining T  BOGGSAT  External SLA mandates  Type of work  Heads-down data entry  Research with certain # of items of interest/entry  Competitive information submission/retrieval environment  Creative / Analytical work  Control Systems work (What role does VARIABILITY play? How would you adjust the Apdex methodology to penalize high variability?) 14

15 Counting Interest Elements One – Simple check box – One data entry field: enter part number Few – Select among the following options – Expected few lines: headers of recently arrived email Several – Type your first name, last name, address, phone number – Information on product, prices, shipping alternatives, etc. The user will typically only be interested in a few of these information fields, do not assume if you present 20, the user will read 20 Many – Interesting report that is read – Scrolling down the page for more content 15©Netforecast.com - use by permission

16 Rating Repetitiveness Very High – There are many short tasks to the process High – There are a few tasks to the process Low – Sometimes there are a few tasks, sometimes there is browsing Very Low – The user is browsing, there is no actual process being performed 16©Netforecast.com - use by permission

17 Satisfied-Tolerating Threshold 1234 Very Low481216 Low36912 High2468 Very High 1234 Number of Elements Viewed Task Repetitiveness User is satisfied if task completes by T seconds Source: NetForecast, Inc.

18 Apdex SLA 0.55 0.60 0.65 0.70 0.75 0.80 0.85 0.90 0.95 1.00 2/252/262/272/283/13/23/33/43/53/63/73/83/93/103/113/123/133/143/153/163/17 CaliforniaColoradoFloridaMinnesotaNew York Apdex [4] SLT SLA SLO 0.92 Weekdays

19 Resources Peter Sevcik NetForecast, Inc.peter@netforecast.com 955 Emerson Drivewww.netforecast.com Charlottesville, VA 22901434 249 1310 Apdex Alliance – www.apdex.org “Designing the User Interface”, Shneiderman / Plaisant “Instrumentation Dashboard Design”, Stephen Few 19

20 Questions? 20


Download ppt "Apdex (Application Performance Index): A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG."

Similar presentations


Ads by Google