Apdex (Application Performance Index): A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG Denver, Colorado Thursday Feb 26,
Agenda Are there metrics that are of primary importance? What kind of presentation should we use? Raw Data? Derived Data? Formatting? How do we determine thresholds? 2
What Metrics are Important? How busy is a given resource? What is the end-user perceiving? 3
What Metrics are Important? How busy is a given resource? What is the end-user perceiving? Delivered Service isn’t EVERYTHING! 4
What Metrics are Important? How busy is a given resource? What is the end-user perceiving? Delivered Service isn’t EVERYTHING! It’s the ONLY thing! (with apologies to Vince Lombardi) 5
What kind of presentation should we use? 6
7
8
Problems with Raw Data Presentation Does not reflect user satisfaction Does not capture effects of multi-modal distribution The definition of ‘good’ varies with each application, making cross-comparisons difficult 9
Apdex Defined Apdex is a numerical measure of user satisfaction with the performance of enterprise applications It defines a method that converts many measurements into one number – Uniform 0-1 scale – 0 = no users satisfied – 1 = all users satisfied Standardized method that is a comparable metric across – Applications, – Measurement approaches, and – Enterprises 10©Netforecast.com - use by permission
How Apdex Works Tolerating Apdex T = Total samples Satisfied 2 + Frustrated Satisfied Tolerating Good Fair Poor Unacceptable 0.00 T 0.50 T 1.00 T 0.85 T 0.94 T 0.70 T Excellent Report Group: Application User Group Time Period Existing Task Response Time Measurement Samples T F [6] Value T 1.Define T for the application T = Target time (satisfied-tolerating threshold) F = 4T (tolerating-frustrated threshold) 2.Define a Report Group 3.Extract data set from existing measurements 4.Count number of samples in three zones 5.Calculate the Apdex formula 6.Display Apdex value showing T 7.Optionally display value using quality colors 11©Netforecast.com - use by permission
Apdex Example (Computation) T = 4 Seconds (important in setting up buckets, but not in math) # of tx representing ‘satisfied’ = 3000 # of tx representing ‘tolerating’ = 1000 # of tx representing ‘frustrated’ = 500 Total =4500 Apdex score = (1000/2) = 0.77 (out of possible 1.0) 4500 = ‘fair’ response time result However, if we compute ‘average’ response time, we get 2.7 second response time, which is ‘good/excellent’…. ( ((0.1*3000) +(4.1*1000) +(16.1*500))/4500) = 2.7) Hmmmm…… 12
0% 2% 4% 6% 8% 10% 12% 14% 16% Load Time of a Typical Business Page (sec) Probability of Experiencing the Time (Another) Example 52% Satisfied 42% Tolerating 6% Frustrated Major eCommerce site ($4B annual on-line sales) North American broadband users accessing the San Francisco data center This site had an average response time of 4 seconds, so it looked like all was well But: Apdex = = Fair 13©Netforecast.com - use by permission
Soooo… What should T be set to? Methods for determining T BOGGSAT External SLA mandates Type of work Heads-down data entry Research with certain # of items of interest/entry Competitive information submission/retrieval environment Creative / Analytical work Control Systems work (What role does VARIABILITY play? How would you adjust the Apdex methodology to penalize high variability?) 14
Counting Interest Elements One – Simple check box – One data entry field: enter part number Few – Select among the following options – Expected few lines: headers of recently arrived Several – Type your first name, last name, address, phone number – Information on product, prices, shipping alternatives, etc. The user will typically only be interested in a few of these information fields, do not assume if you present 20, the user will read 20 Many – Interesting report that is read – Scrolling down the page for more content 15©Netforecast.com - use by permission
Rating Repetitiveness Very High – There are many short tasks to the process High – There are a few tasks to the process Low – Sometimes there are a few tasks, sometimes there is browsing Very Low – The user is browsing, there is no actual process being performed 16©Netforecast.com - use by permission
Satisfied-Tolerating Threshold 1234 Very Low Low36912 High2468 Very High 1234 Number of Elements Viewed Task Repetitiveness User is satisfied if task completes by T seconds Source: NetForecast, Inc.
Apdex SLA /252/262/272/283/13/23/33/43/53/63/73/83/93/103/113/123/133/143/153/163/17 CaliforniaColoradoFloridaMinnesotaNew York Apdex [4] SLT SLA SLO 0.92 Weekdays
Resources Peter Sevcik NetForecast, 955 Emerson Drivewww.netforecast.com Charlottesville, VA Apdex Alliance – “Designing the User Interface”, Shneiderman / Plaisant “Instrumentation Dashboard Design”, Stephen Few 19
Questions? 20