David Halbig Rocky Mountain CMG Denver, Colorado

Slides:



Advertisements
Similar presentations
Apdex Implementation at AOL CMG International Conference San Diego, California December 5, 2007 Eric Goldsmith Operations Architect
Advertisements

Performance Metrics Panel Monday, June Panelists John Rauser - Moderator Peter Sevcik- NetForecast Eric Goldsmith - AOL Eric Schurman - Microsoft.
Business Intelligence (BI) PerformancePoint in SharePoint 2010 Sayed Ali – SharePoint Administrator.
SAMPLE DESIGN: HOW MANY WILL BE IN THE SAMPLE—DESCRIPTIVE STUDIES ?
Research Methodology Lecture No : 11 (Goodness Of Measures)
1 Introduction to OBIEE: Learning to Access, Navigate, and Find Data in the SWIFT Data Warehouse Lesson 6: Navigation in OBIEE – Completing the Tour of.
Clarity in Voice Performance Measurements with Apdex CMG International Conference San Diego, California December 5, 2007 Ken Harker Consultant Service.
1 Introduction to OBIEE: Learning to Access, Navigate, and Find Data in the SWIFT Data Warehouse Lesson 5: Navigation in OBIEE – Touring the Catalog Page.
EXCEL PART1. Objectives  Understand spreadsheet software  Tour the Excel 2010 window  Understand formulas  Enter labels and values and use the Sum.
GTECH 201 Lecture 03 Data measurements Data errors.
1 Agenda Views Pages Web Parts Navigation Office Wrap-Up.
Tutorial 5: Working with Excel Tables, PivotTables, and PivotCharts
1 CS 430: Information Discovery Lecture 15 Library Catalogs 3.
Oracle EBS R12 features Analysis. Agenda Overall R12 features at high level R12 financials features at high level AP – Suppliers AP – Invoices AP – Banks.
Installing CiviCRM onto Wordpress. How does it work?
Lesson 10: Working with Tables and Forms. Learning Objectives After studying this lesson, you will be able to:  Insert a table in a document  Modify,
William H. Bowers – Modeling Users: Personas and Goals Cooper 5.
XHTML Introductory1 Forms Chapter 7. XHTML Introductory2 Objectives In this chapter, you will: Study elements Learn about input fields Use the element.
SALESFORCE.COM SALESFORCE.COM
XP New Perspectives on Integrating Microsoft Office 2003 Tutorial 3 1 Integrating Microsoft Office 2003 Tutorial 3 – Integrating Word, Excel, Access, and.
Compuware Corporation E2E Performance Monitoring to the Mth Tier (Mainframe Integrated) The New Industry Standard via the Apdex Alliance SCCMG November.
Data and information. Information and data By the end of this, you should be able to state the difference between DATE and INFORMAITON.
1 Integrating Microsoft Office 2003 Tutorial 3 – Integrating Word, Excel, Access, and PowerPoint.
XP New Perspectives on Integrating Microsoft Office XP Tutorial 3 1 Integrating Microsoft Office XP Tutorial 3 – Integrating Word, Excel, Access, and PowerPoint.
Data Preparation and Description Lecture 25 th. RECAP.
1 Computing for Todays Lecture 21 Yumei Huo Spring 2006.
Blackboard 8: Grade Center This workshop is for existing users of Blackboard interested in keeping track of student grades online. Blackboard replaced.
Application Analysis Meeting User—and CIO—Expectations J. Scott Haugdahl CTO, WildPackets, Inc. A WildPackets Web Seminar September.
Transportation Agenda 77. Transportation About Columns Each file in a library and item in a list has properties For example, a Word document can have.
Apdex (Application Performance Index): A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG.
Forms 5.02 Understand database queries, forms, and reports.
Crystal Reports Simple Solutions to Every Day Problems Presented By: Project Leadership Associates August 22, 2008.
TOPSpro Report Setup CASAS Summer Institute June 25, 2009.
Innotas Reports, Dashboards, and Filters
Seamlessly customize and update content for each and every location.

Creating Oracle Business Intelligence Interactive Dashboards
Actuaries Climate Index™
Tutorial 5: Working with Excel Tables, PivotTables, and PivotCharts
The Normal Distribution
Analyzing Data with Excel
Mrs. Daniel Alonzo & Tracy Mourning Sr. High
The Decision Making Process with EC2000-Keypad and Internet Versions
Managing Student Test Settings
Agenda: 10/05/2011 and 10/10/2011 Review Access tables, queries, and forms. Review sample forms. Define 5-8 guidelines each about effective form and report.
Active Orders Supplier Administrator Training Getting Started Activities This training presentation describes the Getting Started activities that will.
Actuaries Climate Index™
Data Analysis AMA Collegiate Marketing Research Certificate Program.
Machine Learning Feature Creation and Selection
Print slides for students reference
QAD Operational Metrics Working Exceptionally!
JMP Supplement to DFSS Green Belt Week 1
MODULE 7 Microsoft Access 2010
Presentation, layout and labeling
Week 6 Statistics for comparisons
How Can I Use My Completeness Report to Improve Data Quality?
QuickScore Overview.
Chapter 5 Microsoft Excel Window
MIS2502: Data Analytics Extract, Transform, Load
Navya Thum January 30, 2013 Day 5: MICROSOFT EXCEL Navya Thum January 30, 2013.
F.S.A. Computer-Based Item Types
Comparative Reporting & Analysis (CR&A)
Innotas Reports, Dashboards, and Filters
3.1 Understanding Spreadsheets
COMP444 Human Computer Interaction Usability Engineering
How good is our research? New approaches to research indicators
Bent Thomsen Institut for Datalogi Aalborg Universitet
Pivot tables and charts
Various mobile devices
Collecting and Interpreting Quantitative Data
Presentation transcript:

David Halbig Rocky Mountain CMG Denver, Colorado Apdex (Application Performance Index):  A Rational Way to Report Delivered Service Quality to Users of the IT Resource? David Halbig Rocky Mountain CMG Denver, Colorado Thursday Feb 26, 2009

Agenda Are there metrics that are of primary importance? What kind of presentation should we use? Raw Data? Derived Data? Formatting? How do we determine thresholds?

What Metrics are Important? How busy is a given resource? What is the end-user perceiving?

What Metrics are Important? How busy is a given resource? What is the end-user perceiving? Delivered Service isn’t EVERYTHING!

What Metrics are Important? How busy is a given resource? What is the end-user perceiving? Delivered Service isn’t EVERYTHING! It’s the ONLY thing! (with apologies to Vince Lombardi)

What kind of presentation should we use?

What kind of presentation should we use?

What kind of presentation should we use?

What kind of presentation should we use? Problems with Raw Data Presentation Does not reflect user satisfaction Does not capture effects of multi-modal distribution The definition of ‘good’ varies with each application, making cross-comparisons difficult

©Netforecast.com - use by permission Apdex Defined Apdex is a numerical measure of user satisfaction with the performance of enterprise applications It defines a method that converts many measurements into one number Uniform 0-1 scale 0 = no users satisfied 1 = all users satisfied Standardized method that is a comparable metric across Applications, Measurement approaches, and Enterprises ©Netforecast.com - use by permission

©Netforecast.com - use by permission How Apdex Works Define T for the application T = Target time (satisfied-tolerating threshold) F = 4T (tolerating-frustrated threshold) Define a Report Group Extract data set from existing measurements Count number of samples in three zones Calculate the Apdex formula Display Apdex value showing T Optionally display value using quality colors Existing Task Response Time Measurement Samples 7 2 Good Fair Poor Unacceptable 0.00T 0.50T 1.00T 0.85T 0.94T 0.70T Excellent Report Group: Application User Group Time Period 3 Frustrated 5 6 4 Tolerating ApdexT= Total samples Satisfied 2 + F 0.91 [6] Tolerating Value T 1 T Satisfied ©Netforecast.com - use by permission

Apdex Example (Computation) T = 4 Seconds (important in setting up buckets, but not in math) # of tx representing ‘satisfied’ = 3000 # of tx representing ‘tolerating’ = 1000 # of tx representing ‘frustrated’ = 500 Total = 4500 Apdex score = 3000 + (1000/2) = 0.77 (out of possible 1.0) 4500 = ‘fair’ response time result However, if we compute ‘average’ response time, we get 2.7 second response time, which is ‘good/excellent’…. ( ((0.1*3000) +(4.1*1000) +(16.1*500))/4500) = 2.7) Hmmmm……

(Another) Example 52% Satisfied 42% Tolerating 6% Frustrated Major eCommerce site ($4B annual on-line sales) North American broadband users accessing the San Francisco data center 16% 14% 12% 52% Satisfied 10% This site had an average response time of 4 seconds, so it looked like all was well But: Apdex = 0.7310 = Fair Probability of Experiencing the Time 8% 6% 4% 42% Tolerating 2% 6% Frustrated 0% 20 40 60 80 100 120 Load Time of a Typical Business Page (sec) ©Netforecast.com - use by permission

Soooo… What should T be set to? Methods for determining T BOGGSAT External SLA mandates Type of work Heads-down data entry Research with certain # of items of interest/entry Competitive information submission/retrieval environment Creative / Analytical work Control Systems work (What role does VARIABILITY play? How would you adjust the Apdex methodology to penalize high variability?)

Counting Interest Elements One Simple check box One data entry field: enter part number Few Select among the following options Expected few lines: headers of recently arrived email Several Type your first name, last name, address, phone number Information on product, prices, shipping alternatives, etc. The user will typically only be interested in a few of these information fields, do not assume if you present 20, the user will read 20 Many Interesting report that is read Scrolling down the page for more content ©Netforecast.com - use by permission

Rating Repetitiveness Very High There are many short tasks to the process High There are a few tasks to the process Low Sometimes there are a few tasks, sometimes there is browsing Very Low The user is browsing, there is no actual process being performed ©Netforecast.com - use by permission

Satisfied-Tolerating Threshold User is satisfied if task completes by T seconds Number of Elements Viewed 1 2 3 4 Very Low 8 12 16 Low 6 9 High Very High Task Repetitiveness Source: NetForecast, Inc.

Apdex SLA SLT SLO SLA California Colorado Florida Minnesota New York 1.00 0.95 0.92 0.90 SLO 0.85 0.80 Apdex [4] 0.75 0.70 0.65 SLA 0.60 Weekdays 0.55 2/25 2/26 2/27 2/28 3/1 3/2 3/3 3/4 3/5 3/6 3/7 3/8 3/9 3/10 3/11 3/12 3/13 3/14 3/15 3/16 3/17

Resources Peter Sevcik NetForecast, Inc. peter@netforecast.com 955 Emerson Drive www.netforecast.com Charlottesville, VA 22901 434 249 1310 Apdex Alliance – www.apdex.org “Designing the User Interface”, Shneiderman / Plaisant “Instrumentation Dashboard Design”, Stephen Few

Questions?