1 Predictors of customer perceived software quality Paul Luo Li (ISRI – CMU) Audris Mockus (Avaya Research) Ping Zhang (Avaya Research)

Slides:



Advertisements
Similar presentations
Metrics and Databases for Agile Software Development Projects David I. Heimann IEEE Boston Reliability Society April 14, 2010.
Advertisements

Process Database and Process Capability Baseline
Introduction Build and impact metric data provided by the SGIG recipients convey the type and extent of technology deployment, as well as its effect on.
Empirical Evaluation of Defect Projection Models for Widely-deployed Production Software Systems FSE 2004 Paul Li, Mary Shaw, Jim Herbsleb Institute for.
‘On-the-Ground’ Test Execution Challenges
Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
Predictor of Customer Perceived Software Quality By Haroon Malik.
CSCU 411 Software Engineering Chapter 2 Introduction to Software Engineering Management.
W5HH Principle As applied to Software Projects
Swami NatarajanJune 17, 2015 RIT Software Engineering Reliability Engineering.
SE 450 Software Processes & Product Metrics Reliability Engineering.
SIM5102 Software Evaluation
SE 450 Software Processes & Product Metrics 1 Defect Removal.
1 Software Testing and Quality Assurance Lecture 30 – Testing Systems.
(c) 2007 Mauro Pezzè & Michal Young Ch 1, slide 1 Software Test and Analysis in a Nutshell.
Performance Measurement and Strategic Information Management
© 2013 IBM Corporation Information Management Discovering the Value of IBM InfoSphere Information Analyzer IBM Software Group 1Discovering the Value of.
1 Finding Predictors of Field Defects for Open Source Software Systems in Commonly Available Data Sources: a Case Study of OpenBSD Paul Luo Li Jim Herbsleb.
How to Measure the Impact of Specific Development Practices on Fielded Defect Density.
Roles of IT Personnel Unit Customer Service This is a facility that helps customers with wide-ranging questions relating to a specific company,
Types of software. Sonam Dema..
Module 3: Business Information Systems Enterprise Systems.
Stephanie Fultz. Overall Modeling Modeling is a way of thinking about the problems using models organized around the real world ideas. A modeling method.
Load Test Planning Especially with HP LoadRunner >>>>>>>>>>>>>>>>>>>>>>
Problems with reuse – Increased maintenance costs; lack of tool support; not-invented- here syndrome; creating, maintaining, and using a component library.
Effective Methods for Software and Systems Integration
1 Forecasting Field Defect Rates Using a Combined Time-based and Metrics-based Approach: a Case Study of OpenBSD Paul Luo Li Jim Herbsleb Mary Shaw Carnegie.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
© Mahindra Satyam 2009 Project Metrics QMS Training.
Providing a Software Quality Framework for Testing of Mobile Applications Dominik Franke and Carsten Weise RWTH Achen University Embedded Software Laboratory.
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
THE MANAGEMENT AND CONTROL OF QUALITY, 5e, © 2002 South-Western/Thomson Learning TM 1 Chapter 8 Performance Measurement and Strategic Information Management.
Software Engineering CS3003
Software Estimation and Function Point Analysis Presented by Craig Myers MBA 731 November 12, 2007.
Managed Print Services. MyOffice PrinterServices HardwareSoftwareTonerPartsLabor Consultative Services Level 1 & 2 Help Desk Triage Increase efficiencies.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Patterns of Event Causality Suggest More Effective Corrective Actions Abstract: The Occurrence Reporting and Processing System (ORPS) has used a consistent.
September 22, 2010 LIOB Meeting Independent Evaluation of PG&E’s Electric SmartMeters™ Structure Consulting Group, LLC. provided report at September 2,
C6 Databases. 2 Traditional file environment Data Redundancy and Inconsistency: –Data redundancy: The presence of duplicate data in multiple data files.
Chapter 3 Software. Learning Objectives Upon successful completion of this chapter, you will be able to: Define the term software Describe the two primary.
Microsoft Reseach, CambridgeBrendan Murphy. Measuring System Behaviour in the field Brendan Murphy Microsoft Research Cambridge.
Manag ing Software Change CIS 376 Bruce R. Maxim UM-Dearborn.
Enabling Reuse-Based Software Development of Large-Scale Systems IEEE Transactions on Software Engineering, Volume 31, Issue 6, June 2005 Richard W. Selby,
THE IRISH SOFTWARE ENGINEERING RESEARCH CENTRELERO© What we currently know about software fault prediction: A systematic review of the fault prediction.
SOFTWARE PROCESS AND PROJECT METRICS. Topic Covered  Metrics in the process and project domains  Process, project and measurement  Process Metrics.
“How to Measure the Impact of Specific Development Practices on Fielded Defect Density” by Ann Marie Neufelder Presented by: Feride Padgett.
Experian is a registered trademark of Experian Information Solutions, Inc. © © Experian Information Solutions, Inc Confidential and proprietary -
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
Application Software System Software.
© ABB Corporate Research January, 2004 Experiences and Results from Initiating Field Defect Prediction and Product Test Prioritization Efforts at.
1 Experience from Studies of Software Maintenance and Evolution Parastoo Mohagheghi Post doc, NTNU-IDI SEVO Seminar, 16 March 2006.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
System Maintenance Modifications or corrections made to an information system after it has been released to its customers Changing an information system.
Performance Testing Test Complete. Performance testing and its sub categories Performance testing is performed, to determine how fast some aspect of a.
Network management Network management refers to the activities, methods, procedures, and tools that pertain to the operation, administration, maintenance,
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
UNDER THE GUIDENCE OF: Mr.M.JAYANTHI RAO,M.Tech HOD OF IT. BY: I.ADITHYA(09511A1212) HONEYPOTS.
“The Role of Experience in Software Testing Practice” A Review of the Article by Armin Beer and Rudolf Ramler By Jason Gero COMP 587 Prof. Lingard Spring.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
Improving The Customer Experience Our Commitment To Operational Excellence Gita MacLean, May 21 st /2015.
Steve Chenoweth Office Phone: (812) Cell: (937)
Estimate Testing Size and Effort Using Test Case Point Analysis
Principles of Information Systems Eighth Edition
IT Roles and Responsibilities
ENTERPRISE BUSINESS SYSTEMS
Software metrics.
Chapter 3 Software.
Presentation transcript:

1 Predictors of customer perceived software quality Paul Luo Li (ISRI – CMU) Audris Mockus (Avaya Research) Ping Zhang (Avaya Research)

2 Need to View Quality from the Customer’s Perspective … We translate these advanced technologies into value for our customers … -IBM (#9 on the Fortune 500) … Our strategy is to offer products, services and solutions that are high tech, low cost and deliver the best customer experience. -HP (#11 on the Fortune 500) … We deliver unparalleled value to our customers. Only by serving our customers well do we justify our existence as a business -Avaya (#401 on the Fortune 500)

3 What Would be Ideal Predict customer perceived quality  Using customer characteristics  For each customer Key idea: Focus on the customer

4 Possible Applications of Predictions How do I plan deployment to meet the quality expectations of the customer? How do I target improvement efforts? How do I allocate the right resources to deal with customer problems Predict customer experience for each customer Identify possible causes of problems Predict customer interactions

5 Solutions for Software Producers How do I plan deployment to meet the quality expectations of the customer? How do I target improvement efforts? How do I allocate the right resources to deal with customer problems Predict customer experience for each customer Identify possible causes of problems Predict customer interactions

6 To Improve Customer Perceived Quality How do I plan deployment to meet the quality expectations of the customer? How do I target improvement efforts? How do I allocate the right resources to deal with customer problems Predict customer experience for each customer Identify possible causes of problems Predict customer interactions

7 Gaps in Current Research Prior work examined:  Software defect prediction for a single customer (Musa et al. 1987, Lyu et al. 1996)  Software defect prediction for modules or features (Jones et al. 1999, Khoshgoftaar et al. 1996) Is not scalable

8 Not Focused on Customers Prior work examined:  Software defect prediction for a single customer (Musa et al. 1987, Lyu et al. 1996)  Software defect prediction for modules or features (Jones et al. 1999, Khoshgoftaar et al. 1996) Tell us nothing about a specific customer

9 Does not Capture other Aspects of Customer Perceived Quality Prior work examined:  Software defect prediction for a single customer (Musa et al. 1987, Lyu et al. 1996)  Software defect prediction for modules or features (Jones et al. 1999, Khoshgoftaar et al. 1996) Does not predict other aspects of customer perceived quality that are not code related.

10 Research Contributions Predict software defects for each customer in a cost effective manner Predict other aspects of customer perceived quality for each customer Empirically validate deployment, usage, software, and hardware predictors

11 Rest of This Talk The setting Customer interactions (outputs) Customer characteristics (inputs) Results Conclusion

12 Empirical Results from a Real World Software System Avaya telephone call processing software system  7 million+ lines of C/C++  Fixed release schedule Process improvement efforts Tens of thousands of customers  90% of Fortune 500 companies use it Professional support organization

13 Data Used are Commonly Available Customer issue tracking system  Trouble ticket database The equipment database Change management  Sablime database Data collected as a part of everyday operations Data sources available at other organizations e.g. IBM and HP

14 Data collected as a part of everyday operations Data sources available at other organizations e.g. IBM and HP At Other Organizations Customer issue tracking system  Trouble ticket database The equipment database Change management  Sablime database

15 Customer Interactions (Outputs) We assume customer interaction == customer perceived quality Five customer interaction (Chulani et al. 2001, Buckley and Chillarege 1995) within 3 month of deployment  Software defects: high impact problem  System outages: high impact problem  Technician dispatches  Calls  Automated alarms Important for Avaya and likely for other organizations as well

16 Examine Customer Installations Months after general availability Number of deployments 1 5

17 Capture Characteristics of Each Installation Months after general availability Number of deployments 1 5 Customer 1: Deployed first month, a Large system, Linux… Customer 2: Deployed first month, a Small system, Windows… Customer 3: Deployed first month, a Large system, Proprietary Os… Customer 4: Deployed first month, a Small system, Linux… Customer 5: Deployed first month, a Large system, Linux…

18 Analyze Using Statistical Analysis Months after general availability Number of deployments 1 5 Customer 1: Deployed first month, a Large system, Linux… Customer 2: Deployed first month, a Small system, Windows… Customer 3: Deployed first month, a Large system, Proprietary Os… Customer 4: Deployed first month, a Small system, Linux… Customer 5: Deployed first month, a Large system, Linux… SimilaritiesDifferences

19 Category of Predictors (Kinds of Inputs) We examine:  Deployment issues  Usage patterns  Software platform  Hardware configurations Prior work examines:  Software product  Development process Common sense issues, but lack empirical validation

20 Category of Predictors (Kinds of Inputs) We examine:  Deployment issues  Usage patterns  Software platform  Hardware configurations Prior work examines:  Software product  Development process Key idea: From the customer’s perspective, they are not good predictors (i.e. do not vary for a single release)

21 Specific Predictors (Inputs) Total deployment time  deployment issues Operating system  software platform, hardware configurations System size  hardware configurations, software platform, usage patterns Ports  usage pattern, hardware configurations Software upgrades  deployment issue

22 Recap Predict for each customer (outputs):  Software defects  System outages  Technician dispatches  Calls  Automated alarms Using Logistic regression and Linear regression Using predictors (inputs):  Total deployment time  Operating system  System size  Ports  Software upgrades For a real world software system

23 Example: Field Defect Predictions

24 Predictors

25 Nuisance Variables

26 All Predictors are Important

27 The Most Important Predictor Total deployment time (deployment issue)  Systems deployed half way into our observational period are 13 to 25 times less likely to experience a software defect

28 May Enable Deployment Adjustments Total deployment time (deployment issue)  Systems deployed half way into our observational period are 13 to 25 times less likely to experience a software defect  May be due to software patching, better tools, more experienced technicians

29 Another Important Predictor Total deployment time (deployment issue) Operating system (software platform, hardware configurations)  Systems running on the proprietary OS are 3 times less likely to experience a software defect compared with systems on running the open OS (Linux)  Systems running on the commercial OS (Windows) are 3 times more likely to experience a software defect compared with systems running on the open OS (Linux)

30 May Allow for Targeted Improvement or Improved Testing Total deployment time (deployment issue) Operating system (software platform, hardware configurations)  May be due to familiarity with the operating system  May be due to operating system complexity

31 More Results in Paper The complete results and analyses for field defects Predictions for other customer interactions

32 Validation of Results and Method We accounted for data reporting differences  Included indicator variables in the models to identify populations (e.g. US or international customers) We independently validated the data collection process  Independently extracted data and performed analyses We interviewed personnel to validate findings  Programmers  Field technicians

33 Summary: Identified Predictors of Customer Perceived Quality We identified and quantified characteristics, like time of deployment, that can affect customer perceived quality by more than an order of magnitude

34 Summary: Modeled Customer Interactions We identified and quantified characteristics, like time of deployment, that can affect customer perceived quality by more than an order of magnitude We created models that can predict various customer interactions and found that predictors have consistent effect across interactions

35 Summary: Deployment is Important for High Reliability We identified and quantified characteristics, like time of deployment, that can affect customer perceived quality by more than an order of magnitude We created models that can predict various customer interactions and found that predictors have consistent effect across interactions We learned that controlled deployment may be the key for high reliability systems

36 Improve Customer’s Experiences You can target improvement efforts You can allocate the right resources to deal with customer reported problems You can adjust deployment to meet the quality expectations of your customers

37 Predictors of customer perceived software quality Paul Luo Li Audris Mockus (Avaya Research) Ping Zhang (Avaya Research)

38 Predicted Number of Calls Match Actual Number of Calls Calls for the next release Calls Time Predictions are made here