© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman.

Slides:



Advertisements
Similar presentations
Methodologies for Assessing Social and Economic Performance in JESSICA Operations Gianni Carbonaro EIB - JESSICA and Investment Funds JESSICA Networking.
Advertisements

4th Module: Information Systems Development and Implementation:
System Integration Verification and Validation
Multimedia Specification Design and Production 2013 / Semester 1 / week 7 Lecturer: Dr. Nikos Gazepidis
The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Chapter 2 The Analyst As Project Manager In Managing Information Systems 2.3.
Chapter 2 The Software Process
ITIL: Service Transition
Selecting Preservation Strategies for Web Archives Stephan Strodl, Andreas Rauber Department of Software.
Introduction to the State-Level Mitigation 20/20 TM Software for Management of State-Level Hazard Mitigation Planning and Programming A software program.
Planning a measurement program What is a metrics plan? A metrics plan must describe the who, what, where, when, how, and why of metrics. It begins with.
University of Sunderland CSEM04 ROSCO Unit 13 Unit 13: Risk Methods CSEM04: Risk and Opportunities of Systems Change in Organisations Dr Lynne Humphries.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Lecture 7 Evaluation. Purpose Assessment of the result Against requirements Qualitative Quantitative User trials Etc Assessment of and Reflection on process.
Design Plans CSCI102 - Systems ITCS905 - Systems MCS Systems.
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
IMS Information Systems Development Practices
Ch 3 Usability page 1CS 368 Usability Models the authors compare three usability models and introduce their own “the extent to which a product can be used.
13.1 Revision Semester 2, 2005 IMS Information Systems Development Practices.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
The design process z Software engineering and the design process for interactive systems z Standards and guidelines as design rules z Usability engineering.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
CSI315 Web Applications and Technology Overview of Systems Development (342)
Chapter 2 The process Process, Methods, and Tools
Introduction to SDLC: System Development Life Cycle Dr. Dania Bilal IS 582 Spring 2009.
1 Chapter 2 The Process. 2 Process  What is it?  Who does it?  Why is it important?  What are the steps?  What is the work product?  How to ensure.
Risk planning & risk management (RM)
Towards Appropriate Selection of Analysis Tools and Methods.
Software Quality Assurance SE Software Quality Assurance What is “quality”?
Human Computer Interaction
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
Evaluation Paradigms & Techniques IS 588 Spring 2008 Dr. D. Bilal.
Web Security for Network and System Administrators1 Chapter 2 Security Processes.
Module 4: Systems Development Chapter 12: (IS) Project Management.
Software Project Management With Usage of Metrics Candaş BOZKURT - Tekin MENTEŞ Delta Aerospace May 21, 2004.
Chapter 12 Evaluating Products, Processes, and Resources.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Software Engineering Principles Principles form the basis of methods, techniques, methodologies and tools Principles form the basis of methods, techniques,
The Role of Information in Decision Making ITFM – Outcome 1.
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
Introduction to Systems Analysis and Design
SINTEF Telecom and Informatics EuroSPI’99 Workshop on Data Analysis Popular Pitfalls of Data Analysis Tore Dybå, M.Sc. Research Scientist, SINTEF.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
Indicators to Measure Progress and Performance IWRM Training Course for the Mekong July 20-31, 2009.
Towards Common Standards for Studies of Software Engineering Tools and Tool Features Timothy C. Lethbridge University of Ottawa.
27/3/2008 1/16 A FRAMEWORK FOR REQUIREMENTS ENGINEERING PROCESS DEVELOPMENT (FRERE) Dr. Li Jiang School of Computer Science The.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
Assessment of Technology Options 1 Naomi Radke, seecon international GmbH.
Information Systems System Analysis 421 Chapter 3 Managing the Information Systems Project.
Diploma in Procurement & Supply Business needs in Procurement & Supply Session 1 Business Needs and Procurement Decisions.
WERST – Methodology Group
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Unit – I Presentation. Unit – 1 (Introduction to Software Project management) Definition:-  Software project management is the art and science of planning.
National PE Cycle of Analysis. Fitness Assessment + Gathering Data Why do we need to asses our fitness levels?? * Strengths + Weeknesses -> Develop Performance.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
Emilia Mendes1 Professora Visitante CAPES/ Associate Professor Univ. Auckland, NZ. Introdução a Métricas, Qualidade e Medição de Software.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
by: Er. Manu Bansal Deptt of IT Software Quality Assurance.
Diploma in Procurement & Supply
ITIL: Service Transition
Project Management The Roles and Responsibilities of a Project Manager
Chapter 6: Database Project Management
An assessment framework for Intrusion Prevention System (IPS)
DT249/4 Information Systems Engineering Lecture 0
HCI in the software process
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
HCI in the software process
HCI in the software process
AICT5 – eProject Project Planning for ICT
Presentation transcript:

© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman Susan Linkman

© 1997, BAK. 2The DESMET Methodology EEL UNIVERSITY KE Agenda Evaluation methods Methods Selecting an appropriate metods

© 1997, BAK. 3The DESMET Methodology EEL UNIVERSITY KE Evaluation methods Two aspects: Nature of evaluation outcome –Assessment of suitability qualitative/subjective –Measurable benefits quantitative/objective Organisation of evaluation –formal experiment –case study –survey

© 1997, BAK. 4The DESMET Methodology EEL UNIVERSITY KE Qualitative methods Feature analysis User requirements mapped to method/tool features Subjective assessment how well is feature supported? how usable is functionality? Problems: Selection of features Subjectivity of rating Collation of results Too many features

© 1997, BAK. 5The DESMET Methodology EEL UNIVERSITY KE Quantitative methods Measured benefits of method/tool Objective assessment measure quality and/or productivity compare results using different method/tool Problems Not all benefits are quantitative Some quantitative benefits are hard to measure

© 1997, BAK. 6The DESMET Methodology EEL UNIVERSITY KE Hybrid methods Specific techniques Benchmarking objective performance measures subjective selection of “tests” Qualitative effects analysis subjective expert opinion about quantitative benefits

© 1997, BAK. 7The DESMET Methodology EEL UNIVERSITY KE Formal experiment Scientific paradigm Many subjects (engineers) Perform specified task(s) Subjects assigned at random to method Randomisation and replication minimise bias ensure results are trustworthy Best for precise answers to limited questions

© 1997, BAK. 8The DESMET Methodology EEL UNIVERSITY KE Case studies Method/tool tried out on “real” project Results scale to real world Limited replication so problems with comparisons

© 1997, BAK. 9The DESMET Methodology EEL UNIVERSITY KE Surveys For “mature” methods/tools People/groups that use method or tool polled Database of results analysed

© 1997, BAK. 10The DESMET Methodology EEL UNIVERSITY KE Nine evaluation methods Feature analysis –Formal Experiment –Case Study –Survey –Screening-mode Quantitative evaluation –Formal Experiment –Case Study –Survey Qualitative effects analysis Benchmarking

© 1997, BAK. 11The DESMET Methodology EEL UNIVERSITY KE Problem 9 Evaluation methods Embarrassment of riches Which method should you use? It depends what you want to do

© 1997, BAK. 12The DESMET Methodology EEL UNIVERSITY KE 7 Selection Criteria Evaluation project goals Evaluation capability of organisation Nature of evaluation object Nature of impact Scope of impact Maturity of evaluation object Learning curve

© 1997, BAK. 13The DESMET Methodology EEL UNIVERSITY KE Evaluation goals Choice of methods for individual project Selection of methods & tools for an organisation Monitoring changes as part of process improvement program evaluation of proposed change effect of adoption of change Selection of method/tool for resale

© 1997, BAK. 14The DESMET Methodology EEL UNIVERSITY KE Evaluation capability Characteristics of an organisation affect its ability to perform evaluations Four types of organisation capability: 1. Severely limited –each project is different 2. Qualitative evaluation capability –project follow same standards 3. Quantitative & qualitative –projects all keep project metrics 4. Full evaluation capability –the organisation maintains store of project data

© 1997, BAK. 15The DESMET Methodology EEL UNIVERSITY KE Nature of evaluation object Method (or method/tool combination) –likely to have major impact –quantitative assessment advisable Tool –comparing alternatives suggests feature analysis –tool v. no tool suggests quantitative assessment Generic method –e.g. object-oriented v. structured methods –can only try-out specific methods/tools –generic assessment needs expert opinion

© 1997, BAK. 16The DESMET Methodology EEL UNIVERSITY KE Scope impact Product granularity: whole product modules Extent of impact seen immediately seen over several phases or whole lifecycle seen on subsequent projects

© 1997, BAK. 17The DESMET Methodology EEL UNIVERSITY KE Impact on selection of method Formal experiments more viable for impacts with small scope easier to impose necessary control easier to provide replication Case studies appropriate for larger scope For impacts affecting later projects e.g. effect of re-usability need to consider surveys

© 1997, BAK. 18The DESMET Methodology EEL UNIVERSITY KE Maturity of item If currently in wide-spread use: surveys are possible If new method/tool case study or formal experiment

© 1997, BAK. 19The DESMET Methodology EEL UNIVERSITY KE Learning time time to understand principles time to become proficient Long learning reduces feasibility of formal experiment

© 1997, BAK. 20The DESMET Methodology EEL UNIVERSITY KE Feasibility of selection Other non-technical factors affect method selection: Timescales for evaluation Level of confidence required in result Cost of evaluation

© 1997, BAK. 21The DESMET Methodology EEL UNIVERSITY KE Timescales for evaluation Long (3 months plus): –Cases study (quantitative or qualitative) Medium (several months) –Feature analysis - survey Short (several weeks) –Experiments (quantitative or qualitative) –Benchmarking –Feature analysis - screening mode Very short (a few days) –Quantitative survey –Qualitative Effects Analysis

© 1997, BAK. 22The DESMET Methodology EEL UNIVERSITY KE Risk of “wrong” result Very High: –Qualitative Effects Analysis –Feature analysis - screening mode High: –Quantitative case study (“sister project”) –Feature analysis case study Medium –Quantitative case study (“organisation baseline”) –Feature analysis survey

© 1997, BAK. 23The DESMET Methodology EEL UNIVERSITY KE Risk of wrong result- continued Low: –Quantitative case study (“within project baseline”) –Formal feature analysis experiment –Quantitative survey Very Low –Formal quantitative experiment

© 1997, BAK. 24The DESMET Methodology EEL UNIVERSITY KE Cost of an evaluation High: –Formal experiment Medium: –Case study –Feature Analysis Survey or Screening-mode –Benchmarking Low (assuming infrastructure exists): –Quantitative Survey –Qualitative Effects Analysis

© 1997, BAK. 25The DESMET Methodology EEL UNIVERSITY KE Summary There is no best evaluation method An appropriate evaluation method is context dependent “Appropriate” technical choice can be infeasible if it: takes too long costs too much doesn’t provide sufficiently trustworthy results