Towards Common Standards for Studies of Software Engineering Tools and Tool Features Timothy C. Lethbridge University of Ottawa.

Slides:



Advertisements
Similar presentations
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
Advertisements

Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 16 HCI PROCESS.
What is Software Design?. Systems Development Life- Cycle Planning Analysis Design Implementation Design.
The software process A software process is a set of activities and associated results which lead to the production of a software product. This may involve.
Alternative Software Life Cycle Models By Edward R. Corner vol. 2, chapter 8, pp Presented by: Gleyner Garden EEL6883 Software Engineering II.
EdTPA: Task 1 Support Module Mike Vitale Mark L’Esperance College of Education East Carolina University Introduction edTPA INTERDISCIPLINARY MODULE SERIES.
AN OVERVIEW BY JAMIE STARKE The Role of Prototyping.
Formal Methods in Software Engineering Credit Hours: 3+0 By: Qaisar Javaid Assistant Professor Formal Methods in Software Engineering1.
Assignment Marking via Online Self Assess Margot Schuhmacher, Lecturer Higher Education Development Unit, Centre for Learning and Teaching Support, Monash.
Proposed plan for the summative evaluation of a Technology Enhanced Learning Project Dean Petters, Statement of the research question Participants Method.
Evaluation Adam Bodnar CPSC 533C Monday, April 5, 2004.
How to Read a CS Research Paper? Philip W. L. Fong.
The Process of Interaction Design. What is Interaction Design? It is a process: — a goal-directed problem solving activity informed by intended use, target.
Writing Good Software Engineering Research Papers A Paper by Mary Shaw In Proceedings of the 25th International Conference on Software Engineering (ICSE),
© Lethbridge/Laganière 2001 Chapter 7: Focusing on Users and Their Tasks1 7.1 User Centred Design (UCD) Software development should focus on the needs.
ICS 463, Intro to Human Computer Interaction Design: 9. Experiments Dan Suthers.
Supplement 02CASE Tools1 Supplement 02 - Case Tools And Franchise Colleges By MANSHA NAWAZ.
CURE International Keith Wagner Clayton Hughes Tyler Alexander Ryan Tierney.
10th Workshop "Software Engineering Education and Reverse Engineering" Ivanjica, Serbia, 5-12 September 2010 First experience in teaching HCI course Dusanka.
Ch 3 Usability page 1CS 368 Usability Models the authors compare three usability models and introduce their own “the extent to which a product can be used.
 QUALITY ASSURANCE:  QA is defined as a procedure or set of procedures intended to ensure that a product or service under development (before work is.
1 CMPT 275 Software Engineering Requirements Analysis Process Janice Regan,
Value Assessment by Potential Tool Adopters: Towards a Model that Considers Costs, Benefits and Risks of Adoption Timothy C. Lethbridge SITE, University.
LEARNING WITH CERN KICK-OFF MEETING ‘Design of Missions’
Conducting Usability Tests ITSW 1410 Presentation Media Software Instructor: Glenda H. Easter.
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
Introduction Methodology Results This study aims to explore the current progress of using different types of software with various autism conditions. One.
Managing Software Quality
CLEANROOM SOFTWARE ENGINEERING.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
A Taxonomy of Evaluation Approaches in Software Engineering A. Chatzigeorgiou, T. Chaikalis, G. Paschalidou, N. Vesyropoulos, C. K. Georgiadis, E. Stiakakis.
Requirements Engineering CSE-305 Requirements Engineering Process Tasks Lecture-5.
 CS 5380 Software Engineering Chapter 8 Testing.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
Module 4: Systems Development Chapter 12: (IS) Project Management.
Software Project Management Lecture # 10. Outline Quality Management (chapter 26)  What is quality?  Meaning of Quality in Various Context  Some quality.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
Chapter 12 Evaluating Products, Processes, and Resources.
Software Testing. What is Testing? The process consisting of all life cycle activities, both static and dynamic, concerned with planning, preparation.
Applying the Usability Engineering Lifecycle in Tool Development VT SENRG Will Humphries & Kim Gausepohl 12/04/07 2:50-3:20PM.
Case Study Assignment MTT Certification Exam. Graded on four-point scale Purpose – extent to which response addresses the components of the assignment.
© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman.
WireFrame and RAD Team Members Abilash Kittanna Veeresh Kinagi.
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
Chapter 6: Thinking about requirements and describing them.
Chapter 3: Software Project Management Metrics
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
1 CSCD 326 Data Structures I Software Design. 2 The Software Life Cycle 1. Specification 2. Design 3. Risk Analysis 4. Verification 5. Coding 6. Testing.
1 Chapter 8 Building the Analysis Model (1) Analysis Concepts and Principles.
Advanced Software Engineering Lecture 4: Process & Project Metrics.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
SOFTWARE PROCESS IMPROVEMENT SHARATH CHANDAR REDDY ALETI CSC 532 TERM PAPER.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
The Value of USAP in Software Architecture Design Presentation by: David Grizzanti.
Banaras Hindu University. A Course on Software Reuse by Design Patterns and Frameworks.
edTPA: Task 1 Support Module
Master thesis: Automatic Extraction of Design Decision Relationships from a Task Management System Kick-Off Matthias Ruppel, 8th of May 2017, Munich.
Requirements Analysis Scenes
IAEA E-learning Program
Object-Oriented Software Engineering Using UML, Patterns, and Java,
Topic for Presentaion-2
Software Quality Engineering
Thursday’s Lecture Chemistry Building Musspratt Lecture Theatre,
Progression of Test Categories
Chapter 19 Case study on requirements, design, and evaluation: NATS
Music Technology What’s in the course?
Quick orientation for MBSE Usability Group
Human Computer Interaction Lecture 14 HCI in Software Process
Presentation transcript:

Towards Common Standards for Studies of Software Engineering Tools and Tool Features Timothy C. Lethbridge University of Ottawa

Premise: It is desirable to guide researchers studying SE tools Proposal: Create an inventory of practices to guide such studies Researchers could then create papers that would be  More comparable  More easily reviewable  More indexable

Types of Evaluation Commonly Found in Tools Papers  a) None - just a description  b) Includes rationale  c) Demonstration of adoption  d) Anecdotes and lessons learned  e) Informal studies - includes descriptive stats  f) Formal experiments involving students  g) Formal experiments involving practitioners Case studies papers:  Some combination of b-e Experimental papers:  f and g  but beware of overconfidence in results Papers of type e, f and g would benefit from following certain consistency patterns to facilitate comparability

Inventory of Measures. The following are purely examples that might be found in such an inventory  M1. Time taken to perform a given task.  M2. Amount of a given task completed correctly in a fixed time. The fixed time might depend on the task.  M3. Errors made in a given task  M4. Subjective answers on a scale to specific questions: (Questions to be listed in the inventory)

Inventory of study types ST1. Usability evaluation of a specific feature or tool implementation.  Help ensure that results from other study types are not confounded purely by poor usability. Provides evidence for these research questions:  Q1a To what extent is the feature or tool usable? Measures: M1, M2 and M3 (compared against a threshold).  Q1b What usability defects are present and which ones should be repaired? (qualitative).

Study types - continued ST2. Comparison of a small number of different feature implementations, each providing roughly the same functionality. Provides evidence for these research questions:  Q2a What is the best user interface for a certain feature? Measures: M1, M2, M3, M4 (measured separately for each implementation)  Q2b What comments do users have about each implementation? (qualitative)

Study types - continued ST3. Comparison of two alternative feature sets that achieve roughly the same goal, but in different ways. Provides evidence for these research questions:  Q3 What is the 'best' functionality for a certain task? Measures: M1, M2, M3, M4  Measured separately for each feature set

Study types - continued ST4.Comparison of presence and absence of a feature (or of a small feature set) in a tool Provides evidence for these research questions:  Q4a Is the feature worth including in a final tool set? Measures: M1, M2, M3 (measured separately for a tool with presence or absence of the features)  Q4b What benefits are provided by the feature? (qualitative)

Study types - continued ST5. Determination of which specific combinations of features are most useful as the context varies Provides evidence for these research questions:  Q5 Which features should be available in a given tool so the tool can be used in a variety of contexts? Measures: M1, M2, M3, M4a, M4c  Measured as the feature sets and contexts are varied in different combinations

Study types - continued ST6 Comparison of entire tools  Incorporating sets of features  Less abstract than ST3 Provides evidence for these research questions:  Q6 Which of several tools is best used for a given task? Measures: M1, M2, M3, M4  Measured separately for each tool