1 f02laitenberger7 An Internally Replicated Quasi- Experimental Comparison of Checklist and Perspective-Based Reading of Code Documents Laitenberger, etal.

Slides:



Advertisements
Similar presentations
Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian
Advertisements

Standardized Scales.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Measures of Association.
8. Evidence-based management Step 3: Critical appraisal of studies
Developing a research plan
Introduction to Research Methodology
Estimation and Reporting of Heterogeneity of Treatment Effects in Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare.
Statement of the Problem Goal Establishes Setting of the Problem hypothesis Additional information to comprehend fully the meaning of the problem scopedefinitionsassumptions.
Research Methods in Psychology Pertemuan 3 s.d 4 Matakuliah: L0014/Psikologi Umum Tahun: 2007.
Formulating the research design
The Research Process. Purposes of Research  Exploration gaining some familiarity with a topic, discovering some of its main dimensions, and possibly.
Introduction to the design (and analysis) of experiments James M. Curran Department of Statistics, University of Auckland
Screening and Early Detection Epidemiological Basis for Disease Control – Fall 2001 Joel L. Weissfeld, M.D. M.P.H.
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
Chapter 2: The Research Enterprise in Psychology
Chapter 2: The Research Enterprise in Psychology
Chapter 2 The Research Enterprise in Psychology. n Basic assumption: events are governed by some lawful order  Goals: Measurement and description Understanding.
Magister of Electrical Engineering Udayana University September 2011
Reporting & Ethical Standards EPSY 5245 Michael C. Rodriguez.
Chapter 1: Introduction to Statistics
RESEARCH A systematic quest for undiscovered truth A way of thinking
 For the IB Diploma Programme psychology course, the experimental method is defined as requiring: 1. The manipulation of one independent variable while.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
Internal Assessment Your overall IB mark (the one sent to universities after the IB exam) in any IB science course is based upon two kinds of assessments.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Nursing Research Prof. Nawal A. Fouad (5) March 2007.
Evaluating a Research Report
Chapter 2: The Research Enterprise in Psychology.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Research Seminars in IT in Education (MIT6003) Research Methodology I Dr Jacky Pow.
1 f02kitchenham5 Preliminary Guidelines for Empirical Research in Software Engineering Barbara A. Kitchenham etal IEEE TSE Aug 02.
Systematic reviews to support public policy: An overview Jeff Valentine University of Louisville AfrEA – NONIE – 3ie Cairo.
Quantitative and Qualitative Approaches
Methodology Matters: Doing Research in the Behavioral and Social Sciences ICS 205 Ha Nguyen Chad Ata.
Research & Experimental Design Why do we do research History of wildlife research Descriptive v. experimental research Scientific Method Research considerations.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Scientifically-Based Research What is scientifically-based research? How do evaluate it?
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Research in Communicative Disorders1 Research Design & Measurement Considerations (chap 3) Group Research Design Single Subject Design External Validity.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
1 Experimentation in Computer Science – Part 3. 2 Experimentation in Software Engineering --- Outline  Empirical Strategies  Measurement  Experiment.
Academic Research Academic Research Dr Kishor Bhanushali M
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
McGraw-Hill/Irwin Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. Using Nonexperimental Research.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Research Methods Chapter 2.
Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.
1 cost6 An Experiment Measuring the Effects of Personal Software Process (PSP) Training Prechelt and Unger IEEE TOSE May 00.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Chapter Eight: Quantitative Methods
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Research Design Quantitative. Quantitative Research Design Quantitative Research is the cornerstone of evidence-based practice It provides the knowledge.
Retrospective Chart Reviews: How to Review a Review Adam J. Singer, MD Professor and Vice Chairman for Research Department of Emergency Medicine Stony.
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
CHAPTER ONE EDUCATIONAL RESEARCH. THINKING THROUGH REASONING (INDUCTIVELY) Inductive Reasoning : developing generalizations based on observation of a.
Statistics & Evidence-Based Practice
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Writing a sound proposal
Supplementary Table 1. PRISMA checklist
Empirical Research Methods for Software Engineering
© 2011 The McGraw-Hill Companies, Inc.
Chapter Eight: Quantitative Methods
Research Design Quantitative.
DESIGN OF EXPERIMENTS by R. C. Baker
Chapters 8, 9, 10, 11 Dr. Bhattacharya
Presentation transcript:

1 f02laitenberger7 An Internally Replicated Quasi- Experimental Comparison of Checklist and Perspective-Based Reading of Code Documents Laitenberger, etal IEEE TOSE May 2001

2 f02laitenberger7 Exp Design Guidelines - TTYP D1: Identify the population from which the subjects and objects are drawn. D2: Define the process by which the subjects and objects were selected. D3: Define the process by which subjects and objects are assigned to treatments.

3 f02laitenberger7 Experimental Design

4 f02laitenberger7 Conduct and collect – DC5 and 6? DC5: For observational studies and experiments, record data about subjects who drop out from the studies. DC6: For observational studies and experiments, record data about other performance measures that may be affected by the treatment, even if they are not the main focus of the study.

5 f02laitenberger7 Analysis A1: Specify any procedures used to control for multiple testing. A2: Consider using blind analysis. A3: Perform sensitivity analysis.

6 f02laitenberger7 Analysis A4: Ensure that the data do not violate the assumptions of the tests used on them. A5: Apply appropriate quality control procedures to verify your results. A3: Perform sensitivity analyses.

7 f02laitenberger7 Presentation P1: Describe or cite a reference for all statistical procedures used. P2: Report the statistical package used. P3: Present quantitative results as well as significance levels. Quantitative results should show the magnitude of effects and the confidence limits.

8 f02laitenberger7 Presentation P5: Provide appropriate descriptive statistics. P4: Present the raw data whenever possible. Otherwise, confirm that they are available for confidential review by the reviewers and independent auditors. P6: Make appropriate use of graphics.

9 f02laitenberger7 Interpretation I1: Define the population to which inferential statistics and predictive models apply. I2: Differentiate between statistical significance and practical importance. I3: Define the type of study. I4: Specify any limitations of the study.

10 f02laitenberger7 Results?

11 f02laitenberger7 Detection Effectiveness

12 f02laitenberger7 Cost per defect

13 f02laitenberger7 An Experiment Measuring the Effects of Personal Software Process (PSP) Training Prechelt and Unger IEEE TOSE May 00

14 f02laitenberger7 What did you learn? Two Issues u PSP u Experimentation

15 f02laitenberger7 What are goals of PSP?

16 f02laitenberger7 PSP success These data show, for instance, that the estimation accuracy increases considerably, the number of defects introduced per 1,000 lines of code (KLOC) decreases by a factor of two, the number of defects per KLOC to be found late during development (i.e., in test) decreases by a factor of three or more, and productivity is not reduced

17 f02laitenberger7 What is problem/issue? C2: If a specific hypothesis is being tested, state it clearly prior to performing the study and discuss the theory from which it is derived, so that its implications are apparent. The experiment investigated the following hypotheses (plus a few less important ones not discussed here, see [11]):. Reliability. PSP-trained programmers produce a more reliable program for the phoneword task than non-PSP- trained programmers.. Estimation Accuracy. PSP-trained programmers estimate the time they need for solving the phoneword task more accurately than non-PSP-trained programmers.

18 f02laitenberger7 Exp Context Guidelines C1: Be sure to specify as much of the industrial context as possible. In particular, clearly define the entities, attributes, and measures that are capturing the contextual information.

19 f02laitenberger7 Exp Context Guidelines C3: If the research is exploratory, state clearly and, prior to data analysis, what questions the investigation is intended to address and how it will address them. C4: Describe research that is similar to, or has a bearing on, the current research and how current work relates to it.

20 f02laitenberger7 Exp Design Guidelines D1: Identify the population from which the subjects and objects are drawn. D2: Define the process by which the subjects and objects were selected. D3: Define the process by which subjects and objects are assigned to treatments.

21 f02laitenberger7 Exp Design Guidelines D4: Restrict yourself to simple study designs or, at least, to designs that are fully analyzed in the statistical literature. If you are not using a well-documented design and analysis method, you should consult a statistician to see whether yours is the most effective design for what you want to accomplish.

22 f02laitenberger7 Exp Design Guidelines D8: If you cannot avoid evaluating your own work, then make explicit any vested interests (including your sources of support) and report what you have done to minimize bias. D7: Use appropriate levels of blinding. D9: Avoid the use of controls unless you are sure the control situation can be unambiguously defined.

23 f02laitenberger7 Exp Design Guidelines D11: Justify the choice of outcome measures in terms of their relevance to the objectives of the empirical study. D10: Fully define all treatments (interventions).

24 f02laitenberger7 Conduct and collect DC1: Define all software measures fully, including the entity, attribute, unit and counting rules. DC2: For subjective measures, present a measure of interrater agreement, such as the kappa statistic or the intraclass correlation coefficient for continuous measures. DC3: Describe any quality control method used to ensure completeness and accuracy of data collection.

25 f02laitenberger7 Conduct and collect DC4: For surveys, monitor and report the response rate and discuss the representativeness of the responses and the impact of nonresponse. DC5: For observational studies and experiments, record data about subjects who drop out from the studies. DC6: For observational studies and experiments, record data about other performance measures that may be affected by the treatment, even if they are not the main focus of the study.

26 f02laitenberger7 Analysis A1: Specify any procedures used to control for multiple testing. A2: Consider using blind analysis. A3: Perform sensitivity analyses.

27 f02laitenberger7 Analysis A4: Ensure that the data do not violate the assumptions of the tests used on them. A5: Apply appropriate quality control procedures to verify your results. A3: Perform sensitivity analyses.

28 f02laitenberger7 Presentation P1: Describe or cite a reference for all statistical procedures used. P5: Provide appropriate descriptive statistics. P2: Report the statistical package used. P3: Present quantitative results as well as significance levels. Quantitative results should show the magnitude of effects and the confidence limits. P4: Present the raw data whenever possible. Otherwise, confirm that they are available for confidential review by the reviewers and independent auditors. P6: Make appropriate use of graphics.

29 f02laitenberger7 Interpretation I1: Define the population to which inferential statistics and predictive models apply. I2: Differentiate between statistical significance and practical importance. I3: Define the type of study. I4: Specify any limitations of the study.

30 f02laitenberger7 Results?

31 f02laitenberger7 Output reliability

32 f02laitenberger7 Surprising data (no digit)

33 f02laitenberger7 Effort estimation error

34 f02laitenberger7 Estimated/actual productivity

35 f02laitenberger7 Productivity

36 f02laitenberger7 Productivity

37 f02laitenberger7 effort

38 f02laitenberger7 For Thurs 9/19 u Smith, Hale and Parrish, “An Empirical Study Using Task Assessment to Improve the Accuracy of Software Effort Estimation”, IEEE TOSE Mar 2001