Presentation is loading. Please wait.

Presentation is loading. Please wait.

Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific.

Similar presentations


Presentation on theme: "Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific."— Presentation transcript:

1 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 1 Forrest Shull,Fraunhofer Center for Experimental Software Engineering - Maryland ISERN Distributed Experiment: Planned Design

2 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 2 Goals for this Meeting Present a framework for the distributed experiment –The broad strokes that are not negotiable! Decide whether there is sufficient consensus to go forward with a distributed experiment If yes, –Gauge the interest in specific hypotheses that can be explored using this framework. –Decide on a specific set of hypotheses for study and sign up participants.

3 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 3 Distributed Design Goals Collect useful information about inspections for researchers and project managers –Each local experiment is interesting –“Meta-analysis” across a critical mass of experiments yields another level of information Understand and incorporate local differences Identify factors influencing effectiveness across –Different types of organizations –Different types of development environments Allow several options for collaboration –Allowing organizations to match required effort with anticipated benefits –All of which support the larger analysis

4 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 4 Collaboration options: Level 1 Industry Survey –Descriptive analysis, no benchmark. –Estimated effort for organization:  One respondent, 4 to 6 staff hours. –Output: A report characterizing types of responding organizations and their inspection processes. –Benefits:  Characterization of state-of-the-practice  Compilation of best practices  Understanding of problem areas that could be improved  [Measure distance between “standards” and local processes, local documents.]

5 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 5 Collaboration Options: Level 1 Industry Survey: Process –Identify/contact a respondent. –Respondent answers for his/her development team. –Responses are aggregated across all respondents and reflected in final report, along with lessons learned analysis.

6 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 6 Collaboration Options: Level 2 Benchmark an inspection technique –Pick some inspection variant and study it across different contexts. –Estimated effort for organization:  Contact person and >10 participants for 1 day, 65 to 85 staff hours total. –Benefits:  Provides training to participants in a beneficial inspection approach.  Some understanding of potential benefit due to process change in the local organization.  [Some understanding of expected improvement due to variant process in different contexts.]

7 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 7 Collaboration Options: Level 2 Benchmarking a technique: Process –Complete survey steps –Produce local documents (version control, seeding, …) –Training –Inspection using the new technique –Analysis: Compare inspection results to historical baseline for organization  Qualitative or quantitative –Feedback to participants

8 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 8 Collaboration Options: Level 3 Controlled experiment –Get very accurate data about the improvement due to some inspection variant –Estimated effort for organization:  1 contact person, 8-10 hours  >10 participants for 1.5 days –Benefits:  Provides training to participants in a beneficial inspection approach.  Accurate understanding of potential benefit due to process change in the local organization.  [“Meta-analysis” across all organizations.]

9 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 9 Collaboration Options: Level 3 Controlled experiment: Process –Complete survey steps –Produce 2 local documents –Inspection of “local” document using the usual technique –Training –Inspection of “baseline” document using the new technique –Inspection of “local” document using the new technique –Analysis:  Compare results on local documents for new vs. usual inspection techniques  Compare results for new technique on local vs. baseline documents –Feedback to participants

10 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 10 Collaboration Options: Shadow Experiments For industrial collaborators who want to lower risk before investing in Level 2 or Level 3. –Must make a representative, “anonymized” local document available. –ISERN will match the industrial partner to a university course that would be willing to perform a first run of the same study –Industrial experiment will only occur if results from academic environment are promising.

11 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 11 Known Issues Most organizations don’t have a common inspection process. –Contact points and analysis must be done at the level of development teams. Too little emphasis on accurate measures of effectiveness. –Can get some benefit from qualitative reflections, even at the survey level. No “benchmark technique” will be equally effective in all environments. –Level 2 and shadow experiment give an opportunity to test the hypothesis with less commitment. Seeding defects in any document is inaccurate. –We will emphasize using previously version-controlled documents with a defect history whenever possible, but have to rely on an organization’s own assessment of what the important issues are.

12 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 12 Goals for the technique to be benchmarked Require evidence of its likely benefit Should be widely applicable (maximize potential pool of participants) Some version should be able to be taught in a “reasonable” time Should be of genuine interest to target audience Results should be actionable

13 Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific 5 Picking a Specific Object of Study Object of Study Slide 13 Some options for the technique to be benchmarked PBR (requirements) / OORTs (UML) –Training materials (including webcasts) with a history of reuse are available to assist consistency of training –PBR: Reusable ICSE tutorial Inspection meeting approaches –Video/telecon; document circulation; netmeeting; DOORS- based protocol –Alternatives are commonly available, require minimal training


Download ppt "Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific."

Similar presentations


Ads by Google