Presentation is loading. Please wait.

Presentation is loading. Please wait.

Research and Technology Object Oriented Defect Detection Frank HoudekForrest Shull DaimlerChrysler AGFraunhofer Center - Maryland Research and Technology.

Similar presentations


Presentation on theme: "Research and Technology Object Oriented Defect Detection Frank HoudekForrest Shull DaimlerChrysler AGFraunhofer Center - Maryland Research and Technology."— Presentation transcript:

1 Research and Technology Object Oriented Defect Detection Frank HoudekForrest Shull DaimlerChrysler AGFraunhofer Center - Maryland Research and Technology

2 Agenda Overview, Goals of the Session Murray Wood: Challenges in Object-Oriented Code Inspection ISERN Survey Outcome Classification Schemes  Thilo Schwinn: Quality Gate Driven Definition of Classification Schemes Reading Techniques and Strategies  Forrest Shull: A Set of OO Design Reading Techniques  Stefan Biffl: Comparison of Checklists to Scenario-Based Reading Partitioning of Artifacts  Andreas Birk: PBR applied to OO designs Discussion and Planning of Future Steps

3 Research and Technology OO Inspections - A Multi-Faceted Selection Problem Input: Artifact type Properties - Size - Language - Standards used Domain Inspectors Inspection Goals Effectiveness (%defects found) Efficiency (#defects/time) Focussed follow-up activities Learning/Training Inspection Method Classification Scheme Reading technique and strategy Partitioning of artifacts Auxiliary Material Meeting & Documentation techniques Outcome Effectiveness Effort Efficiency Focussed follow-up Learning/Training Statisfaction Selection

4 Research and Technology Open Issues Empirical results for many combinations of Input/Goal/Method Consistent scheme for reporting results (  Framework) Evolution and tailoring of existing methods Emphasis in this track:  Classification schemes  Reading techniques and strategies  Partitioning of artifacts

5 Research and Technology Potential Outcome of the Session Building a common framework for OO-DD experiments (e.g. by using the proposed multi-faceted selection problem framework) Building a repository of knowledge, e.g. a common repository of OO-DD experiment descriptions (using a unique form?) Post-mortem analysis of already performed experiments First version of a selection mechanism for (and especially identifying most important influence factors)  Classification schemes  Reading techniques and strategies  Partitioning of artifacts

6 Research and Technology Discussion Question 0: The Right Taxonomy? Is the proposed framework for inspection methods (slide 3) a useful one for organizing research?  Is some aspect missing?  Is it a logical way to organize the work?

7 Research and Technology Discussion Question 1: Building a body of knowledge? Are there common trends in the results reported here? E.g.:  OO design inspections seem feasible…  OO design inspections seem effective…  OO reading is an effective way to perform inspections…  OO reading is an effective way to perform inspections for certain users… What re-analysis can be done to further support these hypotheses?  Who’s going to do it? We need names! Output: Can we get a joint paper that summarizes the results of our independent studies and draws some conclusions?

8 Research and Technology Discussion Question 2: Component pieces of a process? Can we aggregate the results into an approach to doing OO inspections?  E.g. We have discussed ways to do defect classification reading partitioning of documents etc.  Can they work together?  Do we know when to use the different approaches?

9 Research and Technology Discussion Question 3: An experience repository? ISERNers often talk about experience/data repositories… (Have any ever really gotten going?) Imagine a “lightweight” repository for OO inspections.  What would you want to get out of it? What would make it worth the effort?  How much effort / What kind of contribution is reasonable to expect from participants?  Who will contribute? Who will manage? We need names!

10 Research and Technology Survey: Object Oriented Defect Detection Experience in ISERN (1) Sent out to all ISERN members Returns: 9 (6 filled out questionnaires [2 Industry, 4 University] from 5 partners, 4 references or ‘no contribution’) Questions:  Artifacts  Scope and size in one inspection (partitioning)  Mechanism of walking through the artifacts and identify objects  Used classification scheme  Findings

11 Research and Technology Survey: Object Oriented Defect Detection Experience in ISERN (2) SourceEricssonTU ViennaIESEDCUniv. UlmUMD ArtifactsUse casesRequirementsUML designRequirementsOctopus OOARequirements Sequence diag.Use casesUse cases State diags.HL designUML diags. UnitArround classesAllLogical entitiesWholeWholePairs of documentdocumentdocuments Artifact size 1-2 classes20-30 pages30 pages40-80 pages10-20 pages6 classes (in inspection) Partitioning--- ---n.a.n.a.n.a. criteria Naming elements---------line no.page, line--- in the meeting ClassificationError noneNon-critical Ommision schemeSuperfluous Important Incorrect fact Improvement CriticalInconsistency MissingOptimization Ambiguity QuestionExtra Information vertical/horicontal1 Difference toEasiern.a.NoneNonen.a. structured artifacts Defect classificationOkFine enough fast, stableno problemsreading pre- for analysis, dominates defects but robust


Download ppt "Research and Technology Object Oriented Defect Detection Frank HoudekForrest Shull DaimlerChrysler AGFraunhofer Center - Maryland Research and Technology."

Similar presentations


Ads by Google