Presentation is loading. Please wait.

Presentation is loading. Please wait.

Defect Localization Using Dynamic Call Tree Mining and Matching Anis Yousefi, PhD Candidate Department of Computing and Software McMaster University May.

Similar presentations


Presentation on theme: "Defect Localization Using Dynamic Call Tree Mining and Matching Anis Yousefi, PhD Candidate Department of Computing and Software McMaster University May."— Presentation transcript:

1 Defect Localization Using Dynamic Call Tree Mining and Matching Anis Yousefi, PhD Candidate Department of Computing and Software McMaster University May 28, 2012 1

2 Scope Defect detection – Test fails – Customer reports a problem – etc. Defect localization: identifying where the bugs are in the source code – Manual – Tool-supported Defect fixing 2

3 Problem statement Input – Failure setting: scenario + input causing the failure – Failure description: distorted features Output 3 – Methods manifesting the failure – Path for root cause analysis – Root cause of the problem – Set of passing scenarios + features they exercise

4 Approach Defect Localization using Dynamic Call Tree Mining and Matching – Dynamic analysis – Granularity: method – Call tree affecting defects – Rationale: failure is associated with deviation from correct execution path Detect correct execution paths (call tree mining) Detect deviating points (call tree matching) 4

5 Step I Pre-processing – Instrument the target system – Collect execution traces – Build dynamic call trees – Reduce dynamic call trees 5

6 Step II Mining – Mine frequent sub trees from passing test cases 6 On Msg Main Init Rej Orde r On Msg Main Init Ack Fill Orde r Test case 1Test case 2 On Msg Main Init Ack DFD Test case 3 Fill Orde r

7 Step III Pattern analysis – Rank patterns according to their relevance to the failing feature 7 Failing Feature: Acknowledge Feature-Specific Shared On Msg Main Init Rej Orde r On Msg Main Init Ack Fill Orde r Test case 1Test case 2 On Msg Main Init Ack DFD Test case 3 Fill Orde r irrelevant

8 Ranking based on relevance 8 Relation between the support set of a pattern and that of a target (failing) feature Method frequency differences between executions that exercise the target feature and the ones that do not P10 P11 P12 P5 P6 P3 P1 P2 P8 P9 P4 P7 Feature-SpecificShared & Irrelevant Ranked patterns:

9 Step IV Matching – Input: feature-specific patterns – Match highly relevant patterns against the call tree of the failing execution to identify suspicious call tree changes 9 P10 P11 P12 P5 P6

10 Locating matching roots 10

11 Approximate matching 11 Diff DiffSize = 1 ParentDiffSize = 1

12 Step V Providing the defect localization report – Rank the results (patterns/methods) according to their likelihood of being defective – Provide a report 12

13 Ranking based on likelihood of defectiveness P10 (root = m1) P11 (root = m1) P12 (root = m1) P5 (root = m2) P6 (root = m2) m1’s match @ location x 1/12/321/8NA m1’s match @ location y 1/102/420/0NA m2’s match (root not found) NA 1/1 (RNF) 13 0/01/1 1/81/102/322/42 Ranking Results – Approach #1: ParentDiffSize/DiffSize 0/01/102/421/11/82/321/1 Ranking Results – Approach #2: Feature-Specific Patterns Matches in the failing tree

14 Defect localization report 0/0 – P12 and m1’s match @ location y – Wrong branch(es) are executed in m1 or its children. – Suggested locations: P10: method m i P11: methods m j, m k 1/1 – P10 and m1’s match @ location x – Missing/Additional call(s) in the execution of m1 or its children – Incorrect branch(es) executed in m1 or its children – Suggested locations: P10: method m l P12 - method m m P11 - methods m n, m o 1/1 (RNF) – P5 (root, m2, not found) – Missing call to method m2! – Prospective callers are … 14

15 15

16 Thank You! Anis Yousefi yousea2@mcmaster.ca


Download ppt "Defect Localization Using Dynamic Call Tree Mining and Matching Anis Yousefi, PhD Candidate Department of Computing and Software McMaster University May."

Similar presentations


Ads by Google