Download presentation
Presentation is loading. Please wait.
Published byPrudence Sherman Modified over 9 years ago
1
Research Heaven, West Virginia FY2003 Initiative: Hany Ammar, Mark Shereshevsky, Walid AbdelMoez, Rajesh Gunnalan, and Ahmad Hassan LANE Department of Computer Science and Electrical Engineering West Virginia University The OSMA Software Assurance Symposium July 30, 2003 Lakeview, WV WVU UI: WVU UI: Quantitative Relations Between Static and Dynamic Software Metrics
2
Research Heaven, West Virginia NASA OSMA SAS’03 2 Two main views of software systems: The static view of the structure and composition, The dynamic view during execution Static and dynamic properties of software have been widely studied in the literature.
3
Research Heaven, West Virginia NASA OSMA SAS’03 3 So how good is static analysis? IV&V testing is mostly focused of static analysis –Software rarely executed at Fairmont. Why? –NASA software supports NASA hardware –So, some software can’t run without its associated hardware Traditional view: –Dynamic execution more informative than static analysis This research: –To what extent are static measure surrogate for dynamic measures? –Compare insights gained from static and dynamic measures
4
Research Heaven, West Virginia NASA OSMA SAS’03 4 This talk Propose three hypotheses of possible correlations between selected static and dynamic metrics –Hypothesis I: Static coupling metrics correlate with error propagation in software architectures. –Hypothesis II: Static error propagation correlates with dynamic error propagation –Hypothesis III: Change proneness” correlates with dynamic coupling of components Experiments to measure –static metrics –dynamic metrics –for selected case studies Statistical analysis of the data: –computing correlations –developing linear or non-linear regression models. In summary: –Hypothesis I: not quite supported –Hypothesis II: supported –Hypothesis III: rejected
5
Research Heaven, West Virginia NASA OSMA SAS’03 5 Project Overview FY03 Conduct empirical study of the relationship between the static and dynamic metrics. Measure Static Metrics Measure related Dynamic Metrics Conduct statistical Analysis of the Data FY04 Establish Relationships between Static and Dynamic Metrics
6
Research Heaven, West Virginia Tools
7
Research Heaven, West Virginia NASA OSMA SAS’03 7 Tools for Static Metrics We can automatically collect several static metrics (OO metrics,Complexity metrics and Size/Volume metrics). We are currently using Understand for Java and C++ by Scientific Toolworks for collecting static metrics.
8
Research Heaven, West Virginia NASA OSMA SAS’03 8 Tools for Dynamic Metrics Rational Rose-Realtime simulation facility is used to simulate UML models At the code level, Rational Purifyplus produces dynamic sequence diagrams at run-time JProbe profiler by Quest Software serves a similar purpose for Java code but does not produce sequence diagrams.
9
Research Heaven, West Virginia Hypothesis I Static coupling metrics correlate with error propagation in software architectures.
10
Research Heaven, West Virginia NASA OSMA SAS’03 10 Study of Static Coupling Metrics and Error Propagation Correlate the experimental data on –error propagation –static information coupling (connector based). Correlate: –experimental error propagation –and static NAS (number of associations) coupling measure (connector Based). Correlate: –experimental error propagation –and CBO (connector based).
11
Research Heaven, West Virginia NASA OSMA SAS’03 11
12
Research Heaven, West Virginia NASA OSMA SAS’03 12 Table Highlighting Non-zero Values Information Flow Vs Ep
13
Research Heaven, West Virginia NASA OSMA SAS’03 13 Table Highlighting Non-zero Values CBO Vs Ep
14
Research Heaven, West Virginia NASA OSMA SAS’03 14 Correlation Values Correlation between Information Flow Coupling and Error Propagation is 0.545657 Correlation between CBO and Error Propagation is 0.130478 Correlation between NAS and Error Propagation is 0.0 The correlations are only for the values where the # fault injections >25.
15
Research Heaven, West Virginia NASA OSMA SAS’03 15 Discussion of Results Information flow coupling metric shows a higher correlation with dynamic error propagation than CBO. On further analysis the R 2 correlation between Information Flow and EP was 0.297742. This low correlation value could be partly due to the small size of our sample (9 data points) This result is not sufficient to validate the Hypothesis I
16
Research Heaven, West Virginia Hypothesis II: Static error propagation correlates with dynamic error propagation
17
Research Heaven, West Virginia NASA OSMA SAS’03 17 Static and Dynamic Error Propagation We will use the experimental data on error propagation and the static error propagation measure(connector Based). The static error propagation measure is based on the relation developed in the software architecture metrics USIP using the information flow between components.
18
Research Heaven, West Virginia NASA OSMA SAS’03 18 Experimental (Dynamic) Error Propagation Matrix of HCS
19
Research Heaven, West Virginia NASA OSMA SAS’03 19 Static Error Propagation Matrix of HCS
20
Research Heaven, West Virginia NASA OSMA SAS’03 20 Correlation Results There is a strong correlation between static error propagation and dynamic error propagation. Correlation = 0.875098 R 2 = 0.765797 This supports Hypothesis II and our conjecture that some static metrics correlate with related dynamic metrics.
21
Research Heaven, West Virginia Hypothesis III: “Change proneness” correlates with dynamic coupling of components
22
Research Heaven, West Virginia NASA OSMA SAS’03 22 Change Proneness and Change Propagation Hypothesis III “change proneness correlates with dynamic coupling of components” – from the study of Erik Arisholm June 2002, –Dynamic Coupling Measures for Object-Oriented Software, –Eighth IEEE Symposium on Software Metrics Correlate –dynamic import coupling, –dynamic export coupling metrics –with change proneness measured –based on the software architecture metrics USIP using interface change propagation probabilities.
23
Research Heaven, West Virginia NASA OSMA SAS’03 23 Dynamic Coupling Metrics The dynamic coupling metrics we focus on are as follows 1.Dynamic Export Coupling 2.Dynamic Import Coupling 3.Total Dynamic Coupling sum of both import and export coupling.
24
Research Heaven, West Virginia NASA OSMA SAS’03 24 Experiment The experiment was conducted on the SharpTool case study.with 32 components. Dynamic coupling was obtained by counting the object interactions dynamically at run time. During the run all functionalities of the tool were executed.
25
Research Heaven, West Virginia NASA OSMA SAS’03 25 Experiment Results
26
Research Heaven, West Virginia NASA OSMA SAS’03 26 Discussion of Results Zero correlation between –change proneness –and dynamic coupling. We also computed CBO and found zero correlation between change proneness and CBO This negates Hypothesis III. The study in the paper by Erik Arisholm was done using actual project changes data. In our experiment we used static change proneness data using our interface change propagation probabilities defined in the architecture metrics USIP
27
Research Heaven, West Virginia
28
NASA OSMA SAS’03 28 So how good is static analysis? Hypothesis I: Static coupling metrics correlate with error propagation in software architectures. –Not Sure Hypothesis II: Static error propagation correlates with dynamic error propagation –Yes Hypothesis III: Change proneness” correlates with dynamic coupling of components –No And so, what does all this mean? –IV&V can not run code –And still offer value added to NASA systems
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.