Download presentation
Presentation is loading. Please wait.
1
SLOC and Size Reporting
Pongtip Aroonvatanaporn [Material by Ray Madachy] CSCI577b Spring 2011 February 4, 2011 2/4/11 (C) USC-CSSE
2
Outline Size Reporting Process SLOC Counting Rules
Reused and Modified Software COCOMO Model The Unified Code Count tool Conclusion 2/4/11 (C) USC-CSSE
3
Goal of Presentation Understanding the size data required at IOC Why?
Important historical data on 577 process Process performance Can be used for COCOMO calibration Specially calibrated COCOMO for 577 Current COCOMO utilize 200+ projects to calibrate Help identify COCOMO deficiencies and additional needs 2/4/11 (C) USC-CSSE
4
Size Reporting Process
Determine what you produced and quantify it Code developed new, reused, and modified Apply code counter to system modules Apply reuse parameters to all reused and modified code to get equivalent size 2/4/11 (C) USC-CSSE
5
Size Reporting Process
Identify any software not counted Provide as much background as possible Someone can follow up and fill in the gaps Acceptable to use function point counts as a last resort Problem with counting COTS code Provide COCOTS inputs if doing COTS-intensive development COTS-development contributes the majority of effort, but size cannot be counted 2/4/11 (C) USC-CSSE
6
Size Reporting Process
Finalizing the report Add up all equivalent lines of code The same top-level size measure that COCOMO uses Count by modules The modules should be consistent with your COCOMO estimation Otherwise, nearly impossible to compare with estimates 2/4/11 (C) USC-CSSE
7
Lines of Code Source lines of code (SLOCs) Logical source statements
NOT physical statements Data declarations Non-executable statements that affect an assembler’s or compiler’s interpretation Executable statements Cause runtime actions 2/4/11 (C) USC-CSSE
8
Lines of Code LOC = 1 LOC = 5 Example 1 Example 2 1 String[] command =
2 { 3 “cmd.exe”, 4 “/C”, 5 “-arg1”, 6 “-arg2”, 7 “-arg3” 8 }; 1 int arg1=0; int arg2=4; String ans; 2 ans = “Answer is”; System.out.println(ans +arg1+arg2); LOC = 1 LOC = 5 2/4/11 (C) USC-CSSE
9
SLOC Counting Rules Standard definition for counting lines
Based on SEI definition Modified for COCOMO When line or statement contains more than one type Classify it as type with highest precedence 2/4/11 (C) USC-CSSE
10
SLOC Counting Rules 2/4/11 (C) USC-CSSE
11
SLOC Counting Rules 2/4/11 (C) USC-CSSE
12
Reused and Modified Software
Also categorized as “adapted software” Problem: Effort for adapted software is not the same as for new software How to compare effort for reused and modified software with new software? Counting approach: Convert adapted software into equivalent size of new software 2/4/11 (C) USC-CSSE
13
Reuse Size-Cost Model Does not cross origin due to cost for assessing, selecting, and assimilating reusable components ~ 5% Non-linear because small modifications generation disproportionately large costs Cost of understanding software Relative cost of interface checking 2/4/11 (C) USC-CSSE
14
COCOMO Reuse Model Non-linear estimation model
Convert adapted software into equivalent size of new software Percent Design Modified Percent Code Modified Percent of effort for integration and test of modified Adaptation Adjustment Factor Adaptation Adjustment Multipliers (AAM) Equivalent SLOC Assessment and Assimilation Effort Software Understanding Unfamiliarity 2/4/11 (C) USC-CSSE
15
Reuse Model Parameters
DM – Percent Design Modified Percentage of adapted software’s design modified to fit it to new objectives CM – Percent Code Modified Percentage of “reused” code modified to fit it to new objectives IM – Percentage of effort for integration and test of modified software Relative to new software of comparable size IM = 100 * I&T Effort (modified software) / I&T Effort (new software) 2/4/11 (C) USC-CSSE
16
Reuse Model Parameters (AA)
Assessment & Assimilation Effort Effort needed to: Determine whether fully-reused software is appropriate Integrate its description into overall product description 2/4/11 (C) USC-CSSE
17
Reuse Model Parameters (SU)
Software Understanding Effort When code isn’t modified (DM=0, CM=0), SU=0 Take subjective average of 3 categories 2/4/11 (C) USC-CSSE
18
Reuse Model Parameters (UNFM)
Unfamiliarity Effect of programmer’s unfamiliarity with software 2/4/11 (C) USC-CSSE
19
Improved Reuse Model Unified model for both reuse and maintenance
New calibration performed by Dr. Vu Nguyen SLOC modified and deleted are considered to be equivalent to SLOC added 0.3 1 2/4/11 (C) USC-CSSE
20
Reuse Parameter Guidelines
2/4/11 (C) USC-CSSE
21
Data Collection 2/4/11 (C) USC-CSSE
22
Data Collection Refer to COCOMO model definition for details on various parameters DM, CM, IM, etc Indicate the counting method you used Manual approach? Automated? Available Code Counters CSSE Code Counter: UCC Code Counter developed as part of CSC 665 Advanced Software Engineering project Third party. But make sure that the counting rules are consistent. 2/4/11 (C) USC-CSSE
23
The Unified Code Count Tool
Developed at USC-CSSE Based on the counting rule standards established by SEI Evolved to count all major languages including web platforms Can be used to determine modified code (changed and deleted) Use this data to find equivalent “new” code 2/4/11 (C) USC-CSSE
24
Conclusion Software sizing and reporting is more than just simple line counting Finding actual effort based on equivalent sizing Only logical source code contributes to effort Accurate reporting is essential For research purposes Process performance evaluation and calibration Future planning and productivity predictions Give background on software pieces not counted 2/4/11 (C) USC-CSSE
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.