Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE1 Software Sizing Ali Afzal Malik & Vu Nguyen USC-CSSE.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE1 Software Sizing Ali Afzal Malik & Vu Nguyen USC-CSSE."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE1 Software Sizing Ali Afzal Malik & Vu Nguyen USC-CSSE CSCI 510

2 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE2 Outline Nature, value, and challenges of sizing Sizing methods and tools Sizing metrics Maintenance sizing Conclusions and references

3 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE3 Nature and Value of Sizing Definitions and general characteristics Value provided by sizing When does sizing add less value?

4 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE4 Size: Bigness/Bulk/Magnitude Measured in gallons Measured in usable square feet Measured in ???

5 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE5 Sizing Characteristics Generally considered additive –Size (A U B) = Size (A) + Size (B) Often weighted by complexity –Some artifacts “heavier” to put in place –Requirements, function points

6 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE6 Software Size is an Important Metric A key metric used to determine –software project effort and cost: (effort, cost) = f(size, factors) –time to develop –staff –quality –productivity

7 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE7 When Does Sizing Add Less Value? Often easier to go directly to estimating effort –Imprecise size parameters GUI builders; COTS integration –Familiar, similar-size applications Analogy to previous effort: “yesterday’s weather” When size is a dependent variable –Time-boxing with prioritized features

8 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE8 Sizing Challenges Brooks’ factors: software system, product Cone of uncertainty

9 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE9 Brooks’ Factor of 9 for Programming System Product A Program A Programming System A Programming Product A Programming Systems Product x3 Product: Handles off-nominal cases, testing; well-documented System: Handles interoperability, data management, business management Telecom: 1/6 basic call processing; 1/3 off-nominals; 1/2 billing Adapted from Brooks, 1995

10 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE10 The Cost and Size Cone of Uncertainty If you don’t know what you’re building, it’s hard to estimate its size or cost Boehm et al. 2000

11 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE11 Outline Nature, value, and challenges of sizing Sizing methods and tools Sizing metrics Maintenance sizing Conclusions and references

12 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE12 Basic Methods, Strengths, and Weaknesses (Adapted from Boehm, 1981) MethodStrengthsWeaknesses Pair-wise comparisonAccurate assessment of relative size Absolute size of benchmark must be known Expert judgmentAssessment of representativeness, interactions, exceptional circumstances No better than participants Biases, incomplete recall AnalogyBased on representative experience Representativeness of experience ParkinsonCorrelates with some experience Reinforces poor practice Price to winOften gets the contractGenerally produces large overruns Top-downSystem level focus Efficient Less detailed basis Less stable Bottom-upMore detailed basis More stable Fosters individual commitment May overlook system level costs Requires more effort

13 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE13 Comparison with Previous Projects Comparable metadata: domain, user base, platform, etc. Pair-wise comparisons Differential functionality –analogy; yesterday’s weather

14 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE14 Group Consensus Wideband Delphi –Anonymous estimates –Facilitator provides summary –Estimators discuss results and rationale –Iterative process –Estimates converge after revision in next rounds –Improves understanding of the product’s nature and scope –Works when estimators are collocated Planning Poker (Cohn, 2005) –Game: deck of cards –Moderator provides the estimation-item –Participants privately choose appropriate card from deck –Divergence is discussed –Iterative – convergence in subsequent rounds –Ensures everyone participates –Useful for estimation in agile projects

15 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE15 Probabilistic Methods PERT (Putnam and Fitzsimmons, 1979) –3 estimates: optimistic, most likely, pessimistic –Expected Size = optimistic + 4 * (most likely) + pessimistic –Std. Dev. = (pessimistic – optimistic) / 6 –Easy to use –Ratio reveals uncertainty –Example 6 Componentaiai mimi bibi EiEi δiδi SALES6K10K20K11K2.33K DISPLAY47137.51.5 INEDIT8121912.51.83 TABLES481281.33 TOTALS22376439δ E = 3.6

16 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE16 Why Do People Underestimate Size? (Boehm, 1981) Basically optimistic and desire to please Have incomplete recall of previous experience Generally not familiar with the entire software job

17 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE17 Outline Nature, value, and challenges of sizing Sizing methods and tools Sizing metrics Maintenance sizing Conclusions and references

18 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE18 Sizing Metrics vs. Time and Degree of Detail (Stutzke, 2005) Process Phase Possible Measures Primary Aids ConceptElaborationConstruction Increasing Time and Detail Subsystems Key Features User Roles, Use Cases Screens, Reports, Files, Application Points Function Points ComponentsObjects Source Lines of Code, Logical Statements Product Vision, Analogies Operational Concept, Context Diagram Specification, Feature List Architecture, Top Level Design Detailed Design Code

19 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE19 Sizing Metrics and Methodologies Requirements Architecture and Design Code Function Points COSMIC Function Points Use Case Points Mark II Function Points CK Object Oriented Metrics Class Points Object Points Source Lines of Code (SLOC) Module (file) Count

20 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE20 Counting Artifacts Artifacts: requirements, inputs, outputs, classes, use cases, modules, scenarios –Often weighted by relative difficulty –Easy to count at initial stage –Estimates may differ based on level of detail

21 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE21 Cockburn Use Case Level of Detail Scale (Cockburn, 2001)

22 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE22 Function Point Analysis - 1 FPA measures the software size by quantifying functionality of the software provided to users Five function types –Data Internal Logical File (ILF), External Interface File (EIF) –Transactions External Input (EI), External Output (EO), External Query (EQ) Each function is weighted by its complexity Total FP is then adjusted using 14 general systems characteristics (GSC) MS Word (system being counted) Huma n User MS Excel MS Outlook EO/EO/EQ EI/EO/EQ ILF EIF EO/EO/EQ

23 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE23 Function Point Analysis - 2 Advantages –Independent of technology and implementation –Can be measured early in project lifecycle –Can be applied consistently across projects and organizations Disadvantages –Difficult to automate –Require technical experience and estimation skills –Difficult to count size of maintenance projects

24 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE24 COSMIC FP COSMIC measures functionality of the software thru the movement of data (exit, entry, read, write) cross logical layers Advantages –Same as FPA –Applicable to real-time systems Disadvantages –Difficult to automate –Require experience and estimation skills –Require detailed architecture –Difficult to understand for customers

25 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE25 Use Case Point UCP counts functional size of the software by measuring the complexity of use cases Advantages –Independent of technology and implementation –Can be measured early in project lifecycle –Easier to count than FPA Disadvantages –Lack of standardization –Difficult to automate –High variance (low accuracy) UCP = (#Actors + #Use cases) * technical and environmental factors

26 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE26 CK Object-Oriented Metrics Six OO metrics (Chidamber and Kemerer, 1994) –Weighted Methods Per Class (WMC) –Depth of Inheritance Tree (DIT) –Number of Children (NOC) –Coupling Between Object Classes (CBO) –Response for a Class (RFC) –Lack of Cohesion in Methods (LCOM) Advantages –Correlated to effort –Can be automated using tools Disadvantages –Difficult to estimate early –Useful only in object-oriented code

27 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE27 Source Lines of Code (SLOC) SLOC measures software size or length by counting the logical or physical number of source code lines –Physical SLOC = number of physical lines in code –Logical SLOC = number of source statements Advantages –Can be automated easily using tools (e.g., USC’s CodeCount) –Easy to understand for customers and developers –Reflect the developer’s view of software –Supported by most of cost estimation models Disadvantages –Dependent on programming languages, programming styles –Difficult to estimate early –Inconsistent definitions across languages and organizations

28 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE28 SLOC Counting Rules Adapted from Madachy, 2005 Standard definition for counting lines - Based on SEI definition checklist from CMU/SEI-92-TR-20 - Modified for COCOMO II

29 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE29 Relationship Among Sizing Metrics Two broad categories of sizing metrics –Implementation-specific e.g. Source Lines of Code (SLOC) –Implementation-independent e.g. Function Points (FP) Need to relate the two categories –e.g. SLOC/FP backfire ratios

30 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE30 SLOC/FP Backfiring Table (Jones, 1996): other backfire ratios up to 60% higher

31 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE31 Multisource Estimation Implementation-independent/dependent Implementation-independent estimators –Use cases, function points, requirements Advantage: implementation-independent – Good for productivity comparisons when using VHLLs, COTS, reusable components Weakness: implementation-independent –Gives same estimate when using VHLLs, COTS, reusable components, 3GL development Multisource estimation reduces risk

32 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE32 Outline Nature, value, and challenges of sizing Sizing methods and tools Sizing metrics Maintenance sizing Conclusions and references

33 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE33 Sizing in Reuse and Maintenance Acquired Code Exiting System’s Code Reused Code New Code Adapted Code Project Execution Release 1Release 2

34 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE34 Calculating Total Equivalent SLOC - 1 Formulas: DM = % design modified, CM = % code modified, IM = % integration and test needed

35 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE35 Calculating Total Equivalent SLOC - 2

36 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE36 Adaptation Adjustment Modifier (AAM) Relative Amount of Modification (AAF) Relative Cost

37 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE37 Conclusions Size plays an important role in estimation and project management activities –Software estimation is a “garbage in garbage out” process Bad size in; bad cost out A number of challenges make estimation of size difficult Different sizing metrics –FP, COSMIC FP (cfu), Use Case Points, SLOC, CK OO metrics A plethora of sizing methods exists each method has strengths and weaknesses

38 University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE38 References Books –Boehm, Barry W., Software Engineering Economics, Prentice Hall, 1981. –Boehm, Barry W., et al., Software Cost Estimation With COCOMO II, Prentice Hall, 2000. –Brooks, Jr., Frederick P., The Mythical Man-Month, Addison-Wesley, 1995. –Cockburn, A., Writing Effective Use Cases, Addison-Wesley, 2001. –Cohn, M., Agile Estimating and Planning, Prentice Hall, 2005. –Jones, C., Applied Software Measurement: Assuring Productivity and Quality, McGraw-Hill, 1996. –Stutzke, Richard D., Estimating Software-Intensive Systems, Addison-Wesley, 2005. Journals –Chidamber, S. and C. Kemerer, 1994, “A Metrics Suite for Object Oriented Design,” IEEE Transactions on Software Engineering. –Putnam, L.H., and Fitzsimmons, A., 1979. Estimating software costs. Datamation. Tutorials/Lectures/Presentations –Boehm, Barry W., and Malik, Ali A., COCOMO Forum Tutorial, “System & Software Sizing”, 2007. –Boehm, Barry W., CSCI 577a Lecture, “Cost Estimation with COCOMO II”, 2004. –Madachy, Ray, CSCI 510 Lecture, “COCOMO II Overview”, 2005. –Valerdi, Ricardo, INCOSE Presentation, “The Constructive Systems Engineering Cost Model”, 2005. Websites –http://www.crisp.se/planningpoker/http://www.crisp.se/planningpoker/ –http://www.ifpug.org/http://www.ifpug.org/ –http://www.spr.com/http://www.spr.com/


Download ppt "University of Southern California Center for Systems and Software Engineering 10/16/2009©USC-CSSE1 Software Sizing Ali Afzal Malik & Vu Nguyen USC-CSSE."

Similar presentations


Ads by Google