Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Sizing, Estimation, and Tracking

Similar presentations


Presentation on theme: "Software Sizing, Estimation, and Tracking"— Presentation transcript:

1 Software Sizing, Estimation, and Tracking
Pongtip Aroonvatanaporn CSCI577 Spring 2012 February 10, 2012 2/10/12 (C) USC-CSSE

2 Outline Terms and Definitions Software Sizing Software Estimation
Software/Project Tracking 2/10/12 (C) USC-CSSE

3 Terms and Definitions Software Sizing Software Estimation
Mechanism to estimate size and complexity Software Estimation Mechanism to estimate effort, time, duration Software/Project Tracking Mechanism to manage project progress 2/10/12 (C) USC-CSSE

4 Software Sizing Agile Techniques Traditional Techniques
Story points Planning Poker Traditional Techniques Expert Judgment Function Points Application Points Uncertainty treatment PERT Sizing Wideband Delphi COCOMO-U 2/10/12 (C) USC-CSSE

5 Story Points 2/10/12 (C) USC-CSSE

6 Story Points: What? Estimation mechanism based on user stories
Features/capabilities = story point Often used by Scrum team Strong focus on agile process A way to estimate difficulty Without committing time duration Measure size and complexity Essentially, how hard it is 2/10/12 (C) USC-CSSE

7 Story Points: Why? Better than hours
Humans are not good at estimating hours Standish group survey 68% projects failed to meet original estimates Hours completed tells nothing No useful information for clients/customers Story points can provide roadmap of capabilities to be delivered Less variation 2/10/12 (C) USC-CSSE

8 Story Points: How To? Involves the entire team Process Cohn Scale
Look at the backlog of features Pick the easiest Give a score to that feature (i.e. 2) Estimate other features relative to that point Cohn Scale Fibonacci 0, 1, 2, 3, 5, 8, 13, 20, 40, 100 2/10/12 (C) USC-CSSE

9 Story Points: How To? Velocity First sprint …
After 2-3 sprints, average story points completed Velocity used for planning future iterations guess 2/10/12 (C) USC-CSSE

10 Story Points: The Good Estimate backlog
Focus on product, not tasks Items that are valuable to clients/customers Track progress based on results delivered Hours are bad 1 hour for most productive team = hours for least productive team In industry Story point estimation cuts estimation time by 80% More estimation and tracking than typical waterfall 48 times faster than traditional waterfall estimations 2000 2/10/12 (C) USC-CSSE

11 Story Points: The Bad Publishing vs. Development Complexity vs. Time
Less effort for publishing Complexity vs. Time Some stories are intellectually complex Some stories are simply time consuming Less complex, but repetitive tasks get lower numbers. No accurate on actual effort required Some developers prefer hours and days Difficult to determine completion time without velocity 2/10/12 (C) USC-CSSE

12 Story Points: Example Students can purchase monthly parking passes online Parking passes can be paid via credit cards Parking passes can be paid via PayPal Professors can input student marks Students can obtain their current seminar schedule Students can order official transcripts Students can only enroll in seminars for which they have pre-requisites Transcripts will be available online via a standard browser 2/10/12 (C) USC-CSSE

13 Planning Poker 2/10/12 (C) USC-CSSE

14 Planning Poker: What? A mechanism to Like playing poker
Introduce estimation Invoke discussions Like playing poker Each person has cards Reveal cards at the same time 2/10/12 (C) USC-CSSE

15 Planning Poker: Why? Multiple expert opinions
Knowledgeable Best suited for estimation tasks Estimates require justifications Improve accuracies Better compensated for missing information Good for story point estimation Average of estimates gives better results 2/10/12 (C) USC-CSSE

16 Planning Poker: How? Include all developers Process
Each estimator given a deck of cards For each user story, moderator reads the description Discuss about story until all questions are answered Each estimator selects a card representing his/her estimate Everyone show their cards at the same time (avoid bias) High and low estimators explain their estimates Discuss about estimates. Estimators re-select cards If estimates converge, take the average. If estimates do not converge, repeat the process. 2/10/12 (C) USC-CSSE

17 Planning Poker: How? Done at two different times First Second
Before project begins Estimate large number of items Initial set of user stories Second During the end of each iteration Estimate for the upcoming iteration 2/10/12 (C) USC-CSSE

18 Planning Poker: The Good
Fun and enjoyable Convergence of estimates More accurate Justifications Invoke group discussions Improve understandings Improve perspectives 2/10/12 (C) USC-CSSE

19 Planning Poker: The Bad
Easy to get into excessive amount of discussion Not accurate estimates Bad results Require high level of expertise Opinions Analogies High and low estimators may be viewed as “attackers” 2/10/12 (C) USC-CSSE

20 Function Points 2/10/12 (C) USC-CSSE

21 Function Points: What? Quantify the functionality
Measure development and maintenance Independent of technology Consistently across all projects Unit of measure representing the function size Application = number of functions delivered Based on user’s perspective What user asked for, not what is delivered Low cost and repeatable Good for estimating use-cases 2/10/12 (C) USC-CSSE

22 Function Points: How? Process Determine function counts by type
Determine complexity level. Classify each function count by complexity levels Apply complexity weights Compute Unadjusted Function Points. Add all the weighted function counts to get one number (UFP) 2/10/12 (C) USC-CSSE

23 Function Points: How? Data Functions Transactional functions
Internal logical files External interface files Transactional functions External inputs External outputs External inquiries 2/10/12 (C) USC-CSSE

24 Internal Logical Files
Data that is stored and maintained within your application Data that your application is built to maintain Examples Tables in database Flat files Application control information Configuration Preferences LDAP data stores 2/10/12 (C) USC-CSSE

25 External Interface Files
Data that your application uses/references But not maintained by your application Any data that your application needs Examples Same as Internal Logical Files But not maintained by your system 2/10/12 (C) USC-CSSE

26 External Inputs Unique user data or user control input that enters the application Comes from external boundary Examples Data entry by users Data or file feeds by external applications 2/10/12 (C) USC-CSSE

27 External Output User data or control that leaves the applications
Leaves the external boundary Present information Retrieval of data or control Examples Reports Data display on screen 2/10/12 (C) USC-CSSE

28 External Inquiry Unique input-output combination
Input causes/generates immediate output No mathematical formulas or calculations Create no derived data No Internal Logical Files maintained during processing Behavior of system not altered Examples Reports that do not involve derived data (direct queries) 2/10/12 (C) USC-CSSE

29 Complexity Levels Record Elements Data Elements 1-19 20-50 51+ 1 Low
Avg 2-5 High 6+ Function Type Complexity-Weight Low Average High Internal Logical Files 7 10 15 External Interfaces Files 5 External Inputs 3 4 6 External Outputs External Inquiries 2/10/12 (C) USC-CSSE

30 Function Point to SLOC COCOMO II has built-in calibration for converting Unadjusted FP to SLOC First specify the implementation language/technology Apply the multiplier (SLOC/UFP) More information can be found in COCOMO II book 2/10/12 (C) USC-CSSE

31 Function Points: The Good
Independent of programming language and technology Help derive productivity and quality performance indicators Benchmarking Productivity rate Cost/FP Guard against increase in scope Function creep 2/10/12 (C) USC-CSSE

32 Function Points: The Bad
Requires subjective evaluations A lot of judgment involved Many cost estimation models do not support function points directly Need to be converted to SLOC first Not as much research data available compared to LOC Can only be performed after design specification 2/10/12 (C) USC-CSSE

33 Estimating with Uncertainty?
2/10/12 (C) USC-CSSE

34 Uncertainty Treatment
PERT Sizing Use distribution Specify pessimistic, optimistic, and most likely sizes Biased? Wideband Delphi Experts discuss and estimate individually Discussions focus on points where estimates vary widely Reiterate as necessary COCOMO-U Extension to COCOMO II Use Bayesian Belief Network to address uncertain parameters Provide range of possible values 2/10/12 (C) USC-CSSE

35 Workshop time! 2/10/12 (C) USC-CSSE

36 Scenario Develop a software system for Effort Reporting
Sounds familiar? Software Requirements User authentication User capabilities Select Week Submit weekly effort View/Update weekly effort View weekly total Admin capabilities View grade report by user (on time submission) Add/view/edit effort categories 2/10/12 (C) USC-CSSE

37 Outline Terms and Definitions Software Sizing Software Estimation
Software/Project Tracking 2/10/12 (C) USC-CSSE

38 Project Tracking Goal-Question-Metric PERT Network Chart
Gantt Chart Burn Up and Burn Down Charts 2/10/12 (C) USC-CSSE

39 Goal-Question-Metric: What?
By Victor Basili, University of Maryland and NASA Software metric approach Captures measurement on three levels Conceptual level (goal) Defined for an object Operational level (question) Define models of the object of study Quantitative level (metric) Metrics associated with each question in a measurable way 2/10/12 (C) USC-CSSE

40 Goal-Question-Metric: Why?
Used within context of software quality improvement Effective for the following purposes: Understanding organization’s software practices Guiding and monitoring software processes Assessing new software engineering technologies Evaluating improvement activities 2/10/12 (C) USC-CSSE

41 Goal-Question-Metric: How?
Six-step process Develop a set of corporate, division, and project business goals Generate questions defining those goals Specify measures needed to be collected to answer questions Develop mechanisms for data collection Collect, validate, and analyze data. Provide feedback in real-time Analyze data in post mortem fashion. Provide recommendations for future improvements. 2/10/12 (C) USC-CSSE

42 Goal-Question-Metric: The Good
Align with organization environment Objectives and goals Project context Flexible 2/10/12 (C) USC-CSSE

43 Goal-Question-Metric: The Bad
Only useful when used correctly Must specify the right goals, questions, and metrics to measure Requires experience and high level of knowledge to use No explicit support for integrating with higher-level business goals and strategies Some things cannot be measured 2/10/12 (C) USC-CSSE

44 GQM+Strategies: What? An extension of GQM
Built on top Link software measurement goals to higher-level goals Software organization Entire business 2/10/12 (C) USC-CSSE

45 GQM+Strategies: Example
Wants: Increase customer satisfaction Strategy: Improve product reliability Both hardware and software Software development contribution Reduce defect slippage Improve testing process Team leaders decide on set of actions to take Implement improvements Measure results of improvements A tie between test defect data and customer satisfaction 2/10/12 (C) USC-CSSE

46 GQM+Strategies: Example
2/10/12 (C) USC-CSSE

47 Other Project Management Methods
2/10/12 (C) USC-CSSE

48 PERT Network Chart Identify critical paths
Nodes updated to show progress Grows quickly Becomes unusable when large Especially in smaller agile environments Eventually gets thrown away 2/10/12 (C) USC-CSSE

49 “Burn” Charts Effective in tracking progress Good for story points
Burn Up Burn Down Effective in tracking progress Good for story points Not good at responding to major changes 2/10/12 (C) USC-CSSE

50 Workshop Time! 2/10/12 (C) USC-CSSE

51 References Story Points Planning Poker Function Points
Planning Poker Function Points Boehm, B., Abts, C., Brown, A.W., Chulani, S., Horowitz, E., Madachy, R., Reifer, D.J., and Steece, B. Software Cost Estimation with COCOMO II. Prentice-Hall, 2000. Goal-Question-Metric GQM+Strategies PERT Wiest, J.D. and Levy, F.K. A Management Guide to PERT/CPM. Prentice-Hall, Englewood Press, 1977. Burn Charts Cockburn, A. “Earned-value and Burn Charts (Burn Up and Burn Down). Crystal Clear, Addison-Wesley, 2004. 2/10/12 (C) USC-CSSE


Download ppt "Software Sizing, Estimation, and Tracking"

Similar presentations


Ads by Google