Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/11/2016CS-499G1 Costs without Maintenance. 1/11/2016CS-499G2 Costs with Maintenance.

Similar presentations


Presentation on theme: "1/11/2016CS-499G1 Costs without Maintenance. 1/11/2016CS-499G2 Costs with Maintenance."— Presentation transcript:

1 1/11/2016CS-499G1 Costs without Maintenance

2 1/11/2016CS-499G2 Costs with Maintenance

3 1/11/2016CS-499G3 Waterfall model Requirements Definition System & Software Design Implementation & Unit Testing Integration & System Testing Operation & Maintenance The drawback of the waterfall model is the difficulty of accommodating change after the process is underway

4 1/11/2016CS-499G4 Waterfall Model with Feedback Requirements Definition System & Software Design Implementation & Unit Testing Integration & System Testing Operation & Maintenance

5 1/11/2016CS-499G5 Iterative Models Prototyping RAD

6 1/11/2016CS-499G6 An Evolutionary (Spiral) Model

7 1/11/2016CS-499G7 Why Projects Fail?  an unrealistic deadline is established  changing customer requirements  an honest underestimate of effort  predictable and/or unpredictable risks  technical difficulties  miscommunication among project staff  failure in project management

8 1/11/2016CS-499G8 Software Teams  the difficulty of the problem to be solved  the size of the resultant program(s) in lines of code or function points  the time that the team will stay together (team lifetime)  the degree to which the problem can be modularized  the required quality and reliability of the system to be built  the rigidity of the delivery date  the degree of sociability (communication) required for the project The following factors must be considered when selecting a software project team structure...

9 1/11/2016CS-499G9 Defining the Problem  establish scope—a narrative that bounds the problem  decomposition—establishes functional partitioning

10 1/11/2016CS-499G10 To Get to the Essence of a Project  Why is the system being developed?  What will be done? By when?  Who is responsible for a function?  How will the job be done technically and managerially?  How much of each resource (e.g., people, software, tools, database) will be needed? Barry Boehm

11 1/11/2016CS-499G11 Task duration and dependencies

12 1/11/2016CS-499G12 Activity network

13 1/11/2016CS-499G13 Activity timeline u 4/7 u1u1 u 1/7 u 18/7 u 25/7 u 1/8 u 8/8 u 15/8 u 22/8 u 29/8 u 5/9 u 12/9 u 19/9 u T4 u T1 u T2 u M1 u T7 u T3 u M5 u T8 u M3 u M2 u T6 u T5 u M4 u T9 u M7 u T10 u M6 u T11 u M8 u T12 u Start u Finish

14 1/11/2016CS-499G14 Staff allocation

15 1/11/2016CS-499G15 Measurement & Metrics... collecting metrics is too hard... it's too time-consuming... it's too political... it won't prove anything... Anything that you need to quantify can be measured in some way that is superior to not measuring it at all.. Tom Gilb

16 1/11/2016CS-499G16 Metrics  Project Metrics  Effort/time per SE task  Errors uncovered per review hour  Error rate as a function of project time  Scheduled vs. actual milestone dates  Changes (number) and their characteristics  Distribution of effort on SE tasks  Estimated vs. actual effort required  Product Metrics  focus on the quality of deliverables  measures of analysis model  complexity of the design  internal algorithmic complexity (e.g., McCabe)  architectural complexity  data flow complexity  code measures  measures of process effectiveness  e.g., defect removal efficiency

17 1/11/2016CS-499G17 Typical Metrics  Size Oriented  errors per KLOC (thousand lines of code)  defects per KLOC  $ per LOC  page of documentation per KLOC  errors / person-month  LOC per person-month  $ / page of documentation  Function Oriented  errors per FP (thousand lines of code)  defects per FP  $ per FP  pages of documentation per FP  FP per person-month

18 1/11/2016CS-499G18 Measuring Quality  Correctness — the degree to which a program operates according to specification  Maintainability—the degree to which a program is amenable to change  Integrity—the degree to which a program is impervious to outside attack  Usability—the degree to which a program is easy to use

19 1/11/2016CS-499G19 Defect Removal Efficiency where errors = problems found before release defects = problems found after release DRE = (errors) / (errors + defects)

20 1/11/2016CS-499G20 Write it Down! Software Project Plan Project Scope Estimates Risks Schedule Control strategy

21 1/11/2016CS-499G21 To Understand Scope...  Understand the customers needs  understand the business context  understand the project boundaries  understand the customer’s motivation  understand the likely paths for change  understand that... Even when you understand, nothing is guaranteed!

22 1/11/2016CS-499G22 Cost Estimation  Project scope must be explicitly defined  Task and functional decomposition is necessary  Historical measures (metrics) are extremely helpful  Always use more than one metric  Always use more than one technique  Remember – the only certainty is uncertainty.

23 1/11/2016CS-499G23 Estimation Techniques  past (similar) project experience  conventional estimation techniques  task breakdown and effort estimates  size (e.g., FP) estimates  tools (e.g., Checkpoint)

24 1/11/2016CS-499G24 Functional Decomposition Statement of Scope Perform some task functional decomposition

25 1/11/2016CS-499G25 Estimation Guidelines  Estimate using at least two techniques  Get estimates from independent sources  Avoid over-optimism – assume difficulties  Be a pessimist not an optimist  Once you obtain an estimate – sleep on it!  Adjust for the people doing the job – they have the biggest impact

26 Paul P’s Project Plan Guidelines  Remember that you are not doing the work, someone with less experience will be  Estimate the time needed to do each task  Add 25% for overhead (meetings, vacation, etc.)  Add 25% for contingency  Add 25% because everyone is always an optimist (Murphy’s Law: If anything can go wrong, it will) 1/11/2016CS-499G26

27 Example  Initial project estimate: 80 person months to complete  25% overhead: 100 PM  25% contingency: 125 PM  25% to cover for Murphy’s Law: 156 PM  Actual size of project is almost double initial estimate  With lots of overtime (and no changes to the project) you may get it done on schedule 1/11/2016CS-499G27

28 1/11/2016CS-499G28 Project Risks What can go wrong? What is the likelihood? What will the damage be? What can we do about it?

29 1/11/2016CS-499G29  Attributes that affect risk  estimated size of the product in LOC or FP  estimated size of the product in number or modules, files, transactions  percentage deviation of the size of the product from that of the average of previous products  size of the database created or used by the product  number of users of the product  number of projected changes to the requirements for the product:  total  before delivery  after delivery  amount or reused software Risk Due to Product Size

30 1/11/2016CS-499G30 Why Project Planning Pays Off cost to find and fix a defect 100 10 log scale 1 req. design code test sys test field use 0.75 1.00 1.50 3.00 10.00 60.00-100.00

31 1/11/2016CS-499G31 Reviews & Inspections... there is no particular reason why your friend and colleague cannot also be your sternest critic. Jerry Weinberg

32 1/11/2016CS-499G32 Reviews  A review is:  a meeting conducted by technical people for technical people  a technical assessment of a work product created during the software engineering process  a software quality assurance mechanism  a training ground  A review is not:  a project budget summary  a scheduling assessment  an overall progress report  a mechanism for reprisal or political intrigue

33 1/11/2016CS-499G33 Conducting the Review be prepared—evaluate product before the review 1. review the product, not the producer 2. keep your tone mild, ask questions instead of making accusations 3. stick to the review agenda 4. raise issues, don't resolve them 5. avoid discussions of style—stick to technical correctness 6. schedule reviews as project tasks 7. record and report all review results 8.

34 1/11/2016CS-499G34 Metrics Derived from Reviews  inspection time per page of documentation  inspection time per KLOC or FP  inspection effort per KLOC or FP  errors uncovered per reviewer hour  errors uncovered per preparation hour  errors uncovered per SE task (e.g., design)  number of minor errors (e.g., typos)  number of major errors (e.g., nonconformance to req.)  number of errors found during preparation

35 1/11/2016CS-499G35 Code Reviews  Purpose:  Formalized approach to document reviews  Intended explicitly for defect DETECTION (not correction)  Defects may be logical errors, anomalies in the code that might indicate an erroneous condition (e.g. an uninitialized variable) or non-compliance with standards  Preconditions:  A precise specification must be available  Team members must be familiar with the organization standards  Syntactically correct code must be available  An error checklist should be prepared  Management must accept that inspection will increase costs early in the software process  Management must not use inspections for staff appraisal

36 1/11/2016CS-499G36 Inspection  Procedure:  System overview presented to inspection team  Code and associated documents are distributed to inspection team in advance  Inspection takes place and discovered errors are noted  Modifications are made to repair discovered errors  Re-inspection may or may not be required  Teams  Made up of at least 4 members  Author of the code being inspected  Reader who reads the code to the team  Inspector who finds errors, omissions and inconsistencies  Moderator who chairs the meeting and notes discovered errors  Other roles are Scribe and Chief moderator


Download ppt "1/11/2016CS-499G1 Costs without Maintenance. 1/11/2016CS-499G2 Costs with Maintenance."

Similar presentations


Ads by Google