Presentation is loading. Please wait.

Presentation is loading. Please wait.

More SQA Reviews and Inspections. Types of Evaluations  Verification Unit Test, Integration Test, Usability Test, etc  Formal Reviews  aka "formal.

Similar presentations


Presentation on theme: "More SQA Reviews and Inspections. Types of Evaluations  Verification Unit Test, Integration Test, Usability Test, etc  Formal Reviews  aka "formal."— Presentation transcript:

1 More SQA Reviews and Inspections

2 Types of Evaluations  Verification Unit Test, Integration Test, Usability Test, etc  Formal Reviews  aka "formal design review", "formal technical reviews", etc. conducted by senior personnel or outside experts uncover potential problems  Peer Reviews  aka "inspections", "walkthroughs", etc. done by peers detect errors, adherence to standards, etc.

3 Formal Reviews  Reviewers should be senior personnel and/or outside experts  Review Leader should not be Project Leader  Usually done at the end of a phase. very appropriate for SRS and Design rarely appropriate for code  Outcome: approve approve pending changes reject Software Quality Assurance by Galin section 8.2

4 Sample Checklist Formal Review of a Design  Adequate  Well-structured  Simple  Efficient  Flexible  Practical  Implementable

5 General: 1. Does the architecture convey a clear vision of the system that can be used for further development? 2. Is the architecture structured to support likely changes? 3. Does the architecture describe the system at a high level of detail? (No interface or implementation details.) 4. Does the architecture cleanly decompose the system? 5. Is the architecture independent of the infrastructure used to develop the system? 6. Has maintainability been considered? 7. No duplicate functionality in the architecture? Complete: 1. Are software requirements reflected in the software architecture? 2. Is effective modularity achieved? Are modules functionally independent? 3. Does each module/class have an understandable name? 4. Is each association well named? 5. Is each association’s and aggregation’s cardinality correct? Correct: 1. Does each association reflect a relationship that exists over the lives of the related modules/classes? 2. Does the architecture have loose coupling and good cohesion? www.cs.trincoll.edu/~hellis2/CPSC240/Project/Design Review Checklist.doc

6 Peer Reviews / Inspections / Walkthroughs guided by: checklists, standards, past problems attendees: review leader the author scribe 1 or 2 people with domain knowledge possibly an SQA team member (for standards) Galin section 8.3 Why schedule a meeting with so many people? Why not just have two people review the item without a meeting?

7 Inspection Process 1. pre-meeting review the item ahead of time 2. meeting author presents overview review team asks questions and express opinions 3. after meeting scribe prepares summary team approves summary follow up

8 Inspection Guidelines Review the Product, not the person! Find errors, don't try to solve them! Keep Records Take written notes. Review your earlier reviews. Allocate resources and schedule time for FTRs. 3 to 5 people Conduct training for reviewers Keep it short limit debate and rebuttal  Set an agenda and keep it. no more than two hours preparation  small portions only  narrow focus increases likelihood of finding an error meeting duration less than two hours

9 Sample Design Inspection 1. Does the algorithm accomplishes desired function? 2. Is the algorithm logically correct? 3. Is the interface consistent with architectural design? 4. Is the logical complexity reasonable? 5. Have error handling and "anti-bugging" been specified? 6. Are local data structures properly defined? 7. Are structured programming constructs used throughout? 8. Is design detail amenable to implementation language? 9. Which are used: operating system or language dependent features? 10. Is compound or inverse logic used? 11. Has maintainability been considered? stolen from Pressman

10 IBM (relative costs) during design $1.5 prior to coding$1 during coding$1.5 during test$60 in field use$100 Reviews are 2 or 3 times as efficient as testing at finding defects. Combined design and code reviews have a yield of 60% to 80%. Effectiveness of Reviews

11 Without QA Reviews Without QA Reviews 50 KLOC with 2500 defects to be found average 4 hours of work per defect 10,000 hours 10,000 hours to remove defects With QA Reviews With QA Reviews 50 KLOC with 2500 defects 70% (1750) of defects removed via Reviews average 0.5 hours per defect 875 hours of work remaining 750 defects average 8 hours each 6000 hours 6,875 hours Total = 6,875 hours Example

12 (from previous lecture) Real Numbers (from previous lecture) Cost of Software Quality for 15 Projects at Raytheon’s Equipment Division http://www2.umassd.edu/swpi/costmodeling/papers/scoqpap1.doc

13 Examples NASA's Software Review Guidelines Case : Software Review What went wrong?


Download ppt "More SQA Reviews and Inspections. Types of Evaluations  Verification Unit Test, Integration Test, Usability Test, etc  Formal Reviews  aka "formal."

Similar presentations


Ads by Google