CS 350, slide set 11 M. Overstreet Old Dominion University Fall 2004.

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Test process essentials Riitta Viitamäki,
Configuration Management
CSCU 411 Software Engineering Chapter 2 Introduction to Software Engineering Management.
Integration testing Satish Mishra
Integration and System Testing CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology March 29, 2007.
Illinois Institute of Technology
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment Chapter 12: Managing and Implementing Backups and Disaster Recovery.
SE 555 Software Requirements & Specification Requirements Validation.
CS350/550 Software Engineering Lecture 1. Class Work The main part of the class is a practical software engineering project, in teams of 3-5 people There.
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Design, Implementation and Maintenance
Issues on Software Testing for Safety-Critical Real-Time Automation Systems Shahdat Hossain Troy Mockenhaupt.
Software System Integration
The Project AH Computing. Functional Requirements  What the product must do!  Examples attractive welcome screen all options available as clickable.
Estimation Wrap-up CSE 403, Spring 2008, Alverson Spolsky.
CS 350, slide set 6 M. Overstreet Old Dominion University Fall 2005.
12.
CCSB223/SAD/CHAPTER141 Chapter 14 Implementing and Maintaining the System.
FINAL DEMO Apollo Crew, group 3 T SW Development Project.
INFO 637Lecture #81 Software Engineering Process II Integration and System Testing INFO 637 Glenn Booker.
Applied Software Project Management Andrew Stellman & Jennifer Greenehttp:// Applied Software Project Management Chapter 1: Introduction.
Managing the development and purchase of information systems (Part 1)
Chapter 8: Systems analysis and design
Software Quality Assurance Activities
CS 350, slide set 7 M. Overstreet Old Dominion University Spring 2005.
RUP Implementation and Testing
CS 350, slide set 8 M. Overstreet Old Dominion University Spring 2005.
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment, Enhanced Chapter 12: Managing and Implementing Backups and Disaster Recovery.
CS 350, slide set 6 M. Overstreet Old Dominion University Spring 2005.
INT-Evry (Masters IT– Soft Eng)IntegrationTesting.1 (OO) Integration Testing What: Integration testing is a phase of software testing in which.
1 Project Information and Acceptance Testing Integrating Your Code Final Code Submission Acceptance Testing Other Advice and Reminders.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
CS 350, slide set 10 M. Overstreet Old Dominion University Spring 2006.
INFO 637Lecture #101 Software Engineering Process II Review INFO 637 Glenn Booker.
Team Software Process (TSPi) CS4320 Fall TSP Strategy Provide a simple process framework based on the PSP. Use modest, well-defined problems. Develop.
CS 350, slide set 5 M. Overstreet Old Dominion University Spring 2005.
Principles of Information Systems, Sixth Edition Systems Design, Implementation, Maintenance, and Review Chapter 13.
COMP 208/214/215/216 – Lecture 8 Demonstrations and Portfolios.
CS 350, slide set 9 M. Overstreet Old Dominion University Spring 2006.
Sponsored by the U.S. Department of Defense © 2002 by Carnegie Mellon University July 2002 PSP-TSPi Faculty Workshop Pittsburgh, PA Lecture.
CPSC 873 John D. McGregor Session 9 Testing Vocabulary.
INFO 637Lecture #71 Software Engineering Process II Product Implementation INFO 637 Glenn Booker.
CSC 480 Software Engineering Test Planning. Test Cases and Test Plans A test case is an explicit set of instructions designed to detect a particular class.
Integration and system test
CS 350, slide set 11 M. Overstreet Old Dominion University Spring 2006.
CPSC 871 John D. McGregor Module 8 Session 1 Testing.
CS 350, slide set 9 M. Overstreet Old Dominion University Fall 2005.
1 Object-Oriented Analysis and Design with the Unified Process Figure 13-1 Implementation discipline activities.
CS 350, slide set 4 M. Overstreet Old Dominion University Spring 2005.
CS 350, slide set 10 M. Overstreet Old Dominion University Spring 2005.
TMP3413 Software Engineering Lab Lab 01: TSPi Tool Support.
T Project Review Sotanorsu I2 Iteration
Getting Ready for the NOCTI test April 30, Study checklist #1 Analyze Programming Problems and Flowchart Solutions Study Checklist.
CPSC 372 John D. McGregor Module 8 Session 1 Testing.
The Software Development Process. Contents  Product Components  Software project staff  Software development lifecycle models.
Software Testing.
Integration Testing.
John D. McGregor Session 9 Testing Vocabulary
Systems Analysis and Design
John D. McGregor Session 9 Testing Vocabulary
Applied Software Implementation & Testing
Software Inspections and Testing
John D. McGregor Session 9 Testing Vocabulary
Software System Integration
Chapter 11: Integration- and System Testing
Test Cases, Test Suites and Test Case management systems
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Software Reviews.
Presentation transcript:

CS 350, slide set 11 M. Overstreet Old Dominion University Fall 2004

Reading TSP text, Ch 9, 10 Remember, you are supposed to have read the chapter on your role from ch And ch. 16, 17, and 18.

Deadlines, Guidelines - 1 Project: due Wednesday, Dec. 8 Each group builds one project directory that contains a subdirectory for each module: one for each implemented modules, maybe others (support, e.g.). All source, Makefiles, test data, test logs, testing results Unit testing data in each module directory Integration testing data in GCS directory All required TSPi forms I must be able to determine who did what. Name (or names) of who did it must be included with each item. I must be able to determine what works. Include actual output whenever appropriate.

Deadlines, forms - 2 Final forms due by midnight, Thursday, Dec. 9 Peer Evaluation forms submitted by Friday, Dec. 10, midnight

Forms checklist (see app. F): Config. Change Req.omit Config. Status Rep.omit Student Info. Sheetomit Inspection (several)required Issue Tracking Rep.required Defect Rec. Logrequired Time Rec. Log (LOGT)required Test Logrequired Test reportsrequired

Forms checklist - 2 Team and Peer Evalrequired Submitted separately Process Imp. Prop.required Sched. Plan. Templaterequired Def. Inj. Summaryrequired Def. Removed Sum.required Plan Summary (SUMP)required Quality Plan (SUMQ)required Size Summary (SUMS)required Time Summaryrequired

Forms - 3 Additional form: submission checklist List of everything submitted Location of everything submitted Identification who completed each item submitted (including forms)

Project evaluation - 1 I must be able to determine Who did what. Put names in documents (code, forms, test reps, etc) If no name, no credit What process steps were completed. Will rely on forms For each module’s reviews and inspections Will rely on forms What code actually works. Will rely files produced during testing, so make sure these are included with modules How much testing was performed. Will rely on test reports

Project evaluation - 2 Individual project grades determined as follows Your peer evaluations: 15% My impression of your contributions15% Forms: 35% Evidence of execution 30% Note that without execution, some forms must be incomplete Quality/completeness of materials 10% As I go through your group’s submission, I will record what you’ve done. Your grade will be based on that list

Project code suggestions Most errors come from misunderstanding of requirements These types of errors should be identified in inspections.

Testing: selected case studies Remember: hard to get this kind of data Magellan spacecraft ( ) to Venus 22 KLOC – this is small 186 defects found in system test 42 critical only 1 critical defect found in 1 st year of testing Project a success, but several software related emergencies Galileo spacecraft (1989 launch, 1995) Testing took 6 years Final 10 critical defects found after 288 weeks of testing

PSP/TSP approach Find most defects before integration testing during: Reviews (requirements, HLD, DLD, test plans, code) Inspections Unit testing Each of these activities are expensive, but testing is worse TSP goal: use testing to confirm that code is high quality. May need to return low quality code for rework or scrapping Data shows strong relationship between defects found in testing & defects found by customers

Build and integration strategies: big bang Build & test all pieces separately then put them all together at the end and see what happens Out of favor Debugging all pieces at the same time; harder to identify real causes of problems Industry experience: 10 defects/KLOC; All-too-typical: system with 30,000 defects

B & I strategies: one subsystem at a time Design system so that it can be implemented in steps; each step useful First test minimal system After its components have been tested Add one component at a time Defects are more likely to come from new parts Not all systems admit to this approach

B & I strategies: add clusters If system has components with dependencies among them, it may be necessary to add clusters of interacting components

B & I strategies: top down Top-down integration Integrate top-level components first With lower-level components stubbed as necessary May identify integration issues earlier than other approaches I suggest this approach for GCS Write top level driver first. It calls stubbed modules. As modules are available, they replace stubbed version

Typical testing goals Show system provides all specified functions Does what is supposed to do Show system meets stated quality goals MTBF, for example Show system works under stressful conditions Doesn’t do “bad” things when other systems (e.g. power) fail, network overloads, disk full In reality, schedule/budget considerations may limit testing to most frequent or critical behaviors only

Test log includes: Date, start and end time of tests Name of tester Which tests were run What code & configuration was tested Number of defects found Test results Other pertinent information Special tools, system config., operator actions See sample test log, pg. 172

Documentation - 1 Probably needs another course Must write from perspective of user of documentation Other programmers on team Future maintenance programmers Installers Managers Users Better to hire English majors to write documentation? Easier teach them the computing part than to teach technical geeks how to write well?

Documentation - 2 Developers often do poor job Even when proofing, omissions (what you forgot to tell reader) are often undetected since writer knows them Student just finished MS thesis of software metrics of open-source code. Did not explain what KDSI meant until end of thesis! I missed it too! Guidelines: include Glossary to define special terms Detailed table of contents Detailed index Sections on Error messages Recovery procedures Troubleshooting procedures

Testing script Covered in text

Postmortem Why? We’re still learning how to do this Organization goal: learn for this project to improve next one We shouldn’t keep making the same mistakes Individual goal: make you a more valuable employee Update your personal checklists, etc.

Postmortem script We’ll skip; works better if 3 cycles Will discuss in class; be ready to tell me Where the process worked and where it did not How did actual performance compare with expected? Where did your team do well? Where not?

PIP objectives While project is fresh, record good ideas on process improvements Implicit goal: be skeptical about TSP as the solution to all software problems Each organization and problem domain probably has their unique problems; one size does not fit all But a request: be tolerant. Learn from others experience; don't reject too quickly

Peer evaluations Use form from text Includes your impression of who had hardest role (% sum to 100) had the most work (% sum to 100) You must use team member names and roles (unlike the form) Written text Your evaluation of TSP Your evaluation of how well you fulfilled your role What did you do that worked well? What did you do that did not work well?