CS 350, slide set 9 M. Overstreet Old Dominion University Spring 2006.

Slides:



Advertisements
Similar presentations
Object Oriented Analysis And Design-IT0207 iiI Semester
Advertisements

Configuration management
Software Testing. Quality is Hard to Pin Down Concise, clear definition is elusive Not easily quantifiable Many things to many people You'll know it when.
Software Quality Assurance Plan
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #7 Software Engineering.
Sponsored by the U.S. Department of Defense © 2002 by Carnegie Mellon University July 2002 Pittsburgh, PA Lecture 6: Team Planning.
6/19/2007SE _06_19_Overview_Inspections.ppt1 Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement.
6/19/2007SE _6_19_TSPImp_SVT_Lecture.ppt1 Implementation Phase Inputs: Development strategy & plan Completed, inspected & baselined SRS & SDS.
06/12/2007SE _6_12_Design.ppt1 Design Phase Outputs: Completed & Inspected SDS & Integration Test Plan Completed & Inspected System Test Plan.
SE 555 Software Requirements & Specification Requirements Validation.
Programming Fundamentals (750113) Ch1. Problem Solving
Chapter 1 Program Design
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
Senior Design – Acceptance Test Plan Review The goal is to: define the criteria for approving the application. Tightly coupled to the Requirements document.
Software System Integration
The Project AH Computing. Functional Requirements  What the product must do!  Examples attractive welcome screen all options available as clickable.
Dr. Pedro Mejia Alvarez Software Testing Slide 1 Software Testing: Building Test Cases.
INFO 637Lecture #31 Software Engineering Process II Launching & Strategy INFO 637 Glenn Booker.
CS 350, slide set 6 M. Overstreet Old Dominion University Fall 2005.
Extreme Programming Software Development Written by Sanjay Kumar.
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
1 Shawlands Academy Higher Computing Software Development Unit.
INFO 637Lecture #81 Software Engineering Process II Integration and System Testing INFO 637 Glenn Booker.
CMSC 345 Fall 2000 Unit Testing. The testing process.
CS 350, slide set 7 M. Overstreet Old Dominion University Spring 2005.
RUP Implementation and Testing
INFO 637Lecture #41 Software Engineering Process II Development Plan INFO 637 Glenn Booker.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
CS 350, slide set 8 M. Overstreet Old Dominion University Spring 2005.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
CS 350, slide set 6 M. Overstreet Old Dominion University Spring 2005.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
1 The Software Development Process  Systems analysis  Systems design  Implementation  Testing  Documentation  Evaluation  Maintenance.
CS 350, slide set 10 M. Overstreet Old Dominion University Spring 2006.
CS 114 – Class 02 Topics  Computer programs  Using the compiler Assignments  Read pages for Thursday.  We will go to the lab on Thursday.
INFO 637Lecture #21 Software Engineering Process II TSP Roles and Overview INFO 637 Glenn Booker.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
INFO 637Lecture #101 Software Engineering Process II Review INFO 637 Glenn Booker.
Team Software Process (TSPi) CS4320 Fall TSP Strategy Provide a simple process framework based on the PSP. Use modest, well-defined problems. Develop.
CS 350, slide set 5 M. Overstreet Old Dominion University Spring 2005.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 7 1 Design and Code Reviews - Overview What are design and code.
The Software Development Process
Chapter 8 Lecture 1 Software Testing. Program testing Testing is intended to show that a program does what it is intended to do and to discover program.
Sponsored by the U.S. Department of Defense © 2002 by Carnegie Mellon University July 2002 PSP-TSPi Faculty Workshop Pittsburgh, PA Lecture.
CS 350, slide set 9a M. Overstreet Old Dominion University Spring 2005.
INFO 637Lecture #71 Software Engineering Process II Product Implementation INFO 637 Glenn Booker.
PRODUCT IMPLEMENTATION Chapter 8 Tawatchai Iempairote September 23, 2041.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
CS 350, slide set 11 M. Overstreet Old Dominion University Fall 2004.
Focus on design principles and on a process for doing design = To produce a precise, complete, high- quality foundation for product implementation.
CS 350, slide set 11 M. Overstreet Old Dominion University Spring 2006.
CS 350, slide set 9 M. Overstreet Old Dominion University Fall 2005.
Software Quality Assurance and Testing Fazal Rehman Shamil.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Advances In Software Inspection
CSC 480 Software Engineering PSP Project 1 August 20, 2004.
CS 350, slide set 4 M. Overstreet Old Dominion University Spring 2005.
CS 350, slide set 10 M. Overstreet Old Dominion University Spring 2005.
Program Design. Simple Program Design, Fourth Edition Chapter 1 2 Objectives In this chapter you will be able to: Describe the steps in the program development.
TMP3413 Software Engineering Lab Lab 01: TSPi Tool Support.
Testing Process Roman Yagodka ISS Test Leader.
Different Types of Testing
Applied Software Implementation & Testing
Software testing strategies 2
CSE 303 Concepts and Tools for Software Development
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Presentation transcript:

CS 350, slide set 9 M. Overstreet Old Dominion University Spring 2006

Term Project Requirements - 1 Overview: Loop M times Generate test data: N numeric values Store N values in a file Run P different programs Each read test data Executes Store output in a file Read output files to compare output from M programs Report differences Count differences Compute simple estimate off reliability of each program

Requirements – 2 simplifying assumptions All programs to be tested read from standard input Test programs have a fixed set of names: pgm01, pgm02, pgm03, etc. Not desirable, but simplifies coding All input is numeric. All programs read their input until EOF. All output is numeric. But you don’t know how much! These programs may be buggy, but are all supposed to do the same thing.

Requirements - 3 Your program reads 3 values from the command line: M, the outer loop parameter: the number of test cases N, the number data values per test case P, the number of programs. Assume P is at least 2.

Requirements – 4 Program Outline Main Loop M times: Generate 1 test case with N numbers Run each of P programs Each program reads test case, produces output All P output files are compared & differences reported Generate reliability estimates for each of the P programs

Requirements – 5 Random Values Use an existing random number generator do “man random” on a UNIX box You should generate values between 0 and 10. For debugging, you want to be able to generate exactly the same set of random numbers on different runs. Set the seed explicitly to a constant.

Requirements – 6 Computing Reliability - 1 We will compute only an estimate of reliability. If one program produces an output that disagrees with all other programs and they agree, count this as an error for that single program. If more than one program produces a different value, assume all programs are in error. See example next slide Two programs is a special case. You can count either way, but be sure you can handle this.

Requirements – 7 Reliability - 2 If we’re testing five programs: Progp1p2p3p4p5Comment val no error detected val 2 586p4 error val all programs in error val 45555p3 error

Requirements – 8 Comparing Floats For coding simplicity, assume all data to be compared are floats Would be better to treat integer values differently from float (or double) values Assume two floats are equal if they differ by less than 0.01%. Use pgm01 output as the basis for the percentage comparison.

Hints: use system OS call Do “man system” on a UNIX box for documentation. Example: to run program abc from a C++ program and have it read from file Input (as standard input) and write to file Output as standard output, use: system( “./abc Output” ); System is an OS call to the operating fork a new shell and run the parameter in that shell as if you had typed that command.

Use of files Data generator module generates one file That file is read repeatedly, once by each program being tested. Testing generates several output files, one per test program The comparison module: Reads one value from each output file Loop using array of files. Compare values, reports and counts Loops until end-of-file

High-level Design: Modules Main Outlined above Test data generator Loop module Runs each of the programs Comparison module Report module Each of these are small, easily designed, easily reviewed and inspected For some you have a base or a reuse source

Intent Design, coding, testing should be easy Including reviews and group inspections Unit testing straightforward Integration testing straightforward So focus on: Group cooperation, group mutual aid Inspect together before running Estimates, scheduling, data collection, forms

What’s hard to code? - 1 What happens if one version fails and produces no output? What happens if one version loops? What happens if one version incorrectly produces nonnumeric output? Answer: ignore problems 2 and 3; handle 1.

What’s hard to code? - 2 Use of UNIX /tmp directory for random data files Advantage: cleaned out by OS You’re not required to do this, but it’s the right way If you do this, must generate a unique file name. Normally, the process id of the generating program is part of the file name Good news: you don’t need to do this.

What’s hard to code? - 3 Unexpected end-of-file What if the different output files have different numbers of values? Then hit end-of-file on some but not all Suggestion: get a version running that handles the case where all files have the same number of values and make sure that works. Then add functionality. But don’t break anything.

Immediate steps! - 1 Team and Individual Goals Due by next Wed. High level design What are the modules? Who codes what? Size estimates for each module Sum for project size estimate Time estimates Group discussed, approved

Immediate steps! 2 Schedule; group discussed & approved What’s to be completed when? Main mileposts: Date each module designed & approved Date each module will be coded Date each module will be tested Date all code integrated Date test plans approved by group Date testing complete Date reporting complete What’s omitted (& not required) Code review dates & forms Coe inspection dates & forms

Testing hints - 1 Unit testing is required. So you must provide a driver and test data for each unit. Are any stubs required? Separate compilation is required Different team members code and test different units Will give examples in class & on web To test your program, you will need to have several programs for it to run. Consider using existing programs. If you write them, make them as simple as possible as long as you can use them to test your code thoroughly.

Testing hints - 2 Consider programs, each consisting of one line: pgm1: cout << “1 2 3” << endl; pgm2: cout << “1 2 3” << endl; pgm3: cout << “1 2 4” << endl; pgm4: cout << “1 2” << endl; etc. as useful for testing How many requirements can you test using programs like these?

Simplified Task Planning Template (p. 82) Rows: include only Plan-tasks and schedule System test plan Only one line per module (but see next slide) Postmortem Totals Columns: include only: Task Name Planned Value Plan hours by personCumulative PV Total team hoursActual hoursCumulative hours SizeActual Completion Date Planned Completion Date

Task Plan/Report per Module TaskSizeWho Plan- hrs PVAct-hrs Plan com. date Act. com. date EV Design Design rev Code Code rev. Compile Code insp. Test data Test driver Test rev. Test insp. Testing

Another omission Given the limited time: Log where errors are found only Omit noting where errors are inserted You must report errors found for each task Some tasks may be unlikely to discover errors, but also include them in your error report.

Comments Make sure you know what data you will need to report so it will take less time to provide it! Look over the forms NOW! You set your own schedules But we will be checking on what you submit & prompting for things! Project is about TSPi (simplified), not coding Eval. biased on determining if you followed TSPi! Did you make reasonable plans (time, sizes, req. tasks)? Were you able to handle the inevitable surprises? Did you set goals, design, inspect, test, record errors, log time, etc.? Did you measure your performance against your goals? Did you support your team?

Reading TSP text, omit Ch. 6; central part of CS 410 TSP text: Ch. 7, 8, 9, 10, App. B (config. mgmt) will be discussed in class TSP text: Read Ch. 16 (Managing yourself), Ch. 17 (Being on a team), & Ch. 18 (Teamwork)

What’s due when? April 30 Will provide checklist on web site Submit all completed components (code, forms, etc. )to userid cs350 The grader will check for them there

Topics Designing in teams Design script Implementation Implementation script

Comments on team design Not easily done by large group Some believe all good designs are done either by one individual or at most by a very small team Design can start with brain-storming session All team members have input and contribute ideals Then 1 or 2 people create the design

Design completion details Goal: Inspection team can determine correctness of design Programmers can produce the intended implementation directly from the design and their background knowledge Sometimes reference materials may be needed And domain knowledge

Design standards Naming conventions Interface formats System and error messages Design representation notations

Levels of design Common levels: System Subsystem Product Component Module For term project, you only need System (mostly done) Module (for each selected module)

Design goals Reuse Depends in part on standard documentation Usability Need to review design from perspective of all people who will use the software Testability May choose one design over another

Design script - 1 StepActivityDescription 1Design process review Review design process Review sample design How design inspection is performed 2High-level design Develop manager leads design team Define program structure Name program components Identify design tasks to be completed and documented 3Design standards Quality/process manager leads the effort to produce the name glossary and design standards 4Design tasks The development manager leads the team through Outlining the SDS and the work to produce it. 5Task allocation The team leader help allocate task among team members Also obtains commitments from members

Design script - 2 StepActivityDescription 6Design specification Each team member Produces and reviews his/her portions of the SDS document Provides these to development manager Development manager produces composite SDS draft 7Integration test plan Development manager leads team in producing and reviewing integration test plan 8Design & int. plan inspection Quality/process manager leads team through inspecting the SDS draft and integration test plan Design is complete and correct Integration test plan is adequate Each problem is recorded and fix responsibility decided Inspection is documented in form INS, defects recorded in LOGD

Design script - 3 StepActivityDescription 9Design update Development manager SDS obtains sections and Combines them into final SDS Verifies traceability to the SRS 10Update baseline Support manager baselines the SDS

Standards Design standards Let's assume pseudocode is good enough Use another if you prefer Coding standards What we have is good enough Size standards Max size of a single component? Defect standards What we have is good enough Standards need to be reviewed and changed if they don't work But not this semester

IMP script - 1 StepActivityDescription 1Imple- mentation process overview Instructor has described Importance of quality implementation Need for coding standards Strategy for handling poor-quality components 2Imple- mentation planning Development manager leads work to Define and plan implementation task (SUMP & SUMQ) 3Task allocation Team leader helps allocate tasks to team members Obtains commitments for when they will complete tasks 4Detailed design Programmers produce detailed designs They do individual thorough design reviews They complete LOGD and LOGT forms 5Unit test plan Tester produces unit test plans

IMP script - 2 StepActivityDescription 6Test develop- ment Testers follow script UT to produce unit test plans, test procedures, and test data for each component 7 Test inspect. Quality/process manager leads team in inspection of test cases, procedures and data (use INS script and forms INS and LOGD) for each component 8Detailed design inspect. Quality/process manager leads team in inspection of DLD of each component (script INS and forms INS and LOGD) 9CodeProgrammers produce component source code Do a code review using personalized checklist Compile and fix the code until it compiles Complete LOGD and LOGT forms 10Code inspect. Quality/process manager leads team in code inspection of each component (script INS and INS and LOGD forms)

IMP script - 3 StepActivityDescription 11Unit testProgrammers, following script UT Conduct the unit tests and complete forms LOGD and LOGT 12Component quality review Quality/process manager reviews each component's data to determine if component quality meets establish team criteria If so, component is accepted for integration testing If not, quality/process mangers recommends either: the product be reworked and reinspected, or the product be scrapped and redeveloped 13Component release When components are satisfactorily implemented and inspected, developers release them to support manager The support manager enters the components in the configuration management system

Some goals/guidelines Design time > coding time Design review time > 50% of design time Code review time > 50% coding time, preferable > 75% You should find twice as many defects in code review as compile You should find > 3 defects / review hr. Review rate < 200 LOC/hr.