CS 240, Prof. Sarwar Slide 1 CS 240: Software Project Fall 2003 Sections 1 & 2 Dr. Badrul M. Sarwar San Jose State University Lecture #18.

Slides:



Advertisements
Similar presentations
Software Engineering COMP 201
Advertisements

Software testing.
Defect testing Objectives
การทดสอบโปรแกรม กระบวนการในการทดสอบ
Chapter 10 Software Testing
1 Integration Testing CS 4311 I. Burnstein. Practical Software Testing, Springer-Verlag, 2003.
Slide 15.1 © The McGraw-Hill Companies, 2002 Object-Oriented and Classical Software Engineering Fifth Edition, WCB/McGraw-Hill, 2002 Stephen R. Schach.
Software Engineering, COMP201 Slide 1 Software Testing Lecture 28 & 29.
Software testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Final Project Account for 40 pts out of 100 pts of the final score 10 pts from.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing 2.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 20 Slide 1 Defect testing l Testing programs to establish the presence of system defects.
Software Engineering Software Testing.
CS 425/625 Software Engineering Software Testing
- Testing programs to establish the presence of system defects -
Introduction to Software Testing
Software Testing & Strategies
Chapter 13 & 14 Software Testing Strategies and Techniques
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
©Ian Sommerville 1995 Software Engineering, 5th edition. Chapter 22Slide 1 Verification and Validation u Assuring that a software system meets a user's.
System/Software Testing
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 23 Slide 1 Software testing Slightly adapted by Anders Børjesson.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Software Testing Hoang Huu Hanh, Hue University hanh-at-hueuni.edu.vn.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 20 Slide 1 Integration testing l Tests complete systems or subsystems composed of integrated.
Object-Oriented Software Engineering, Ch. 9
Software testing techniques 3. Software testing
©Ian Sommerville 1995/2000 (Modified by Spiros Mancoridis 1999) Software Engineering, 6th edition. Chapters 19,20 Slide 1 Verification and Validation l.
Software Engineering Chapter 23 Software Testing Ku-Yaw Chang Assistant Professor Department of Computer Science and Information.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 20 Slide 1 Chapter 20 Software Testing.
Chapter 8 – Software Testing Lecture 1 1Chapter 8 Software testing The bearing of a child takes nine months, no matter how many women are assigned. Many.
©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
INT-Evry (Masters IT– Soft Eng)IntegrationTesting.1 (OO) Integration Testing What: Integration testing is a phase of software testing in which.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
1 Software Defect Testing Testing programs to establish the presence of system defects.
Dr. Tom WayCSC Testing and Test-Driven Development CSC 4700 Software Engineering Based on Sommerville slides.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
Software Testing Yonsei University 2 nd Semester, 2014 Woo-Cheol Kim.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
CSC 480 Software Engineering Lecture 15 Oct 21, 2002.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 20 Slide 1 Defect testing l Testing programs to establish the presence of system defects.
Chapter 8 Lecture 1 Software Testing. Program testing Testing is intended to show that a program does what it is intended to do and to discover program.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Integration testing Integrate two or more module.i.e. communicate between the modules. Follow a white box testing (Testing the code)
CS451 Lecture 10: Software Testing Yugi Lee STB #555 (816)
Chapter 5 – Software Testing & Maintenance (Evolution) 1.
CS451 Software Implementation and Integration Yugi Lee STB #555 (816) Note: This lecture was designed.
What is a level of test?  Defined by a given Environment  Environment is a collection of people, hard ware, software, interfaces, data etc.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
CS 240, Prof. Sarwar Slide 1 CS 240: Software Project Fall 2003 Sections 1 & 2 Dr. Badrul M. Sarwar San Jose State University Lecture #17.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Chapter 8 – Software Testing 1Chapter 8 Software testing Note: These are a modified version of Ch 8 slides available from the author’s site
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
Software Engineering Zhang Shuang
Lecturer: Eng. Mohamed Adam Isak PH.D Researcher in CS M.Sc. and B.Sc. of Information Technology Engineering, Lecturer in University of Somalia and Mogadishu.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Defect testing Testing programs to establish the presence of system defects.
IS301 – Software Engineering V:
Chapter 8 – Software Testing
Chapter 18 Software Testing Strategies
Software testing strategies 2
Testing and Test-Driven Development CSC 4700 Software Engineering
Verification and Validation
Software testing.
IMPLEMENTATION AND INTEGRATION PHASE
Presentation transcript:

CS 240, Prof. Sarwar Slide 1 CS 240: Software Project Fall 2003 Sections 1 & 2 Dr. Badrul M. Sarwar San Jose State University Lecture #18

CS 240, Prof. Sarwar Slide 2 Agenda  Software Testing Principles  Integration Testing  Usability Testing  Practical Testing aspects—Fault-based Testing Approach  System Testing

CS 240, Prof. Sarwar Slide 3 Integration testing  Tests complete systems or subsystems composed of integrated components  Integration testing should be black-box testing with tests derived from the specification  Main difficulty is localising errors  Incremental integration testing reduces this problem

CS 240, Prof. Sarwar Slide 4 Incremental integration testing

CS 240, Prof. Sarwar Slide 5 Product with 13 Modules

CS 240, Prof. Sarwar Slide 6 Approaches to integration testing  Big bang testing  Top-down testing  Start with high-level system and integrate from the top-down replacing individual components by stubs where appropriate  Bottom-up testing  Integrate individual components in levels until the complete system is created  In practice, most integration involves a combination of these strategies known as  Sandwich testing

CS 240, Prof. Sarwar Slide 7 Big Bang Integration Testing  It assumes that all components are first tested individually and then tested together as a single system  No additional test stubs or drivers needed  It sounds simple, but it’s expensive  if a test uncovers a failure (which we expect to happen), it is difficult to pinpoint the specific component responsible for the failure  In addition, it is impossible to distinguish failures in the interface from failures within a component  The better approach  Incremental or phased (or snowballing) integration

CS 240, Prof. Sarwar Slide 8 Test Stubs and Test Drivers  To test module a, modules b, c, d must be stubs  Empty module, or  Prints message ("Procedure radarCalc called"), or  Returns precooked values from preplanned test cases  To test module h on its own requires a driver, which calls it  Once, or  Several times, or  Many times, each time checking the value returned  Testing module d requires a driver and two stubs

CS 240, Prof. Sarwar Slide 9 Problems with: Implementation, Then Integration Approach  Problem 1  Stubs and drivers must be written, then thrown away after module testing is complete  Problem 2  Lack of fault isolation  A fault could lie in any of 13 modules or 13 interfaces  In a large product with, say, 103 modules and 108 interfaces, there are 211 places where a fault might lie  Solution to both problems  Combine module and integration testing  “Implementation and integration phase”

CS 240, Prof. Sarwar Slide 10 Top-down Implementation and Integration  If module m1 calls module m2, then m1 is implemented and integrated before m2  One possible top-down ordering is a, b, c, d, e, f, g, h, i, j, k, l, m  Another possible top-down ordering is a [a]b, e, h [a]c, d, f, i [a, d]g, j, k, l, m

CS 240, Prof. Sarwar Slide 11 Top-down Implementation and Integration: Advantages  Fault isolation  A previously successful test case fails when mNew is added to what has been tested so far  Stubs not wasted  Each stub is expanded into the corresponding complete module at the appropriate step  Major design flaws show up early  Logic modules include decision-making flow of control  In the example, modules a, b, c, d, g, j  Operational modules perform actual operations of module  In the example, modules e, f, h, i, k, l, m  Logic modules are developed before operational modules

CS 240, Prof. Sarwar Slide 12 Top-down Implementation and Integration: Disadvantages  Problem 1  Reusable modules are not properly tested  Lower level (operational) modules are not tested frequently  The situation is aggravated if the product is well designed  Defensive programming (fault shielding)  Example: if (x >= 0) y = computeSquareRoot (x, errorFlag);  Never tested with x < 0  Reuse implications

CS 240, Prof. Sarwar Slide 13 Bottom-up Implementation and Integration  If module m1 calls module m2, then m2 is implemented and integrated before m1  One possible bottom-up ordering is l, m, h, i, j, k, e, f, g, b, c, d, a  Another possible bottom-up ordering is h, e, b i, f, c, d l, m, j, k, g[d] a[b, c, d]

CS 240, Prof. Sarwar Slide 14 Bottom-up Implementation and Integration: Advantages  Advantage 1  Operational modules are thoroughly tested  Advantage 2  Operational modules are tested with drivers, not by fault shielding, defensively programmed calling modules  Advantage 3  Fault isolation

CS 240, Prof. Sarwar Slide 15 Bottom-up Implementation and Integration: Disadvantages  Difficulty 1  Major design faults are detected late  Solution  Combine top-down and bottom-up strategies making use of their strengths and minimizing their weaknesses

CS 240, Prof. Sarwar Slide 16 Sandwich Implementation and Integration  Logic modules are implemented and integrated top-down  Operational modules are implemented and integrated bottom-up  Finally, the interfaces between the two groups are tested

CS 240, Prof. Sarwar Slide 17 Sandwich Implementation and Integration (contd)  Advantage 1  Major design faults are caught early  Advantage 2  Operational modules are thoroughly tested  They may be reused with confidence  Advantage 3  There is fault isolation at all times

CS 240, Prof. Sarwar Slide 18 Top-down testing

CS 240, Prof. Sarwar Slide 19 Bottom-up testing

CS 240, Prof. Sarwar Slide 20 Testing approaches  Architectural validation  Top-down integration testing is better at discovering errors in the system architecture  System demonstration  Top-down integration testing allows a limited demonstration at an early stage in the development  Test implementation  Often easier with bottom-up integration testing  Test observation  Problems with both approaches. Extra code may be required to observe tests

CS 240, Prof. Sarwar Slide 21  Takes place when modules or sub-systems are integrated to create larger systems  Objectives are to detect faults due to interface errors or invalid assumptions about interfaces  Particularly important for object-oriented development as objects are defined by their interfaces Interface testing

CS 240, Prof. Sarwar Slide 22 Interface testing

CS 240, Prof. Sarwar Slide 23 Interfaces types  Parameter interfaces  Data passed from one procedure to another  Shared memory interfaces  Block of memory is shared between procedures  Procedural interfaces  Sub-system encapsulates a set of procedures to be called by other sub- systems  Message passing interfaces  Sub-systems request services from other sub-systems

CS 240, Prof. Sarwar Slide 24 Interface errors  Interface misuse  A calling component calls another component and makes an error in its use of its interface e.g. parameters in the wrong order  Interface misunderstanding  A calling component embeds assumptions about the behaviour of the called component which are incorrect  Timing errors  The called and the calling component operate at different speeds and out- of-date information is accessed

CS 240, Prof. Sarwar Slide 25 Interface testing guidelines  Design tests so that parameters to a called procedure are at the extreme ends of their ranges  Always test pointer parameters with null pointers  Design tests which cause the component to fail  Use stress testing in message passing systems  In shared memory systems, vary the order in which components are activated

CS 240, Prof. Sarwar Slide 26 Usability Testing (User Interface) Testing  GUIs work as a collection of controls that a human can stimulate via mouse or keyboard  Some controls are simple (e.g., buttons) but some are complex (e.g., list boxes), as they pass control as well as data  Testing data passing controls are challenging  Another concern is the order in which GUI controls are used  As a tester our job would be to understand the events and data combinations that originate and ensure that all interesting cases get tested

CS 240, Prof. Sarwar Slide 27 Attacking GUI  We’ll look into four categories to attack a GUI  input  output  storing of data  performing computation  Strategies in testing GUI input  Apply inputs that force all error messages to occur  Apply inputs that force the software to establish default values  Explore allowable character sets and data types  if you type in a URL named file://c:\AUX in internet explorer v 5.5 the program breaks  Overflow input buffers  We have seen one such boundary value attack on MS Word

CS 240, Prof. Sarwar Slide 28 Attacking GUI  Strategies in testing GUI input  repeat the same input or series of inputs numerous times  Microsoft equation editor fails this test  Force different outputs to be generated for each input  Force invalid outputs to be generated  Win NT (earlier versions) showed 2001 as a leap year  Force properties of an output to change  Ex. MS Powerpoint  Force the screen to refresh  MS Powerpoint fails this test

CS 240, Prof. Sarwar Slide 29 Attacking GUI  Strategies in testing GUI Data and Computation  Apply inputs using a variety of initial conditions  Force a data structure to store too many or too few values  Use alternative ways to modify internal data constraints  Force a function to call itself recursively  Force computation result to be too large or too small  Strategies in testing GUI File system  Fill the file system to its capacity  force the media to be busy or unavailable  Damage the media  Assign an invalid file name  vary or corrupt file contents  change file access permissions etc.

CS 240, Prof. Sarwar Slide 30 System Testing  Unit and Integration testing focus on finding faults in individual components and the interfaces between the components  After integration system testing is performed  There are several system testing activities  Functional testing  Nonfunctional (performance) testing  Pilot testing  Acceptance testing  Installation testing

CS 240, Prof. Sarwar Slide 31 Functional Testing  a.k.a Requirements testing finds differences between the functional requirements and the system.  System testing is a black-box testing  Test cases are derived from the use case model  although usability testing also uses use-case model, there is a subtle difference between them  functional: finds differences between the use case model and the observed system behavior  usability: finds differences between the use case model and the user’s expectation of the system

CS 240, Prof. Sarwar Slide 32 Performance Testing  Finds the differences between the design goals selecte3d during the system design and the observed system  The following tests are performed during performance testing  Stress Testing  volume testing  security testing  timing tests  recovery tests

CS 240, Prof. Sarwar Slide 33 Stress testing  Exercises the system beyond its maximum design load. Stressing the system often causes defects to come to light  Stressing the system test failure behaviour.. Systems should not fail catastrophically. Stress testing checks for unacceptable loss of service or data  Particularly relevant to distributed systems which can exhibit severe degradation as a network becomes overloaded

CS 240, Prof. Sarwar Slide 34 Pilot Testing  a.k.a Field Test  The system is installed and used by a selected set of users  Useful when the system is built without a specific set of requirements  An alpha test is a pilot test with users exercising the system in the development environment  A beta test is an acceptance test done by a limited number of end users in the target environment

CS 240, Prof. Sarwar Slide 35 Acceptance Testing  There are three ways a client evaluates a system during acceptance testing  Benchmark test  by using a set of test cases under which the system should operate  Competitor test  if the product is replacing an earlier product then it is tested against it  Shadow test  the new and legacy systems run in parallel and their outputs are compared

CS 240, Prof. Sarwar Slide 36  The components to be tested are object classes that are instantiated as objects  Larger grain than individual functions so approaches to white-box testing have to be extended  No obvious ‘top’ to the system for top-down integration and testing Object-oriented testing

CS 240, Prof. Sarwar Slide 37 Testing levels  Testing operations associated with objects  Testing object classes  Testing clusters of cooperating objects  Testing the complete OO system

CS 240, Prof. Sarwar Slide 38 Object class testing  Complete test coverage of a class involves  Testing all operations associated with an object  Setting and interrogating all object attributes  Exercising the object in all possible states  Inheritance makes it more difficult to design object class tests as the information to be tested is not localised

CS 240, Prof. Sarwar Slide 39 Weather station object interface  Test cases are needed for all operations  Use a state model to identify state transitions for testing  Examples of testing sequences  Shutdown  Waiting  Shutdown  Waiting  Calibrating  Testing  Transmitting  Waiting  Waiting  Collecting  Waiting  Summarising  Transmitting  Waiting

CS 240, Prof. Sarwar Slide 40 Object integration  Levels of integration are less distinct in object-oriented systems  Cluster testing is concerned with integrating and testing clusters of cooperating objects  Identify clusters using knowledge of the operation of objects and the system features that are implemented by these clusters

CS 240, Prof. Sarwar Slide 41 Approaches to cluster testing  Use-case or scenario testing  Testing is based on a user interactions with the system  Has the advantage that it tests system features as experienced by users  Thread testing  Tests the systems response to events as processing threads through the system  Object interaction testing  Tests sequences of object interactions that stop when an object operation does not call on services from another object

CS 240, Prof. Sarwar Slide 42 Scenario-based testing  Identify scenarios from use-cases and supplement these with interaction diagrams that show the objects involved in the scenario  Consider the scenario in the weather station system where a report is generated

CS 240, Prof. Sarwar Slide 43 Collect weather data

CS 240, Prof. Sarwar Slide 44 Weather station testing  Thread of methods executed  CommsController:request  WeatherStation:report  WeatherData:summarise  Inputs and outputs  Input of report request with associated acknowledge and a final output of a report  Can be tested by creating raw data and ensuring that it is summarised properly  Use the same raw data to test the WeatherData object

CS 240, Prof. Sarwar Slide 45 Testing workbenches  Testing is an expensive process phase. Testing workbenches provide a range of tools to reduce the time required and total testing costs  Most testing workbenches are open systems because testing needs are organisation-specific  Difficult to integrate with closed design and analysis workbenches

CS 240, Prof. Sarwar Slide 46 A testing workbench

CS 240, Prof. Sarwar Slide 47 Tetsing workbench adaptation  Scripts may be developed for user interface simulators and patterns for test data generators  Test outputs may have to be prepared manually for comparison  Special-purpose file comparators may be developed