Presentation is loading. Please wait.

Presentation is loading. Please wait.

SYSC 4101 - Software Validation, Verification and Testing1 SYSC 4101 – Software Validation, Verification and Testing Part II – Software Testing Overview.

Similar presentations


Presentation on theme: "SYSC 4101 - Software Validation, Verification and Testing1 SYSC 4101 – Software Validation, Verification and Testing Part II – Software Testing Overview."— Presentation transcript:

1 SYSC 4101 - Software Validation, Verification and Testing1 SYSC 4101 – Software Validation, Verification and Testing Part II – Software Testing Overview

2 SYSC 4101 - Software Validation, Verification and Testing2 Definitions (Verification vs. Validation) Software Verification: –The goal is to find as many latent defects as possible before delivery –Checking whether the system adheres to properties termed as verification properties –Constructing the system well Software Validation: –The goal is to gain confidence in the software, shows it meets its specifications –Relationship with other software engineering activities (e.g., Requirements elicitation, Analysis) –Constructing the right system

3 SYSC 4101 - Software Validation, Verification and Testing3 Definitions (V&V Techniques) Static Techniques: i.e., without any execution of the system –Inspections: Techniques aimed at systematically verifying non- executable software artifacts with the intent of finding as many defects as possible, as early as possible –Mathematical Proof: Proof of the program against its formal specification –Model Checking: Verifying properties of the system using models (e.g., finite state machines, petri nets) Dynamic Techniques: i.e., through the execution of the system –Symbolic Execution: Inputs supplied to the system are symbolic –Verification Testing (or simply, Testing): Inputs supplied to the system are valued  The most used V&V technique

4 SYSC 4101 - Software Validation, Verification and Testing4 Software Bugs … Bug related to the year –A 104 years old woman received an invitation to a kindergarten ( 1992 ). Interface misuse –Underground train in London left a station without the driver ( 1990 ). Over budget project –Failure in an automated luggage system in an airport ( 1995 ). NASA mission to Mars: –Incorrect conversion from imperial to metric leads to loss of Mars satellite (1999) Ariane 5 Flight 501 –The space rocket was destroyed (1996). Therac-25 –Radiation therapy and X-ray machine killed several patients (1985-1987).

5 SYSC 4101 - Software Validation, Verification and Testing5 Ariane 5 – Root Cause Source: ARIANE 5 Flight 501 Failure, Report by the Inquiry Board A program segment for converting a floating point number to a signed 16 bit integer was executed with an input data value outside the range representable by a signed 16 bit integer. This run time error (out of range, overflow), which arose in both the active and the backup computers at about the same time, was detected and both computers shut themselves down. This resulted in the total loss of attitude control. The Ariane 5 turned uncontrollably and aerodynamic forces broke the vehicle apart. This breakup was detected by an on-board monitor which ignited the explosive charges to destroy the vehicle in the air. Ironically, the result of this format conversion was no longer needed after lift off.

6 SYSC 4101 - Software Validation, Verification and Testing6 Software Bugs - Cost “Impact of Inadequate Software Testing on US Economy” Who? –National Institute of Standards and Technology (NIST), a US federal agency. What? –Studies in the manufacturing and transportation equipment sectors, to assess the cost to the U.S. economy of inadequate software testing infrastructure. Results (annual cost): –Estimation: $5.85 billion –Projection to the entire U.S. economy: $59.5 billion http://www.nist.gov/director/prog-ofc/report02-3.pdf

7 SYSC 4101 - Software Validation, Verification and Testing7 Dealing with Software Faults Fault Handling Fault Tolerance Atomic Transactions Modular Redundancy Testing Fault Detection Debugging Component Testing Integration Testing System Testing Correctness Debugging Performance Debugging Fault Avoidance Configuration Management Inspections Design Methodology

8 SYSC 4101 - Software Validation, Verification and Testing8 Goals of Testing Dijkstra, 1972 “Program testing can be used to show the presence of bugs, but never to show their absence” No absolute certainty can be gained from testing Testing should be integrated with other verification activities, e.g., inspections Main goal: demonstrate the software can be depended upon, i.e., sufficient dependability

9 SYSC 4101 - Software Validation, Verification and Testing9 Remarks No matter how rigorous we are, software is going to be faulty Testing represent a substantial percentage of software development costs and time to market Impossible to test under all operating conditions – based on incomplete testing, we must gain confidence that the system has the desired behavior Testing large systems is complex – it requires strategy and technology- and is often done inefficiently in practice

10 SYSC 4101 - Software Validation, Verification and Testing10 Qualities of Testing Effective at uncovering failures Help locate faults for debugging Repeatable so that a precise understanding of the fault can be gained Automated and low cost (repeating is easy)

11 SYSC 4101 - Software Validation, Verification and Testing11 Basic Testing Definitions Errors: –People commit errors Fault: –A fault is the result of an error in the software documentation, code, etc. Failure: –A failure occurs when a fault executes Incident: –Consequences of failures – Failure occurrence may or may not be apparent to the user Testing: –Exercise the software with test cases to find faults or gain confidence in the system Test cases: –Set of inputs and a list of expected outputs (sometimes left out)

12 SYSC 4101 - Software Validation, Verification and Testing12 Test Stubs and Drivers Test Stub: –Partial implementation of a component on which the tested component depends. Test Driver: –Partial implementation of a component that depends on the tested component. Test stubs and drivers enable components to be isolated from the rest of the system for testing

13 SYSC 4101 - Software Validation, Verification and Testing13 Summary of Definitions (1) Oracle Verdict (correct/incorrect) Program Under Test Driver executes creates Stub(s) uses Outputs Inputs

14 SYSC 4101 - Software Validation, Verification and Testing14 Summary of Definitions (2) is caused by ** Test case FailureError Test suite is caused by * * CorrectionComponent Test stub Test driver exercises is revised by finds repairs * ** * * * 1…n * * Fault

15 SYSC 4101 - Software Validation, Verification and Testing15 Exhaustive Testing Exhaustive testing, i.e., testing a software system using all the possible inputs, is most of the time impossible. Examples: –A program that computes the factorial function (n!=n.(n-1).(n-2)…1) Exhaustive testing = running the program with 0, 1, 2, …, 100, … as an input! –A compiler (e.g., javac) Exhaustive testing = running the (Java) compiler with any possible (Java) program (i.e., source code)  Technique used to reduce the number of inputs (i.e., test cases) : –Testing criteria group input elements into (equivalence) classes –One input is selected in each class (notion of test data coverage)

16 SYSC 4101 - Software Validation, Verification and Testing16 Test Data Coverage Software Representation (model) Associated Criteria Test Data Test cases must cover all the … in the model Representation of the specification  Black-Box Testing the implementation  White-Box Testing

17 SYSC 4101 - Software Validation, Verification and Testing17 Black-box vs. White-box Testing Check conformance with the specification It scales up (different techniques at different granularity levels)  It depends on the specification and the degree of detail  Do not know how much of the system is being tested  What if the system performs some unexpected, undesirable task? Based on control and data flow coverage criteria It allows you to be confident about how much of the system is being tested  It does not scale up (mostly applicable at unit and integration testing levels)  It cannot reveal missing functionalities (part of the specification that is not implemented)

18 SYSC 4101 - Software Validation, Verification and Testing18 Black-box vs. White-box Testing Specification System Implementation Missing functionality: Cannot be revealed by white-box techniques Unexpected functionality: Cannot be revealed by black-box techniques

19 SYSC 4101 - Software Validation, Verification and Testing19 Test Organization Many different potential causes of failure –Large systems -> testing involves several stages Module, component, or unit testing Integration testing Function test Performance test Acceptance test Installation test System test

20 SYSC 4101 - Software Validation, Verification and Testing20 Unit test Component code Integration test Tested components Design description Integrated modules Function test System functional specification Functioning system Performance test Other software specification Verified, validated software Acceptance test Customer requirements Accepted system Installation test User environment System in Use Pfleeger, 1998

21 SYSC 4101 - Software Validation, Verification and Testing21 Differences among Testing Activities Unit TestingIntegration TestingSystem Testing From module specifications From interface specifications From requirements specifications Pezze and Young, 1998 Visibility of code details Visibility of integration structure No visibility of code Complex scaffolding Some scaffolding No drivers/stubs Behavior of single modules Interactions among modules System functionalities

22 SYSC 4101 - Software Validation, Verification and Testing22 Integration Testing Integration of well tested components may lead to failure due to: Bad use of the interfaces (bad interface specifications / implementation) Wrong hypothesis on the behavior/state of related modules (bad functional specification / implementation), e.g., wrong assumption about return value Use of poor drivers/stubs: a module may behave correctly with (simple) drivers/stubs, but result in failures when integrated with actual (complex) modules.

23 SYSC 4101 - Software Validation, Verification and Testing23 System vs. Acceptance Testing System testing –The software is compared with the requirements specifications (verification) –Usually performed by the developer, who know the system Acceptance testing –The software is compared with the end-user requirements (validation) –Usually performed by the customer (buyer), who know the environment where the system is to be used –Sometime distinguished between  -  -testing for general purpose products

24 SYSC 4101 - Software Validation, Verification and Testing24 Testing through the Lifecycle Much of the life-cycle development artifacts provides a rich source of test data Identifying test requirements and test cases early helps shorten the development time They may help reveal faults It may also help identify early low testability specifications or design

25 SYSC 4101 - Software Validation, Verification and Testing25 Life Cycle Mapping Requirements => Acceptance testing Analysis => System Testing Design => Integration testing Class statecharts, method pre- and post-conditions, structure => class testing

26 SYSC 4101 - Software Validation, Verification and Testing26 Testing Activities 1.Establish the test objectives 2.Design the test cases 3.Write the test cases 4.Test the test cases 5.Execute the tests 6.Evaluate the test results 7.Change the system 8.Do regression testing

27 SYSC 4101 - Software Validation, Verification and Testing27 Testing Activities BEFORE Coding Testing is a time consuming activity Devising a test strategy and identify the test requirements represent a substantial part of it Planning is essential Testing activities undergo huge pressure as it is is run towards the end of the project In order to shorten time-to-market and ensure a certain level of quality, a lot of QA-related activities (including testing) must take place early in the development life cycle

28 SYSC 4101 - Software Validation, Verification and Testing28 Testing takes creativity Testing often viewed as dirty work. To develop an effective test, one must have: Detailed understanding of the system Knowledge of the testing techniques Skill to apply these techniques in an effective and efficient manner Testing is done best by independent testers Programmer often stick to the data set that makes the program work A program often does not work when tried by somebody else


Download ppt "SYSC 4101 - Software Validation, Verification and Testing1 SYSC 4101 – Software Validation, Verification and Testing Part II – Software Testing Overview."

Similar presentations


Ads by Google