Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Testing1 Dr. Samy Abu Nasser Associate Professor Faculty of Engineering & Information Technology.

Similar presentations


Presentation on theme: "Software Testing1 Dr. Samy Abu Nasser Associate Professor Faculty of Engineering & Information Technology."— Presentation transcript:

1 Software Testing1 Dr. Samy Abu Nasser Associate Professor Faculty of Engineering & Information Technology

2 Software Testing2 Structured Testing

3 Software Testing3 Lecture Outline The Unbearable Hardness of Testing A definition of testing Structured testing Quality requirement factors Black-box testing Functional hierarchies

4 Software Testing4 Testing Will Not Be Easy Non-trivial software has bugs Cannot know in advance what will cause failure Passing a test does not show bug absence  f(x) may be correct for x = 213  f(x) may not be correct for x ≠ 213  f(x) may not be correct for x = 213 next time! Large input/state space

5 Software Testing5 Testing Stays Hard Tests help us find current bugs Future bugs will require new tests

6 Software Testing6 Testing as Risk Management How do we tame this complexity? Ask: what are the risks? Testing is about covering the risks Risks can be  Dependent on implementation ■ Coding errors, &c.  Independent of implementation ■ Specification errors, &c.

7 Software Testing7 A Definition of Testing Software testing is a formal process (carried out by a specialized testing team) in which a software unit, several integrated software units or an entire software package are examined by running the programs on a computer. All the associated tests are performed according to approved test procedures on approved test cases.

8 Software Testing8 Methodical Testing The testing definition implies testing should be methodical Since testing is a statement of confidence it must be convincing and therefore documented Document  Test process(es)  Test design  Test techniques Documentation must be adhered to

9 Software Testing9 Testing Objectives Direct objectives  Identify and reveal as many errors as possible in the tested software  To bring the tested software, after correction of the identified errors and retesting, to an acceptable level of quality  To perform the required tests efficiently and effectively, within budgetary and scheduling limitation Indirect objectives  To compile a record of software errors for use in error prevention (by corrective and preventive actions)

10 Software Testing10 Testing Strategies Big Bang testing  Test the software in its entirety Incremental testing  Test the software as it is built  Unit  Integration  System ■ May be bottom-up, may be top-down

11 Software Testing11 Structured Testing Testing can appear overwhelming initially Develop tests systematically Develop tests incrementally  According to the systematic process Testing will still be labor-intensive  Don’t cut corners

12 Software Testing12 Testing as Development Testing and development cannot be separated Testing shows the behavior of software  Therefore we don’t know how software will behave until we test it Develop tests from specification (model)  Specification defines correct operation  Tests should check for correct operation  Specification ↔ Tests ↔ Code

13 Software Testing13 Quality Requirement Factors Operation  Correctness  Reliability  Efficiency  Integrity  Usability Revision  Maintainability, Flexibility, Testability Transition  Portability, Reusability, Interoperability

14 Software Testing14 Test classifications (Operation) Correctness  Accuracy and completeness of outputs & data ■ Output correctness tests  Accuracy and completeness of documentation ■ Documentation tests  Availability  Data processing and calculations correctness  Coding and documentation standards

15 Software Testing15 Test classifications (Operation) Reliability  Reliability tests Efficiency  Stress/load tests Integrity  Security tests Usability  Training usability tests  Operational usability tests

16 Software Testing16 Test Classifications Revision  Maintainability  Flexibility  Testability Transition  Portability  Reusability  Interoperability ■ With other software ■ With other equipment

17 Software Testing17 Black Box Testing (IEEE) Testing that ignores the internal mechanism of a system or component and focuses solely on the output generated in response to selected inputs and execution conditionals Testing conducted to evaluate the compliance of a system or component with specified functional requirements

18 Software Testing18 Use of Black Box Testing Can’t look at particular implementation Sub-factors testable  Output correctness, Documentation, Availability  Reliability  Stress/load  Security  Training, Operational  Maintainability  Testability  Portability  Interoperability: software, equipment

19 Software Testing19 Black Box Testing Requirements What the system is supposed to do  How do we know this?  How is it described Need a model of the system Need a model of the tests Need a working component or (sub)system

20 Software Testing20 Results of Black Box Testing Types of errors  Missing/incorrect capability  Inadequate performance, deadlock, &c.  Incorrect output  Abnormal termination Causes of errors  Elicitation errors  Specification errors  Programming errors ■ via incorrect results  Configuration/integration errors

21 Software Testing21 Types of Black Box Testing Domain-based testing Stress/load testing Specification driven testing Risk-based testing Function testing Regression testing Scenario/use-case testing User-based testing Exploratory testing

22 Software Testing22 Domain-based testing Divide tests into equivalence classes Test boundaries/representatives from each set Pros  Easy  Generalizes  Can produce results from a small test set Cons  Domains may be difficult to determine  Faults not on boundaries may be missed

23 Software Testing23 Stress/load testing Push system to its limits and beyond  Large data/connection volumes  Long transactions  Degradation: partial cluster failure Pros  Cons Exposes ‘field’ problems  What about faults unrelated to stress?

24 Software Testing24 Specification driven testing Develop assertions from requirements and/or specifications Map assertions to test cases Pros  Defense against not meeting specification Cons  Will not find faults due to missing or incorrect requirements

25 Software Testing25 Risk-based testing Schedule testing according to perceived risks Pros  Optimal division of testing effort Cons  Risks may not be initially clear  Risk may not be identified  Wrong assignment of priority may lead to major risk being untested

26 Software Testing26 Function testing Test each function individually and thoroughly Requires functional analysis Pros  Thorough Cons  Misses functional interaction  Doesn’t assess system as a whole

27 Software Testing27 Regression testing Manage risk of ‘fix’ not fixing a bug or introduced side-effects Automated regression test suites Pros  Maintain consistency of behavior Cons  Doesn’t handle new functionality  Maintenance ranges from annoying to costly  May not cover things well

28 Software Testing28 Scenario/use-case testing Based on real use scenarios  Use cases  Transactions Pros  Realistic (and likely complex)  Exposes failures over time  Useful as part of UA testing Cons  Individual function failure can ruin whole tests  Coverage may be an issue

29 Software Testing29 User-based testing Let real users user the software Find faults in interface  Internal use sessions with external testers  Beta testing Pros  Exposes design/usability issues  May be recorded and/or focused Cons  Coverage may be patchy  Test cases may be poor  May not be free

30 Software Testing30 Exploratory testing Undocumented fiddling with software Learn-as-you go testing Pros  Little preparation needed Cons  May miss a lot due to ignorance

31 Software Testing31 Structured Testing Testing is systematic and structured Design tests  From specification ■ Functional, performance, security  From model ■ Risks model Criteria for correct operation

32 Software Testing32 Specifications End-users must be included in test design  Help mitigate against missing/incomplete documentation and requirements Specification may be development incrementally Test design may involve development of the specification

33 Software Testing33 Test Objectives Directed against a specific (set of) components High level test objectives  Decomposed to more detailed objectives Test objectives and cases must be documented  Audit trail from specification to tests  Document templates ensure consistency

34 Software Testing34 Functional Analysis Model of system functionality Develop lists of functions Analyze functionality to determine tests

35 Software Testing35 Example: FontForge Outline font editor which supports  Creation of character outlines  Development of typographical tables for OpenType and AAT (Apple Advanced Typography )  Saving of fonts in various formats Problems  Documentation/tutorial is patchy (inconsistent)  Output of this program is used in many different environments

36 Software Testing36 Functions Identify functions which are part of the system  Group functions as an hierarchy  Leaves of this tree are individual functions Functional hierarchy  May mirror menus in the application  May mirror documentation  May be a mixture

37 Software Testing37 Function Hierarchy Develop an hierarchy of function groups and Functions

38 Software Testing38 Function Hierarchy

39 Software Testing39 Deeper functionality

40 Software Testing40 Documenting Functions Document each identified function  What the function does  How it is supposed to work

41 Software Testing41 Add new substitution variant Accessed from Glyph Info dialog Allows the user to add a new substitution variant for a glyph Once entered the data becomes part of the translation tables in the font

42 Software Testing42 Documenting Functions Level of detail dependent on end-users of documentation  ‘Accessed from Glyph Info dialog’ may or may not be enough information Descriptions may be provided up-front or an as-needed basis during the design of tests

43 Software Testing43 Data Flow Inputs  Target components  Substitution tag  Script/language tag  Flags Outputs  Entry in substitution table for glyph

44 Software Testing44 Data Flow Model

45 Software Testing45 Analysis Analyze each function to define test criteria Analyze each function’s input and output to determine required tests Analyze the functions’  Criteria  Inputs  Outputs  Internal conditions  Internal state

46 Software Testing46 Criteria Define post-conditions to determine whether the function is performing correctly

47 Software Testing47 Criteria Tools are just that: tools not solutions  XML  UML  Java  They should not be used for the sake of it

48 Software Testing48 Criteria #CriterionId 1New substitution variant GLYPH-SUBS-CR-001

49 Software Testing49 Outputs Identify output which must be produced by the system to support the function Identify how values will be determined to be correct

50 Software Testing50 Outputs #OutputAccessCorrectness Id 1 Target componentSubstitution tableAs enteredGLYPH-SUBS-OU-001 2 Type tagSubstitution tableAs enteredGLYPH-SUBS-OU-002 3 ScriptSubstitution tableAs enteredGLYPH-SUBS-OU-003 4 FlagsSubstitution tableAs entered GLYPH-SUBS-OU-004

51 Software Testing51 Inputs Identify input required to generate outputs Identify how invalid inputs are treated

52 Software Testing52 Inputs #InputRequirementsId 1 Target component glyph Target glyph of substitution. Glyph is entered into appropriate substitution table GLYPH-SUBS-CR-001 GLYPH-SUBS-OU-001 2 Type tag substitution table GLYPH-SUBS-CR-001 GLYPH-SUBS-OU-002 3 Script Script for which substitution is to be used GLYPH-SUBS-CR-001 GLYPH-SUBS-OU-003 4 FlagsModifier flags GLYPH-SUBS-CR-001 GLYPH-SUBS-OU-004

53 Software Testing53 Invalid Inputs #InputTreatmentId 1 Tag If tag type does not exist disallow confirmation but do not abort. GLYPH-SUBS-II-001 2 Tag If tag type is invalid for glyph disallow confirmation but do not abort. GLYPH-SUBS-II-002

54 Software Testing54 Internal Conditions Identify internal conditions under which outputs may be produced Identify consequences when required conditions are not satisfied

55 Software Testing55 Internal Conditions #ConditionEffectId 1 Target Component glyph exists Entry added to substitution table GLYPH-SUBS-SC-001

56 Software Testing56 Unsatisfied Internal Conditions #ConditionTreatmentId 1 Target Component glyph does not exists Addition of substitution cannot be completed GLYPH-SUBS-UC-001

57 Software Testing57 Internal State Identify how the internal state changes after receiving input Identify how the internal state can be observed to check for correct change of state

58 Software Testing58 Internal Conditions Identify internal conditions under which outputs may be produced Identify consequences when required conditions are not satisfied


Download ppt "Software Testing1 Dr. Samy Abu Nasser Associate Professor Faculty of Engineering & Information Technology."

Similar presentations


Ads by Google