Tool support for testing (CAST) 1 Principles2 Lifecycle 4 Dynamic test techniques 3 Static testing 5 Management6 Tools Software Testing ISTQB / ISEB Foundation.

Slides:



Advertisements
Similar presentations
© SMARTESTING 2011 – This document is the property of Smartesting. It may not be reproduced in whole or in part Cliquez pour modifier le style du titre.
Advertisements

Designing Reusable Frameworks for Test Automation
Test process essentials Riitta Viitamäki,
CS0004: Introduction to Programming Visual Studio 2010 and Controls.
HP Quality Center Overview.
Key-word Driven Automation Framework Shiva Kumar Soumya Dalvi May 25, 2007.
Programming Types of Testing.
Alternate Software Development Methodologies
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 20 Slide 1 Critical systems development.
Software Delivery. Software Delivery Management  Managing Requirements and Changes  Managing Resources  Managing Configuration  Managing Defects 
Software Quality Assurance Inspection by Ross Simmerman Software developers follow a method of software quality assurance and try to eliminate bugs prior.
Requirements Analysis 5. 1 CASE b505.ppt © Copyright De Montfort University 2000 All Rights Reserved INFO2005 Requirements Analysis CASE Computer.
Testing an individual module
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Software Testing Prasad G.
Types and Techniques of Software Testing
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 17 Slide 1 Rapid software development.
Damian Gordon. Requirements testing tools Static analysis tools Test design tools Test data preparation tools Test running tools - character-based, GUI.
Microsoft Visual Basic 2012 CHAPTER ONE Introduction to Visual Basic 2012 Programming.
Microsoft Visual Basic 2005 CHAPTER 1 Introduction to Visual Basic 2005 Programming.
CSCI 5801: Software Engineering
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Topics Covered: Data preparation Data preparation Data capturing Data capturing Data verification and validation Data verification and validation Data.
Semester 1, 2003 Week 7 CSE9020 / 1 Software Testing and Quality Assurance With thanks to Shonali Krishnaswamy and Sylvia Tucker.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Model Bank Testing Accelerators “Ready-to-use” test scenarios to reduce effort, time and money.
What is Sure BDCs? BDC stands for Batch Data Communication and is also known as Batch Input. It is a technique for mass input of data into SAP by simulating.
Moving into the Testing Phase Revised for October 22, 2008.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
Designing For Testability. Incorporate design features that facilitate testing Include features to: –Support test automation at all levels (unit, integration,
Winrunner Usage - Best Practices S.A.Christopher.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
Testing Workflow In the Unified Process and Agile/Scrum processes.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 23 Reliability III.
Debugging and Profiling With some help from Software Carpentry resources.
E.g.: MS-DOS interface. DIR C: /W /A:D will list all the directories in the root directory of drive C in wide list format. Disadvantage is that commands.
1 Introduction to Software Testing. Reading Assignment P. Ammann and J. Offutt “Introduction to Software Testing” ◦ Chapter 1 2.
Controls design Controls are “the plan of organization and all the methods and measures to safeguard its assets, check the accuracy and reliability of.
Input Design Lecture 11 1 BTEC HNC Systems Support Castle College 2007/8.
Week 14 Introduction to Computer Science and Object-Oriented Programming COMP 111 George Basham.
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Chapter 1: Fundamental of Testing Systems Testing & Evaluation (MNN1063)
LOGO TESTING Team 8: 1.Nguyễn Hoàng Khánh 2.Dương Quốc Việt 3.Trang Thế Vinh.
Test Plan: Introduction o Primary focus: developer testing –Implementation phase –Release testing –Maintenance and enhancement o Secondary focus: formal.
Dynamic Testing.
Types of Software Chapter 2.
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Automated Testing April 2001WISQA Meeting Ronald Utz, Automated Software Testing Analyst April 11, 2001.
Microsoft Visual Basic 2015 CHAPTER ONE Introduction to Visual Basic 2015 Programming.
The information systems lifecycle Far more boring than you ever dreamed possible!
Tool Support for Testing Classify different types of test tools according to their purpose Explain the benefits of using test tools.
Introduction to Computer Programming Concepts M. Uyguroğlu R. Uyguroğlu.
DOCUMENTATION REF: Essentials of IT (Hamilton et al) Chapter 1.
Software Testing and Quality Assurance Practical Considerations (1) 1.
1 Requirements Management - II Lecture # Recap of Last Lecture We talked about requirements management and why is it necessary to manage requirements.
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
Tool Support for Testing
Definition CASE tools are software systems that are intended to provide automated support for routine activities in the software process such as editing.
Introduction to Visual Basic 2008 Programming
Designing For Testability
Unit-6 Tool Support For SW Testing
Chapter 8 – Software Testing
Dynamic Program Analysis
Rapid software development
Tool support for testing (CAST)
Test Cases, Test Suites and Test Case management systems
User Interface Design and Development Lecture 1 – Monday 29 th January 2018.
Test Tools Tools can be useful but are they worth it? Cost
Presentation transcript:

Tool support for testing (CAST) 1 Principles2 Lifecycle 4 Dynamic test techniques 3 Static testing 5 Management6 Tools Software Testing ISTQB / ISEB Foundation Exam Practice Chapter 6

Types of CAST tool Why capture/replay is not test automation Automating and testing are separate skills Best practice Contents Tool support ISTQB / ISEB Foundation Exam Practice

Testing tool classification Requirements testing tools Static analysis tools Test design tools Test data preparation tools Test running tools - character-based, GUI Comparison tools Test harnesses and drivers Performance test tools Dynamic analysis tools Debugging tools Test management tools Coverage measurement

Static analysis Test management tools Test design Test data preparation Coverage measures Test running Dynamic analysis Debug Performance measurement Comp. Test Where tools fit Req Anal Code Function Design Sys Test Int Test Acc Test Requirements testing Test harness & drivers Comparison

Requirements testing tools n Automated support for verification and validation of requirements models -consistency checking -animation Tool information available from: Ovum Evaluates Software Testing Tools (subscription service) CAST Report, 1999 World Wide Web

Static analysis tools n Provide information about the quality of software n Code is examined, not executed n Objective measures -cyclomatic complexity -others: nesting levels, size

Test design tools n Generate test inputs -from a formal specification or CASE repository -from code (e.g. code not covered yet)

Test data preparation tools n Data manipulation -selected from existing databases or files -created according to some rules -edited from other sources

Test running tools 1 n Interface to the software being tested n Run tests as though run by a human tester n Test scripts in a programmable language n Data, test inputs and expected results held in test repositories n Most often used to automate regression testing

Test running tools 2 n Character-based -simulates user interaction from dumb terminals -capture keystrokes and screen responses n GUI (Graphical User Interface) -simulates user interaction for WIMP applications (Windows, Icons, Mouse, Pointer) -capture mouse movement, button clicks, and keyboard inputs -capture screens, bitmaps, characters, object states

Comparison tools n Detect differences between actual test results and expected results -screens, characters, bitmaps -masking and filtering n Test running tools normally include comparison capability n Stand-alone comparison tools for files or databases

Test harnesses and drivers n Used to exercise software which does not have a user interface (yet) n Used to run groups of automated tests or comparisons n Often custom-build n Simulators (where testing in real environment would be too costly or dangerous)

Performance testing tools n Load generation -drive application via user interface or test harness -simulates realistic load on the system & logs the number of transactions n Transaction measurement -response times for selected transactions via user interface n Reports based on logs, graphs of load versus response times

Dynamic analysis tools n Provide run-time information on software (while tests are run) -allocation, use and de-allocation of resources, e.g. memory leaks -flag unassigned pointers or pointer arithmetic faults

Debugging tools n Used by programmers when investigating, fixing and testing faults n Used to reproduce faults and examine program execution in detail -single-stepping -breakpoints or watchpoints at any statement -examine contents of variables and other data

Test management tools n Management of testware: test plans, specifications, results n Project management of the test process, e.g. estimation, schedule tests, log results n Incident management tools (may include workflow facilities to track allocation, correction and retesting) n Traceability (of tests to requirements, designs)

Coverage measurement tools n Objective measure of what parts of the software structure was executed by tests n Code is instrumented in a static analysis pass n Tests are run through the instrumented code n Tool reports what has and has not been covered by those tests, line by line and summary statistics n Different types of coverage: statement, branch, condition, LCSAJ, et al

Content Types of CAST tool Why capture/replay is not test automation Automating and testing are separate skills Best practice Tool support ISTQB / ISEB Foundation Exam Practice

Advantages of recording manual tests n documents what the tester actually did -useful for capturing ad hoc tests (e.g. end users) -may enable software failures to be reproduced n produces a detailed “script” -records actual inputs -can be used by a technical person to implement a more maintainable automated test n ideal for one-off tasks -such as long or complicated data entry

Captured test scripts n will not be very understandable -it is a programming language after all! -during maintenance will need to know more than can ever be ‘automatically commented’ n will not be resilient to many software changes -a simple interface change can impact many scripts n do not include verification -may be easy to add a few simple screen based comparisons

Compare seldom vs. compare often Storage space Failure analysis effort Miss faults Implementation effort Susceptibility to changes Robust TestsSensitive Tests

Too much sensitivity = redundancy If all tests are robust, the unexpected change is missed If all tests are sensitive, they all show the unexpected change Three tests, each changes a different field Unexpected change occurs for every test Test output

Automated verification n there are many choices to be made -dynamic / post execution, compare lots / compare little, resilience to change / bug finding effective n scripts can soon become very complex -more susceptible to change, harder to maintain n there is a lot of work involved -speed and accuracy of tool use is very important n usually there is more verification that can (and perhaps should) be done -automation can lead to better testing (not guaranteed!)

Content Types of CAST tool Why capture/replay is not test automation Automating and testing are separate skills Best practice Tool support ISTQB / ISEB Foundation Exam Practice

Effort to automate n The effort required to automate any one test varies greatly -typically between 2 and 10 times the manual test effort n and depends on: -tool, skills, environment and software under test -existing manual test process which may be: unscripted manual testing unscripted manual testing scripted (vague) manual testing scripted (vague) manual testing scripted (detailed) manual testing scripted (detailed) manual testing

Unscripted manual testing Step 4: check it worked OK “Try this” “Try that” “What about...” “What if...” Step 3: enter the inputs Step 2: think up specific inputs Step 1: identify conditions to test Wizzo Computer

Scripted (vague) manual testing Step 4: check it worked OK Step 3: enter the inputs Step 2: think up specific inputs Step 1: read what to do

A vague manual test script

Wizzo Computer Scripted (detailed) manual testing Step 3: check it worked OK Step 2: enter the inputs Step 1: read what to do

Content Types of CAST tool Why capture/replay is not test automation Automating and testing are separate skills Best practice Tool support ISTQB / ISEB Foundation Exam Practice

Don’t automate too much long term n as the test suite grows ever larger, so do the maintenance costs -maintenance effort is cumulative, benefits are not n the test suite takes on a life of its own -testers depart, others arrive, test suite grows larger nobody knows exactly what they all do … dare not throw away tests in case they’re important n inappropriate tests are automated -automation becomes an end in itself

Maintain control n keep pruning -remove dead-wood: redundant, superceded, duplicated, worn-out -challenge new additions (what’s the benefit?) n measure costs & benefits -maintenance costs -time or effort saved, faults found?

Invest n commit and maintain resources -“champion” to promote automation -technical support -consultancy/advice n scripting -develop and maintain library -data driven approach, lots of re-use

Tests to automate n run many times -regression tests -mundane n expensive to perform manually -time consuming and necessary -multi-user tests, endurance/reliability tests n difficult to perform manually -timing critical -complex / intricate Automate

Tests not to automate n not run often -if no need (rather than expensive to run manually) -one off tests (unless several iterations likely and build cost can be minimised) n not important -will not find serious problems n usability tests -do the colours look nice? n some aspects of multi-media applications Automate

Summary: Key Points There are many different types of tool support for testing, covering all areas of the life cycle. Automation requires planning and up-front effort Identify and adopt best practice Tool support ISTQB / ISEB Foundation Exam Practice