Testing & Software Quality Seminar on software quality 13.5.2005 Karipekka Kaunisto.

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Performance Testing - Kanwalpreet Singh.
Test Automation Success: Choosing the Right People & Process
Testing and Quality Assurance
Software Quality Assurance Plan
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
Metrics for Process and Projects
Case Tools Trisha Cummings. Our Definition of CASE  CASE is the use of computer-based support in the software development process.  A CASE tool is a.
CSCU 411 Software Engineering Chapter 2 Introduction to Software Engineering Management.
Degree and Graduation Seminar Scope Management
F21DF1 : Databases & Information SystemsLachlan M. MacKinnon & Phil Trinder Introduction to Information Systems Databases & Information Systems Lachlan.
Software Testing and Quality Assurance
Testing an individual module
Pertemuan Matakuliah: A0214/Audit Sistem Informasi Tahun: 2007.
System Implementation
MSIS 110: Introduction to Computers; Instructor: S. Mathiyalakan1 Systems Investigation and Analysis Chapter 12.
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
Software Testing Prasad G.
Introduction to Software Testing
Types and Techniques of Software Testing
Understanding of Automation Framework A Storehouse of Vast Knowledge on Software Testing and Quality Assurance.
“GENERIC SCRIPT” Everything can be automated, even automation process itself. “GENERIC SCRIPT” Everything can be automated, even automation process itself.
© 2006, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice. Automation – How to.
MSF Testing Introduction Functional Testing Performance Testing.
NYC Technology Forum Introduction to Test Automation 11/2/07 All rights reserved Not to be reproduced without permission Bill Rinko-Gay Solutions Director,
Test Design Techniques
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
Chapter 10.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
SOFTWARE TESTING STRATEGIES CIS518001VA : ADVANCED SOFTWARE ENGINEERING TERM PAPER.
System Analysis and Design
Maintaining Information Systems Modern Systems Analysis and Design.
CPIS 357 Software Quality & Testing
Introduction Telerik Software Academy Software Quality Assurance.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Software Testing Testing principles. Testing Testing involves operation of a system or application under controlled conditions & evaluating the results.
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
OBJECT ORIENTED SYSTEM ANALYSIS AND DESIGN. COURSE OUTLINE The world of the Information Systems Analyst Approaches to System Development The Analyst as.
Dr. Tom WayCSC Testing and Test-Driven Development CSC 4700 Software Engineering Based on Sommerville slides.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Principles of Information Systems, Sixth Edition Systems Investigation and Analysis Chapter 12.
Systems Analysis and Design in a Changing World, Fourth Edition
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Session # Rational User Conference 2002 Author Note: To edit Session # go to: View/Master/Title Master ©1998, 1999, 2000, 2001, 2002 Rational Software.
Connecting with Computer Science2 Objectives Learn how software engineering is used to create applications Learn some of the different software engineering.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Software Testing and Quality Assurance 1. What is the objectives of Software Testing?
MTA EXAM Software Testing Fundamentals : OBJECTIVE 6 Automate Software Testing.
1 The Requirements Problem Chapter 1. 2 Standish Group Research Research paper at:  php (1994)
Software Quality Assurance and Testing Fazal Rehman Shamil.
UML - Development Process 1 Software Development Process Using UML.
1. Black Box Testing  Black box testing is also called functional testing  Black box testing ignores the internal mechanism of a system or component.
Oman College of Management and Technology Course – MM Topic 7 Production and Distribution of Multimedia Titles CS/MIS Department.
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
CS223: Software Engineering Lecture 18: The XP. Recap Introduction to Agile Methodology Customer centric approach Issues of Agile methodology Where to.
MANAGEMENT INFORMATION SYSTEM
Software Testing Strategies for building test group
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
Software Engineering (CSI 321)
Principles of Information Systems Eighth Edition
Understanding of Automation Framework
Applied Software Implementation & Testing
Introduction to Software Testing
Introducing ISTQB Agile Foundation Extending the ISTQB Program’s Support Further Presented by Rex Black, CTAL Copyright © 2014 ASTQB 1.
Course: Module: Lesson # & Name Instructional Material 1 of 32 Lesson Delivery Mode: Lesson Duration: Document Name: 1. Professional Diploma in ERP Systems.
Introduction to Systems Analysis and Design Stefano Moshi Memorial University College System Analysis & Design BIT
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Software Testing Strategies
Presentation transcript:

Testing & Software Quality Seminar on software quality Karipekka Kaunisto

Contents  Role of testing in quality assurance  Challenges of software testing  What is test automation?  Test automation: Possible benefits  Common pitfalls of test automation  Conclusions  References

Role of testing in quality assurance  Quality controlling  Final product meets it’s requirements  Find potential errors and flaws  Enforce standards and good design principles  Regression testing  Improving quality  Preventive testing  Find cause of an error not just symtoms

Role of testing...  Testing as supportive action  Data collected during testing can be used to develop various quality metrics  These can be used to some extend when evaluating system quality and maturity  However, numbers alone don’t solely assure good quality!

Examples of poor testing  A major U.S. retailer was hit with a large govermental fine in October of 2003 due to web site errors that enabled customers to view one anothers' online orders  In early 1999 a major computer game company recalled all copies of a popular new product due to software problems. The company made a public apology for releasing a product before it was ready  A retail store chain filed suit in August of 1997 against a transaction processing system vendor (not a credit card company) due to the software's inability to handle credit cards with year 2000 expiration dates

Challenges of software testing  Complexity of testing  Even in a seemingly simple program there can be potentially infinite number of possible input and output permutations to test  What about large software systems with complex control logic and numerous dependencies on other modules and entire systems?

Complexity of testing  => It is not feasible to get even close to testing all combinations and thus finding all possible errors!  Tester needs to carefully create test set in a way that minimises risk of fatal errors in final product  Related problem: How do you know when to stop testing?  Acceptable risk level

Managing large test sets  Various general techniques have been introduced for managing test sets  Partitioning to smaller subsets  Testing special cases (boundaries, special values etc.)  Testing most important functions only (focus testing)  Invalid inputs and data  Program flow and code coverage testing

Are we ready to ship?  Even with all the techniques available it will require tester’s personal expertise and domain knowledge to create test plan and make the final decision to approve the product  Business issues may also affect on this: Risk of errors vs. risk of delay  Plan and test effort correlate quite well to quality controlling role of testing

Other challenges  Testing activities require significant amount of time and resources of the project => Delays, hasty testing  Testing is often regarded as dull, monotonous and laborous part of software development => Poor effort  System architecture is often quite complex, which require special testing effort => Reliability suffers, all tests not even possible manually

What is test Automation?  ”The management and performance of test activities to include the development and execution of test scripts so as to verify test requirements, using an automated test tool”. – Dustin, Rashka & Paul  ”Testing supported by software tool”. – Faught, Bach

Automation in practice  Tester describes the test cases for tool by using special scripting language designed by tool developers  Some tools may also include graphical interface and recording options but in practise scripting has to be used  Script should also specify how tool is supposed to interpret the correct results of any given test case  Tool then takes care of executing specified tests and examining the results

Automation in practise (cont.)  Result validation includes text outputs, elapsed time, screen captures etc.  Can be very challenging part to do automatically and may require some human intervention in some cases!  Evaluation results are presented in clear test reports that can be used to examine results of test round  Produced reports can also be used to gather data for various quality metrics

Areas of test automation  Automation suits mainly on testing that requires repeated effort of similar tests cases  Regression testing  Portability testing  Performance and stress testing  Configuration testing  Smoke testing ...

Possible benefits  More reliable system  Improved requirements definition  Improved performance (load & stress) testing  Better co-operation with developers  Quality metrics & Test optimisation  Enchanced system development life cycle

Benefits (2)  More effective testing process  Improved effort in various sub-areas like regression, smoke, configuration and multi-platform compatibility testing  Ability to reproduce errors  Dull routine tests can be executed without human intervention  ”After-hours testing”

More effective...  Execution of tests that are not possible manually  Better focus on more advanced testing issues  Enchanced business expertise

Benefits (3)  Reduced test effort and schedule  Initial costs of automation are usually very high  Payback comes later on (possibly quite much later) when team has adopted the process and use of tools

Pitfalls of test automation  Automatic test planning and design  There are no tools that can generate these automatically!  Requires human expertise and domain knowledge  Tool just does what it is scripted to do and nothing else

Pitfalls (2)  Immeadiate cost and time savings  On the contrary introduction of automation and new tools will increase the need of resources!  Automation process must be planned, test architecture created, tools evaluated, people trained, scripts programmed...  = Lot’s of work

Immediate...  Potential savings will be archieved (much) later on when organisation has ’learned’ the process and created needed infrastructure for it  If automation is introduced poorly, savings will never be gained at all!  In the worst case automation can just mess things up

Pitfalls (3)  One tool does it all  Wide array of operating systems, hardware and programming languages  Very different systems and architectures are often used  Testing requirements differ depending on system and project  Result analysis differ (graphical, text, time etc.)

Pitfalls (4)  Automation requires no technical skill  Tools rely solely on scripts when executing tests  Maintainable and reusable script building requires good programming skills and knowledge of the tool  Testers may have to be able to use several different tools with different scripting technologies!

Pitfalls (5)  100% test automation  Even if automation succeeds it cannot completely replace manual testing  Some tests must be conducted manually and others require at least some human intervention  Automation is really useful only with test cases that are executed repeatedly over time (regression)

Other related tools  Code analyzers  Coverage analyzers  Memory analyzers (purifiers)  Web test tools  Test management tools

Conclusions  Testing has significant role in software quality assurance  Automation, when implemented properly can further improve test effort and thus lead to improved quality  However many automation attempts have failed because of unrealistic expectations and inproper introduction of automation tools

References  Dustin E., Rashka J., Paul J.: Automated Software Testing: Introduction, Management and Performance. Adison Wesley, 1999  Craig R. and Jaskiel S.: Systematic Software Testing, Artech House Publishing, 2002  Pettichord Bret, Presentations and Publications.