Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) 1.

Slides:



Advertisements
Similar presentations
HL7 V2 Implementation Guide Authoring Tool Proposal
Advertisements

HL7 V2 Conformance Testing Robert Snelick NIST January 20 th, 2004
System Integration Verification and Validation
DETAILED DESIGN, IMPLEMENTATIONA AND TESTING Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
S&I Framework Testing HL7 V2 Lab Results Interface and RI Pilot Robert Snelick National Institute of Standards and Technology June 23 rd, 2011 Contact:
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Test Logging and Automated Failure Analysis Why Weak Automation Is Worse Than No Automation Geoff Staneff
1. Introduction OASIS Reference Model for Service Oriented Architecture 2. ECF 4.0 Architecture 2.1 Core vs. Profiles 2.2 Major Design Elements 2.3.
Component 4: Introduction to Information and Computer Science Unit 9: Components and Development of Large Scale Systems Lecture 5 This material was developed.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
6/11/2015Page 1 Web Services-based Distributed System B. Ramamurthy.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Software Issues Derived from Dr. Fawcett’s Slides Phil Pratt-Szeliga Fall 2009.
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Introduction to Software Testing
Getting Started with Windows Communication Foundation 4.5 Ed Jones, MCT, MCPD, MCTS Consultant RBA Inc.
Workflow API and workflow services A case study of biodiversity analysis using Windows Workflow Foundation Boris Milašinović Faculty of Electrical Engineering.
HL7 V2 Test Case Authoring and Management Tool Robert Snelick National Institute of Standards and Technology May 14 th 2012 Contact:
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Web Services Inter- Operability CS409 Application Services Even Semester 2007.
Jonas Eberle3rd June Process chaining capabilities based on OGC Web Processing Services Jonas Eberle, Anna Homolka Friedrich-Schiller-University.
Aurora: A Conceptual Model for Web-content Adaptation to Support the Universal Accessibility of Web-based Services Anita W. Huang, Neel Sundaresan Presented.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Explore.
THE GITB TESTING FRAMEWORK Jacques Durand, Fujitsu America | December 1, 2011 GITB |
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
Software Engineering Chapter 23 Software Testing Ku-Yaw Chang Assistant Professor Department of Computer Science and Information.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
© 2006 IBM Corporation IBM WebSphere Portlet Factory Architecture.
1 CIS336 Website design, implementation and management (also Semester 2 of CIS219, CIS221 and IT226) Lecture 6 XSLT (Based on Møller and Schwartzbach,
DEVS Namespace for Interoperable DEVS/SOA
November 1, 2006IU DLP Brown Bag : Fall Data Integrity and Document- centric XML Using Schematron for Managing Text Collections Dazhi Jiao, Tamara.
Fujitsu Computer Systems Testing Considerations (Past experience, Methodology, Options…) CAMP F2F Redwood City, November 2012 Jacques Durand Fujitsu 1.
JSTL, XML and XSLT An introduction to JSP Standard Tag Library and XML/XSLT transformation for Web layout.
Oracle Data Integrator Procedures, Advanced Workflows.
Unit-1 Introduction Prepared by: Prof. Harish I Rathod
Software Construction Lecture 18 Software Testing.
Software Testing and Quality Assurance Software Quality Assurance 1.
What is Testing? Testing is the process of finding errors in the system implementation. –The intent of testing is to find problems with the system.
Week 14 Introduction to Computer Science and Object-Oriented Programming COMP 111 George Basham.
Interoperability Testing. Work done so far WSDL subgroup Generated Web Service Description with aim for maximum interoperability between various SOAP.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
CPSC 873 John D. McGregor Session 9 Testing Vocabulary.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
CEN/ISSS eBIF GTIB Project Meeting, Brussels Mar , 2009 CEN/ISSS eBIF GTIB Project Meeting, Brussels 1 CEN/ISSS eBIF Global eBusiness Interoperability.
Using WS-I to Build Secure Applications Anthony Nadalin Web Services Interoperability Organization (WS-I) Copyright 2008, WS-I, Inc. All rights reserved.
CPSC 871 John D. McGregor Module 8 Session 1 Testing.
Martin Kruliš by Martin Kruliš (v1.1)1.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Promoting Web services interoperability across platforms, applications and programming languages Overview Presentation September, 2003.
Helping the Cause of Medical Device Interoperability Through Standards- based Test Tools DoC/NIST John J. Garguilo January 25,
C++ for Engineers and Scientists, Second Edition 1 Problem Solution and Software Development Software development procedure: method for solving problems.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
PROGRAMMING FUNDAMENTALS INTRODUCTION TO PROGRAMMING. Computer Programming Concepts. Flowchart. Structured Programming Design. Implementation Documentation.
Interop SC 02/03/2016. Agenda Jacques feedbacks Contribution process improvements proposal 2.
© 2009 Artisan Software Tools. All rights reserved. Testing Solutions with UML/SysML Andrew Stuart, Matthew Hause.
Copyright 2015 Varigence, Inc. Unit and Integration Testing in SSIS A New Approach Scott @varigence.
CPSC 372 John D. McGregor Module 8 Session 1 Testing.
WS ►I Promoting Web services interoperability across platforms, applications and programming languages October, 2002.
NIST Immunization Test Suite Quick Reference Guide Robert Snelick Sandra Martinez Robles National Institute of Standards and Technology November 10, 2015.
Testing Tutorial 7.
Software Testing.
Web Services Inter- Operability
Web Ontology Language for Service (OWL-S)
XML in Web Technologies
Introduction to Software Testing
CSSSPEC6 SOFTWARE DEVELOPMENT WITH QUALITY ASSURANCE
Web Services Interoperability Organization
Presentation transcript:

Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) 1

Fujitsu Computer Systems  The case for “2-Phase” testing  Case study: the “WS-Interoperability” test suite  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) Separating Test Execution from Test Analysis Fujitsu America, Inc. 2

Fujitsu Computer Systems A bit of Background  These Ideas come from testing Service- Oriented environments  Testing a Service vs. testing a component? Service = “Contract” – “Right” use  gives you good “service” Service = Reusable – But a lot of variation in contexts of use… Invocation  use a stack of technologies/standards, –the integration of which needs be included in testing Fujitsu America, Inc. 3

Fujitsu Computer Systems A bit of Background (2)  It became obvious that: Ways of using a Service are many more than you can afford to test…  Need to make the best of each service invocation Understanding the Context of Use is key in understanding failures.  Need to factor-in prior invocations, other services and middleware composed with  Application building more and more like assembling Services Reused components, mash-ups… Fujitsu America, Inc. 4

Fujitsu Computer Systems System Testing today: (in general) Test Suites = workflow of Test Cases Test Case = test execution + analysis + reporting SUTAPIAPI Test case 1 Test case 2 Test case 3 opA opB opC Test Report opA FAIL opC PASS opB FAIL Test Suite Fujitsu America, Inc. 5

Fujitsu Computer Systems Test Case-level Integration of Execution and Analysis  Each Test Cases is doing: Prepare test (input data…) Call operation XYZ on System Under Test Compare output data with reference data. Report Error / Success  Mixing two different activities: Test execution Test analysis  Why is this NOT great? Fujitsu America, Inc. 6

Fujitsu Computer Systems Issues with each Test Case doing it all (test execution + analysis + reporting)  Under-reporting A test case is designed to test ONE feature But needs accessory use of OTHER features Designed to detect/report failures of “main” feature, NOT of “accessory” features  Mis-diagnosis: Accessory features are assumed to work well, but may fail Causing failure of the test case  a FAIL is reported for the “main” feature under test. Fujitsu America, Inc. 7

Fujitsu Computer Systems Main Feature under test and Accessory Features  Example: “Array structure” under test Test case 123 is testing the “sum” function  Sequence of operations: (1) Create the array, e.g. size 20 (2) Set each array entry to some value (3) Calculate “Sum” of all values (4) Compare output of (3) with reference sum  What if “Set” does not work above index 10? SUM test case will FAIL! “Set” failure will not be reported by 123 accessory main Mis-diagnosis analysis Under-reporting Fujitsu America, Inc. 8

Fujitsu Computer Systems The need to Use Accessory Features  Example: Array object under test  Test case 122 is testing the “set” function  Sequence of operations for test 122: (1) Create the array, e.g. size 10 (2) Set each array entry to some value (3) Read each array entry and make a list (3) Compare result list with reference list accessory feature main feature analysis accessory Fujitsu America, Inc. 9

Fujitsu Computer Systems Start end F1 F2 F3 F4 FAIL AnalyzeReport Test Case 100 SUT Main Feature under test for test 100 F2 F3 F4 Test report F5 FAIL Accessory features for test 100 Execute scenario “Streaming” test suites: a workflow of “do-it-all” Test Cases Fujitsu America, Inc. 10

Fujitsu Computer Systems 11 Separating Test Execution from Test Analysis Test Operations Test scenarios Phase 1 Test Analysis Test assertions Phase 2 SUT Execution Report Test Report Fujitsu America, Inc.

Fujitsu Computer Systems  In a Streaming Test Suite: Test Suite = workflow of test cases Each test case has a main feature in focus Each test case: executes + analyzes Test case produces a validation report item (pass/fail)  In a 2-Phase Test Suite: Phase 1 = workflow of test cases –Phase 2 is a separate, global analysis phase Every SUT feature needs be exercised too –But no “main feature” to be reported on Test case executes a test scenario, NO analysis –produces an execution report item ( operation trace ) 2-Phase Test Suites vs. Streaming T. Suites Fujitsu America, Inc. 12

Fujitsu Computer Systems Test Tools: overview Web Service ANALYZER MONITOR Interceptor Logger Message artifacts Use Case: WS-Interoperability Testing Execution report Test report Fujitsu America, Inc. Phase 1 Phase 2 Test SCENARIOS Client code SUT 1 (SUT 2) 13

Fujitsu Computer Systems Testing for EVERY operation used (Accessory or Main) in EVERY test case  Example: Test the “Create” Array operation All initial values must be = 0  “Tracing” the create operation: (1) Trace input (nm=“ABC” sze= “10”) (2) Create ABC [10] (3) Read ABC [1], ABC [2] … ABC [10] (4) Trace output Execution Report Create ABC: ( Sze = 10 Out = (1,0)(2,0)… (10,0) ) Fujitsu America, Inc. 14

Fujitsu Computer Systems Testing for EVERY operation used (Accessory or Main) in EVERY test case  “Tracing” the set operation: (1) Trace input (index=“2”, val= “50”) (2) Set array entry 2 to input value 50 (3) Read ABC[2] (4) Trace output Execution Report Set ABC: ( Ind = 2 Val = 50 Out = (2,50) ) Fujitsu America, Inc. 15

Fujitsu Computer Systems Start end F1 Test Case 100 SUT F2 F3 F4 Execute “Tracing” each operation Fujitsu America, Inc. Trace (F1) Trace (F2) Trace (F3) Execution Report Trace Traces A library of test wrappers 16

Fujitsu Computer Systems  Test case 123 is executing (NOT verifying) the “sum” function  Sequence of operations: (1) Trace (Create the array ABC, size 11) (2) Trace (Set each array entry from -5 to +5) (3) Trace (Sum of all values) (should be 0) Rewriting Test Case 123 for Phase 1 in a 2-Phase test Suite Create ABC: (Sze = 21; Out= (1,0)…(11,0)) Set ABC: (Ind=1; Val = -5; Out= (1, -5)) Set ABC: (Ind=2; Val = -4; Out = (2, -4)) Sum ABC: ( Out = 0) Set ABC: (Ind=11; Val = 5; Out = (11, 5)) Execution Report … Fujitsu America, Inc. 17

Fujitsu Computer Systems Start end F1 F3 FAIL Trace Test Case 100 SUT F2 F3 F4 Test report FAIL Execute Trace F1 F2 F3 Execution Report F1 F3 F4 F2 ANALYSISANALYSIS F3 F4 “2-Phase” test suites Phase 2 Phase 1 Fujitsu America, Inc. 18

Fujitsu Computer Systems Test case 1Test case 2 Test case 3Test case 4 Feature A Feature B Feature C Feature D Feature A Feature B Feature C Feature D main Exercised but NOT reported Exercised + reported “Streaming” Test suite 2-phase Test suite Fujitsu America, Inc. SUT feature Coverage 19

Fujitsu Computer Systems  Verify the Set operation = Verify EVERY Set trace (from every test case)  Verify the Sum operation = after its Create and Set traces have been verified Phase 2: Test Analysis Execution Report Create ABC: (Sze = 21; Out= (1,0)…(11,0)) Set ABC: (Ind=1; Val = -5; Out= (1,-5)) Set ABC: (Ind=2; Val = -4; Out = (2,-4)) Sum ABC: ( Out = 0) Set ABC: (Ind=11; Val = 5; Out = (11,5)) … Fujitsu America, Inc. Under-reporting Mis-diagnosis 20

Fujitsu Computer Systems  Ensuring correct diagnosis: Failure of an accessory feature may cause failure of the Test Case. Test output is meaningful for main feature only if no failure on accessory features. In streaming Test Suites: hard to isolate fail cause –Need ordering: accessory test first, separate, exhaustive In 2-Phase Test Suites: much better – Only test ANALYSIS needs be ordered & separate – Accessory features always tested in their real context Test Analysis: Test(sum) = { create + set + sum } Fujitsu America, Inc. 21

Fujitsu Computer Systems  Tough diagnosis: Sometimes need to look at several Test Case outputs to understand which feature failed Advanced Test Analysis: Test(sum) => Create + Set + Sum Test(set) => Set + Read FAILOK andRead FAILED FAIL OK andSum (or Create?) FAILED FAIL andSet (or Read + Sum) FAILED FAIL Most likely cause: Fujitsu America, Inc. + 22

Fujitsu Computer Systems  The case for 2-Phase test suites  Case study: the “WS-Interoperability” test suite  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) Separating Test Execution from Test Analysis Fujitsu America, Inc. 23

Fujitsu Computer Systems Test Tools: overview HTTP SOAP XML WSDL, XML schemas Web Service ANALYZER MONITOR Interceptor Logger Meta-data Use Case: WS-Interoperability Testing What’s under test? Description artifacts Web service run-time Client Execution report Test report Fujitsu America, Inc. Test SCENARIOS Client code 24

Fujitsu Computer Systems  Several “WS Profiles” to be tested Profile = way to combine underlying Standards –(SOAP, HTTP, XML…)  Phase 1 for Basic Profile 2.0: about 20 test scenarios Execution report = XML-formatted log  Phase Executable test assertions (XML + Xpath )  Standard, HTML test reports Use Case: WS-Interoperability Testing Fujitsu America, Inc. 25

Fujitsu Computer Systems WS-I Profiles WS-Interoperability Testing (1) Schema A WSDL Message 1 Schema B Message 2 Message 3 Message 100 Consolidated XML Execution Report wsdl HTTP Message capture Test Assertion Engine Analyzer Fujitsu America, Inc. 26

Fujitsu Computer Systems  Basic Profile 2.0 Phase 1: ~20 Test scenarios, each producing average 2 messages in execution report (~40).  Phase 2 “coverage”: ~150 Test Assertions (TA) Each message hits an average of 20 Test Assertions  ~800 Test Items reports  COMPARE WITH: a “streaming” Test suite would have required: Executing more test scenarios (ideal 1 per TA) Yet would produce only 150 to 300 Test Items WS-Interoperability Testing (2) Fujitsu America, Inc. 27

Fujitsu Computer Systems  Basic Profile 2.0 Phase 1: ~20 Test scenarios, each producing average 2 messages in execution report (~40).  Phase 2 “coverage”: ~150 Test Assertions (TA) Each message hits an average of 20 Test Assertions  ~800 Test Items reports WS-Interoperability Testing (2) Fujitsu America, Inc. COMPARE WITH a “streaming” Test suite : Executing more test scenarios (ideal 1 per TA) Yet would produce only 150 to 300 Test Items 28

Fujitsu Computer Systems  The case for 2-Phase test suites  Case study: the “WS-Interoperability” test suite  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) Separating Test Execution from Test Analysis Fujitsu America, Inc. 29

Fujitsu Computer Systems  A definition A Definition “a testable or measurable expression for evaluating the adherence of [part of] an implementation to a normative statement in a specification” (Test Assertions Guidelines OASIS TC)  Usually distinct from a Test Case:  Test Case = Executable set of test tools, programs & files  Test Assertion = a declarative, logical statement Back to Fundamentals: Test Assertions Fujitsu America, Inc. 30

Fujitsu Computer Systems Test Case Test Case Test assertion Test assertion Specification Test assertion Test Case addresses Test suite 1..n Normative statement Normative statement Normative statement 1..n The Role of Test Assertions derived from Conformance Clause defines a Conformance Profile for measures or indicates conformance of an implementation for specify: SUT behavior, API, etc Fujitsu America, Inc. 31

Fujitsu Computer Systems A Definition  Designing a Streaming Test Suite Test assertion derive Normative statement Test Suite design methodologies Test case = executable scenario + analysis + reporting Scenario Analysis  Designing a 2-Phase Test Suite Test assertion Normative statement Test case = Scenario + tracing Executable TA ( analysis + report ) derive Fujitsu America, Inc. 32

Fujitsu Computer Systems Normative statement Normative statement ID Normative Source Target Predicate Prescription Prerequisite Normative statement Specification Test Assertion [part of] an SUT Anatomy of a Test Assertion Analysis Logic Allows conditional Execution of TA ( MUST / SHOULD / MAY ) Fujitsu America, Inc. According to OASIS TAG 33

Fujitsu Computer Systems ID Normative Source Target Predicate Prescription Prerequisite Leveraging XML + XPath …. (Xpath) Fujitsu America, Inc. 34

Fujitsu Computer Systems  The case for 2-Phase test suites  The “WS-Interoperability” test suite use case  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) demonstration Separating Test Execution from Test Analysis Fujitsu America, Inc. 35

Fujitsu Computer Systems Test Assertion ID Normative Source Target Predicate Prescription level Prerequisite Tags Test Assertion XPath Reporting Variables Add an ID scheme for target instances Convenient references to other documents Test assertion Markup + XPath = executable test “rule” Error message Diagnostic details XPath Fujitsu America, Inc. XPath 36

Fujitsu Computer Systems Simplest : For every target instance in the input XML document that matches the Target expression, do : if the Predicate expression is false report "failed" else report "passed" Fujitsu America, Inc. Test assertion mark-up from OASIS 37

Fujitsu Computer Systems Generating the Analyzer Generator (XSLT 2.0) Test Assertions (XML+XPath) Analyzer (XSLT 2.0) Code generation Testing run- time HTML Input Execution Report (XML) Test Analysis with Tamelizer Tool generating an Analyzer from a set of XML Test Assertions Test Report (XML) Rendering (XSLT1.0) Input generates Fujitsu America, Inc. 38

Fujitsu Computer Systems 2-Phase test suites separate Execution from Analysis: more reliable and “productive” than conventional (streaming) test suites Test Assertions are key to test analysis (phase 2) Mature XML processing technology can automate Analysis phase end-to-end Conclusion Fujitsu America, Inc. 39

Fujitsu Computer Systems OASIS Test Assertions Guidelines Useful Links Fujitsu America, Inc. Tamelizer Analysis tool (open source) 40

Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) 41