Business Informatics Group Institute of Software Technology and Interactive Systems Vienna University of Technology Favoritenstraße 9-11/1883, 1040 Vienna,

Slides:



Advertisements
Similar presentations
© Telelogic AB UML Testing Profile Graphical Testing With UML The UML Testing Profile Eric Samuelsson
Advertisements

Profiles Construction Eclipse ECESIS Project Construction of Complex UML Profiles UPM ETSI Telecomunicación Ciudad Universitaria s/n Madrid 28040,
Karl-Heinz Kühnlein Conquest 2009: Experiences with model centric Testing in Standard-based Medical IT Environments Test Management Aspects.
UML Testing Profile Tutorial MBT User Conference, 18 th of October 2011 Marc-Florian Wendland, Ina Schieferdecker, Markus Schacher, Armin Metzger.
Ch:8 Design Concepts S.W Design should have following quality attribute: Functionality Usability Reliability Performance Supportability (extensibility,
GFT The Graphical Format of TTCN-3
SEG4110 – Advanced Software Design and Reengineering TOPIC D Metamodelling.
A Brief Introduction. Acknowledgements  The material in this tutorial is based in part on: Concurrency: State Models & Java Programming, by Jeff Magee.
1 SWE Introduction to Software Engineering Lecture 15 – System Modeling Using UML.
UML CASE Tool. ABSTRACT Domain analysis enables identifying families of applications and capturing their terminology in order to assist and guide system.
COMP8130 and 4130Adrian Marshall 8130 and 4130 Test Design & Documentation Adrian Marshall.
Lecture a: Additional UML Models: Package, Activity, Deployment Lecture b: Generalization, Aggregation and Additional Domain Model Notation Copyright W.
NIEM-UML Profile Justin Stekervetz, NIEM PMO
Basic Concepts The Unified Modeling Language (UML) SYSC System Analysis and Design.
Chapter 9 Domain Models. Domain Model in UML Class Diagram Notation A “visual dictionary”
Software Engineering Muhammad Fahad Khan
® IBM Software Group © 2006 IBM Corporation Writing Good Use Cases Module 4: Detailing a Use Case.
UML - Development Process 1 Software Development Process Using UML (2)
Smith’s Aerospace © P. Bailey & K. Vander Linden, 2005 Architecture: Component and Deployment Diagrams Patrick Bailey Keith Vander Linden Calvin College.
SEG4110 – Advanced Software Design and Reengineering
Faculty of Informatics and Information Technologies Slovak University of Technology Peter Kajsa and Ľubomír Majtás Design.
Copyright © Siemens AG All rights reserved. Essential Criteria on MBT to Ensure Quality of Software in Industry PVR Murthy Andreas Ulrich Siemens.
An Object-Oriented Approach to Programming Logic and Design
Implementation Yaodong Bi. Introduction to Implementation Purposes of Implementation – Plan the system integrations required in each iteration – Distribute.
MDA and QVT  Tom Gullion, Director of Product Management, Together Products.
An Introduction to Software Architecture
Introduction to MDA (Model Driven Architecture) CYT.
Assessing the Suitability of UML for Modeling Software Architectures Nenad Medvidovic Computer Science Department University of Southern California Los.
Object-Oriented Modeling
Prepared by: Sanaz Helmi Hoda Akbari Zahra Ahmadi Sharif University of Tech. Summer 2006 An Introduction to.
Using UML, Patterns, and Java Object-Oriented Software Engineering Chapter 11, Testing: Model-based Testing and U2TP Note to Instructor: Some of the material.
Copyright 2002 Prentice-Hall, Inc. Chapter 2 Object-Oriented Analysis and Design Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer Joey.
Using UML, Patterns, and Java Object-Oriented Software Engineering Chapter 11, Testing: Model-based Testing.
MD – Object Model Domain eSales Checker Presentation Régis Elling 26 th October 2005.
Modeling Component-based Software Systems with UML 2.0 George T. Edwards Jaiganesh Balasubramanian Arvind S. Krishna Vanderbilt University Nashville, TN.
ARCH-2: UML From Design to Implementation using UML Frank Beusenberg Senior Technical Consultant.
SWE 316: Software Design and Architecture Objectives Lecture # 18 Introduction to Components SWE 316: Software Design and Architecture To learn:  benefits.
Chapter 9 Applying UML and Patterns -Craig Larman
TTCN-3 MOST Challenges Maria Teodorescu
Secure Systems Research Group - FAU SW Development methodology using patterns and model checking 8/13/2009 Maha B Abbey PhD Candidate.
Logical view –show classes and objects Process view –models the executables Implementation view –Files, configuration and versions Deployment view –Physical.
1 Class Diagrams. 2 Overview Class diagrams are the most commonly used diagrams in UML. Class diagrams are for visualizing, specifying and documenting.
® A Proposed UML Profile For EXPRESS David Price Seattle ISO STEP Meeting October 2004.
(14-2) UML Instructor - Andrew O’Fallon CptS 122 (December 2, 2015) Washington State University.
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
Chapter 8 Testing. Principles of Object-Oriented Testing Å Object-oriented systems are built out of two or more interrelated objects Å Determining the.
1 Technical & Business Writing (ENG-715) Muhammad Bilal Bashir UIIT, Rawalpindi.
Domain Model A representation of real-world conceptual classes in a problem domain. The core of object-oriented analysis They are NOT software objects.
XASTRO-2 Presentation CCSDS SAWG th November 2004.
UML - Development Process 1 Software Development Process Using UML.
Rigorous Testing by Merging Structural and Behavioral UML Representations Presented by Chin-Yi Tsai.
Basic Concepts and Definitions
A Framework for Automated and Composable Testing of Component-based Services Miguel A. Jiménez, Ángela Villota, Norha M. Villegas, Gabriel Tamura, Laurence.
Integrating and Extending Workflow 8 AA301 Carl Sykes Ed Heaney.
TTCN-3 Testing and Test Control Notation Version 3.
Introduction to UML and Rational Rose UML - Unified Modeling Language Rational Rose 98 - a GUI tool to systematically develop software through the following.
© 2009 Artisan Software Tools. All rights reserved. Testing Solutions with UML/SysML Andrew Stuart, Matthew Hause.
Software test automation with UML2.0 TestingProfile & TTCN-3 Maili Markvardt.
UML (Unified Modeling Language)
Work Item “Patterns in Test Development (PTD)” Re-start Meeting 17 March, Berlin Helmut Neukirchen Institute for.
Architecture Review 10/11/2004
Elaboration popo.
Chapter 11, Testing: Model-based Testing and U2TP
DEPLOYMENT OF MODEL-BASED AUTOMATED TESTING INFRASTRUCTURE IN A CLOUD
AMGA Web Interface Vincenzo Milazzo
Chapter 11, Testing: Model-based Testing
An Introduction to Software Architecture
Chapter 11, Testing: Model-based Testing
Overview of the ETSI Test Description Language
STF 454 TDL – Overview Last change:
Presentation transcript:

Business Informatics Group Institute of Software Technology and Interactive Systems Vienna University of Technology Favoritenstraße 9-11/1883, 1040 Vienna, Austria phone: 43 (1) (secretary), fax: 43 (1) Martin Fleck UTP – Introduction to the UML Testing Profile

Introduction 2 Why do we need the UML Testing Profile?  UML natively lacks concepts for the testing of systems  Identified need for…  Domain-independent test modeling  Test case specification  Test data specification  Test deployment specification  Test result visualization specification  OMG sends RFP for test-related UML Profile in 2001 Requirements System Code Test Code «derive» «generate» ?

Introduction 3 The UML Testing Profile (UTP)  History  Involved Partners  Industrial partners (Ericcson, Telelogic, IBM, Softeam, SINTEF, …)  Academic partners (University Luebeck, University of Göttingen, …) 2001 RFP 2003 Initial Submission July 2005 Adopted as Standard, v1.0 April 2012 Updated to v1.1 April 2013 Updated to v1.2 December 2013 Possible RFP for v2.0

Introduction 4 What is the UTP?  Goals  Minimality: Reuse UML constructs and benefits of MDE  Clarity: Separate testing concepts  Extensibility: Generic profile extendable for specific domains  Key Aspects  Design and configuration of test systems  Model-based test specifications on top of existing systems  Test cases and test data  Test environments  UTP focuses on Black-Box Testing

Introduction 5 What is not in the UTP?  Not specifically addressed by the UTP  Test case generation  White-Box approaches  Audits and Reviews  Test management (only partially addressed)  Test methodology  Combine with other standards and profiles  ISO/IEC – Software Testing  SysML for combination with requirements  BMM for combination with goals, objectives, and risks  … UML SysML UTP …SoaMLUPDMTelcoMLBPMNMARTE

Introduction 6 How is the UTP organized?  UTP standard provides  Pre-defined Type library  Native UML profile  OCL for abstract syntax description  Standalone MOF-based meta model (obsolete in 1.2)  Mappings to JUnit and TTCN-3 «profile» UTP «modelLibrary» UTPTypes «profile» StandardProfileL2 UML «apply» «import» 154 pages

UML Testing Profile 7 Overview TestLogEntry TestLogApplication TestObjectiveSpecification LiteralAnyOrNull LiteralNull InstanceValue LiteralAny TestLog «enumeration» Verdict none pass inconclusive fail error SUT DataSelector DataPoolDataPartition Modification CodingRule TestContextTestComponent Behavior «primitive» Timepoint «primitive» Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict) «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() «primitive» Timezone TestCase 0..* 1 1..* * * * 1..* 0..* TimeOut TimeOutMessage TimeOutAction TimerRunningAction ReadTimerActionStartTimerActionStopTimerAction GetTimezoneActionSetTimezoneActionDetermAlt FinishAction LogAction ValidationActionDefaultApplicationDefault 1

Native UML Profile 8 Logical packages  Type Library  Pre-defined types and interfaces  Test Architecture  Structural aspects of a test environment and test configuration  Test Behavior  Dynamic test behavior and defaults to stimulate and observe the SUT  Test Data  Stimuli and responses to and from the SUT, Definition of initial state  Test Management  Test Planning and Scheduling, Test Monitoring and Control, Test Result Analysis

Example 9 Overview  Petstore  Online store for animals Animal name: String type: String price: Double ApplicationController + login(String, String) + logout(String) + findAnimal(String) + addAnimalToCart(Animal) + removeAnimalFromCart(Animal) + confirmOrder() EntityManager + findCustomer(String, String) + queryAllItems() + persist(Entity) CustomerService + login(String, String) + logout(String) CatalogService + findAnimal(String) + createCategory(Category) + removeCategory(Category) Category title: String description: String OrderService + addAnimalToCart(Animal) + removeAnimalFromCart(Animal) + resetOrder(String) + confirmOrder(String)

UML Testing Profile 10 Overview TestLogEntry TestLogApplication TestObjectiveSpecification LiteralAnyOrNull LiteralNull InstanceValue LiteralAny TestLog «enumeration» Verdict none pass inconclusive fail error SUT DataSelector DataPoolDataPartition Modification CodingRule TestContextTestComponent Behavior «primitive» Timepoint «primitive» Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict) «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() «primitive» Timezone TestCase 0..* 1 1..* * * * 1..* 0..* TimeOut TimeOutMessage TimeOutAction TimerRunningAction ReadTimerActionStartTimerActionStopTimerAction GetTimezoneActionSetTimezoneActionDetermAlt FinishAction LogAction ValidationActionDefaultApplicationDefault 1  Type Library

Pre-defined Type Library 11 Overview  Verdict  Result of a running test case  none < pass < inconclusive < fail < error  Arbiter  Assigns and retrieves verdicts of test cases  Timer  Observe and controls test behavior through time measurement  Time Primitives  Data types to define points in time, durations and time zones «primitive» Timepoint «primitive» Duration «primitive» Timezone «enumeration» Verdict none pass inconclusive fail error «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() read(): Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict)

UML Testing Profile 12 Overview TestLogEntry TestLogApplication TestObjectiveSpecification LiteralAnyOrNull LiteralNull InstanceValue LiteralAny TestLog «enumeration» Verdict none pass inconclusive fail error SUT DataSelector DataPoolDataPartition Modification CodingRule TestContextTestComponent Behavior «primitive» Timepoint «primitive» Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict) «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() «primitive» Timezone TestCase 0..* 1 1..* * * * 1..* 0..* TimeOut TimeOutMessage TimeOutAction TimerRunningAction ReadTimerActionStartTimerActionStopTimerAction GetTimezoneActionSetTimezoneActionDetermAlt FinishAction LogAction ValidationActionDefaultApplicationDefault 1  Test Architecture

Native UML Profile 13 Test Architecture  SUT (System under Test)  System, subsystem, or component being tested  Only public methods available (black box)  Test Component  Communicate with the SUT and other test components  Drive test cases by stimulating the SUT  Used in architectural design and deployment specification «Stereotype» SUT «metaclass» Property «Stereotype» TestComponent zone: Timezone [0..1] «metaclass» Class

Native UML Profile 14 Test Architecture  Test Context  Group Test Cases  Classifier behavior can describe order of execution  Test Configuration  Collection of test components  Connection between test components and the SUT «Stereotype» TestContext arbiter: Arbiter testLevel: ValueSpecification [*] «metaclass» StructuredClassifier «metaclass» BehavioredClassifier

Example 15 Test Architecture  Simple Unit Test Context  Prepare for Unit Tests of EntityManager and CatalogService  Define specific test levels for Petstore  No additional test components required, e.g., no mocking «enumeration» TestContextLevel Unit Integration System Acceptance «TestContext» { testLevel = Unit } PetstoreTestContext - «SUT» em: EntityManager - «SUT» cs: CatalogService petstore.test petstore «import»

UML Testing Profile 16 Overview TestLogEntry TestLogApplication TestObjectiveSpecification LiteralAnyOrNull LiteralNull InstanceValue LiteralAny TestLog «enumeration» Verdict none pass inconclusive fail error SUT DataSelector DataPoolDataPartition Modification CodingRule TestContextTestComponent Behavior «primitive» Timepoint «primitive» Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict) «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() «primitive» Timezone TestCase 0..* 1 1..* * * * 1..* 0..* TimeOut TimeOutMessage TimeOutAction TimerRunningAction ReadTimerActionStartTimerActionStopTimerAction GetTimezoneActionSetTimezoneActionDetermAlt FinishAction LogAction ValidationActionDefaultApplicationDefault 1  Test Behavior

Native UML Profile 17 Test Behavior  Test Case  Specifies interaction of test components with the SUT to realize test objective  Can access all parts of the test model through test context  Returns a verdict (arbitrated through arbiter or not) «Stereotype» TestCase priority: ValueSpecification [0..1] testType: ValueSpecification [*] «metaclass» Operation «metaclass» Behavior

Native UML Profile 18 Test Behavior  Test Actions  ValidationAction: Set verdict in test behavior with optional reason  DetermAlt: Deterministic, operands evaluated in order of the model  FinishAction: completes test case  LogAction: Log entries during analysis «metaclass» CombinedFragment «metaclass» OpaqueAction «metaclass» InvocationAction «metaclass» SendObjectAction «Stereotype» DetermAlt «Stereotype» FinishAction «Stereotype» LogAction «metaclass» CallOperationAction «Stereotype» ValidationAction reason: ValueSpecification

Native UML Profile 19 Test Behavior  Defaults  Help make incomplete specifications complete  Separate common situations from exceptional ones  Different strategies of how to resume after default behavior has been executed  Repeat  Continue  Conclude «metaclass» Behavior «metaclass» Dependency «Stereotype» Default «Stereotype» DefaultApplication repitition: UnlimitedNatural

Native UML Profile 20 Test Behavior  Timer-related Concepts  Timezone Co-ordination «metaclass» TimeEvent «metaclass» Message «Stereotype» TimeOut «Stereotype» TimeOutMessage «metaclass» AcceptEventAction «Stereotype» TimeOutAction «metaclass» ReadStructuralFeatureAction «Stereotype» TimerRunningAction «metaclass» CallOperationAction «Stereotype» ReadTimerAction «metaclass» CallOperationAction «Stereotype» StartTimerAction «metaclass» CallOperationAction «Stereotype» StopTimerAction «metaclass» ReadStructuralFeatureAction «Stereotype» GetTimezoneAction «metaclass» WriteStructuralFeatureAction «Stereotype» SetTimezoneAction

Example 21 Test Behavior  Test Context  Prepare for Unit Tests of EntityManager and CatalogService (SUT)  Define specific test level for Petstore  Test Cases the for Petstore Test Context  Define Test Cases  Define specific test case priority  Reuse pre-defined verdicts «enumeration» TestCaseType Functionality Usability Reliability Performance Supportability «enumeration» TestCasePriority High Medium Low «enumeration» TestContextLevel Unit Integration System Acceptance «TestContext» { testLevel = Unit } PetstoreTestContext - «SUT» em: EntityManager - «SUT» cs: CatalogService + «TestCase» + { testType=Functionality, priority=High } + queryAllItems_shouldReturn5(): Verdict + «TestCase» + { testType=Functionality, priority=High } + catalogService_shouldPersist(): Verdict «enumeration» Verdict none pass inconclusive fail error

Example 22 Test Behavior  Unit Test Case for CatalogService Call findAllCategories() object result «TestCase» { testType=Functionality, priority=High } PetstoreTestContext::catalogService_shouldPersist() Verdict: Verdict «ValidationAction» pass result [false] [true] first assertEquals result second Call createCategory() category Call size() object result Create CatalogService first add result second Create 1 Call findAllCategories() object result Call size() object result Create “Giraffidae” Create Category Create “Ruminant…” result object «ValidationAction» fail result title description result verdict: Verdict

UML Testing Profile 23 Overview TestLogEntry TestLogApplication TestObjectiveSpecification LiteralAnyOrNull LiteralNull InstanceValue LiteralAny TestLog «enumeration» Verdict none pass inconclusive fail error SUT DataSelector DataPoolDataPartition Modification CodingRule TestContextTestComponent Behavior «primitive» Timepoint «primitive» Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict) «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() «primitive» Timezone TestCase 0..* 1 1..* * * * 1..* 0..* TimeOut TimeOutMessage TimeOutAction TimerRunningAction ReadTimerActionStartTimerActionStopTimerAction GetTimezoneActionSetTimezoneActionDetermAlt FinishAction LogAction ValidationActionDefaultApplicationDefault 1  Test Data

Native UML Profile Test Data  Test Data Specification  Data Pool  Physical containers of data  Data partition  Subsets of concrete sets of instances  Data Selector  Specify which data values should be retrieved «metaclass» Classifier «metaclass» Classifier «metaclass» Property «metaclass» Operation «Stereotype» DataPartition «Stereotype» DataPool «Stereotype» DataSelector 24

Native UML Profile 25 Test Data  Test Data Values  Coding Rules for encoding/decoding of communcation  Wildcards for loose specification of test data values  Modification: Test data value reuse «metaclass» Namespace «metaclass» ValueSpecification «metaclass» Property «metaclass» LiteralSpecification «Stereotype» LiteralAny «metaclass» LiteralSpecification «Stereotype» LiteralAnyOrNull «metaclass» Dependency «Stereotype» Modification «Stereotype» CodingRule coding: String

Example 26 Test Data  Specify test data to be used in the test context  Data: All animals in the shop  Restrict to two Birds and two Dogs «DataPool» PetstoreAnimalPool «DataPartition» Dog + «DataSelector» + getDog(): Animal Animal name: String type: String price: Double «DataPartition» Bird + «DataSelector» + getBird(): Animal 1 22 «TestContext» { testLevel = Unit } PetstoreTestContext - «SUT» em: EntityManager - «SUT» cs: CatalogService + «TestCase» + { testType=Functionality, priority=High } + queryAllItems_shouldReturn5(): Verdict + «TestCase» + { testType=Functionality, priority=High } + catalogService_shouldPersist(): Verdict «initialTestData»

Example 27 Test Data  Use modification to re-use InstanceSpecifications  „dog“ specification is incomplete (missing price)  Full specification through modification  No Cyclic modification allowed Animal name: String type: String price: Double fifi: Animal name = „Fifi“ type = „Dog“ streetFifi: Animal price = 30.0 luxusFifi: Animal price = «modifies»

UML Testing Profile 28 Overview  Test Management TestLogEntry TestLogApplication TestObjectiveSpecification LiteralAnyOrNull LiteralNull InstanceValue LiteralAny TestLog «enumeration» Verdict none pass inconclusive fail error SUT DataSelector DataPoolDataPartition Modification CodingRule TestContextTestComponent Behavior «primitive» Timepoint «primitive» Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict) «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() «primitive» Timezone TestCase 0..* 1 1..* * * * 1..* 0..* TimeOut TimeOutMessage TimeOutAction TimerRunningAction ReadTimerActionStartTimerActionStopTimerAction GetTimezoneActionSetTimezoneActionDetermAlt FinishAction LogAction ValidationActionDefaultApplicationDefault 1

Native UML Profile 29 Test Management  Test Planning and Scheduling  Specify the objective of test cases  Test Monitoring and Control  Re-use already introduced behavior  Test Result Analysis  Logging of test behavior «Stereotype» TestObjectiveSpecification id: String text: String priority: ValueSpecification [0..1] reference: String [*] «metaclass» Class «Stereotype» TestLog tester: ValueSpecification [0..1] executedAt: Timepoint [0..1] duration: Duration [0..1] verdict: Verdict verdictReason: ValueSpecification [*] «metaclass» Behavior «metaclass» Dependency «Stereotype» TestLogApplication «Stereotype» TestLogEntry timestamp: Timepoint «metaclass» OccurenceSpecification

Native UML Profile 30 Test Management  Test Planning and Scheduling  Specify the objective of test cases  Test Monitoring and Control  Re-use already introduced behavior  Test Result Analysis  Logging of test behavior «Stereotype» TestObjectiveSpecification id: String text: String priority: ValueSpecification [0..1] reference: String [*] «metaclass» Class «Stereotype» TestLog tester: ValueSpecification [0..1] executedAt: Timepoint [0..1] duration: Duration [0..1] verdict: Verdict verdictReason: ValueSpecification [*] «metaclass» Behavior «metaclass» Dependency «Stereotype» TestLogApplication «Stereotype» TestLogEntry timestamp: Timepoint «metaclass» OccurenceSpecification «Stereotype» ManagedElement owner String [0..1] description: String [0..1] version: String [0..1] «metaclass» Class

Example 31 Test Management  Test Objective Specification  Textual description of objective  Multiple ways to connect with test cases or test contexts «TestContext» { testLevel = Unit } PetstoreTestContext - «SUT» em: EntityManager - «SUT» cs: CatalogService + «TestCase» + { testType=Functionality, priority=High } + queryAllItems_shouldReturn5(): Verdict + «TestCase» + { testType=Functionality, priority=High } + catalogService_shouldPersist(): Verdict «enumeration» TestObjectivePriority High Medium Low «TestObjectiveSpecification» TestObjective_01 id = TO_01 text = „Ensure the correct querying of all items for the entity manager“ priority = High reference = „See DoW, Section 2.1“ «TestObjectiveSpecification» TestObjective_02 id = TO_02 text = „Ensure the correct creation of categories through the catalog service“ priority = Medium «trace» «realization»

UML Testing Profile 32 Overview TestLogEntry TestLogApplication TestObjectiveSpecification LiteralAnyOrNull LiteralNull InstanceValue LiteralAny TestLog «enumeration» Verdict none pass inconclusive fail error SUT DataSelector DataPoolDataPartition Modification CodingRule TestContextTestComponent Behavior «primitive» Timepoint «primitive» Duration «interface» Arbiter getVerdict(): Verdict setVerdict(Verdict) «interface» Timer isRunning: Boolean start(Timepoint) start(Duration) stop() «primitive» Timezone TestCase 0..* 1 1..* * * * 1..* 0..* TimeOut TimeOutMessage TimeOutAction TimerRunningAction ReadTimerActionStartTimerActionStopTimerAction GetTimezoneActionSetTimezoneActionDetermAlt FinishAction LogAction ValidationActionDefaultApplicationDefault 1

Summary 33 Key Points of UTP  Profile for creation, documentation, visualization, specification and exchange of model-based test specifications  Focus on Black-Box Testing  Recently more active work to improve the UTP Requirements System Code Test Code «derive» «generate» «test» «reference»

Business Informatics Group Institute of Software Technology and Interactive Systems Vienna University of Technology Favoritenstraße 9-11/1883, 1040 Vienna, Austria phone: 43 (1) (secretary), fax: 43 (1) Thank you! Martin Fleck