January 2004WOSP’2004 1 UML and SPE Part II: The UML Performance Profile Dr. Dorina Petriu Carleton University Department of Systems and Computer Engineering.

Slides:



Advertisements
Similar presentations
Network II.5 simulator ..
Advertisements

International Telecommunication Union © ITU-T Study Group 17 Integrated Application of URN Daniel Amyot University of Ottawa, Canada
1 CIS224 Software Projects: Software Engineering and Research Methods Lecture 11 Brief introduction to the UML Specification (Based on UML Superstructure.
Lecture # 2 : Process Models
Temporal and Real-Time Databases: A Survey by Gultekin Ozsoyoglu and Richard T. Snodgrass Presentation by Didi Yao.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 8 Slide 1 System models.
The Architecture Design Process
Modified from Sommerville’s originalsSoftware Engineering, 7th edition. Chapter 8 Slide 1 System models.
Software Requirements
The new The new MONARC Simulation Framework Iosif Legrand  California Institute of Technology.
Modified from Sommerville’s originalsSoftware Engineering, 7th edition. Chapter 8 Slide 1 System models.
1/31 CS 426 Senior Projects Chapter 1: What is UML? Chapter 2: What is UP? [Arlow and Neustadt, 2005] January 22, 2009.
Software Architecture Quality. Outline Importance of assessing software architecture Better predict the quality of the system to be built How to improve.
1 CS 426 Senior Projects Chapter 1: What is UML? Chapter 2: What is UP? [Arlow and Neustadt, 2002] January 26, 2006.
Department of Computer Science 1 CSS 496 Business Process Re-engineering for BS(CS)
The chapter will address the following questions:
[ §6 : 1 ] 6. Basic Methods II Overview 6.1 Models 6.2 Taxonomy 6.3 Finite State Model 6.4 State Transition Model 6.5 Dataflow Model 6.6 User Manual.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 15 Slide 1 Real-time Systems 1.
Basic Concepts The Unified Modeling Language (UML) SYSC System Analysis and Design.
Computer System Lifecycle Chapter 1. Introduction Computer System users, administrators, and designers are all interested in performance evaluation. Whether.
Introduction To System Analysis and design
Ekrem Kocaguneli 11/29/2010. Introduction CLISSPE and its background Application to be Modeled Steps of the Model Assessment of Performance Interpretation.
Chapter 10 Architectural Design
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
UML - Development Process 1 Software Development Process Using UML (2)
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 8: Modelling Interactions and Behaviour.
(C) 2009 J. M. Garrido1 Object Oriented Simulation with Java.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 7 Slide 1 System models l Abstract descriptions of systems whose requirements are being.
Chapter 4 System Models A description of the various models that can be used to specify software systems.
System models Abstract descriptions of systems whose requirements are being analysed Abstract descriptions of systems whose requirements are being analysed.
An Introduction to Software Architecture
Integrating Security Design Into The Software Development Process For E-Commerce Systems By: M.T. Chan, L.F. Kwok (City University of Hong Kong)
Introduction to MDA (Model Driven Architecture) CYT.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 10Slide 1 Architectural Design l Establishing the overall structure of a software system.
Chapter 3 System Performance and Models. 2 Systems and Models The concept of modeling in the study of the dynamic behavior of simple system is be able.
Chapter 7 System models.
Database Design and Management CPTG /23/2015Chapter 12 of 38 Functions of a Database Store data Store data School: student records, class schedules,
Reference: Ian Sommerville, Chap 15  Systems which monitor and control their environment.  Sometimes associated with hardware devices ◦ Sensors: Collect.
System models l Abstract descriptions of systems whose requirements are being analysed.
Modified by Juan M. Gomez Software Engineering, 6th edition. Chapter 7 Slide 1 Chapter 7 System Models.
Sommerville 2004,Mejia-Alvarez 2009Software Engineering, 7th edition. Chapter 8 Slide 1 System models.
Software Engineering Prof. Ing. Ivo Vondrak, CSc. Dept. of Computer Science Technical University of Ostrava
Performance evaluation of component-based software systems Seminar of Component Engineering course Rofideh hadighi 7 Jan 2010.
UML as a Specification Language for Embedded Systems. By, Mir Ahmed Ali, Asst. Professor, ECM department, SNIST. By, Prof. Narsiah sir, Director of School.
Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision.
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
1 Technical & Business Writing (ENG-715) Muhammad Bilal Bashir UIIT, Rawalpindi.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Review of Parnas’ Criteria for Decomposing Systems into Modules Zheng Wang, Yuan Zhang Michigan State University 04/19/2002.
1 Chapter 12 Configuration management This chapter is extracted from Sommerville’s slides. Text book chapter 29 1.
1. 2 Purpose of This Presentation ◆ To explain how spacecraft can be virtualized by using a standard modeling method; ◆ To introduce the basic concept.
UML 1.4 to Core Scenario Model Transformation for PUMA Project Kathleen H. Shen Department of Systems and Computer Engineering Carleton.
21/1/ Analysis - Model of real-world situation - What ? System Design - Overall architecture (sub-systems) Object Design - Refinement of Design.
From UML to Performance Models 29 June 2004 Dorina C. Petriu, Gordon Gu page 1 From UML to Performance Models: High Level View Dorina C. Petriu Gordon.
 To explain why the context of a system should be modelled as part of the RE process  To describe behavioural modelling, data modelling and object modelling.
Model Driven Performance Analysis University College London James Skene –
Of 24 lecture 11: ontology – mediation, merging & aligning.
UML (Unified Modeling Language)
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 10Slide 1 Chapter 5:Architectural Design l Establishing the overall structure of a software.
Method – Notation 8 Hours.
OPERATING SYSTEMS CS 3502 Fall 2017
CHAPTER
Modelowanie i analiza systemów informacyjnych Performance evaluation of a software UML design Marek Bazan Kwiecień 2017.
Unified Modeling Language
Abstract descriptions of systems whose requirements are being analysed
BPMN - Business Process Modeling Notations
Analysis models and design models
An Introduction to Software Architecture
Layered Queueing Network Modeling of Software Systems Murray Woodside 5201 Canal Building Building Security System (buffering)
Software Development Process Using UML Recap
Presentation transcript:

January 2004WOSP’ UML and SPE Part II: The UML Performance Profile Dr. Dorina Petriu Carleton University Department of Systems and Computer Engineering Ottawa, Canada, K1S 5B6

January 2004WOSP’2004 tutorial 2 Outline n UML Profile for Schedulability, Performance and Time l General Resource Model l General Time Modeling l General Concurrency Model l Schedulability Profile l Performance Profile n Applying the Performance Profile n Generation of performance models from UML specs n Validation n Directions for future work

January 2004WOSP’2004 tutorial 3 UML Profile for Schedulability, Performance and Time

January 2004WOSP’2004 tutorial 4 UML Profiles n A package of related specializations of general UML concepts that capture domain-specific variations and usage patterns l A domain-specific interpretation of UML n Fully conformant with the UML standard l does not extend the UML metamodel l uses only standard extension mechanisms: stereotypes, tagged values, constraints l additional semantic constraints cannot contradict the general UML semantics l within the “semantic envelope” defined by the standard Standard UML Semantics Profile X Profile Y

January 2004WOSP’2004 tutorial 5 Guiding Principles for the SPT Profile n Adopted as OMG standard, Version 1 - OMG document formal/ pdf n Ability to specify quantitative information directly in UML models l key to quantitative analysis and predictive modeling n Flexibility: l users can model their real-time systems using modeling approaches and styles of their own choosing l open to existing and new analysis techniques n Facilitate the use of analysis methods l eliminate the need for a deep understanding of analysis methods l as much as possible, automate the generation of analysis models and the analysis process itself n Using analysis results for: l predicting system characteristics (detect problems early) l analyze existing system (sizing, capacity planning)

January 2004WOSP’2004 tutorial 6 UML Profile Mechanism The UML profile mechanism provides a way of specializing the concepts defined in the UML standard according to a specific domain model. a stereotype can be viewed as a subclass of an existing UML concept a tagged value associated with the stereotype can be viewed as an attribute The domain model often shows associations between domain concepts, but the UML extension mechanisms do not provide a convenient way for specifying new associations in the metamodel. The following general techniques are used: some domain associations map directly to existing associations in the metamodel. some domain composition associations map to tags associated with the stereotype. in some cases, a domain association is represented by using the > relationship provided by the UML profile mechanisms.

January 2004WOSP’2004 tutorial 7 Specializing UML: Stereotypes n Possible to add semantics to any standard UML concept l Must not violate standard UML semantics «PAhost» Node UML metaclass Tagged value associated with the «PAhost» stereotype MyProcessor { PAschdPolicy = ‘FIFO’} «PAhost» Stereotype of Node with added semantics: a processing resource with a scheduling policy, processing rate, context switching time, utilization, throughput, etc.

January 2004WOSP’2004 tutorial 8 General Process for Model Analysis Software Domain Schedulability/ Performance Domain

January 2004WOSP’2004 tutorial 9 UML SPT Profile Structure Analysis Models Infrastructure Models General Resource Modeling Framework «modelLibrary» RealTimeCORBAModel «profile» RTresourceModeling «profile» RTconcurrencyModeling «import» «profile» RSAprofile «import» «profile» RTtimeModeling «profile» PAprofile «import» «profile» SAProfile «import»

January 2004WOSP’2004 tutorial 10 SPT Profile: General Resource Model

January 2004WOSP’2004 tutorial 11 Quality of Service Concepts n Quality of Service (QoS): a specification (usually quantitative) of how well a particular service is (to be) performed l e.g. throughput, capacity, response time, jitter n Resource: an element whose service capacity is limited, directly or indirectly, by the finite capacities of the underlying physical elements n The specification of a resource can include: l offered QoS: the QoS that it provides to its clients l required QoS: the QoS it requires from other components to support its QoS obligations key analysis question: (requiredQoS  offeredQoS) l why is difficult to answer: the QoS offered by a resource depends not only on the resource capacity itself, but also on the load (i.e., on the competition from other clients for the same resource).

January 2004WOSP’2004 tutorial 12 CoreResourceModel ResourceUsageModel DynamicUsageModel (from ResourceUsageModel) StaticUsageModel (from ResourceUsageModel) CausalityModel ResourceTypes ResourceManagement RealizationModel The General Resource Model

January 2004WOSP’2004 tutorial 13 n Parallel hierarchy of instances and descriptors. Core Resource Model (Domain Model) DescriptorInstance 1..n0..n +type 1..n0..n / +offeredQoS 0..n / ResourceResourceInstance 1..n0..n +type 1..n +instance 0..n ResourceServiceResourceServiceInstance 0..n +type 1 +instance 0..n 1..n+offeredService1..n l QoScharacteristic 0..n 1 QoSvalue 0..n +offeredQoS 0..n 1 +type 1 +instance 0..n

January 2004WOSP’2004 tutorial 14 Basic Resource Usage Model StaticUsage DynamicUsage UsageDemand ResourceServiceInstance (from CoreResourceModel) ResourceUsage workload n +usedServices 0..n AnalysisContext 1 1..n ResourceInstance (from CoreResourceModel) 1..n 0..n +usedResources 1..n0..n 1..n 0..n 1..n EventOccurence (from CausalityModel) n Resource usage describes how a set of clients uses a set of resources and their instances. l static usage: who uses what l dynamic usage: who uses what and how (order, timing, etc.)

January 2004WOSP’2004 tutorial 15 Static Usage Model n Static usage: who uses what (order does not matter) StaticUsage (from ResourceUsageModel) ResourceInstance (from CoreResourceModel) Client 1..n +usedResources 1..n QoSvalue (from CoreResourceModel) 0..n +offeredQoS 0..n / 1..n 0..n +requiredQoS 1..n QoScharacteristic (from CoreResourceModel) 0..n 1 +instance 0..n +type 1 Instance (from CoreResourceModel)

January 2004WOSP’2004 tutorial 16 Basic Causality Loop n Used in modeling dynamic scenarios – it captures the essential of the cause-effect chain in the behaviour of run-time instances effect cause 1 Stimulus 1..n1 +effect 1..n +cause 1 Scenario 0..n 1 +effect 0..n +cause 1 0..n 1 +effect 0..n +cause 1 Instance (from CoreResourceModel) n +receiver n 1..n +methodExecution0..n +executionHost 1..n EventOccurence StimulusReception StimulusGeneration

January 2004WOSP’2004 tutorial 17 Dynamic Usage Model n dynamic usage: who uses what and how (order, timing, etc.) +requiredQoS DynamicUsage (from ResourceUsageModel) QoSvalue (from CoreResourceModel) ResourceInstance (from CoreResourceModel) ResourceServiceInstance (from CoreResourceModel) 1..n 0..n +offeredQoS 0..n Scenario (from CausalityModel) 1..n +usedResources 1..n / +usedServices 1..n / ActionExecution 0..n 1..n +step 1..n 0..n +predecessor +successor 0..n {ordered} ActionExecution: the only modification to the UML metamodel proposed in the profile

January 2004WOSP’2004 tutorial 18 Proposed Changes to the UML Metamodel n The change is reflected in version UML 1.5. n Addition of a new metamodel element called ActionExecution n Replacement of the association between Action and Stimulus by a semantically similar association between Stimulus and ActionExecution n addition of two new associations between Action and ActionExecution, and ActionExecution and Instance, respectively.

January 2004WOSP’2004 tutorial 19 Categories of Resources ActiveResourcePassiveResourceUnprotectedResource protectionKindactivenessKind ResourceInstance (from CoreResourceModel) ProtectedResource CommunicationResourceProcessorDevice purposeKind n Categorization of resources: a given resource instance may belong to more than one type.

January 2004WOSP’2004 tutorial 20 Exclusive Use Resources and Actions ActionExecution (from DynamicUsageModel) ResourceServiceInstance (from CoreResourceModel) AcquireService isBlocking : Boolean ReleaseService ExclusiveService 0..n 1 1 / 1 1 / AccessControlPolicy 1 0..n 1 ProtectedResource 1..n / 0..n 0..1 / 0..n n To use an exclusive service of a protected resource the following actions must be executed before and after: l acquire service action (according to an access control policy) l release service action

January 2004WOSP’2004 tutorial 21 Resource Management Model Instance (from CoreResourceModel) ResourceInstance (from CoreResourceModel) ResourceBroker 1..n 0..n 1..n 0..n AccessControlPolicy (from ResourceTypes) 1..n 11 ResourceManager 1..n 0..n +managedResource1..n 0..n ResourceControlPolicy 1 0..n 1 n Utility package for modeling resource management services: l resource broker: administers the access control policy l resource manager: creates and keeps track of resources according to a resource control policy.

January 2004WOSP’2004 tutorial 22 SPT Profile: General Time Modeling

January 2004WOSP’2004 tutorial 23 General Time Model n Concepts for modeling time and time values n Concepts for modeling events in time and time-related stimuli n Concepts for modeling timing mechanisms (clocks, timers) n Concepts for modeling timing services, such as those found in real-time operating systems. TimeModel TimedEvents TimingMechanisms TimingServices

January 2004WOSP’2004 tutorial 24 Physical and Measured Time n dense time: corresponds to the continuous model of physical time (represented by real numbers) n discrete time: time broken into quanta (represented by integers)

January 2004WOSP’2004 tutorial 25 Timing Mechanisms: Clock and Timer ResourceInstance (from CoreResourceModel) Timeout (from TimedEvents) Timer isPeriodic : Boolean 1 0..n 1 +generatedTimeouts 0..n ClockInterrupt (from TimedEvents) TimeInterval (from TimeModel) Clock 1 0..n +offset 1 0..n 1 +accuracy 1 0..n 1 +generatedInterrupts 0..n 1 TimeValue (from TimeModel) TimingMechanism stability drift skew set(time : TimeValue) get() : TimeValue reset() start() pause() 1 0..n +resolution 1 0..n 1 +referenceClock n+currentValue 1 0..n 1 +maximalValue 1 0..n TimeValue (from TimeModel) 1 0..n +duration 1 0..n TimedEvent (from TimedEvents) 1..n +timestamp 1..n 1 +origin 1

January 2004WOSP’2004 tutorial 26 Timed Stimuli These two associations are derived from the general association between EventOccurrence and Stimulus Stimulus (from CausalityModel) Timeout ClockInterrupt StimulusGeneration (from CausalityModel) 1 +cause 1 +effect 1..n 1 +cause 1 1..n / 1 +cause1 0..n / TimedStimulus 0..n +start 1..n 0..n +end 0..n TimeValue (from TimeModel) 0..n +time n In UML events are assumed to occur instantaneously.

January 2004WOSP’2004 tutorial 27 Timed Events and Timed Actions Delay EventOccurence (from CausalityModel) TimeValue (from TimeModel) TimedEvent 1..n +timestamp 1..n Scenario (from CausalityModel) TimedAction TimeInterval (from TimeModel) 1 0..n +duration 1 0..n 1..n +end 1..n +start 1..n n Timed event: has one or more timestamps (according to different clocks) n Timed action: an action that takes a certain time to complete (with known start time, end time and duration)

January 2004WOSP’2004 tutorial 28 CallerOperator call ack Notation: Timing Marks and Constraints n A timing mark identifies the time of an event occurrence  On messages:  sendTime()  receiveTime()  On action blocks:  startTime()  endTime() {call.sendTime() - ack.receiveTime < 10 sec} Timing constraint ack.sendTime() call.receiveTime() callHandler.startTime() callHandler.endTime()

January 2004WOSP’2004 tutorial 29 Specifying Time Values Time values can be represented by a special stereotype of Value ( «RTtimeValue» ) in different formats; e.g. 12:04 (time of day) 5.3, ‘ms’ (time interval) 2000/10/27 (date) Wed (day of week) $param, ‘ms’ (parameterized value) ‘poisson’, 5.4, ‘sec’ (time value with a Poisson distribution) l ‘histogram’ 0, , , 0.28, 3, ‘ms’ P=0.28 P=0.44 P= ms1 ms2 ms3 ms

January 2004WOSP’2004 tutorial 30 Specifying Arrival Patterns n Method for specifying standard arrival pattern values Bounded: ‘bounded’,, Bursty: ‘bursty’, Irregular: ‘irregular’,, [ ]* Periodic: ‘periodic’, [, ] Unbounded: ‘unbounded’, n Probability distributions supported: l Bernoulli, Binomial, Exponential, Gamma, Geometric, Histogram, Normal, Poisson, Uniform

January 2004WOSP’2004 tutorial 31 General Concurrency Model n Rich enough domain model of concurrently executing and com- municating objects - used in applications, operating systems, etc.

January 2004WOSP’2004 tutorial 32 Schedulability Profile

January 2004WOSP’2004 tutorial 33 Background n Schedulability analysis: applied to hard real-time systems to find a schedule that will allow the system to meet all its deadlines. n Schedulable entities: granular parts of an application that contend for use of the execution resource that executes the schedule. n Schedule: an assignment of all the schedulable entities in the system on available processors, produced by the scheduler. n Scheduler: implements scheduling algorithms and resource arbitration policies. n Scheduling policy: a methodology used to establish the order of schedulable entity (e.g. process, or thread) execution. l E.g., rate monotonic, earliest deadline first, minimum laxity first, maximize accrued utility, and minimum slack time. n Scheduling mechanism: an implementation technique used by a scheduler to make decisions about the order to choose threads for execution. l E.g.,fixed priority schedulers, and earliest deadline schedulers.

January 2004WOSP’2004 tutorial 34 Schedulability core domain model

January 2004WOSP’2004 tutorial 35 Example: Telemetry System Specification

January 2004WOSP’2004 tutorial 36 Real-time Situation Modeled as SD

January 2004WOSP’2004 tutorial 37 Real-time Situation Modeled as CD Sensors :SensorInterface TelemetryGatherer :DataGatherer «SAResource» SensorData :RawDataStorage TelemetryDisplayer : DataDisplayer TelemetryProcessor :DataProcessor Display :DisplayInterface TGClock : Clock A.1:gatherData ( ) C.1:displayData ( ) B.1:filterData ( ) TGClock : Clock A.1.1:main ( ) A.1.1.1: writeStorage ( ) B.1.1 : main ( ) B.1.1.1: readStorage ( ) C.1.1.1: readStorage ( ) C.1.1 : main ( ) {SACapacity=1, SAAccessControl=PriorityInheritance} «SASchedulable» «SATrigger» {SASchedulable=$R1, RTat=('periodic',100,'ms')} «SAResponse» {SAAbsDeadline=(100,'ms')} «SASituation» «SAAction» {SAPriority=2, SAWorstCase=(93,'ms'), RTduration=(33.5,'ms')} «SAAction» {RTstart=(16.5,'ms'), RTend=(33.5,'ms')} «SATrigger» {SASchedulable=$R2, RTat=('periodic',60,'ms')} «SAResponse» {SAAbsDeadline=(60,'ms')} «SATrigger» {SASchedulable=$R3, RTat=('periodic',200,'ms')} «SAResponse» {SAAbsDeadline=(200,'ms')} «SAResponse» {SAPriority=3, SAWorstCase=(177,'ms'), RTduration=(46.5,'ms')} «SAAction» {RTstart=(10,'ms'), RTend=(31.5,'ms')} «SAAction» {RTstart=(3,'ms'), RTend=(5,'ms')} «SAResponse» {SAPriority=1, SAWorstCase=(50.5,'ms'), RTduration=(12.5,'ms')} Result

January 2004WOSP’2004 tutorial 38 Performance Profile

January 2004WOSP’2004 tutorial 39 Background n Scenarios define execution paths whose end points are externally visible. QoS requirements can be placed on scenarios. n Each scenario is executed by a workload: l open workload: requests arriving at in some predetermined pattern l closed workload: a fixed number of active or potential users or jobs n Scenario steps: the elements of scenarios joined with predecessor- successor relationships which may include forks, joins and loops. l a step may be an elementary operation or a whole sub-scenario n Resources are used by scenario steps. Quantitative resource demands for each step must be given by performance annotations. Important note: the main reason we build and solve performance models is to compute the additional delays due to the competition for resources by different concurrent users! n Performance results for a system include resource utilizations, waiting times, response times, throughputs. n Performance analysis applied to real-time systems with stochastic characteristics and soft deadlines (use mean value analysis methods).

January 2004WOSP’2004 tutorial 40 Performance Domain Model > ClosedWorkload population externalDelay OpenWorkload occurrencePattern PResource utilization schedulingPolicy throughput PProcessingResource processingRate contextSwitchTime priorityRange isPreemptible PerformanceContext 1..n 0..n 1..n 0..n Workload responseTime priority 1..n 1 1 PStep probabilty repetition delay operations interval executionTime 0..n +successor 0..n +predecessor 0..n PScenario hostExecutionDemand responseTime 0..n +resource 0..n n +host n 1..n root 1 1 PPassiveResource waitingTime responseTime capacity accessTime {ordered}

January 2004WOSP’2004 tutorial 41 Specifying Performance Values n A complex structured string with the following format ::=“(“ “,“ “,“ ”)” where: ::= ‘req’ | ‘assm’ | ‘pred’ | ‘msr’ required, assumed, predicted, measured ::= ‘mean’ | ‘sigma’ | ‘kth-mom’, | ‘max’ | ‘percentile’ | ‘dist’ is a time value described by the RTtimeValue type n A single characteristic may combine more than one performance values: ::= [ ]* n Example: {PAdemand = (‘pred’, ‘mean’, (20, ‘ms’)) } {PArespTime = (‘req’, mean, (1, ‘sec’)) (‘pred’, mean, $RespT) } required predicted - analysis result

January 2004WOSP’2004 tutorial 42 Performance Stereotypes(1) StereotypeApplies ToTagsDescription «PAclosedLoad»Action, ActionExecution, Stimulus, Action, Message, Method… PArespTime [0..*] PApriority [0..1] PApopulation [0..1] PAextDelay [0..1] A closed workload «PAcontext»Collaboration, CollaborationInstanceSet ActivityGraph A performance analysis context «PAhost»Classifier, Node, ClassifierRole, Instance, Partition PAutilization [0..*] PAschdPolicy [0..1] PArate [0..1] PActxtSwT [0..1] PAprioRange [0..1] PApreemptible [0..1] PAthroughput [0..1] A processor «PAopenLoad»Action, ActionExecution, Stimulus, Action, Message, Method… PArespTime [0..*] PApriority [0..1] PAoccurrence [0..1] An open workload

January 2004WOSP’2004 tutorial 43 Performance Stereotypes (2) StereotypeApplies ToTagsDescription «PAresource»Classifier, Node, ClassifierRole, Instance, Partition PAutilization [0..*] PAschdPolicy [0..1] PAcapacity [0..1] PAmaxTime [0..1] PArespTime [0..1] PAwaitTime [0..1] PAthroughput [0..1] A passive resource «PAstep»Message, ActionState, Stimulus, SubactivityState PAdemand [0..1] PArespTime [0..1] PAprob [0..1] PArep [0..1] PAdelay [0..1] PAextOp [0..1] PAinterval [0..1] A step in a scenario

January 2004WOSP’2004 tutorial 44 Applying the Performance Profile

January 2004WOSP’2004 tutorial 45 What is represented in the UML model? n A UML model should represent the following aspects in order to support early performance analysis: : l key use cases described by representative scenarios t frequently executed scenarios with performance constraints t performance requirements for each scenario can be defined by the user u examples: end to end response time, throughput, jitter, etc. l resources used by each scenario t reason why needed: contention for resources has a strong performance impact t resources may be active or passive, physical or logical, hardware or software u examples: processor, disk, process, software server, lock, buffer t quantitative resource demands must be given for each scenario step u how much? u how many times? l workload intensity for each scenario (scenario set) t open workload: arrival rate of requests for the scenario t closed workload: number of simultaneous users executing the scenario

January 2004WOSP’2004 tutorial 46 Building Security System: selected Use Cases

January 2004WOSP’2004 tutorial 47 Deployment Diagram > ApplicCPU <<PAresource>> LAN VideoAcquisition > DB_CPU Database > SecurityCard Reader > DoorLock Actuator > Video Camera > Disk {PAcapacity=2} AccessControl Access Controller Access Controller AquireProc StoreProc > Buffer {PAcapacity=$Nbuf} Buffer Manager

January 2004WOSP’2004 tutorial 48 Acquire/Store Video Scenario > Video Controller > AcquireProc {PAcapacity= 2} > BufferManager {PAcapacity= $Bufs} > StoreProc *[$N] procOneImage(i) > allocBuf (b) getImage (i, b) passImage (i, b) storeImage (i, b) > releaseBuf (b) freeBuf (b) > Database writeImg (i, b) getBuffer() store (i, b) > {PAdemand =(‘asmd’, ‘mean’, (1.5, ‘ms’)} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms))} > o {PAdemand=(‘asmd’, ‘mean’, ($P * 1.5, ‘ms’)), PAextOp = (network, $P)} > {PAdemand=(‘asmd’, ‘mean’, ($B * 0.9, ‘ms’)),, PAextOp=(writeBlock, $B)} > { PApopulation = 1, PAinterval =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $Cycle)) } > {PArep =$N} > {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’))} o o > {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (0.9, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.1, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (2, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (0.2,’ms’))} Result o Requirement

January 2004WOSP’2004 tutorial 49 Access Control Scenario getRights() User > CardReader > DoorLock > Alarm > Access Controller > Database > Disk readCard admit (cardInfo) readRights() [not_in_cache] readData() checkRights() [OK] openDoor() [not OK] alarm() [need to log?] logEvent() writeRec() enterBuilding writeEvent() > {PAextOp=(read, 1)} > {PAoccurencePattern = (‘poisson’, 30, ‘s’), PArespTime =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $UserR))} > {PAdemand=(‘asmd’, ‘mean’, (3, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} > o o {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.5, ‘ms’)), PAprob = 0.4} > {PAdelay=(‘asmd’, ‘mean’, (500, ‘ms’)), PAprob = 1} > {PAprob = 0} > {PAdemand=(‘asmd’, ‘mean’, (0.3, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (0.2, ‘ms’), PAprob=0.2} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} o Result o Requirement

January 2004WOSP’2004 tutorial 50 Acquire/Store Video Scenario: top-level AD > {PApopulation = 1, PAinterval= ((‘req’,’percentile’, 95, (1,’s’)), (‘pred’,’percentile’, $Cycle))} VideoController procOneImage *[ N ] cycleOverhead > {PAdemand= (‘asmd’, ‘mean’, (1.8, ‘ms))} > {PArep = $N} composite activity detailed on the next slide

January 2004WOSP’2004 tutorial 51 Composite activity procOneImage wait_SP getBuffer > allocBuf wait_DB > releaseBuf AcquireProc StoreProc BufferManager Database getImage passImage image store writeImg freeBuf wait_SP wait_DB > {PAdemand=(‘asmd’, ‘mean’, 1.5, ‘ms’)} > {PAdemand=(‘asmd’, ‘mean’, ($P * 1.5, ‘ms’)), PAextOp = (network, $P)} > {PAdemand=(‘asmd’, ‘mean’,( 0.5, ‘ms’)) } > {PAdemand=(‘asmd’, ‘mean’, (0.9, ‘ms’)) } > {PAdemand=(‘asmd’, ‘mean’, (2, ‘ms’)) } > {PAdemand=(‘asmd’, ‘mean’, 0.5, ‘ms’)} > {PAdemand=(‘asmd’, ‘mean’, (0.2, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, ($B * 0.9, ms’)), PAextOp=(writeBlock, $B)} storeImage > {PAdemand=(‘asmd’, ‘mean’, (1.1, ‘ms’)) }

January 2004WOSP’2004 tutorial 52 LQN Model procOneImage [1.5,0] alloc [0.5, 0] bufEntry getImage [12,0] passImage [0.9, 0] AcquireProc (2 threads) BufferManager Buffer AcquireProc2 acquireLoop [1.8] VideoController lock [500, 0] Lock releaseBuf [0.5, 0] BufMgr2 alarm [0,0] Alarm network [0, 1] Network storeImage [3.3, 0] StoreProc User rate=2/min Users readCard [1, 0] CardReader admit [3.9, 0.2] AccessController writeEvent [1.8, 0] writeImg [7.2, 0] readRights [1.8,0] writeRec [3,0] writeBlock [1, 0] readData [1.5, 0] (1,0) (forwarded) (1, 0) (0, 1) ($P, 0) (1, 0) ($N) (1, 0) (0,0.2) ($B, 0) (0.4, 0) (1, 0) (1, 0) (0, 0) Applic CPU DB CPU DiskP LockP AlarmP CardP UserP NetP Dummy DataBase Disk

January 2004WOSP’2004 tutorial 53 LQN Simulation results Bufs : number of buffers; Cycle: mean camera scan cycle in sec PCam: Prob (camera cycle > 1 sec); PDoor: Prob(door response > 1 sec)

January 2004WOSP’2004 tutorial 54 Generation of performance models from UML specs

January 2004WOSP’2004 tutorial 55 Direct UML to LQN Transformation: our first approach n LQN model structure (tasks, devices and their interconnections) from: l UML model of the high-level software architecture l UML deployment diagram n LQN detailed elements (entries, phases, activities and parameters) from: l UML models of key scenarios with performance annotations t Scenarios can be represented in UML by the software designers as: u Sequence diagrams u Activity diagrams t Why we prefer activity diagrams in our transformation process: u So far, UML sequence diagram are missing convenient constructs for representing branch/merge, fork/join, iteration and decomposition. u Other authors use extended sequence diagrams to overcome these deficiencies. However, the UML metamodel and the present UML tools do not cover such extensions… Hopefully UML 2.0 will! t If sequence diagrams are given, we will transform them automatically in activity diagrams, then proceed with the performance model derivation.

January 2004WOSP’2004 tutorial 56 Tool interoperability UML Model (XML format) UML Tool Analysis results UML to LQN Transformation LQN Tool Import perf. results into UML model Performance Model n Forward path (in black) - implemented n Backward path (in gray) - not done yet

January 2004WOSP’2004 tutorial 57 UML LQN Transformation Algorithm 1. Generate LQN model structure 1.1 determine LQN software tasks from high-level components 1.2 determine LQN hardware devices from deployment diagram 2. Generate LQN details on entries, phases, activities from scenarios 2.1 for each scenario, process the corresponding activity diagram(s) match inter-component communication style from pattern with messages between components from the activity diagram parse the activity diagram detach activity diagram into partition subgraphs divide further each partition into nonterminal subgraphs map subgraphs to LQN entries, phases, and activities create the respective LQN elements compute their parameters 2.2 merge LQN submodels corresponding to different scenarios 3. Traverse LQN graph and write out textual model description.

January 2004WOSP’2004 tutorial 58 Generating the LQN Model Structure a) High-level architecture b) Deployment diagram ProcS ProcC Modem Internet LAN ProcDB Disk1 Client WebServer Database c) Generated LQN model structure Software tasks generated for high- level software components according to the architectural patterns used. Hardware tasks generated for devices from deployment diagram

January 2004WOSP’2004 tutorial 59 Parsing Activity Diagrams n In the case of the ad-hoc Java transformation, a graph-grammar was defined and a top-down parsing algorithm was developed l Input: a graph of meta-objects representing an UML model, (particularly activity diagrams with swimlanes and collaborations) l Output: a parsing sub-tree for each swimlane (i.e., concurrent component) showing the following: t Compliance with the architectural patterns used in the UML model t Division of the activity diagram into sub-areas corresponding to LQN entries, phases and activities t Identification of basic structures for each sub-area (sequence, branch, loop, etc.) n The parsing tree is used to generate the following LQN elements: l Lower-level components of LQN tasks: entries, phases, activities l LQN request arcs form phases and activities to entries l LQN parameters: service times and visit ratios.

January 2004WOSP’2004 tutorial 60 Mapping AD to LQN Elements a) ClientSever with blocking client Client continue work request service serve request and reply waiting undefined Server complete service (opt) wait for reply do something b) ClientSever with non-blocking client Client continue work request service serve request and reply waiting undefined Server complete service (opt) e1, any e2, ph1 e2, ph2 e1, a1 e1, a3 e1, a2 e2, ph1 e2, ph2 e2 [ph1,ph2] Server e1 Client e2 [ph1,ph2] Server & & Client a1 a2a4 a3 LQN submodel for (a)LQN submodel for (b)

January 2004WOSP’2004 tutorial 61 Sample AD With Performance Annotations send wait_bwait_c b1 b2 b3 b4 b5 receive undef_b c1 c2 undef_c c3 ABC t1 t2 t3 t4 t5 t6 t7 t9 t10 t11 t12 t5 t13 t14 t15 t16 t17 t18 t8 t5 f2 f4 f3 j1 j2 j3 init fin > {PAdemand=(‘est’, ‘mean’, 2, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 2, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 3, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 1, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 2.5, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 1.7, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 2.1, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 2.8, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 1.5, ‘ms)’} > {PAdemand=(‘est’, ‘mean’, 1.8, ‘ms)’} > {Papopulation = $N}

January 2004WOSP’2004 tutorial 62 send wait_bwait_c b1 b2 b3 b4 b5 receive undef_b c1 c2 undef_c c3 ABC f2 f4 f3 j1 j2 j3 init fin e1, ph1 e3, ph1 e3, ph2 e2, a2 e2, a1 e2, a4 e2, a5 & a1 { 1.8 } & a4 [e2] { 2.5 } e1 { 4, 0 } e3 { 3.8, 2.8 } a2 { 4 } a3 { 0 } a5 { 1.5 } A B, e2 C {1, 0} Generating LQN Detailed Elements From AD

January 2004WOSP’2004 tutorial 63 The PUMA project n PERFORMANCE FROM UNIFIED MODEL ANALYSIS web page: n Goal: to develop a unified approach for building performance models from scenario-based software specifications n The proposed approach to support scenario-based performance engineering: Software Design Tool (UML, UCM, etc.) Performance Tool (LQN, QN, etc.) Sensitivity and Optimization Tools Core Scenario Model (XML-based) Annotated Design Specs Results and guidance Performance Model Results solve diagnose performance problems place recommendations in design context translate to CSM generate perf. model

January 2004WOSP’2004 tutorial 64 PUMA project highlights n Scenario input based on the UML Profile for Schedulability, Performance and Time. l Other forms of definition of scenarios, either based on UML, or on other languages such as Use Case Maps. n Core Scenario Model to integrate a wide variety of scenario specifications. l in a XML-based language to be defined l close to the domain model of the UML Performance Profile n Performance modeling using existing tools. l Initially, Layered Queueing, and Queueing Network model formats for ensured scalability. l Simulation. n Design evaluation by model experimentation, and feedback of results as measures attached to the design as l design suggestions l identified hot spots l software resource analysis.

January 2004WOSP’2004 tutorial 65 Transformation Steps Performance Model UML model (in XML format) UML Tool Analysis Results Transformation from UML to CSM Performance Model Solver Results and recommendations Core Scenario Model (CSM) Transformation from CSM to performance model Two kind of performance models generated so far: 1.LQN model 2.CSIM simulation model (C/C++ code)

January 2004WOSP’2004 tutorial 66 Core Scenario Model n Requirements: l Simpler than the UML model l Must contain all the data for generating performance models n Reflects closely the elements of the Performance Profile l different kind of resources l scenarios and scenario steps and their resource demands l workload n Challenges for the transformation from UML to CSM: l Extract only the UML model elements important for building the performance model l Process multiple UML diagrams and multiple scenarios l Recognize whether the UML model is incomplete/inconsistent from the performance annotation viewpoint l Generate a correct XML file conform to the CSM Schema l Find a UML tool that exports properly a UML model in XML format according to the full XMI definition.

January 2004WOSP’2004 tutorial 67 Generating Different Performance Models n Challenges for the transformation from CSM to different performance models: l Realizing the mapping between CSM and the chosen performance model t different degrees of abstraction between input and output t generate the performance model structure t compute the performance model parameters t identify cases in which the chosen performance model cannot express some features of the system under study l Automatic or semi-automatic transformation? t designer guidance may be necessary l Flexibility of the transformation process t inflexible: mapping hard-wired in the transformation code t flexible: the performance analyst or software developer may be allowed to set some transformation rules.

January 2004WOSP’2004 tutorial 68 Validation

January 2004WOSP’2004 tutorial 69 Client Case Study: Document Exchange System n Steps for validating the methodology: a)design the UML model b)implement the system by using reusable frameworks (ACE and JAWS) c)measure the system in a networked environment d)annotate the UML model with measured resource demands e)generate the LQN model automatically from the UML specification f)solve the LQN model and compared its results with overall performance measurements obtained from the real system g)use the LQN model to gain some insights into the DES performance. > LAN > CDisk > ClientCPU > ServerCPU > SDisk Server > Retrieve Thread SDiskProc

January 2004WOSP’2004 tutorial 70 Activity Diagram for “Retrieve Document” Scenario Client RetrieveThread SDiskProc request document accept request read request update logfile write to logfile parse request get document read from disk send document recycle thread receive document undef_D > {PAdemand=(‘msrd’,’mean’, (220/$cpuS,’ms’))} > {PAdemand=(‘msrd’,’mean’, ( /$cpuS,’ms’))} > {PApopulation= $Nusers} > {PAdemand=(‘asmd’,’mean’, (0.5,’ms’)), PAextOp=(‘network’,1) PArespTime= (‘req’,’mean’,(1,’sec’), (‘pred’,’mean’,$RespT)}} > {PAdemand=(‘asmd’,’mean’, (1.5,’ms’))} > {PAdemand=(‘msrd’,’mean’, (35/$cpuS,’ms’))} > {PAdemand=(‘msrd’,’mean’, (25/$cpuS,’ms’))} > {PAdemand=(‘msrd’,’mean’, ($cdS,’ms’)), PAextOp=(‘readDisk’,$DocS’)} > {PAdemand=(‘msrd’,’mean’, (0.70,’ms’)), PAextOp=(‘readDisk’,$RP’)} > {PAdemand=(‘msr’,’mean’, (170/$cpuS,’ms’))} > {PAdemand=(‘msrd’,’mean’, ($scdC/$cpuS,’ms’)), PAextOp=(‘network’,$DocS’)} > {PAdemand=(‘msrd’,’mean’, ($gcdC/$cpuS,’ms’))} wait_D

January 2004WOSP’2004 tutorial 71 LQN Model for “Retrieve Document” ClientP Ethernet eServer RetrieveThread SDisk net1 Request Task net2 ReplyTask ServerP default entry ClientT dummy1 dummy2 eServer SDiskProc

January 2004WOSP’2004 tutorial 72 Model sensitivity to I/O time for short messages

January 2004WOSP’2004 tutorial 73 Sensitivity to network delays for short messages

January 2004WOSP’2004 tutorial 74 Sensitivity to I/O time for long messages

January 2004WOSP’2004 tutorial 75 Sensitivity to network delays for long messages

January 2004WOSP’2004 tutorial 76 Future research directions n Refinement of the current Performance Profile l add more annotations to allow state machine-based performance analysis l harmonize the Performance Profile with the Schedulability Profile n Upgrade the STP Profile for UML 2.0 n Coordinate the SPT Profile with the emerging QoS Profile l a new OMG initiative to support modeling a wide range of QoS concepts: QoS characteristics, constraints, contracts, execution models, adaptation and monitoring, etc. n Use the Performance Profile in the context of MDA l Challenge: how to use UML models at different levels of abstraction (platform-independent and -dependent) to generate performance models which are inherently instance-based and platform dependent.