Download presentation
Presentation is loading. Please wait.
Published byHarvey Harmon Modified over 8 years ago
1
Model Driven Performance Analysis University College London James Skene – j.skene@cs.ucl.ac.uk
2
Outline Requirements for the analysis method, as I see them. Overview of the model driven performance approach chosen –Rationale related to the requirements. Future work
3
Performance Analysis Functional Requirements Assuming the existence of the TAPAS platform… Reason about compositionality of service level agreements Predict application capacity –Over-provisioning or under provisioning w.r.t SLAs a cost. –Targeted at ASP technologies Enable design time performance prediction –Select architecture
4
Non-functional requirements Be usable: –Performance analysis outside usual software engineering competence Must be integrated with standard software engineering practice –Minimise cost of performance analysis. Be used: –Performance analysis is currently not performed despite benefits
5
Approach Mappings from analysis to design models within the Model Driven Architecture (MDA) Qualitatively: –Includes UML so is integrated with standard software engineering practice. –Is tool supported, so: Can integrate the technique Can provide assistance with the technique Can automate the technique Also meets the functional requirements!
6
The Model Driven Architecture (MDA) Family of specifications –UML – The Unified Modelling Language –MOF – The Meta-Object Facility –CWM – The Common Warehouse Meta-model –Also: CORBA – The Common Object-Request Broker Architecture Not really an architecture Software designs captured as UML models
7
PIMs and PSMs Problem: Technical infrastructure changes independently of business rules, but these are strongly coupled in designs. Solution: Decouple them Platform Independent Model (PIM) Platform Specific Model (PSM) «realize»
8
Semantic domains PIMs and PSMs relate to different types of thing. It is convenient to describe these designs using different languages. –E.g. EJB implementation details UML can describe object oriented designs. UML contains extension mechanisms to provide additional interpretations for model elements.
9
Metamodels UML PIMPSM Virtual Metamodel Profile Meta-model: Model:
10
Profiles The lightweight extension mechanisms: –Stereotypes extend the meaning of UML model elements. –Tagged values associate qualities with model elements. –Constraints govern the form of models, enforcing domain semantics. Act at meta-model level. –Profiles group stereotypes, tagged values and constraints. Freedom through constraint Opportunity for standardisation
11
Mappings PIM PSM Source Code Analysis Results PIM
12
How are mappings described? Imperative mappings specify an algorithm Declarative mappings specify pair-wise constraints Declarative mappings can be captured in a profile using constraints. «profile» Mapping «profile» Design «profile» QN
13
Benefits of mappings They can be checked, providing assistance to modellers Declarative mappings only need to be partially specified –The flexibility addresses the difficulty in producing feasible analysis models. The mappings define a semantics for the design domain, in terms of the analysis domain concepts. The declarative mappings provide guidance for subsequent automated mappings. Can capture expert modelling techniques
14
Design domain: A soft-real- time profile Based on the ‘UML Profile for Schedulability, Performance, and Time Specification’ Stereotypes to: –Identify workload classes –Identify resources accessed under mutual exclusion –Identify actions having resource demands
15
A Soft-real-time profile 2 Tagged values to: –Specify workload parameters (e.g. population, think- time, or arrival rate) –Specify resource demands for actions/procedures –Specify probabilities for choices, average number of iterations. Constraints: –Object containing action with resource demand must be deployed in context where resource is available.
16
Example design model - sequence :UpdateBean :ManagerBean:EmployeeBean 1:update() 2:ejbCreate() 3:ejbCreate() {repetitions = 100, demand={cpu:10000}} {p= 0.5, demand={cpu:100, disk1:5}} {demand={cpu:100, disk1:5}}
17
Platform Example design model – deployment :UpdateBean :ManagerBean :EmployeeBean {serviceRate=0.1s} «resource» CPU «resource» Disk2 «resource» Disk1 «deploys» {serviceRate=0.1s}{serviceRate=0.001s}
18
A performance analysis domain profile Queuing networks Stereotypes: –Identify instances as queues, delays or populations. Tagged values: –Specify service intervals and probabilities on links. Constraints: –Ensure that the network is connected.
19
Example QN Collaboration «client» Workload «queue» CPU «queue» Disk1 «queue» Disk2 {thinkTime = 5sec, Population = 15} {serviceRate = 1000}{p = 0.05} {p = 0.02}{serviceRate = 0.1}
20
Mapping from design to analysis domain Resources correspond to queues. Resource demands translate to probabilities or demand vectors. Much more complicated mappings will be required to capture infrastructure details (e.g. performance of containers). «model» Design «model» QN «DesignToQN»
21
Requirements? PIM PSM Analysis EJBMQ serverOracle Tool: QN SPA SPN Lifecycle SLAng
22
Progress SLAng identifies relevant scenarios and technologies. Assembling a toolset: –Poseidon UML –MDR plug-in for NetBeans –LUI OCL checker, NEPTUNE project –Libraries and tools for performance analysis
23
Future work Define profiles Associate with SLAng constructs Create tool to automate analysis Integrate into single IDE Automate mappings
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.