Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision.

Slides:



Advertisements
Similar presentations
Network II.5 simulator ..
Advertisements

Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion Abu Bakr Siddiq.
Executional Architecture
CS 443 Advanced OS Fabián E. Bustamante, Spring 2005 Resource Containers: A new Facility for Resource Management in Server Systems G. Banga, P. Druschel,
Chapter 22 UML Tooks and UML as Blueprint Model-Driven Architecture (MDA) Object-Constraint Language (OCL)
Distributed Multimedia Systems
Unified theory of software evolution Reengineering – Business process reengineering and software reengineering BPR model – Business definition, process.
Behavioral Design Outline –Design Specification –Behavioral Design –Behavioral Specification –Hardware Description Languages –Behavioral Simulation –Behavioral.
Chapter 14 Chapter 14: Server Monitoring and Optimization.
Copyright © 1998 Wanda Kunkle Computer Organization 1 Chapter 2.1 Introduction.
The new The new MONARC Simulation Framework Iosif Legrand  California Institute of Technology.
1 Software Testing and Quality Assurance Lecture 40 – Software Quality Assurance.
©Company confidential 1 Performance Testing for TM & D – An Overview.
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
Copyright © , Software Engineering Research. All rights reserved. Creating Responsive Scalable Software Systems Dr. Lloyd G. Williams Software.
Chapter 2- Software Process Lecture 4. Software Engineering We have specified the problem domain – industrial strength software – Besides delivering the.
Course Instructor: Aisha Azeem
Architectural Design Establishing the overall structure of a software system Objectives To introduce architectural design and to discuss its importance.
Object Oriented Analysis and Design Using the UML
Sitefinity Performance and Architecture
Load Test Planning Especially with HP LoadRunner >>>>>>>>>>>>>>>>>>>>>>
Computer System Lifecycle Chapter 1. Introduction Computer System users, administrators, and designers are all interested in performance evaluation. Whether.
Chapter : Software Process
Self-Adaptive QoS Guarantees and Optimization in Clouds Jim (Zhanwen) Li (Carleton University) Murray Woodside (Carleton University) John Chinneck (Carleton.
Software Engineering Muhammad Fahad Khan
An Introduction to Software Architecture
/ Copyright © Siemens AG All rights reserved. Corporate Technology Performance Prediction of Client-Server Systems by High-Level Abstraction Models.
Introduction to MDA (Model Driven Architecture) CYT.
High Performance Embedded Computing © 2007 Elsevier Chapter 1, part 2: Embedded Computing High Performance Embedded Computing Wayne Wolf.
University of Southern California Center for Systems and Software Engineering Model-Based Software Engineering Supannika Koolmanojwong Spring 2013.
Middleware for FIs Apeego House 4B, Tardeo Rd. Mumbai Tel: Fax:
The Systems Development Life Cycle
Performance evaluation of component-based software systems Seminar of Component Engineering course Rofideh hadighi 7 Jan 2010.
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Introducing Project Management Update December 2011.
11 CLUSTERING AND AVAILABILITY Chapter 11. Chapter 11: CLUSTERING AND AVAILABILITY2 OVERVIEW  Describe the clustering capabilities of Microsoft Windows.
PRJ566 Project Planning & Management Software Architecture.
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
Software Engineering Lecture # 1.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
1 Copyright © 2005, Oracle. All rights reserved. Following a Tuning Methodology.
UML 1.4 to Core Scenario Model Transformation for PUMA Project Kathleen H. Shen Department of Systems and Computer Engineering Carleton.
Chapter 8 System Management Semester 2. Objectives  Evaluating an operating system  Cooperation among components  The role of memory, processor,
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Unit – I Presentation. Unit – 1 (Introduction to Software Project management) Definition:-  Software project management is the art and science of planning.
From UML to Performance Models 29 June 2004 Dorina C. Petriu, Gordon Gu page 1 From UML to Performance Models: High Level View Dorina C. Petriu Gordon.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
1 PerfCenter and AutoPerf: Tools and Techniques for Modeling and Measurement of the Performance of Distributed Applications Varsha Apte Faculty Member,
OOD OO Design. OOD-2 OO Development Requirements Use case analysis OO Analysis –Models from the domain and application OO Design –Mapping of model.
Model Driven Performance Analysis University College London James Skene –
Introduction to Software Engineering 1. Software Engineering Failures – Complexity – Change 2. What is Software Engineering? – Using engineering approaches.
© 2009 Artisan Software Tools. All rights reserved. Testing Solutions with UML/SysML Andrew Stuart, Matthew Hause.
1 Presented by: Val Pennell, Test Tool Manager Date: March 9, 2004 Software Testing Tools – Load Testing.
Introduction to Performance Tuning Chia-heng Tu PAS Lab Summer Workshop 2009 June 30,
Spark on Entropy : A Reliable & Efficient Scheduler for Low-latency Parallel Jobs in Heterogeneous Cloud Huankai Chen PhD Student at University of Kent.
SQL Database Management
OPERATING SYSTEMS CS 3502 Fall 2017
Modelowanie i analiza systemów informacyjnych Performance evaluation of a software UML design Marek Bazan Kwiecień 2017.
Threads vs. Events SEDA – An Event Model 5204 – Operating Systems.
Software Architecture in Practice
Object-Oriented Software Engineering Using UML, Patterns, and Java,
Applying Control Theory to Stream Processing Systems
Software Architecture in Practice
Chapter 1 (pages 4-9); Overview of SDLC
Software models - Software Architecture Design Patterns
An Introduction to Software Architecture
Simple Web Server: Bottlenecks
Layered Queueing Network Modeling of Software Systems Murray Woodside 5201 Canal Building Building Security System (buffering)
Software Architecture
Chapter 5 Architectural Design.
Presentation transcript:

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision and Tools The PUMA project "Performance by Unified Model Analysis" Murray Woodside Dorina Petriu Carleton University what sort of evaluation? how can it be done easily?  UML tool integration, PUMA project ideas benefits

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 2 Why Design Evaluation? System perspective early integrated... tradeoffs between parts Planning view understandable by many stakeholders and by groups with subsystem concerns maintain a view over the life cycle Aligned with Model-Driven Design/Architecture trend to higher-level development techniques trend to use of predefined components (generative techniques)

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 3 Six capabilities “Profile the design”  how often are objects called/created, messages sent? Estimate Resource loadings  heads-up on saturation Budgetting for resource use and delay  create and verify budgets given to subsystem developers Estimate performance  estimate response time, achievable throughput, and compare to requirements  point estimates, sensitivity Analyze scale-up and find bottlenecks  intrinsic limitations in the design; go beyond the limits of the lab. Improve scalability

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 4 “Profile the design” Estimate the numbers of events per system response calls, requests, and messages, relative frequencies of paths Based on estimates of call multipliers at each component for one entry to the component, how many of these will occur? this “local” knowledge is more easily grasped, simple math does the rest. The simplest numbers: a kind of accounting model for events 10/sec [*5] [*7] [*10] 3500/sec ABCD

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 5 Estimate Resource loadings by component Estimate utilization ( 0 to 100%) (frequency of resource use)  holding time frequency from the accounting model...  holding times may be estimated by judgment, or budgetted, or also be found from a model CPU loading: % utilization Logical resource (thread pool, buffer pool)... need to estimate holding times, generally needs a model or use a budget value for holding time

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 6 Budgets for resource use and delay Budgeting is a shift in viewpoint: instead of trying to guess or predict, we set targets. budgeting is a familiar problem, with familiar uncertainties achieving the targets is a separate job,  delayed until implementation  may involve budget adjustments A management tool that allows a developer to focus on his own part of the system and still work towards achieving a performance plan

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 7 Estimate performance A point estimate for delay or throughput at a given set of parameter values average, or probability of meeting a target A model may be used to explore parameter changes and uncertainties, by multiple runs design sensitivity, data sensitivity, environment sensitivity, sensitivity to user behaviour. %miss parameter1 parameter2

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 8 Analyze scale-up, find bottlenecks bottlenecks = limiting factors on increasing the workload study replication strategies, protocol changes, movements of the bottleneck study: “what if a certain function could be speeded up 20%” study large system configurations that can’t be reproduced in the lab.

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 9 Improve scalability By removing bottlenecks in the model successive limits, similar to work in the lab but faster subject of several studies By scalability strategies involving replicated resources Software redesign and re-architecture can reduce logical resource holding times or frequencies

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 10 PUMA project origins Perceive: that evaluation is a worthwhile goal difficulties in getting parameters are real but can be overcome Fact: we have scalable performance modelling tools, but creating models is too hard Trend to higher-level tools for software development acceptance of UML MDA etc, generative programming Conclude: can piggyback performance prediction on design capture in UML PUMA: Performance by Unified Model Analysis (NSERC Strategic Grant Project)

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 11 PUMA Build models from the UML design specifications dealing with real limitations like incompleteness of the UML doc Transform automatically (or nearly?) to a model we use “layered queueing”, others are to be supported Assisted exploration of the design space (big question) Designer feedback and design suggestions (big question) Elements of a “Performance Engineering Assistant” open-ended potential for integration with other tools, other analyses

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 12 PUMA: Performance Information Processing overview S Core Scenario Model (gather and organize structure and data) R Results of analysis P Performance Model - layered queues - Petri nets Platform and environment submodels Plans for experiments U UML Design - deployment - scenarios: (sequence, activity diagrams) - components U2S S2P R2P R2S R2U solve

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 13 Correspondence between a UML Spec and its Performance Model A Building Security System (BSS) example UML description with performance annotations using the standard performance profile for schedulability, performance and time (SPT) Layered queueing model Sample results

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 14 Performance Profile: the domain model PerformanceContext Workload responseTime priority PScenario hostExecDemand responseTime PResource utilization schedulingPolicy throughput PProcessingResource processingRate contextSwitchTime priorityRange isPreeemptible PPassiveResource waitingTime responseTime capacity accessTime PStep probability repetition delay operations interval executionTime ClosedWorkload population externalDelay OpenWorkload occurencePattern 0..n 1..n n n 1 1 {ordered} +successor +predecessor +root +host

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 15 Case Study – Building Security System (BSS) Use Cases Access control Log entry/ exit Acquire/store video Manage access rights Manager Database Video Camera > User

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 16 Deployment Diagram of BSS

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 17 Sequence Diagram for Video Acquire/Store Scenario <<PAresource>> Video Controller <<PAresource>> AcquireProc << PAresource>> BufferManager <<PAresource>> StoreProc *[$N]procOneImage(i) << GRMacquire>> allocBuf (b) getImage (i, b) passImage (i, b) storeImage (i, b) <<GRMrelease>> releaseBuf (b) freeBuf (b) <<PAresource>> Database {PAcapacity=10} writeImg (i, b) getBuffer() store (i, b) <<PAstep>> {PAdemand =(‘asmd’, ‘mean’, (1.5, ‘ms’)} > {PAdemand=(‘asmd’, ‘ mean’, (1.8, ‘ms))} <<PAcontext>> o <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, ($P * 1.5, ‘ms’)), PAextOp = (network, $P)} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, ($B * 0.9, ‘ms’)),, PAextOp=(writeBlock, $B)} <<PAclosedLoad>> {PApopulation = 1, PAinterval =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $Cycle)) } <<PAstep>> {PArep = $N} > {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’))} o o <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.9, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (1.1, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (2, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.2,’ms’))} o This object manages the resource Buffer o

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 18 Sequence Diagram for Video Acquire/Store Scenario <<PAresource>> Video Controller <<PAresource>> AcquireProc <<PAresource>> BufferManager *[$N]procOneImage(i) <<GRMacquire>> allocBuf (b) getBuffer() <<PAstep>> {PAdemand =(‘asmd’, ‘mean’, (1.5, ‘ms’)} o <<PAclosedLoad>> {PApopulation = 1, PAinterval =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $Cycle)) } <<PAstep>> {PArep = $N} <<PAstep>> {PAdemand=(‘ ‘mean’, (0.5, ‘ms’))} o This object manages the resource Buffer

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 19 Sequence Diagram for Access Control Scenario getRights() User CardReader DoorLock Alarm Access Controller Database Disk readCard admit (cardInfo) readRights() [not_in_cache] readData() checkRights() [OK] openDoor() [not OK] alarm() [need to log?] logEvent() writeRec() enterBuilding writeEvent() > {PAextOp=(read, 1)} > {PAdemand=(‘asmd’, ‘mean’, (3, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} > o o {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’)), PAextOp = (network, 1)} > {PAdemand=(‘asmd’, ‘mean’, (1.5, ‘ms’)), PAprob = 0.4} > {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’)), PAprob = 1} > {PAprob = 0} > {PAdemand=(‘asmd’, ‘mean’,(0.3, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (0.2, ‘ms’), PAprob=0.2} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} o > {PAoccurencePattern = (‘poisson’, 120, ‘s’), PArespTime =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $RT)) }

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 20 Layered Queueing Network (LQN) model Advantages of LQN modeling Model layered resources and logical resources in a natural way Give insight to resource dependencies Scale up well for large system What can we get from the LQN solver Service time (mean, variance) Waiting time Probability of missing deadline Throughput Utilization Confidence Interval clientE DBWriteDBRead DB DKWrite [1, 10] DKRead Client CPU ClientT Disk DB CPU DB Disk

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 21 Using the Model to Study and Improve a Design U UML Design - deployment - scenarios: (sequence, activity diagrams) - components S Core Scenario Model (gather and organize structure and data) R Results of analysis P Performance Model - layered queues - Petri nets Platform and environment submodels Plans for experiments U2S S2P R2P R2S R2U solve

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 22 LQN Model (0,1)

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 23 Using the Model: Improvements to BSS Base case system capacity: 20 cameras software bottleneck at the buffers Adding software resources: 4 Buffers and 2 StoreProc threads 40 cameras, performance improvement 100% hardware bottleneck at the processor Replicating the processor: Dual Application CPU 50 cameras, performance improvement 150% Increasing concurrency Moving first phase call to second phase 100 cameras, performance improvement 400%

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 24 Feedback  Modified BSS Sequence Diagram

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 25 Scalability study of a telecom system Deployment diagram > s bit/sec Service Builder Database > ServiceBuilder component instance contains several concurrent high-level objects

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 26 Telecommunication system architecture > RequestHandler 1..n > IO IOin IOout > Stack StackIn StackOut doubleBuffer inBuffer outBuffer alloc() {sequential} free() {sequential} ShMem1 update() {sequential} ShMem2 > DataBase ServiceBuilder

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 27 Architectural Design Patterns PIPELINE WITH BUFFER UpStrmFilter DownStrmFilter Buffer PIPELINE WITH MESSAGE UpStrmFilter DownStrmFilter > RequestHandler 1..n > IO IOin IOout > Stack StackIn StackOut doubleBuffer inBuffer outBuffer PIPELINE WITH BUFFER UpStrmFilter DownStrmFilter Buffer PIPELINE WITH MESSAGE UpStrmFilter DownStrmFilter CRITICAL SECTION Accessor Shared CRITICAL SECTION Accessor Shared CLIENT SERVER Client Server alloc() {sequential} free() {sequential} ShMem1 update() {sequential} ShMem2 > DataBase

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 28 Scenario with performance annotations enqueue (req) StackIn :client IOininBuffer input (req) reserve() write (req) signal (awake) read (req) ReqHandleroutBufferIOoutStackOut process (req) write (result) signal (awake) read (result) pass (result) sendBack (result) Details of process (req) ReqHandlerShMem1DataBaseShMem2 interpret (script) alloc () get (script) update () free () > {PAPopulation = $Nusers, PAextDelay = (‘mean’, asgn’, 20, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.105, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, , ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 1.998, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.105, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, , ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, , ‘ms’)} > {Pademand = (‘meas’, ‘mean’, , ‘ms’)} > {Pademand = (‘meas’, ‘mean’, , ‘ms’)}

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 29 Average execution times per request

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 30 LQN model for the telecom system Dummy Proc pull push alloc free IOin Request Handler IOout StackExe c ShMem2 Proc IOexec StackIn StackOu t Buffer ShMem1 update DataBase ProcDB

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 31 Max throughput for 1, 4 and 6-processor configurations

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 32 Base Case, 1-processor: hardware bottleneck

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 33 Base Case, 4-processor: software bottleneck

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 34 Base Case, 6-processor: stronger software bottleneck

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 35 IOout is doing only 0.12 ms of useful work on behalf of a system request. Nonetheless, it is the system bottleneck. RequestHandler is doing 2.75 ms of useful work on behalf of a system request. However, it is not the system bottleneck. useful work waiting useful work waiting What is the bottleneck task doing?

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 36 Modified system architecture l The original software architecture suffers from serializations constraints at the level of IO process, Stack process and doubleBuffer l Eliminate the serialization constraints in two steps:  Each pipeline filter will run on its own process (thread of control)  Split the pipeline buffer into two separate buffers, each controlled by its own semaphore. > Request Handler 1..n alloc() {sequential} free() {sequential} ShMem1 update() {sequential} ShMem2 > DataBase ServiceBuilder > StackIn > StackOut > IOin > IOout inBuffer outBuffer

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 37 Modified system, 4-processor configuration: software bottleneck eliminated

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 38 Modified system, 6-proc configuration: a new software bottleneck is emerging (DataBase)

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 39 Research on the SPT Profile The SPT profile needs to be adapted to UML2, with many changes to behaviour specification, a new QoS profile which describes how to specify QoS measures Enhancements can be made to harmonize the performance and schedulability aspects to provide a more formal domain model to annotate components possible changes to measures and parameters (topic of a workshop in Toronto in May) We will be participating with several other groups.

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 40 PUMA will enhance Model-based Development We see a paradigm shift in the direction of MDA or MDD (Model-Driven Development) emphasizes more work on models generation of code and system artifacts Model evaluation/verification becomes more important performance verification supports this trend may provide a leading example for non-functional verification  reliability

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 41 PUMA opens new prospects... Libraries of submodels for platform specific aspects Instrument and calibrate generated code as it is produced Use the performance model to manage later stages of performance engineering testing debugging Adapt the transformations for other kinds of evaluation, based on the approach of the QVT (Query, View, Transformation) standard introduced for transforming platform-independent specs to platform-specific versions but capable of much more...

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 42 Research on Transformation Tools and QVT At present our transformation tools are ad hoc creations based on the specific analysis, but QVT seems to be applicable: the Profile annotations support a scenario view Core Scenario Model is a first target model with its own metamodel described in MOF the performance model is a second target: can it be described in MOF? Explore the use of QVT most uses are only UML to UML... this is more general may help in defining QVT may support other transformations for evaluation