Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision.

Similar presentations


Presentation on theme: "Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision."— Presentation transcript:

1 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision and Tools The PUMA project "Performance by Unified Model Analysis" Murray Woodside Dorina Petriu Carleton University what sort of evaluation? how can it be done easily?  UML tool integration, PUMA project ideas benefits www.sce.carleton.ca/rads/puma/

2 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 2 Why Design Evaluation? System perspective early integrated... tradeoffs between parts Planning view understandable by many stakeholders and by groups with subsystem concerns maintain a view over the life cycle Aligned with Model-Driven Design/Architecture trend to higher-level development techniques trend to use of predefined components (generative techniques)

3 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 3 Six capabilities “Profile the design”  how often are objects called/created, messages sent? Estimate Resource loadings  heads-up on saturation Budgetting for resource use and delay  create and verify budgets given to subsystem developers Estimate performance  estimate response time, achievable throughput, and compare to requirements  point estimates, sensitivity Analyze scale-up and find bottlenecks  intrinsic limitations in the design; go beyond the limits of the lab. Improve scalability

4 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 4 “Profile the design” Estimate the numbers of events per system response calls, requests, and messages, relative frequencies of paths Based on estimates of call multipliers at each component for one entry to the component, how many of these will occur? this “local” knowledge is more easily grasped, simple math does the rest. The simplest numbers: a kind of accounting model for events 10/sec [*5] [*7] [*10] 3500/sec ABCD

5 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 5 Estimate Resource loadings by component Estimate utilization ( 0 to 100%) (frequency of resource use)  holding time frequency from the accounting model...  holding times may be estimated by judgment, or budgetted, or also be found from a model CPU loading: % utilization Logical resource (thread pool, buffer pool)... need to estimate holding times, generally needs a model or use a budget value for holding time

6 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 6 Budgets for resource use and delay Budgeting is a shift in viewpoint: instead of trying to guess or predict, we set targets. budgeting is a familiar problem, with familiar uncertainties achieving the targets is a separate job,  delayed until implementation  may involve budget adjustments A management tool that allows a developer to focus on his own part of the system and still work towards achieving a performance plan

7 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 7 Estimate performance A point estimate for delay or throughput at a given set of parameter values average, or probability of meeting a target A model may be used to explore parameter changes and uncertainties, by multiple runs design sensitivity, data sensitivity, environment sensitivity, sensitivity to user behaviour. %miss parameter1 parameter2

8 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 8 Analyze scale-up, find bottlenecks bottlenecks = limiting factors on increasing the workload study replication strategies, protocol changes, movements of the bottleneck study: “what if a certain function could be speeded up 20%” study large system configurations that can’t be reproduced in the lab.

9 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 9 Improve scalability By removing bottlenecks in the model successive limits, similar to work in the lab but faster subject of several studies By scalability strategies involving replicated resources Software redesign and re-architecture can reduce logical resource holding times or frequencies

10 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 10 PUMA project origins Perceive: that evaluation is a worthwhile goal difficulties in getting parameters are real but can be overcome Fact: we have scalable performance modelling tools, but creating models is too hard Trend to higher-level tools for software development acceptance of UML MDA etc, generative programming Conclude: can piggyback performance prediction on design capture in UML PUMA: Performance by Unified Model Analysis (NSERC Strategic Grant Project)

11 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 11 PUMA Build models from the UML design specifications dealing with real limitations like incompleteness of the UML doc Transform automatically (or nearly?) to a model we use “layered queueing”, others are to be supported Assisted exploration of the design space (big question) Designer feedback and design suggestions (big question) Elements of a “Performance Engineering Assistant” open-ended potential for integration with other tools, other analyses

12 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 12 PUMA: Performance Information Processing overview S Core Scenario Model (gather and organize structure and data) R Results of analysis P Performance Model - layered queues - Petri nets Platform and environment submodels Plans for experiments U UML Design - deployment - scenarios: (sequence, activity diagrams) - components U2S S2P R2P R2S R2U solve

13 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 13 Correspondence between a UML Spec and its Performance Model A Building Security System (BSS) example UML description with performance annotations using the standard performance profile for schedulability, performance and time (SPT) Layered queueing model Sample results

14 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 14 Performance Profile: the domain model PerformanceContext Workload responseTime priority PScenario hostExecDemand responseTime PResource utilization schedulingPolicy throughput PProcessingResource processingRate contextSwitchTime priorityRange isPreeemptible PPassiveResource waitingTime responseTime capacity accessTime PStep probability repetition delay operations interval executionTime ClosedWorkload population externalDelay OpenWorkload occurencePattern 0..n 1..n 1 1 1 0..n 0..1 1..n 1 1 {ordered} +successor +predecessor +root +host

15 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 15 Case Study – Building Security System (BSS) Use Cases Access control Log entry/ exit Acquire/store video Manage access rights Manager Database Video Camera > User

16 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 16 Deployment Diagram of BSS

17 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 17 Sequence Diagram for Video Acquire/Store Scenario <<PAresource>> Video Controller <<PAresource>> AcquireProc << PAresource>> BufferManager <<PAresource>> StoreProc *[$N]procOneImage(i) << GRMacquire>> allocBuf (b) getImage (i, b) passImage (i, b) storeImage (i, b) <<GRMrelease>> releaseBuf (b) freeBuf (b) <<PAresource>> Database {PAcapacity=10} writeImg (i, b) getBuffer() store (i, b) <<PAstep>> {PAdemand =(‘asmd’, ‘mean’, (1.5, ‘ms’)} > {PAdemand=(‘asmd’, ‘ mean’, (1.8, ‘ms))} <<PAcontext>> o <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, ($P * 1.5, ‘ms’)), PAextOp = (network, $P)} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, ($B * 0.9, ‘ms’)),, PAextOp=(writeBlock, $B)} <<PAclosedLoad>> {PApopulation = 1, PAinterval =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $Cycle)) } <<PAstep>> {PArep = $N} > {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’))} o o <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.9, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (1.1, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (2, ‘ms’))} <<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.2,’ms’))} o This object manages the resource Buffer o

18 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 18 Sequence Diagram for Video Acquire/Store Scenario <<PAresource>> Video Controller <<PAresource>> AcquireProc <<PAresource>> BufferManager *[$N]procOneImage(i) <<GRMacquire>> allocBuf (b) getBuffer() <<PAstep>> {PAdemand =(‘asmd’, ‘mean’, (1.5, ‘ms’)} o <<PAclosedLoad>> {PApopulation = 1, PAinterval =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $Cycle)) } <<PAstep>> {PArep = $N} <<PAstep>> {PAdemand=(‘ ‘mean’, (0.5, ‘ms’))} o This object manages the resource Buffer

19 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 19 Sequence Diagram for Access Control Scenario getRights() User CardReader DoorLock Alarm Access Controller Database Disk readCard admit (cardInfo) readRights() [not_in_cache] readData() checkRights() [OK] openDoor() [not OK] alarm() [need to log?] logEvent() writeRec() enterBuilding writeEvent() > {PAextOp=(read, 1)} > {PAdemand=(‘asmd’, ‘mean’, (3, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} > o o {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’)), PAextOp = (network, 1)} > {PAdemand=(‘asmd’, ‘mean’, (1.5, ‘ms’)), PAprob = 0.4} > {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ms’)), PAprob = 1} > {PAprob = 0} > {PAdemand=(‘asmd’, ‘mean’,(0.3, ‘ms’))} > {PAdemand=(‘asmd’, ‘mean’, (0.2, ‘ms’), PAprob=0.2} > {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))} o > {PAoccurencePattern = (‘poisson’, 120, ‘s’), PArespTime =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $RT)) }

20 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 20 Layered Queueing Network (LQN) model http://www.sce.carleton.ca/rads/lqn/lqn-documentation Advantages of LQN modeling Model layered resources and logical resources in a natural way Give insight to resource dependencies Scale up well for large system What can we get from the LQN solver Service time (mean, variance) Waiting time Probability of missing deadline Throughput Utilization Confidence Interval clientE DBWriteDBRead DB DKWrite [1, 10] DKRead Client CPU ClientT Disk DB CPU DB Disk

21 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 21 Using the Model to Study and Improve a Design U UML Design - deployment - scenarios: (sequence, activity diagrams) - components S Core Scenario Model (gather and organize structure and data) R Results of analysis P Performance Model - layered queues - Petri nets Platform and environment submodels Plans for experiments U2S S2P R2P R2S R2U solve

22 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 22 LQN Model (0,1)

23 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 23 Using the Model: Improvements to BSS Base case system capacity: 20 cameras software bottleneck at the buffers Adding software resources: 4 Buffers and 2 StoreProc threads 40 cameras, performance improvement 100% hardware bottleneck at the processor Replicating the processor: Dual Application CPU 50 cameras, performance improvement 150% Increasing concurrency Moving first phase call to second phase 100 cameras, performance improvement 400%

24 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 24 Feedback  Modified BSS Sequence Diagram

25 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 25 Scalability study of a telecom system Deployment diagram > s bit/sec Service Builder Database > ServiceBuilder component instance contains several concurrent high-level objects

26 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 26 Telecommunication system architecture > RequestHandler 1..n > IO IOin IOout > Stack StackIn StackOut doubleBuffer inBuffer outBuffer alloc() {sequential} free() {sequential} ShMem1 update() {sequential} ShMem2 > DataBase ServiceBuilder

27 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 27 Architectural Design Patterns PIPELINE WITH BUFFER UpStrmFilter DownStrmFilter Buffer PIPELINE WITH MESSAGE UpStrmFilter DownStrmFilter > RequestHandler 1..n > IO IOin IOout > Stack StackIn StackOut doubleBuffer inBuffer outBuffer PIPELINE WITH BUFFER UpStrmFilter DownStrmFilter Buffer PIPELINE WITH MESSAGE UpStrmFilter DownStrmFilter CRITICAL SECTION Accessor Shared CRITICAL SECTION Accessor Shared CLIENT SERVER Client Server alloc() {sequential} free() {sequential} ShMem1 update() {sequential} ShMem2 > DataBase

28 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 28 Scenario with performance annotations enqueue (req) StackIn :client IOininBuffer input (req) reserve() write (req) signal (awake) read (req) ReqHandleroutBufferIOoutStackOut process (req) write (result) signal (awake) read (result) pass (result) sendBack (result) Details of process (req) ReqHandlerShMem1DataBaseShMem2 interpret (script) alloc () get (script) update () free () > {PAPopulation = $Nusers, PAextDelay = (‘mean’, asgn’, 20, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.105, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.4028, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 1.998, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.105, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.4028, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.1328, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.6749, ‘ms’)} > {Pademand = (‘meas’, ‘mean’, 0.1576, ‘ms’)}

29 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 29 Average execution times per request

30 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 30 LQN model for the telecom system Dummy Proc pull push alloc free IOin Request Handler IOout StackExe c ShMem2 Proc IOexec StackIn StackOu t Buffer ShMem1 update DataBase ProcDB

31 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 31 Max throughput for 1, 4 and 6-processor configurations

32 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 32 Base Case, 1-processor: hardware bottleneck

33 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 33 Base Case, 4-processor: software bottleneck

34 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 34 Base Case, 6-processor: stronger software bottleneck

35 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 35 IOout is doing only 0.12 ms of useful work on behalf of a system request. Nonetheless, it is the system bottleneck. RequestHandler is doing 2.75 ms of useful work on behalf of a system request. However, it is not the system bottleneck. useful work waiting useful work waiting What is the bottleneck task doing?

36 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 36 Modified system architecture l The original software architecture suffers from serializations constraints at the level of IO process, Stack process and doubleBuffer l Eliminate the serialization constraints in two steps:  Each pipeline filter will run on its own process (thread of control)  Split the pipeline buffer into two separate buffers, each controlled by its own semaphore. > Request Handler 1..n alloc() {sequential} free() {sequential} ShMem1 update() {sequential} ShMem2 > DataBase ServiceBuilder > StackIn > StackOut > IOin > IOout inBuffer outBuffer

37 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 37 Modified system, 4-processor configuration: software bottleneck eliminated

38 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 38 Modified system, 6-proc configuration: a new software bottleneck is emerging (DataBase)

39 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 39 Research on the SPT Profile The SPT profile needs to be adapted to UML2, with many changes to behaviour specification, a new QoS profile which describes how to specify QoS measures Enhancements can be made to harmonize the performance and schedulability aspects to provide a more formal domain model to annotate components possible changes to measures and parameters (topic of a workshop in Toronto in May) We will be participating with several other groups.

40 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 40 PUMA will enhance Model-based Development We see a paradigm shift in the direction of MDA or MDD (Model-Driven Development) emphasizes more work on models generation of code and system artifacts Model evaluation/verification becomes more important performance verification supports this trend may provide a leading example for non-functional verification  reliability

41 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 41 PUMA opens new prospects... Libraries of submodels for platform specific aspects Instrument and calibrate generated code as it is produced Use the performance model to manage later stages of performance engineering testing debugging Adapt the transformations for other kinds of evaluation, based on the approach of the QVT (Query, View, Transformation) standard introduced for transforming platform-independent specs to platform-specific versions but capable of much more...

42 Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 42 Research on Transformation Tools and QVT At present our transformation tools are ad hoc creations based on the specific analysis, but QVT seems to be applicable: the Profile annotations support a scenario view Core Scenario Model is a first target model with its own metamodel described in MOF the performance model is a second target: can it be described in MOF? Explore the use of QVT most uses are only UML to UML... this is more general may help in defining QVT may support other transformations for evaluation


Download ppt "Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision."

Similar presentations


Ads by Google