Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Architecture Evaluation Methodologies Presented By: Anthony Register.

Similar presentations


Presentation on theme: "Software Architecture Evaluation Methodologies Presented By: Anthony Register."— Presentation transcript:

1 Software Architecture Evaluation Methodologies Presented By: Anthony Register

2 Software architecture (SWA) - “The fundamental organization of a system embodied in its components, their relationships to each other, and to the environment, and the principles guiding its design and evolution” as defined by IEEE

3 SWA Quality determined by quality attributes such as:  Flexibility  Usability  Performance  Maintenance

4 Evaluation methods: SAAM : Scenario-Based Architecture Analysis Method ALMA : Architecture Level Modifiability Analysis ATAM : Architecture Trade-off Analysis Method PASA : Performance Assessment of Software Architecture

5 Assumption - Several of the most common SWA evaluation methods are similar in the techniques applied, but have evolved to serve different purposes or viewpoints for evaluating the quality of a specific SWA.

6 Scenario Based  Documented in OO design and user interface as a technique to solicit requirements  Used for comparing SWA options  Simulate system events or changes  Determine impact on SWA  Result in brief description of a functional requirement that eventually may be implemented into the system at design time or in the future

7 Scenario Based Benefits:  Break down and clearly describe the architectural description and analysis points  Catalyst for generating more questions

8 Types of Scenarios Direct : requires no change to system architecture Indirect : requires change to system architecture

9 Scenario Example

10 Scenario-Based Architecture Analysis 5 steps: 1.Describe the candidate architecture 2.Develop scenarios 3.Perform scenario evaluations 4.Reveal scenario interaction 5.Overall SWA evaluation http://www.sei.cmu.edu/pub/documents/00.reports/pdf/00tr004.pdf#search=%22ATAM%20%22

11 SAAM (Steps 1 & 2) Describe the candidate architecture  Use a common syntactic and semantic notation that all designers and stakeholders can easily understand. Develop scenarios  Direct and indirect

12 SAAM (Steps 3 & 4) Perform scenario evaluations  Evaluate quality attributes (flexibility, usability, performance, maintenance, etc.) Reveal scenario interaction  Indirect scenario affects need to be at a minimum or the component may be a risk to the SWA  Scenario analysis indicates that many indirect scenarios affect the same component, then too much architectural level coupling may be indicated

13 SAAM (Step 5) Overall SWA evaluation  Weighted ranking or scoring performed for the overall SWA evaluation.  A scoring technique using an appropriate weighting analysis can be applied to track the affects of the indirect scenarios on the SWA for future decision making

14 SAAM Benefits:Disadvantages: Detecting issues early Lacking in tool support for managing large amounts of data for system and scenario descriptions SWA documentation through the documentation of scenarios, outcomes, and affects on the system architecture Lacks architectural metrics for assisting designers when making decisions between alternative designs Improved understanding of possible SWA issues

15 SAAM Strengths  Evaluate or compare future and existing systems, such as using the low cost method (tabular calculations and comparisons) for measuring change when choosing between SWA designs  Evaluation result enables the designers and users to focus on the details of the SWA and not be distracted from the less important areas  Provides a guided approach for evaluation and provides for open dialogue between the stakeholders of the system under evaluation

16 Architecture Level Modifiability Analysis (ALMA)  Focuses on modifiability of the SWA  Risk assessments  Distinguishes multiple analysis goals  Explicit assumptions  Provides repeatable techniques for performing the steps  Predicting future maintenance costs  Evaluating the flexibility of a system at an architectural level  Main quality inputs are the SWA specification and quality requirements http://www.sei.cmu.edu/pub/documents/00.reports/pdf/00tr004. pdf#search=%22ATAM%20%22

17 ALMA Process Steps 5 Steps: 1. Set the goal and determine the focus of the analysis 2. Create a description of the SWA 3. Create scenarios from the functional requirements 4. Evaluate the effects of the scenarios 5. Analyze the results of the independent scenario evaluations

18 ALMA (Steps 1 & 2) Goal Setting  Maintenance cost prediction, risk assessment, and SWA selection Description of SWA  SWA information in order to derive an architectural description of the system  Activity evaluates the decomposition of the system components and evaluates the relationships between the components

19 ALMA (Steps 3 & 4) Create scenarios from functional requirements  Determine the impact of change to the SWA  Providing a metric for analysis performed in the next step for evaluating the different scenarios Evaluate effects of scenarios  Collection of information to be used in SWA impact analysis  Identify components affected by the change, determine what the effects were, and determine the ripple effects on relational components

20 ALMA (Step 5) Analyze the results of the independent scenario evaluations  Interpretation of the results needs to align with the goal set forth for the evaluation  Goals:  Maintenance cost prediction, the scenarios should cover future events of the system  Risk assessment, the scenarios should provide complex change events in order to determine and interpret the effects of the changes

21 ALMA Benefits:  Identification of SWA risks  Measurement of the amount of effort required for changes  Deciding between available SWA options  Reduction in the number of scenarios and a process that provides guidance as to when to stop generating scenarios  All change categories explicitly considered  New change scenarios do not affect the classification structure

22 Architecture Trade-off Analysis Method (ATAM) Evaluates a SWA for quality attribute target goals, but focuses more on the trade-offs between the quality attributes. http://www.sei.cmu.edu/architecture/ata_method.html

23 ATAM Main inputs:  business goals  software specifications  SWA description Main Outputs:  list of scenario sensitivity points  trade-off points  risks  different approaches to the SWA  utility tree  quality attribute questions with the responses

24 ATAM Process Steps The 4 phases are:  Presentation  Investigation and Analysis  Testing  Reporting

25 ATAM (Phase 1 & 2) Presentation  Activities:  Presenting the ATAM  Presenting the business requirements  Presenting the architecture Investigation and analysis  Activities:  Identify the architectural approaches suggested by the architect before they are analyzed  Create an attribute utility tree  Analyze the architectural approaches

26 ATAM – Utility Tree An analytical method that provides a top-down approach for decomposing the quality attributes as designated by the ATAM goals. http://www.sei.cmu.edu/architecture/ata_method.html

27 ATAM (Phase 3 & 4) Testing  Activities  Brainstorming scenarios  Analyzing the architectural approaches  Utilizes the designated high priority scenarios to be used for test cases  The goal is targeted to identify any hidden architectural approaches, risks, sensitivity points, and tradeoff points Reporting  Activities  Present the results  Presented to the stakeholders in the form of a final analysis report

28 ATAM Benefits:  The quality attributed requirements are clarified  Provides for improved SWA documentation that can be used in the foundation of future SWA decisions  Promote communication between the stakeholders, such as customers, architects, and testers.  Identify system risks early in the solution life- cycle

29 Performance Assessment of Software Architecture (PASA)  Evaluates performance issues for SWA systems  Goal of PASA utilizes performance based scenarios  Unique : the documentation extracted comes from the developers and source code (since can be performed during development cycle)  Only includes interaction with the development team which is different from the other techniques Comparison of Scenario-Based Software Architecture Evaluation Methods

30 PASA Process  Starts by setting goals, identifying required information, understanding stakeholder expectations, and describing the method of approach  Selects key performance scenarios as elicited from the developers  Focuses on the architectural style or patterns used  The output of the process is a presentation of the results.

31 Common Goals and Activities of all Methods Common Goal : evaluate and predict the quality attributes as applied to a SWA evaluation analysis Common Activities :  Evaluating  Planning and preparation  Explanation of SWA approaches  Elicitation of quality sensitive scenarios  Analysis of SWA options  Interpretation and presentation of the evaluation results for final SWA decision making

32 Methods Overall  SAAM attempts to identify the potential risks to SWA and assess the modifiability.  ALMA attempts to predict the modifiability based on risk assessment, support costs, and SWA comparisons.  ATAM analyzes the sensitivity and trade-off points to determine what may prevent realizing the best combination of quality attributes for a given SWA.  PASA evaluates the performance risks.

33 Differences  Participants:  SAAM and ATAM involve the architects, designers, and the end users  ALMA only mostly includes the architect designer  PASA only includes the developers  SAAM and ATAM are the only two methodologies that are close to success when providing details as to the costs associated with a SWA evaluation or resource requirements  ATAM is one of the few if not the only evaluation method that provides sufficient process steps. The other evaluation methods provide descriptions of the required activities, but do not provide enough granular detail.  All of the methods are influenced by non-technical issues, such as stakeholder interests and political factors, but the ATAM process is the only method that provides instructions for detailed guidelines and techniques to manage the social issues

34 Original Assumption The most common SWA evaluation techniques are similar in the methodology applied, but have evolved to serve different purposes or perspectives for evaluating the quality of a specific SWA.

35 Final Assumption  Slightly reversed in truth as it has been determined that the methodologies applied are different and unique in their approaches, but they all serve the common goal for determining the best SWA option for a given set of business scenarios  No single evaluation technique provides a clearly decisive result by specifying a single SWA selection out of multiple available SWA options

36 Conclusion The various techniques of SAAM, ALMA, ATAM, and PASA use different approaches and steps in the quest for selecting the best SWA solution. Ultimately the common goal or purpose of all of the techniques is the selection of the SWA that provides maximum usability and satisfies all of the quality attributes as defined by the business goals.

37 Questions and Discussion


Download ppt "Software Architecture Evaluation Methodologies Presented By: Anthony Register."

Similar presentations


Ads by Google