Lecture 17 Architecture Tradeoff Analysis Method ATAM

Slides:



Advertisements
Similar presentations
ATAM Architecture Tradeoff Analysis Method
Advertisements

Leverage MarkITS for agile solutions delivery that balances strategic thinking with tactical execution for “Business & Technology Convergence” MarkITS.
Evaluating a Software Architecture By Desalegn Bekele.
Software Architecture – Centric Methods and Agile Development by Craig Castaneda.
Testing Without Executing the Code Pavlina Koleva Junior QA Engineer WinCore Telerik QA Academy Telerik QA Academy.
The Role of Software Engineering Brief overview of relationship of SE to managing DSD risks 1.
Architecture is More Than Just Meeting Requirements Ron Olaski SE510 Fall 2003.
The Architecture Design Process
Active Review for Intermediate Designs [Clements, 2000]
1 Building with Assurance CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute May 10, 2004.
Analysis Concepts and Principles
Fundamentals of Information Systems, Second Edition
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
SE 555 Software Requirements & Specification Requirements Validation.
1 Computer Systems & Architecture Lesson 1 1. The Architecture Business Cycle.
Software Architecture Quality. Outline Importance of assessing software architecture Better predict the quality of the system to be built How to improve.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Software architecture evaluation
Software Architecture premaster course 1.  Israa Mosatafa Islam  Neveen Adel Mohamed  Omnia Ibrahim Ahmed  Dr Hany Ammar 2.
Architecture Tradeoff Analysis Method Based on presentations by Kim and Kazman
1 The ATAM A Comprehensive Method for rchitecture Evaluation & The CBAM A Quantitative Approach to Architecture Design Deci $ ion Making CSSE 377 Software.
Introduction to Computer Technology
The Many Contexts of Software Architecture
Software Architecture in Practice (3rd Ed) Introduction
Evaluating Architectures: ATAM
CPSC 871 John D. McGregor Module 4 Session 3 Architecture Evaluation.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
ATAM –Cont’d SEG 3202 N. Elkadri.
Architecture Evaluation Evaluation Factors Evaluation by the designer Every time the designer makes a key design decision or completes a design milestone,
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Software Architecture Prof.Dr.ir. F. Gielen
IT Requirements Management Balancing Needs and Expectations.
Slide 1 Introduction to Software Architecture TV Prabhakar.
10 Software Architecture CSCU 411 Software Engineering.
Software Architecture and Design Dr. Aldo Dagnino ABB, Inc. US Corporate Research Center October 23 rd, 2003.
1 Computer Systems & Architecture Lesson 5 9. The ATAM.
Lecture-3.
1 Advanced Software Architecture Muhammad Bilal Bashir PhD Scholar (Computer Science) Mohammad Ali Jinnah University.
Fundamentals of Information Systems, Second Edition 1 Systems Development.
System Context and Domain Analysis Abbas Rasoolzadegan.
1 Advanced Software Architecture Muhammad Bilal Bashir PhD Scholar (Computer Science) Mohammad Ali Jinnah University.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
By Germaine Cheung Hong Kong Computer Institute
Overall Evaluation of Software Architecture By Ashwin Somaiah.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
L ECTURE 18 Enterprise Systems Development ( CSC447 ) COMSATS Islamabad Muhammad Usman, Assistant Professor.
Requirement Engineering
John D. McGregor Architecture Evaluation
The ATAM method. The ATAM method (1/2) Architecture Tradeoff Analysis Method Requirements for complex software systems Modifiability Performance Security.
Analyzing an Architecture. Why analyze an architecture? Decide whether it solves the problem Compare to other architectures Assess what needs to change,
CpSc 875 John D. McGregor C 12 – Security/ATAM. Attack surface of a product face_Analysis_Cheat_Sheet
Systems Analysis Lecture 5 Requirements Investigation and Analysis 1 BTEC HNC Systems Support Castle College 2007/8.
 System Requirement Specification and System Planning.
Lecture 11 Analysis of Software Architectures Architecture Tradeoff & Analysis Method (ATAM) Topics  Why and when Evaluate Archiectures  Analysis of.
Lecture 15 Attribute Driven Design Again Topics ATAM – team expertise and experience needed Chapter 24 Next Time: June 22, 2016 CSCE 742 Software Architecture.
Lecture 15 Attribute Driven Design Again Topics ATAM – team expertise and experience needed Chapter 24 Next Time: June 22, 2016 CSCE 742 Software Architecture.
Lecture 12 Attribute Driven Design Again
Chapter 21: Architecture Evaluation
CSCE 742 Software Architectures
CSCE 742 Software Architectures
Principles of Information Systems Eighth Edition
Software Requirements analysis & specifications
Analyzing an Architecture
Chapter 5 Designing the Architecture Shari L. Pfleeger Joanne M. Atlee
Lecture 12z ATAM Case Study Nightingale System
Vanilson Burégio The Battlefield Control System – The First Case Study in Applying the ATAM Chapter 4 of: Clements, Paul et al., Evaluating.
Analyzing an Architecture
The ATAM – A Method for Architecture Evaluation
Software Architecture
John D. McGregor C 12 – Security/ATAM
Presentation transcript:

Lecture 17 Architecture Tradeoff Analysis Method ATAM CSCE 742 Software Architectures Lecture 17 Architecture Tradeoff Analysis Method ATAM Topics Evaluating Software Architectures What an evaluation should tell What can be examined, What Can’t ATAM Next Time: Case Study: the Nightingale System Ref: The “Evaluating Architectures” book and Chap 11 October 22, 2003

Overview Last Time New: Analysis of Architectures Next time: Why, When, Cost/Benefits Techniques Properties of Successful Evaluations New: Analysis of Architectures Next time: References:

Last Time Why evaluate architectures? When do we Evaluate? Cost / Benefits of Evaluation of SA Non-Financial Benefits EvaluationTechniques Active Design Review,” SAAM (SA Analysis Method), ATAM (Architecture Tradeoff Analysis Method), CBAM (2001/SEI) – Cost Benefit Analysis Method Planned or Unplanned Evaluations Properties of Successful Evaluations Results of an Evaluation Principles behind the Agile Manifesto Architecture Tradeoff Analysis Method (ATAM) Conceptual Flow ATAM

ATAM Overview ATAM is based upon a set of attribute-specific measures of the system Analytic measures, based upon formal models Performance Availability Qualitative measures, based upon formal inspections Modifiability Safety security

ATAM Benefits clarified quality attribute requirements improved architecture documentation documented basis for architectural decisions identified risks early in the life-cycle increased communication among stakeholders

Figure taken from http://www.sei.cmu.edu/ata/ata_method.html Conceptual Flow ATAM Figure taken from http://www.sei.cmu.edu/ata/ata_method.html

For What Qualities can we Evaluate? From the architecture we can’t tell if the resulting system will meet all of its quality goals Why? Usability  largely determined by user interface User interface is typically at lower level than SA UMLi http://www.ksl.stanford.edu/people/pp/papers/PinheirodaSilva_ksl_02_04.pdf ATAM concentrates on evaluating a SA based on certain quality attributes

ATAM Quality Attributes Performance Reliability Availability Security Modifiability Portability Functionality Variability – how well the architecture can be expanded to produce new SAs in preplanned ways ( important for product lines) Subsetability Conceptual integrity

Non-suitable Quality Attributes (for ATAM) There are some quality attributes that are just to vague to be used as the basis for an evaluation. Examples “The system must be robust.” “The system shall be highly modifiable.” Quality attributes are evaluated in some context A system is modifiable wrt a specific kind of change. A system is secure wrt a specific kind of threat.

Outputs of an Architecture Evaluation Prioritized statement of quality attribute requirements “You've got to be very careful if you don't know where you're going, because you might not get there.” Yogi Berra Yogi also said "I didn't really say everything I said." Mapping of approaches to quality attributes How the architectural approaches will achieve or fail to achieve quality attributes Provides some “rationale” for the architecture Risks and non-risks Risks are potentially problematic architectural decisions There are more for each specific analysis technique. We will consider ATAM soon.

Documenting Risks and Non-Risks Documenting risks and non-risks consists of: An architectural decision (that has not been made) A specific quality attribute response being addressed by that decision A rationale for the positive or negative effect that the decision has on satisfying the quality attribute

Example of a Risk* The rules for writing business logic modules in the second tier of your three tier client-server system are not clearly articulated. (decision to be made) This could result in replication of functionality thereby compromising modifiability of the third tier (a quality attribute response and its consequences) Unarticulated rules for writing business logic can result in unintended and undesired coupling of components (rationale for the negative effect) *Example from: Evaluating Software Architectures: Methods and Case Studies by Clements, Kazman and Klein

Example of a Non-Risk* Assuming message arrival rates of once per second, a processing time of less than 30 milliseconds and the existence of one higher priority process (the architectural decisions) A one-second soft deadline seems appropriate (the quality attribute response and its consequence) Since the arrival rate is bounded and the preemptive effects of higher priority processes and known and can be accommodated (the rationale) *Example from: Evaluating Software Architectures: Methods and Case Studies by Clements, Kazman and Klein

Participants in ATAM The evaluation team Team leader – Evaluation leader Scenario scribe Proceedings scribe Timekeeper Process observer Process enforcer Questioner Project Decision makers – people empowered to speak for the development project Architecture stakeholders – including developers, testers, …, users, builders of systems interacting with this one

Evaluation Team Roles and Responsibilities Team leader Sets up evaluation (set evaluation contract) forms team interfaces with client Oversees the writing of the evaluation report Evaluation leader Runs the evaluation Facilitates elicitation of scenarios Administers selection and prioritization of scenarios process Facilitates evaluation of scenarios against architecture Scenario scribe Records scenarios on a flip chart Captures (and insists on) agreed wording Proceedings scribe

Evaluation Team Roles and Responsibilities Proceedings scribe Captures proceedings in electronic form Raw scenarios with motivating issues Resolution of each scenario when applied to the architecture Generates a list of adopted scenaios Timekeeper Helps evaluation leader stay on schedule Controls the time devoted to each scenario Process observer Keeps notes on how the evaluation process could be improved Process enforcer – helps the evaluation leader stay “on process” Questioner raises issues of architectural interest stakeholders may not have thought of

Outputs of the ATAM A Concise presentation of the architecture Frequently there is “too much” ATAM will force a “one hour” presentation of the architecture forcing it to be concise and clear Articulation of Business Goals Quality requirements in terms of collections of scenarios Mapping of architectural decisions to quality attribute requirements A set sensitivity and tradeoff points Eg. A backup database positively affects reliability (sensitivity point with respect to reliability) However it negatively affects performance (tradeoff)

More Outputs of the ATAM A set of risks and non-risks A risk is an architectural decision that may lead to undesirable consequences with respect to a stated quality attribute requirement A non-risk is an architectural decision that, after analysis, is deemed to be safe Identified risks can form the basis of a “architectural risk mitigation” plan A set of risk themes Examine the collection of risks produced looking for themes that are the result of systematic weaknesses in the architecture Other outputs – more documentation, rationale, sense of community between stakeholders, architect, …

Phases of the ATAM Phase Activity Participants Duration Preparation Preparation Team leadership/key project decision makers Over a few weeks 1 Evaluation Evaluation team and project decision makers 1 day + hiatus of 2 to 3 weeks 2 Evaluation with Stakeholders Ditto + stakeholders 2 days 3 Follow-up Evaluation team and client 1 week

Steps of the Evaluation Phase(s) Present the ATAM Present Business drivers Present Architecture Identify architectural approaches Generate quality attribute utility tree Analyze architectural approaches Hiatus and start of phase 2 Brainstorm and prioritize scenarios Present results

Phase 0: Partnership Client – someone who can exercise control of the project whose architecture is being evaluated Perhaps a manger Perhaps someone in an organization considering a purchase Issues that must be resolved in Phase 0: The client must understand the evaluation method and process (give them a book, make them a video) The client should describe the system and architecture. A “go/no-go” decision is made here by the evaluation team leader. Statement of work is negotiated Issues of proprietary information resolved

Phase 0: Preparation Forming the evaluation team Holding an evaluation kickoff meeting Assignment of roles Good idea to not get into ruts; try varying assignments Roles not necessarily one-to-one The minimum size evaluation team is 4 One person can be process observer, timekeeper and questioner Team leader’s responsibilities are mostly “outside” the evaluation. He can double up. (Often the evaluation leader.) Questioners should be chosen to represent the spectrum of expertise in performance, modifiability, …

Phase 1: Activities in Addition to the steps Organizational meeting of evaluation team and key project personnel Form schedule The right people attend the meetings They are prepared (know what is expected of them) They have the right attitude Besides carrying out the steps the team needs to gather as much information as possible to determine Whether the remainder of the evaluation is feasible Whether more architectural documentation is required Which stakeholders should be present for phase 2

Step 1: Present the ATAM Evaluation leader presents the ATAM to all participants To explain the process To answer questions To set the context and expectations for the process Using standard presentation discuss ATAM steps in brief and outputs of the evaluation

A Typical ATAM Agenda for Phases 1 and 2 Figure 3.9 from “Evaluating Software Architectures: Methods and Case Studies,” by Clements, Kazman and Klein

Step 2: Present Business Drivers Everyone needs to understand the primary business drivers motivating/guiding the development of the system A project decision maker presents a system overview from the business perspective The system’s functionality Relevant constraints: technical, managerial, economic, or political Business goals Major stakeholders The architectural drivers – the quality attributes that shape the architecture

Step 3: Present the Architecture The lead architect makes the presentation at the appropriate level of detail Architecture Presentation (~20 slides; 60 minutes) Architectural drivers and existing standrards/models/approaches for meeting (2-3 slides) Important Architectural Information (4-8 slides) Context diagram Module or layer view Component and connector view Deployment view Architectural approaches, patterns, or tactics employed

Step 3: Present the Architecture (cont) Architectural approaches, patterns, or tactics employed Use of COTS (commercial off the shelf) products Trace of 1-3 most important use cases scenarios Trace of 1-3 most important change scenarios Architectural issues/risks with respect to meeting the driving architectural requirements Glossary Should have a high signal to noise ratio, don’t get bogged down too deeply in details Should cover technical constraints such as operating system, hardware and middleware

Step 4: Identify Architectural Approaches The ATAM analyzes an architecture by focusing on its architectural approaches Captured by not analyzed (here) by the evaluation team The evaluation team should explicitly asked the architect to name these

Step 5: Generate Quality Attribute Utility Tree

Scenarios Types of Scenarios: Use case scenarios Growth scenarios The user wants to examine budgetary data Growth scenarios Change the maximum number of tracks from 150 to 200 and keep the latency of disk to screen at 200ms or less Exploratory scenarios Switch the OS from Unix to Windows Improve availability from 98% to 99.99%

Step 6:Analyze Architectural Approaches The evaluation team examines the highest priority scenarios one at a time and the architect is asked how the architecture supports each one.

Step 7: Brainstorm and Prioritize Scenarios The Hiatus Evaluation team distills what has been learned and informally contacts architecture for more information where needed After

References “Evaluating Software Architectures: Methods and Case Studies” by Clements, Kazman and Klein, published by Addison-Wesley, 2002. Part of the SEI Series in Software Engineering. SEI Software Architecture publications http://www.sei.cmu.edu/ata/pub_by_topic.html