Evaluating Architectures: ATAM

Slides:



Advertisements
Similar presentations
ATAM Architecture Tradeoff Analysis Method
Advertisements

ITIL: Service Transition
Chapter 10 Schedule Your Schedule. Copyright 2004 by Pearson Education, Inc. Identifying And Scheduling Tasks The schedule from the Software Development.
Evaluating a Software Architecture By Desalegn Bekele.
Active Review for Intermediate Designs [Clements, 2000]
Computer Engineering 203 R Smith Requirements Management 6/ Requirements IEEE Standard Glossary A condition or capability needed by a user to solve.
Evaluating Software Architectures for Real- Time Systems R. Kazman, M. Klein, P. Clements Software Engineering Institute Carnegie Mellon University.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Lecture 17 Architecture Tradeoff Analysis Method ATAM
SE 555 Software Requirements & Specification Requirements Validation.
Software Architecture in Practice
9 1 Chapter 9 Database Design Database Systems: Design, Implementation, and Management, Seventh Edition, Rob and Coronel.
Software Architecture Quality. Outline Importance of assessing software architecture Better predict the quality of the system to be built How to improve.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Software architecture evaluation
Architectural Design Establishing the overall structure of a software system Objectives To introduce architectural design and to discuss its importance.
Software Architecture. Agenda " Why architect? " What is architecture? " What does an architect do? " What principles guide the process of architecting?
Software Architecture premaster course 1.  Israa Mosatafa Islam  Neveen Adel Mohamed  Omnia Ibrahim Ahmed  Dr Hany Ammar 2.
Architecture Tradeoff Analysis Method Based on presentations by Kim and Kazman
1 The ATAM A Comprehensive Method for rchitecture Evaluation & The CBAM A Quantitative Approach to Architecture Design Deci $ ion Making CSSE 377 Software.
Introduction to Computer Technology
Enterprise Architecture
The Many Contexts of Software Architecture
CSC271 Database Systems Lecture # 20.
Software Project Management Lecture # 8. Outline Chapter 25 – Risk Management  What is Risk Management  Risk Management Strategies  Software Risks.
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
CPSC 871 John D. McGregor Module 4 Session 3 Architecture Evaluation.
Chapter 2 The process Process, Methods, and Tools
ATAM –Cont’d SEG 3202 N. Elkadri.
Architecture Evaluation Evaluation Factors Evaluation by the designer Every time the designer makes a key design decision or completes a design milestone,
1 REQUIREMENT ENGINEERING Chapter 7. 2 REQUIREMENT ENGINEERING Definition Establishing what the customer requires from a software system. OR It helps.
Software Architecture Prof.Dr.ir. F. Gielen
Management & Development of Complex Projects Course Code - 706
Team Skill 6: Building the Right System From Use Cases to Implementation (25)
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
Role-Based Guide to the RUP Architect. 2 Mission of an Architect A software architect leads and coordinates technical activities and artifacts throughout.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
1 Computer Systems & Architecture Lesson 5 9. The ATAM.
1 Introduction to Software Engineering Lecture 1.
Evaluating Architectural Options Simon Field Chief Technology Officer.
SOFTWARE DESIGN AND ARCHITECTURE LECTURE 05. Review Software design methods Design Paradigms Typical Design Trade-offs.
1 Advanced Software Architecture Muhammad Bilal Bashir PhD Scholar (Computer Science) Mohammad Ali Jinnah University.
Rational Unified Process (RUP) Process Meta-model Inception Phase These notes adopted and slightly modified from “RUP Made Easy”, provided by the IBM Academic.
Systems Engineering Simulation Modeling Maintenance Analysis Availability Research Repair Testing Training Copyright © 2009, David Emery & D&S Consultants,
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Scenario-Based Analysis of Software Architecture Rick Kazman, Gregory Abowd, Len Bass, and Paul Clements Presented by Cuauhtémoc Muñoz.
Overall Evaluation of Software Architecture By Ashwin Somaiah.
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
John D. McGregor Architecture Evaluation
Architecture Tradeoff Analysis Method Software Engineering Institute Carnegie Mellon University Presented by: Senthil ayyasamy CS 590l- winter 2003.
From Use Cases to Implementation 1. Structural and Behavioral Aspects of Collaborations  Two aspects of Collaborations Structural – specifies the static.
The ATAM method. The ATAM method (1/2) Architecture Tradeoff Analysis Method Requirements for complex software systems Modifiability Performance Security.
Analyzing an Architecture. Why analyze an architecture? Decide whether it solves the problem Compare to other architectures Assess what needs to change,
From Use Cases to Implementation 1. Mapping Requirements Directly to Design and Code  For many, if not most, of our requirements it is relatively easy.
CpSc 875 John D. McGregor C 12 – Security/ATAM. Attack surface of a product face_Analysis_Cheat_Sheet
 System Requirement Specification and System Planning.
Lecture 11 Analysis of Software Architectures Architecture Tradeoff & Analysis Method (ATAM) Topics  Why and when Evaluate Archiectures  Analysis of.
Lecture 15 Attribute Driven Design Again Topics ATAM – team expertise and experience needed Chapter 24 Next Time: June 22, 2016 CSCE 742 Software Architecture.
Lecture 15 Attribute Driven Design Again Topics ATAM – team expertise and experience needed Chapter 24 Next Time: June 22, 2016 CSCE 742 Software Architecture.
Lecture 12 Attribute Driven Design Again
CSCE 742 Software Architectures
The Systems Engineering Context
Chapter 22: Management and Governance
Analyzing an Architecture
Vanilson Burégio The Battlefield Control System – The First Case Study in Applying the ATAM Chapter 4 of: Clements, Paul et al., Evaluating.
Analyzing an Architecture
Software Architecture
John D. McGregor C 12 – Security/ATAM
Presentation transcript:

Evaluating Architectures: ATAM We evaluate the services that anyone renders to us according to the value he puts on them, not according to the value they have for us. Friedrich Nietzsche

Analyzing Architectures Before we consider specific methods for evaluating software architectures will consider some context Software Architectures (SA) will tell you important properties of a system even before it exists Architects know the effects (or can estimate) of design (architectural) decisions

Why and When do we Evaluate? Why evaluate architectures? Because so much is riding on it Before it becomes the blueprint for the project $$$, flaws detected in the architecture development stage save lots of money When do we Evaluate? As early as possible, even during actual development of SA The earlier problems are found the earlier and cheaper they can be fixed Certainly after the architecture is completed, you should validate it before development Later to ensure consistency between design and implementation especially for legacy systems Before acquiring a new system. The real answer is early and often!

Cost / Benefits of Evaluation of SA Cost – is the staff time that is required of the participants AT&T performed ~300 full scale SA Evaluations Average cost 70 staff-days SEI ATAM method averages 36 staff-days + stakeholders Benefit AT&T estimated that the 300 Evaluations saved 10% in project costs Anecdotal information of Evaluations savings Company avoided multi-million dollar purchase when evaluation showed inadequacy Anecdotes on Evaluations that should have been done Estimate of rewrite of system would take two years; it took three rewrites and seven years Large engineering DB system; design decisions prevented integration testing. Cancelled after $20,000,000 invested

Qualitative Benefits Forced Preparation for the review Presenters will focus on clarity Captured Rationale Evaluation will focus on questions Answering yields “explanations of design decisions” Useful throughout the software life cycle After the fact capturing rationale much more difficult “why was that done?” Early detection of problems with the architecture Validation of requirements Discussion of how well a requirement is met opens discussion Some requirements easy to demand, hard to satisfy Improved architectures Iterative improvement

Planned or Unplanned Evaluations Normal part of software life cycle Built into the project’s work plans, budget and schedule Scheduled right after completion of SA (or ???) Planned evaluations are “proactive” and “team-building” Unplanned As a result of some serious problem “mid-course correction” Challenge to technical authority of Team Unplanned evaluations are “reactive” and “tension-filled”

Evaluation Techniques “Active Design Review,” (1985) Parnas Pre-architecture SAAM (1994/SEI) – SA Analysis Method ATAM ( / SEI) – Architecture Tradeoff Analysis Method

Basic Reason for Evaluation Architecture is so vital for the success of any system, assess and evaluate it identify flaws and risks early, mitigate those risks before the costs become too high to manage effectively

Goals of an architectural assessment In a perfect world, requirements should identify and prioritize business goals, also known as quality attributes, to be achieved by the system. You need to collect quality attributes of the system by merging: the requirements document the plans of the project manager.

Next Steps… After the architectural evaluation, you should: Know whether the current architecture is suitable for achieving the quality attributes of the system or get a list of suggested changes for achieving the quality attributes of the system. Get a list of the quality attributes which you will achieve fully and those you will only partially achieve. Get a list of quality attributes that have associated risks. Gain a better understanding of the architecture and the ability to articulate it.

Architecture evaluation methods After extensive research, the Carnegie Mellon Software Engineering Institute (SEI) has come up with three architecture evaluation methods: Architecture Tradeoff Analysis Method (ATAM) Software Architecture Analysis Method (SAAM) Architecture Review For Intermediate Design (ARID) In many expert opinions, ATAM is the best method of the three and is the one I will discuss in more detail.

ATAM According to the SEI, to conduct a detailed evaluation, you should divide the evaluation process into the following steps: Presentation Analysis Testing Report

Presentation During this phase of ATAM, the project lead presents the process of architectural evaluation, the business drivers behind this project, and a high-level overview of architectures under evaluation for achieving the stated business needs.

Analysis In the analysis phase, the architect discusses different architectural approaches in detail, identifies business needs from requirement documents, generates a quality attribute tree, and analyzes architectural approaches.

Testing This phase is similar to the analysis phase. Here, the architect would identify groups of quality attributes and evaluate architectural approaches against these groups of attributes.

Report During the report phase, the architect puts together in a document the ideas, views collected, and the list of architectural approaches outlined during the previous phases.

Architecture Tradeoff Analysis Method (ATAM) “We evaluate the services that anyone renders to us according to the value he puts on them, not according to the value they have for us.” --- Friedrich Nietzche Evaluating an architecture is a complicated undertaking. Large system  large architecture ABC + Nietzche’s quote yields: a computer system is intended to support business goals and Evaluation needs to make connections between goals and design decisions Multiple stakeholders  acquiring all perspectives requires careful management

ATAM –Cont’d ATAM is based upon a set of attribute-specific measures of the system Analytic measures, based upon formal models Performance Availability Qualitative measures, based upon formal inspections Modifiability Safety security

ATAM Benefits clarified quality attribute requirements improved architecture documentation documented basis for architectural decisions identified risks early in the life-cycle increased communication among stakeholders

Conceptual Flow ATAM Ref: http://www.sei.cmu.edu/ata/ata_method.html

Participants in ATAM The evaluation team Project decision makers Architecture stakeholders

Outputs of the ATAM A concise presentation of the architecture Articulation of the business goals Quality requirements in terms of a collection of scenarios Mapping of architectural decisions to quality requirements A set of identified sensitivity and tradeoff points A set of risks and non-risks A set of risk themes

Phases of the ATAM Phase Zero Phase One Phase Two Phase Three Partnership and preparation Phase One Evaluation Phase Two Evaluation continued Phase Three Follow-up

Steps of Evaluation Phase One Step 1 : Present the ATAM Step 2 : Present business drivers Step 3 : Present architecture Step 4 : Identify architectural approaches Step 5 : Generate quality attribute utility tree Step 6 : Analyze architectural approaches

Step 1: Present the ATAM The ATAM steps Techniques Outputs The evaluation team presents an overview of the ATAM including The ATAM steps Techniques Utility tree generation Architecture elicitation and analysis Scenario brainstorming and mapping Outputs Architectural approaches Utility tree Scenarios Risk vs. “Non-risk” Sensitivity points tradeoffs

Step 2: Present Business Goals Business Representatives describe: The system’s most important (high-level) functions Any relevant technical, managerial, economic, or political constraints The business goals and context as they relate to the project Architectural driver: quality attributes that shape the architecture Critical requirements: most important quality attributes for the success of the software The major stakeholders

Step 3: Present Architecture Software Architect presents: An overview of the architectures Technical constraints such as OS, hardware and languages Other interfacing systems of the current system Architectural approaches used to address quality attributes requirements. Important architectural information Context diagram Views: E.g. Module or layer view Component and connector view Deployment view

Step 3: Present Architecture- Cont’d Architectural approaches, patterns, tactics employed, what quality attributes they address and how they address those attributes Use of COTS (Component-off-the-shelves) and their integration Evaluation Team begins probing and capturing risks Most important use case scenarios Most important change scenarios Issues/risk w.r.t. meeting the given requirements

Step 4: Identify Architectural Approaches Catalog the evident patterns and approaches Based on step 3 Start to identify places in the architecture that are keys for realizing quality attributes requirements Identify any useful architectural patterns for the current problems E.g. Client-server, publish-subscribe, redundant hardware

Step 5: Generate Quality Attribute Utility Tree Is a top-down tool to refine and structure quality attributes requirements. Select the general, important quality attributes to be the high-level node E.g. performance, modifiability, security and availability. Refine them to more specific categories All leaves of the utility tree are “scenarios”. Prioritize scenarios Important first, difficult to achieve first, and so on. Importance with respect to system success – High, Medium, Low Difficulty in achieving High, Medium, Low (H,H), (H,M), (M,H) most interesting Present the quality attribute goals in detail

Step 5: Generate Quality Attribute Utility Tree (Example)

Step 6: Analyze Architectural Approaches Examine the highest ranked scenarios The goal is for the evaluation team to be convinced that the approach is appropriate for meeting the attribute-specific requirements Scenario walkthroughs Identify and record a set of sensitivity points and tradeoff points, risks and non-risks Sensitivity and tradeoff points are candidate risks

Sensitivity Point, Tradeoff Points, Risks and non-Risks in Step 6 Sensitivity points Property of components that are critical in achieving particular quality attribute response E.g., security: level of confidentiality vs. number of bits in encryption key Tradeoff points Property that is sensitivity point for more than one attribute E.g., encryption level vs. security and performance, in particular Risks Potentially problematic architectural decisions. (e.g. test by unacceptable values of responses) Non-Risk Good architectural decision.

Next time… Continue phases. Present a case study.

The essentials Architectural evaluation is an essential part of system development. Here, we have emphasized the importance of architecture and outlined a formal method for evaluating architecture in ATAM. Architectural evaluation is important for the success of all systems, irrespective of size. But, normally, it becomes more significant to use formal architectural evaluation methods in medium to large-scale projects.

References Evaluating Software Architectures: Methods and Case Studies, by Paul Clements, Rick Kazman, and Mark Klein. (Chapter 11) CBAM (2001/SEI) – Cost Benefit Analysis Method (Chapter 12) Software Architecture in Practice, Len Bass, Paul Clements, Rick Kazman (Chapter 11)