(c) DAIMI - OOSSHenrik Bærbak Christensen1 Architectural Evaluation Overview, Questioning Techniques.

Slides:



Advertisements
Similar presentations
ATAM Architecture Tradeoff Analysis Method
Advertisements

System Integration Verification and Validation
Lecture # 2 : Process Models
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
Evaluating a Software Architecture By Desalegn Bekele.
Software Architecture – Centric Methods and Agile Development by Craig Castaneda.
1 Software Requirement Analysis Deployment Package for the Basic Profile Version 0.1, January 11th 2008.
Software Architecture for DSD DSD Team. Overview What is software architecture and why is it so important? The role of architecture in determining system.
Architecture is More Than Just Meeting Requirements Ron Olaski SE510 Fall 2003.
Active Review for Intermediate Designs [Clements, 2000]
Fundamentals of Information Systems, Second Edition
1 Introduction to System Engineering G. Nacouzi ME 155B.
SE 555 Software Requirements & Specification Requirements Validation.
1 Computer Systems & Architecture Lesson 1 1. The Architecture Business Cycle.
Software Architecture Quality. Outline Importance of assessing software architecture Better predict the quality of the system to be built How to improve.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Introduction to Systems Analysis and Design
Software architecture evaluation
Architecture and Requirements
Architecture Tradeoff Analysis Method Based on presentations by Kim and Kazman
Methods and Models for Evaluating Software Product Line Architecture Hyotaeg Jung Computer Science Department Univ. of Texas at Dallas Software Architecture.
Software Architecture in Practice (3rd Ed) Introduction
What is Business Analysis Planning & Monitoring?
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
Evaluating Architectures: ATAM
CPSC 871 John D. McGregor Module 4 Session 3 Architecture Evaluation.
ATAM –Cont’d SEG 3202 N. Elkadri.
Architecture Evaluation Evaluation Factors Evaluation by the designer Every time the designer makes a key design decision or completes a design milestone,
The Architecture Business Cycle. Software Architecture Definition The software architecture of a program or computing system is the structure or structures.
Architecture Business Cycle
1 ISA&D7‏/8‏/ ISA&D7‏/8‏/2013 Systems Development Life Cycle Phases and Activities in the SDLC Variations of the SDLC models.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Software Architecture CS3300 Fall Beware the Fuzzy Front End We are already almost 1/3 of the way done Need to negotiate deliverable schedule: SDP.
Software Architecture and Design Dr. Aldo Dagnino ABB, Inc. US Corporate Research Center October 23 rd, 2003.
1 Computer Systems & Architecture Lesson 5 9. The ATAM.
Lecture 7: Requirements Engineering
SOFTWARE DESIGN AND ARCHITECTURE LECTURE 05. Review Software design methods Design Paradigms Typical Design Trade-offs.
1 Advanced Software Architecture Muhammad Bilal Bashir PhD Scholar (Computer Science) Mohammad Ali Jinnah University.
Review of Software Process Models Review Class 1 Software Process Models CEN 4021 Class 2 – 01/12.
Fundamentals of Information Systems, Second Edition 1 Systems Development.
PROC-1 1. Software Development Process. PROC-2 A Process Software Development Process User’s Requirements Software System Unified Process: Component Based.
Systems Engineering Simulation Modeling Maintenance Analysis Availability Research Repair Testing Training Copyright © 2009, David Emery & D&S Consultants,
Software Engineering 1 Object-oriented Analysis and Design Applying UML and Patterns An Introduction to Object-oriented Analysis and Design and Iterative.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Architecture Analysis Techniques
Scenario-Based Analysis of Software Architecture Rick Kazman, Gregory Abowd, Len Bass, and Paul Clements Presented by Cuauhtémoc Muñoz.
Overall Evaluation of Software Architecture By Ashwin Somaiah.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
Requirements Validation
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
CSE 303 – Software Design and Architecture
Overview of SAIP and LSSA. Software Architecture in Practice Provides a set of techniques, not a prescriptive method for architectural design. Based on.
John D. McGregor Architecture Evaluation
Requirements Analysis
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
An Agile Requirements Approach 1. Step 1: Get Organized  Meet with your team and agree on the basic software processes you will employ.  Decide how.
Requirement Discipline Spring 2006/1385 Semester 1.
The ATAM method. The ATAM method (1/2) Architecture Tradeoff Analysis Method Requirements for complex software systems Modifiability Performance Security.
RUP RATIONAL UNIFIED PROCESS Behnam Akbari 06 Oct
Analyzing an Architecture. Why analyze an architecture? Decide whether it solves the problem Compare to other architectures Assess what needs to change,
Software Architecture in Practice
Lecture 9z Case Study ADD: Garage Door
Analyzing an Architecture
Chapter 5 Designing the Architecture Shari L. Pfleeger Joanne M. Atlee
Vanilson Burégio The Battlefield Control System – The First Case Study in Applying the ATAM Chapter 4 of: Clements, Paul et al., Evaluating.
Analyzing an Architecture
The ATAM – A Method for Architecture Evaluation
Software Architecture
Presentation transcript:

(c) DAIMI - OOSSHenrik Bærbak Christensen1 Architectural Evaluation Overview, Questioning Techniques

(c) DAIMI - OOSSHenrik Bærbak Christensen2 Literature Main –[Bass el al., 2003] chap 11 (ATAM) Bass, L., Clements, P., Kazmann, R. Software Architecture in Practice. Addison-Wesley –[Barbacci et al., 2003] Barbacci, M., Ellison, R., Lattanze, A., Stafford, J., Weinstock, C., and Wood, W. (2003). Quality Attribute Workshops, 3rd Edition. Technical Report CMU/SEI TR-016, Software Engineering Institute. –[Bardram et al., 2004] Bardram, J., Christensen, H. B., and Hansen, K. M. (2004). Architectural Prototyping: An Approach for Grounding Architectural Design and Learning. In Proceedings of the 4th Working IEEE/IFIP Conference on Software Architecture (WICSA 2004), pages 15-24, Oslo, Norway.

(c) DAIMI - OOSSHenrik Bærbak Christensen3 A Simplified Development Process NB! Iterative, incremental, parallel, … process ?

(c) DAIMI - OOSSHenrik Bærbak Christensen4 Rationale for Architectural Evaluation Software architecture allows or precludes almost all system quality attributes –Need to evaluate impact of architectural design on quality attributes –Should be possible… Architecture needs to be designed early –E.g., prerequisite for work assignment Architecture embodies fundamental design decisions –Hard/costly to change these –The later the change, the costlier it is Cheap techniques for architectural evaluation exist

(c) DAIMI - OOSSHenrik Bærbak Christensen5 When? Architecture designed, design not implemented –I.e., early in a cycle in an iteration –E.g., “We have designed this architecture will it possibly fit the quality requirements?” Early –Evaluate design decisions Either design decisions made or considered Architecture description may not be available –Completeness and fidelity product of state of architectural description –E.g., “Will any architecture meet these quality requirements?” “You can have any combination of features the Air Ministry desires, so long as you do not also require that the resulting airplane fly” (Messerschmitt) Late –Evaluate architecture for existing system Possible redesign, evolution, … –E.g., “We need to integrate with this product, will it be suitable in our architecture?”

(c) DAIMI - OOSSHenrik Bærbak Christensen6 Who? Roles –Evaluation team Preferably not process staff –Objectivity reasons… –Project stakeholders Articulate requirements “Garden variety” and decision makers Producers –Software architect, developer, maintainer, integrator, standards expert, performance engineer, security expert, project manager, reuse czar Consumers –Customer, end user, application builder (for product line) Servicers –System administrator, network administrator, service representatives Needs depend on the actual type of evaluation

(c) DAIMI - OOSSHenrik Bærbak Christensen7 What? Probe of suitability of architecture(s) –Will system meet quality goals? –Will it still be buildable? Context-specific –E.g., modifiability depends on modification scenarios Need articulated goals –Major part of evaluation process –Particularly in large, organizationally complex projects Most often not scalar or results –“00, 03, …, 13” –Often focus on finding risks Which decisions affect which qualities?

(c) DAIMI - OOSSHenrik Bærbak Christensen8 Questioning Techniques Analytical –Any project area, typically review –Any state of completeness Questionnaires Checklists Scenario-based methods

(c) DAIMI - OOSSHenrik Bærbak Christensen9 Overview Check-lists / Questionnaires

(c) DAIMI - OOSSHenrik Bærbak Christensen10 Questionnaires List of relatively open questions –Apply to all architectures Process and product questions –Cf. Capability Maturity Model (CMM) for software E.g., Kruchten

(c) DAIMI - OOSSHenrik Bærbak Christensen11 Checklists Detailed questions –Based on experience –Domain-specific E.g., AT&T

(c) DAIMI - OOSSHenrik Bærbak Christensen12 Overview Scenario-based methods

(c) DAIMI - OOSSHenrik Bærbak Christensen13 Scenario-Based Methods Scenarios –Description of interaction with system from the point of with of a stakeholder System-specific – developed as part of project Cf. quality-attribute scenarios as in [Bass et al., 2003] Types –Quality Attribute Workshops (QAW) Structured way of involving stakeholders in scenario generation and prioritization –Architecture Tradeoff Analysis Method (ATAM) Creates utility trees to represent quality attribute requirements Analysis of architectural decisions to identify sensitivity points, tradeoffs, and risks –Software Architecture Analysis Method (SAAM) Brainstorming of modifiability and functionality scenarios Scenario walkthrough to verify functionality support and estimate change costs –Active Reviews for Intermediary Designs (ARID) Active Design Review of software architecture –E.g., “Is the performance of each component adequately specified” vs. “For each component, write down its maximum execution time and list the shared resources that it may consume” –Survivable Network Analysis method (SNA) Focus on survivability as quality attribute Process –Determine essential components based on essential services and assets –Map intrusion scenarios onto architecture to find “soft-spot” components (essential, but vulnerable) –Analyze these wrt »Resistance »Recognition »Recovery of/from attacks

(c) DAIMI - OOSSHenrik Bærbak Christensen14 Quality Attribute Workshops (QAWs) Facilitated method that engages stakeholders in discovering and prioritizing quality attribute requirements –[Barbacci, 2003] –Typically started before architecture is designed/finished Steps 1.QAW Presentation and Introductions 2.Business/Mission Presentation 3.Architectural Plan Presentation 4.Identification of Architectural Drivers 5.Scenario Brainstorming 6.Scenario Consolidation 7.Scenario Prioritization 8.Scenario Refinement

(c) DAIMI - OOSSHenrik Bærbak Christensen15 QAW Steps (2) 1. Presentation and Introductions 2. Business/Mission Presentation –Stakeholders present drivers for system E.g., profitability, time-to-market, better quality of service, … –High-level functional requirements, constraints, quality attribute requirements 3. Architectural Plan Presentation –How will business/mission drivers be satisfied? –Key technical requirements E.g., mandated technical decisions –Existing, preliminary architectural descriptions 4. Identification of Architectural Drivers –Keys to realizing quality attribute goals of system E.g., key requirements, business drivers, quality attributes –Distilled version of steps 2. and 3.

(c) DAIMI - OOSSHenrik Bærbak Christensen16 QAW Steps (3) 5. Scenario Brainstorming –Goal Come up with as many well-formed quality attribute scenarios as possible Stimulus, environment, response –Participants Come up with quality attribute scenarios No critique as such, only clarification questions –Facilitator Write scenarios on whiteboard Ensure that scenarios are usable –“The system shall be modifiable” vs. “The user interface of … is changed to different look & fell in two person days” Make sure architectural drivers are covered –Either fixed time period or whenever participants run out of good ideas Usually easy to create scenarios

(c) DAIMI - OOSSHenrik Bærbak Christensen17 QAW Steps (4) – Scenario Types Use case scenarios –Expected use of system Growth scenarios –Anticipated changes to system Exploratory scenarios –Unanticipated stresses to system

(c) DAIMI - OOSSHenrik Bærbak Christensen18 QAW Steps (5) 6. Scenario Consolidation –Merge similar scenarios –Requires stakeholder agreement – majority consensus if need be 7. Scenario Prioritization –Each stakeholder has N = ceil(#scenarios * 0,3) votes on consolidated scenarios –Round-robin voting Two passes Each pass: allocate half of votes –Resulting count = prioritization Typically high, medium, low priority 8. Scenario Refinement –Develop high priority scenarios according to scheme of [Bass et al., 2003] Stimulus, source of stimulus, environment, artifact stimulated, response measure Describe relevant quality attributes Elicitate questions and issues

(c) DAIMI - OOSSHenrik Bærbak Christensen19 QAW Results Outputs –List of architectural drivers –Raw scenario list –Prioritized scenario list –Refined scenarios Uses –Update architectural vision, refine requirements –Guide design and development –Facilitate further analysis/evaluation E.g., ATAM

(c) DAIMI - OOSSHenrik Bærbak Christensen20 Architecture Tradeoff Analysis Method (ATAM) Context –“Large US public projects” Government, military, space DoD, NASA, … Goals –Reveal how well architecture satisfies quality goals –Insight into tradeoffs among quality attributes –Identify risks Structured method –Repeatable analysis Should yield same results when applied twice –Guides users to look for conflicts and resolutions Parts –Presentation –Investigation –Testing –Reporting

(c) DAIMI - OOSSHenrik Bærbak Christensen21 ATAM Steps (1) – Investigation and Analysis Presentation –1. Present the ATAM Evaluation leader describes method… –2. Present Business Drivers Most important functions Relevant constraints –Technical, managerial, economic Business goals Major stakeholders Architectural drivers –Major quality attribute goals, which shape architecture –3. Present the Architecture In terms of views… Focus on how drivers are met

(c) DAIMI - OOSSHenrik Bærbak Christensen22 ATAM Steps (2) – Investigation and Analysis 4. Identify Architectural Approaches Used –Architect names approaches used Approach = here, set of architectural decisions, e.g., tactics that are used –Goal Eventually match desired qualities and decisions

(c) DAIMI - OOSSHenrik Bærbak Christensen23 ATAM Steps (3) – Investigation and Analysis 5. Generate the Quality Attribute Utility Tree –Decision managers refine most important quality attribute goals Qualities: Performance –Refinement: Data latency »Scenario: Deliver video in real-time –Specify prioritization Importance with respect to system success –High, Medium, Low Difficulty in achieving –High, Medium, Low (H,H), (H,M), (M,H) most interesting

(c) DAIMI - OOSSHenrik Bærbak Christensen24 ATAM Steps (4) – Investigation and Analysis 6. Analyze Architectural Approaches –Match quality requirements in utility tree with architectural approaches How is each high priority scenario realized? Quality- and approach-specific questions asked – e.g., well-known weaknesses of approach –Decision makers identify Sensitivity points –Property of component(s) critical in achieving particular quality attribute response –E.g., security: level of confidentiality vs. number of bits in encryption key Tradeoff points –Property that is sensitivity point for more than one attribute –E.g., encryption level vs. security and performance, in particular if hard real-time guarantees are required Risks –Potentially unacceptable values of responses

(c) DAIMI - OOSSHenrik Bærbak Christensen25

(c) DAIMI - OOSSHenrik Bærbak Christensen26 ATAM Steps (6) – Testing 7. Brainstorm and Prioritize Scenarios –Akin to QAWs Larger groups of stakeholders Place results in utility tree 8. Analyze Architectural Approaches 9. Present Results –Outputs Documented architectural approaches Set of scenarios and prioritizations Utility tree (Non)risks Sensitivity and tradeoff points

(c) DAIMI - OOSSHenrik Bærbak Christensen27 ATAM and QAW – Discussion How to ensure completeness? –And thus eventually suitability of architecture evaluated Does process quality => product quality? How do the approaches fare in an iterative setting? –Need to rework scenarios etc. iteratively…

(c) DAIMI - OOSSHenrik Bærbak Christensen28 Overview Measuring techniques

(c) DAIMI - OOSSHenrik Bærbak Christensen29 Measure Techniques Prerequisite –Artifacts to do measurements on… May answer specific quality attribute scenarios –Cf. experimental vs. exploratory prototyping Types –Metrics –Simulation, prototypes, experiments –Rate-Monotonic Analysis –ADL-based

(c) DAIMI - OOSSHenrik Bærbak Christensen30 Metrics Quantitative interpretation of observable measurement of software architecture –Cf. [ISO, 2001] E.g., measuring complexity to predict modifiability –Real-time object-oriented telecommunication systems Number of events reacted to Number of {asynchronous, synchronous} calls made Number of component clusters –Object decompositions units, e.g., car as wheels, transmission, steering, … Depth of inheritance tree … E.g., flow metrics to predict reliability –Call-graph-metrics Number of modules used by module Total calls to others modules Unique calls to other modules –Control-flow metrics Number of if-then conditional arcs Number of loop arcs given control-flow graph

(c) DAIMI - OOSSHenrik Bærbak Christensen31 Simulations, Prototypes, Experiments Architectural prototyping: [Bardram et al., 2004] –Exploratory: finding solutions –Experimental: evaluating solutions

(c) DAIMI - OOSSHenrik Bærbak Christensen32 Rate-Monotonic Analysis Static, quantitative analysis –Ensuring dependability in hard real-time systems Preemptive multitasking, execute task with highest priority Basic idea –Assign priority to each process according to its execution time The shorter the execution time, the higher the priority RMA ensures schedulability of processes –No process misses execution deadline –Worst-case schedule bound: W(n) = n * (2^(1/n) - 1) ln 2, n -> oo Guarantees schedulability if concurrency model is observed in implementation

(c) DAIMI - OOSSHenrik Bærbak Christensen33 Automated Tools UML-based –Simulation –Semantic checks –Code generation to ensure conformance –…

(c) DAIMI - OOSSHenrik Bærbak Christensen34 Summary Wide range of evaluation approaches exist –Questioning techniques –Measuring techniques Establishing suitability of architecture as main goal –Does architecture fulfill quality goals? –Is architecture buildable within project constraints? Utility of approaches dependent on context –Project state –Expertise –Tools used –Domain –…