Presentation is loading. Please wait.

Presentation is loading. Please wait.

SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 1 T-76.5650 Sotfware Engineering Seminar.

Similar presentations


Presentation on theme: "SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 1 T-76.5650 Sotfware Engineering Seminar."— Presentation transcript:

1 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 1 T-76.5650 Sotfware Engineering Seminar Quality attributes within software architectures Varvana Myllärniemi

2 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 2 Outline  Software architecture and quality attributes  Understanding quality attributes within the architectural context  Achieving quality attributes in architecture design  Evaluating quality attributes from architecture

3 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 3 Software architecture and quality attributes

4 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 4 Software architecture  Architecture is the fundamental organization of a system embodied in its components, their relationships to each other, and to the environment, and the principles guiding its design and evolution. (IEEE Std 1471-2000)  Every software system has an architecture, whether its documented or not (Bass et al., 2003)  Why architectures are important (Bass et al., 2003):  Architecture is a common vehicle for stakeholder communication  Architecture manifests the earliest design decisions

5 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 5 Why is architecture important for QAs?  Software architecture substantially determines many of the system quality attributes (Bass et al., 2003)  If the architecture hasn’t been designed to support certain quality attributes, it is hard to add support through detailed design and implementation only  The realisation of many quality attributes does not depend on one module alone  The decomposition of the system affects its quality attributes  In other words, many quality attributes are architecturally sensitive  Software architecture is critical to their realisation  These quality attributes should be designed in and can be evaluated at the architectural level (Bass et al., 2003)

6 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 6 Not all of QAs are architectural  Architecture, by itself, it unable to achieve quality attributes (Bass et al., 2003)  Architecture provides a foundation, not a guarantee  Detailed design and implementation affect also  There are many aspects of quality attributes that are not architecturally sensitive  Some quality attributes are more architecturally sensitive than others  E.g. modifiability vs. usability?

7 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 7 Functionality vs. QAs within SA  The mapping of system’s functionality onto software structures determines the architecture’s support for quality attributes  However, functionality itself is not architecturally sensitive  Functionality is orthogonal to structure  Any number of possible structures can be conceived to implement any given functionality  If achievement of functionality is the only requirement, the system could exist as a single monolithic component with no internal structure at all (Bass et al., 2003)

8 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 8 Understanding quality attributes within the architectural context

9 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 9 Why requirements are not enough?  Quality requirements are often poorly defined, at a high level, or completely missing  Therefore, requirements may not communicate enough  Misunderstandings and different interpretations  Scenarios provide a simple yet powerful means for bridging the gap between requirements and architecture  Concretise requirements, communicate to all stakeholders  Highlight the most critical aspects of system

10 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 10 Scenarios  A scenario is a short statement describing an interaction of one of the stakeholders with the system. (Clements et al., 2002)  Note: a stakeholder instead of a user  An architectural scenario is a crisp, concise description of a situation that the system is likely to face in its production environment, along with the definition of the response required of the system. (Rozanski and Woods, 2005)

11 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 11 A scenario is not a use case  Although sometimes treated as synonyms, scenarios are not use cases  A scenario is an instance of a use case (Kruchten, 1995; Jacobson, 1995)  A use case describes a set of sequences of actions  A scenario is a concrete instantiation of a use case  Use cases are often treated more formally (Jacobson, 1995)  Use cases describe the system strictly from a black-box viewpoint, whereas scenarios are often linked to the internal elements of the system (Jacobson, 1995)

12 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 12 A scenario is not a use case  However, not all scenarios are instances of use cases  Scenarios can capture many aspects that cannot be captured by use cases  Quality requirements  Important from the viewpoint of SA  If only use cases are used to capture requirements, QA requirements are easily forgotten  Focus of this course!  Scenarios initiated by the system itself, or due time passing

13 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 13 Uses for scenarios  Input for AD process  A driver to help designers discover architectural elements during AD (Kruchten, 1995)  Architecture evaluation  Communication among stakeholders  Concretisation of requirements  Identifying missing requirements  Driving testing (Rozanski and Woods, 2005)

14 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 14 Scenario elicitation  Sources for scenarios  Requirements  Stakeholders  Experience  After scenarios have been identified, it is important to prioritise them  Importance to stakeholders, risk of implementation  Focus should be on the most critical scenarios  Too many scenarios leave to a confusion  A manageable set of scenarios = max. 15 to 20 (Rozanski and Woods, 2005)

15 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 15 Types of scenarios  Rozanski and Woods (2005):  Functional scenarios  System quality scenarios  Clements et al. (2002):  Use case scenarios  User’s interaction with a completed, running system  Can be used to capture certain quality aspects  Growth scenarios  Anticipated changes to the system  Exploratory scenarios  Push the envelope and stress the system

16 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 16 Format of quality attribute scenarios  Consists of six parts (Bass et al., 2003)  Source of stimulus  Stimulus  Environment  Artifact  Response  Response measure  Unnecessary parts may be omitted Artifact StimulusResponse EnvironmentSourceMeasure

17 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 17 Performance scenario parts  Source  Internal or external source(s)  Stimulus  Periodic events arrive; sporadic events arrive, stochastic events arrive  Artifact  System  Environment  Normal mode; overload mode  Response  Processes stimuli; changes level of service  Response measure  Latency; deadline; throughput; jitter; miss rate; data loss (Bass et al., 2003)

18 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 18 Example performance scenario  Users initiate 1000 transactions per minute stochastically under normal operation and these transactions are processed within an average latency of two seconds. Artifact System Stimulus Initiate 1000 transactions stochastically Response Transactions are processed Environment Under normal operation Source Users Measure With average latency of two seconds

19 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 19 Availability scenario parts  Source  Internal or external  Stimulus  Fault: omission, crash, timing, wrong response  Artifact  Resource that should be available: processor, communication channel, persistent storage, processes  Environment  Normal mode; degraded mode  Response  Detects event and does one or more of the following:  Records it  Notifies appropriate parties  Disables sources of events  Is unavailable for a predefined time interval  Continues to operate in normal or degraded mode  Response measure  Time interval when the system must be available; availability time; time interval in which the system can be in degraded mode; repair time (Bass et al., 2003)

20 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 20 Example availability scenario  An unanticipated external message is received by a process during normal operation. The process informs the operator of the receipt of the message and the system continues to operate with no downtime. Artifact Process Stimulus Unanticipated message Response Inform operator, continue to operate Environment Under normal operation Source External to system Measure No downtime

21 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 21 Modifiability scenario parts  Source  End user, developer, system administrator  Stimulus  Wishes to add / delete / modify / vary functionality, quality attribute, capacity  Artifact  System user interface, platform, environment, system that interoperates with target system  Environment  At runtime, compile time, build time, design time  Response  Locates places in architecture to be modified; makes modification without affecting other functionality; tests modification; deploys modification  Response measure  Cost in terms of number of elements affected / effort / money; extent to which this affect other functions or quality attributes (Bass et al., 2003)

22 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 22 Example modifiability scenario  A developer wishes to change the UI code at design time; modification is made with no side effects in three hours. Artifact Code Stimulus Wishes to change UI Response Modification is made with no side effects Environment At design time Source Developer Measure In three hours

23 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 23 Achieving quality attributes in architecture design

24 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 24 Outline  Architectural design processes  Styles  Patterns  Architectural tactics

25 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 25 Architectural design process  Since quality attributes should be designed in the architectural level (Bass et al., 2003), an architectural design process should take quality attributes into account  Typical activities in a such a process  Gather input  Design  Evaluate  Iterate

26 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 26 Functionality-based architectural design by Bosch (2000) (Bosch, 2000; Bosch & Molin, 1999) Functionality-based architectural design Requirement specification Estimate quality attributes Architecture transformation Application architecture QA-optimizing solutions [acceptable] [not acceptable] Architectural styles Architectural patterns Design patterns Transform QA to functionality  First draft architecture to satisfy only functional requirements, then iteratively transform to match QA requirements

27 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 27 Architecture definition process by Rozanski and Woods (2005)  Quality attributes are taken into account already from the start  Scenarios  Styles  Evaluation and iteration, if architecture does not fullfil its requirements Evaluate architecture with stakeholders Identify scenarios Identify relevant architectural styles Produce candidate architecture Explore architectural options Rework architecture, rework requirements [not acceptable] [acceptable] Input for process: architecturally significant requirements, context and scope (Rozanski & Woods, 2005)

28 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 28 Attribute-Driven Design (ADD) (Bass et al., 2001)  A decomposition method that relies on  Architectural drivers  Styles and tactics  Quality scenarios  One level of decomposition is designed and evaluated at a time Verify use cases and quality scenarios, make them constraints for children design elements Select design element to decompose (initially the whole system) Identify architectural drivers (either functional or quality drivers) Choose styles and tactics that satisfy architectural drivers Instantiate design elements and allocate functionality to them [for every children design element that needs further decomposition] [all design elements decomposed]

29 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 29 How to achieve QAs in the design  Design processes rely on the architect to find a solution that satisfies quality attribute requirements  Some mentioned mechanisms for achieving desired quality attributes in the design construction  Architectural styles  Patterns  Transforming QA into functionality (Bosch & Molin, 1999)  Tactics (Bass et al., 2003)  Perspectives (Rozanski & Woods, 2005)

30 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 30 Architectural styles  Definition  An architectural style defines a vocabulary of components and connector types, and a set of constraints on how they can be combined (Shaw & Garlan, 1996)  An architectural style expresses a fundamental structural organisation schema for software system. Provides set of predefined element types, specifies their responsibilities and includes rules and guidelines for organising relationships between them. (Buschmann, 1996)  Large-scale best practices for solving known problems  Recorded effect on quality attributes  For example, a layered style enhances modifiability but hinders performance

31 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 31 Common architectural styles  One of the first listings by Shaw & Garlan (1996):  Pipes and filters  Batch sequential  Main program and subroutines  Object-oriented systems  Layers  Communicating processes  Event-based systems  Interpreters  Rule-based systems  Databases  Hypertext systems  Blackboards  Further styles listed by Rozanski & Woods (2005)  Client-server  Tiered computing  Peer-to-peer  Publisher-subscriber  Asynchronous data replication  Distribution tree  Integration hub  Tuple space

32 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 32 Patterns  Definition  A design pattern provides a schema for refining the elements of a software system or the relations between them. Describes a commonly occurring structure of interconnected design elements that solves a general design problem within a particular context. (Buschmann, 1996)  Patterns are more fine-grained than styles  Design-level solutions  Recorded best practices, known effect on quality attributes  Sometimes a difference is made between architectural patterns and design patterns  A book about pattern-oriented architectures (Buschmann, 1996)

33 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 33 Tactics  Typically, one style or pattern implements several mechanisms for achieving a quality attribute requirement  A primitive mechanism for achieving a quality attribute is called an architectural tactic (Bass et al., 2003)  Relation to styles and patterns  Tactics are foundational “building blocks” of design, from which patterns and styles are created (Bass et al., 2003)  Patterns and styles are more concrete, perhaps more easily applied

34 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 34 Events arrive Response generated within time constraints Performance tactics Increase computation efficiency Reduce computational overhead Manage event rate Control frequency sampling Introduce concurrency Maintain multiple copies Increase available resources Scheduling policy Performance Resource management Resource demand Resource arbitration (Bass et al., 2003)

35 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 35 Fault Availability tactics Ping/heartbeat Echo Exception Voting Active redundancy Passive redundancy Spare Removal from service Transactions Process monitor Availability Recovery preparation and repair Fault detection Prevention (Bass et al., 2003) Fault masked or repair made Recovery reintroduction Shadow State resynchronisation Rollback

36 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 36 Attack Security tactics Authenticate users Authorise users Maintain data confidentiality Maintain integrity Limit exposure Limit access Intrusion detection Identification (audit trail) Security Detecting attacks Resisting attacks (Bass et al., 2003) System detects, resists, or recovers from attacks Recovering from an attack Restoration (see availability)

37 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 37 Changes arrive Changes made, tested and deployed within time and budget Modifiability tactics Semantic coherence Anticipate expected changes Generalise module Limit possible options Abstract common services Hide information Maintain existing interface Restrict communication paths Use an intermediary Runtime registration Configuration files Polymorphism Component replacement Adherence to defined protocols Modifiability Prevention of ripple effect Localise changesDefer binding time (Bass et al., 2003)

38 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 38 User request User given appropriate feedback and assistance Usability tactics Cancel Undo Aggregate User model System model Task model Usability Support user initiative Separate user interface Support system initiative (Bass et al., 2003)

39 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 39 Evaluating quality attributes from architecture

40 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 40 Why evaluate?  Determine whether quality goals can be reached before committing resources enormously  At this time, only a paper draft, easy to make changes  Validate abstractions and technical correctness  Often interpretations and misunderstandings between stakeholders  Sell and explain the architecture

41 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 41 Evaluation methods  Different ways of evaluating architectures (Rozanski and Woods 2005)  Presentations  Formal reviews and structured walkthroughs  Evaluation using scenarios  Prototypes and proof-of-concept systems  Prototype = functional subset of the system  Proof-of-concept = some code to prove that a risky element is feasible  Skeleton systems  Implementation of the system’s structure that contains minimal subset of system’s functionality Light-weight methods Heavy-weight methods

42 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 42 Evaluation using scenarios  Structured approach to evaluating how well architecture meets stakeholder needs  In terms of the quality attributes the architecture exhibits  Scenarios act as a quality attribute specification that the architecture must be able to respond to  The realisation of the scenario is analysed through the architecture to identify possible problems  E.g. if an event traverses through many processes, it is probable that it will not meet latency requirement of 10 ms  Create time budgets along the path, sum them up  E.g. if a change request affects many components in the system, it is probable that the developer is not able to do it in 3 hours  Estimate number of changed LOC

43 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 43 Evaluation using scenarios  Pros: provides a sophisticated understanding of the architecture and how it handles critical situations related to achieving quality attributes  Cons: considerably more complex and expensive than simple reviews or walkthroughs, often requires extensive stakeholder participation  Two methods in more detail  ATAM  Scenario profile evaluation

44 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 44 ATAM (Architecture Tradeoff Analysis Method)  The purpose of the ATAM is to assess the consequences of architecture decisions in light of quality attribute requirements  ATAM is a method for detecting risks of a complex software intensive system  Risk = architecturally important decision that is potentially problematic in regard to quality attribute(s)  ATAM also provides insight to how design decisions affect QAs and their trade-offs (Clements et al., 2002; Kazman et al., 2000)

45 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 45 ATAM analysis concepts Analysis High priority scenarios Attribute-specific questions Architectural approaches Sensitivity points Properties that are critical for achieving a quality attribute response Tradeoff points Properties that are sensitivity points for more than one quality attribute Risks Potentially problematic architectural decisions (Kazman et al., 2000)

46 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 46 1. Present ATAM  method  process  agenda 2. Present business drivers  market situation  history of products  market differentiators  business constraints  technical constraints  critical requirements  quality attributes 3. Present architecture  architectural drivers  high level arch. views  architectural styles  important scenarios  risks related to architectural drivers ATAM input 5/7. Generate quality attribute utility tree  quality tree  scenarios 4. Identify architectural approaches  architectural decisions 6/8. Analyse architectural approaches  decisions  risks  trade-offs  sensitivity points  non-risks  other issues Evaluation  risk themes Evaluation  business drivers Evaluation  impact of risk themes on business drivers ATAM output (Ferber et al., 2001) 9. Present results

47 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 47 Scenario profile evaluation  Based on creating a scenario profile and analysing its impact (Bosch, 2000; Bengtsson & Bosch, 1999)  For maintainability, but may be applicable to other quality attributes  Method steps  Identify scenarios (together create a scenario profile)  Assign each scenario a probability estimate (a weight)  Estimate the impact of each scenario to the architecture (e.g. a change scenario implies number of LOC changed)  Calculate the overall quality by weighing each scenario impact with scenario weights and then summing up  Is method result a good indicator as such, or should it be used for comparing two architectural alternatives?

48 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 48 Prediction techniques for component-based systems  Initial objective was to support prediction of quality attributes from an application built from reusable components  Component-based engineering, software product families, dynamically configurable systems  Predictable assembly (Crnkovic et al., 2002): given a set of components, and their component composition, how can system- wide quality attribute value be predicted?  No human involvement in the prediction process  Several existing methods proposed  E.g. for performance (Bondarev et al., 2006) and reliability (Reussner et al., 2003)  Not all quality attributes are easy to predict this way! (Larsson, 2004)

49 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 49 References Bass, Len; Clements, Paul and Kazman, Rick. Software Architecture in Practice. 2nd edition. Addison-Wesley, 2003. Bass, Len; Klein, Mark and Bachmann, Felix. Quality Attribute Design Primitives and the Attribute Driven Design Method. Proceedings of International Workshop on Product Family Engineering, 2001. Bengtsson, Per Olof and Bosch, Jan. Architecture Level Prediction of Software Maintenance. Proceedings of International Conference on Software Maintenance and Reengineering, 1999. Bosch, Jan. Design and Use of Software Architectures—Adopting and Evolving a Product-Line Approach. Addison-Wesley, 2000. Bosch, Jan and Molin, Peter. Software Architecture Design: Evaluation and Transformation. Proceedings of IEEE Conference and Workshop on Engineering of Computer-Based Systems, 1999. Bondarev, Egor; Chaudron, Michel R. V. and de With, Peter H. N. A Process for Resolving Performance Trade-Offs in Component-Based Architectures. Proceedings of International Symposium on Component-Based Software Engineering, 2006.

50 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 50 References Buschmann, Frank. Pattern-Oriented Software Architecture: A System of Patterns. Wiley, 1996. Clements, Paul; Kazman, Rick and Klein, Mark. Evaluating Software Architectures. Addison-Wesley, 2002. Crnkovic, Ivica and Schmidt, H. and Stafford, Judith and Wallnau, Kurt. Anatomy of a Reseach Project in Predictable Assembly. Proceedings of Workshop on Component-Based Software Engineering (CBSE), 2002. IEEE Std 1471-2000. IEEE Recommended practice for architectural description of software-intensive systems, 2000. Ferber, Stefan; Heild, Peter and Lutz, Peter. Reviewing Product Line Architectures: Experience Report of ATAM in an Automotive Context. Proceedings of International Workshop on Product Family Engineering, 2001. Jacobson, Ivar. The Use Case Construct in Object-Oriented Software Engineering. Chapter 12 in Scenario-Based Design—Envisioning Work and Technology in System Development, editor G. Ghastek. New York: John Wiley & Sons, 1995.

51 SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 51 References Kazman, Rick and Klein, Mark and Clements, Paul. ATAM: Method for Architecture Evaluation. Technical report CMU/SEI-2000-TR-004, 2000. Kruchten, Philip. The 4+1 View Model of Architecture. IEEE Software (Nov), 1995. Larsson, Magnus. Predicting Quality Attributes in Component-Based Software Systems. Ph.D. dissertation, Mälardalen University, 2004. Reussner, R.; Schmidt, H. and Poernomo, I. Reliability Prediction for Component-Based Software Architectures. Journal of Systems and Software, 66(3), 2003. Rozanski, Nick and Woods, Eoin. Software Systems Architecture: Working with Stakeholders Using Viewpoints and Perspectives. Addison-Wesley, 2005. Shaw, Mary and Garlan, David. Software Architecture—Perspectives on an Emerging Discipline, Chapter 2. Prentice Hall, 1996.


Download ppt "SoberIT Software Business and Engineering Institute HELSINKI UNIVERSITY OF TECHNOLOGY © Varvana Myllärniemi, 2006 1 T-76.5650 Sotfware Engineering Seminar."

Similar presentations


Ads by Google