Download presentation
Presentation is loading. Please wait.
Published byLee King Modified over 9 years ago
1
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 1 / 16 Model-based SOA Performance Profiling using CloudLauncher Other project members Francesco Caruso Josephine Micallef Sumant Tambe, sutambe@dre.vanderbilt.edu Vanderbilt University, Nashville TN Summer Intern @ Telcordia Technologies, June-Aug 2009
2
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 2 / 16 Outline Background: Service-Oriented Architecture (SOA) and Quality of Service (QoS) QoS Design Problem in SOA Overview of State of the Art Project Objectives Solution: SOA Dynamic Designer for QoS A model-driven deployment and performance profiling tool for SOA QoS design: CloudLauncher Amazon Elastic Compute Cloud (EC2) Testbed Concluding remarks
3
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 3 / 16 Service-Oriented Architecture (SOA) & QoS Implementation layer Services interface layer Business process layer..NET J2EELegacy composite services atomic services E2E QoS (SLA) Service QoS Environment QoS SOA System
4
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 4 / 16 Motivation Mission-critical systems with high-assurance requirements – in defense, communications, healthcare, are increasingly adopting SOA Often non-intuitive tradeoffs are necessary between multiple QoS (e.g., security, reliability, availability, performance) Lack of engineering methodology for SOA system QoS QoS design today is an ad-hoc, manual and costly one-off job for every SOA system Increases risk and cost for successfully deploying these solutions No procedures or tools available to intelligently explore the solution space to find the best tradeoffs
5
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 5 / 16 Overview of State of the Art SOA standard specifications for security and reliable messaging policies are maturing Implementations exist for enterprise middleware platforms However, no support in higher-level QoS design methodology or tool Several research projects addressing SOA QoS Primary goal is for “service matching”– selecting the service that meets the required QoS ”– not system design Provide point solutions to very specific problems – not “end-to-end” QoS Modeling for Distributed Real-time Embedded (DRE) systems E.g., Component QoS Modeling Language (CQML), Vanderbilt Univ. Fault-tolerance, Network QoS etc.
6
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 6 / 16 Project Objectives Provide system engineers with a methodology and supporting toolset to enable them to design SOA- based systems that achieve end-to-end QoS Use model-driven techniques to generate configurations adapted to the target deployment environment that meet the QoS requirements Reduce development burden and configuration errors Facilitates deployment onto heterogeneous environments Enable adaptation to changes in system’s QoS requirements or deployment environment by re- computing new target configuration Incremental computation to minimize rework Minimize configuration delta to reduce impact
7
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 7 / 16 Solution: SOA Dynamic Designer for QoS Model System End-to-End QoS Requirements Suggest Composition Patterns for QoS Build Service QoS Profile Deploy Promising Candidates SOA Dynamic Designer for QoS Design-Time Intelligently present design options based on all constraints Allows users to do “what-if” analysis on alternative designs Automatically configure “tunables” on services and deployment environment 1 2 34
8
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 8 / 16 Order ManagerCredit Check Model-based Design Specification (1/3) System model specifies composition of SOA services Structure Behavioral Abstractions Parallel Serial Inventory CheckCustomer DataInventory Data External Credit Check Serial Parallel E2E QoS: Support up to N orders/hour while responding with order confirmation in no more than X sec. Order Confirmation Response Time T = max ( T a, T b ) TaTa TbTb Credit Check Response Time T a = T x + T y TxTx TyTy T
9
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 9 / 16 Model-based Design Specification (2/3) Specify QoS attributes (s ecure channel, reliable channel etc) Apply composition patterns Encryption strength At least once, exactly once delivery, etc Secure Channel Reliable Channel Order ManagerCredit CheckInventory CheckCustomer DataInventory Data External Credit Check Encryption Mediator Persistent Message Queue
10
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 10 / 16 Collocated and Clustered Model-based Design Specification (3/3) Specify deployment pattern alternatives Single/multiple core(s), small/medium/large memory resources Clustering Vs. standalone Collocated Vs. distributed Order ManagerCredit CheckInventory CheckCustomer DataInventory Data External Credit Check Inventory CheckInventory DataInventory CheckInventory DataInventory CheckInventory Data Less Resourceful Machine Resourceful Machine
11
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 11 / 16 Build Service QoS Profile CloudLauncher Model-based tool for empirical load testing Use Amazon Elastic Compute Cloud (EC2) as testbed Automatically instantiate different deployment patterns different deployment host resource settings different QoS composition patterns (in progress) Use Grinder for generating dynamic load (in progress) Measure data points (response time, load, utilization) Interpolate to estimate service QoS profile (in progress)
12
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 12 / 16 CloudLauncher Supported Deployment Patterns Amazon Cloud (US-East-1* Zone) Amazon EC2 Instance App Server: Order Manager DB: Customer Data JDBC Single Machine Instance App Server: Order Manager DB: Customer Data JDBC Distributed on 2 Machines DB: Customer Data App Server: Order Status App Server: Order Manager JDBC Load Balancer Clustered App Server App Server: Order Manager DB: Customer Data JDBC App Server: Order Manager DB: Customer Data JDBC App Server: Order Manager DB: Customer Data JDBC Load Balancer Replicated DB
13
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 13 / 16 Architecture of Grinder SOA Dynamic Designer for QoS Amazon Cloud (US-East-1* Zone) Amazon EC2 Instance The Grinder Controller Test Clients Distributes test plan to agents, And collects test statistics DB: Customer Data App Server: Order Status App Server: Order Manager JDBC Load Balancer Clustered App Server The Grinder Agent Generate
14
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 14 / 16 Example Service Profile for Different Deployment Patterns Response Time Load S Host L Host XL Host Response Time Load 1H-Cluster 4H-Cluster Response Time (Load) = a + b * Load 2H-Cluster Machine Type Clustered Configuration L1L1 L2L2 L1L1 L2L2 T1T1 T2T2 T1T1 T2T2
15
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 15 / 16 Concluding Remarks SOA Dynamic Designer for QoS capabilities Model end-to-end QoS requirements Intelligently suggest QoS patterns Generate configuration for target deployment environment Build QoS profile of the system by empirical testing on cloud platforms
16
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 16 / 16 Thank you!
17
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 17 / 16 Extra slides
18
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 18 / 16 Generic Modeling Environment (GME) Developed at Institute for Software Integrated Systems (ISIS), Vanderbilt University Under development since 1995 (GME 9 in beta testing) Enables layered, multi-view system modeling, model transformation, model analysis and validation, and model execution API support in C++, Java, Python Reference: Composing Domain-Specific Design Environments, Akos Ledeczi et al., IEEE Computer, Nov 2001 Parts browser with custom icons Tree view System Model
19
Copyright © 2009, Telcordia Technologies, Inc. All Rights Reserved. 19 / 16 Cloud Vs. Grid Computing 19 CloudGrid Transactional apps (e.g. web sites)High performance, one application Real-time resource allocation, Fork-a- server now! Best-effort scheduling of queued allocation requests Requests of up to 10-50 nodesRequests of up to 500-1000 nodes Same administrative domain (so far)Sharing of resources belonging to different admin domains Pay-as-you-go (frequent scale out and scale down) infrequent Virtualization is crucialVirtualization is just entering Usability: Easy (PaaS, SaaS)Heavy
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.