Download presentation
Presentation is loading. Please wait.
Published byJoseph Lynch Modified over 9 years ago
1
1 Rajeev R. Raje Andrew M. Olson Barrett R. Bryant Carol C. Burt Mikhail Auguston funded by the DoD and Office of Naval Research under the CIP/SW Program
2
2 Overview (SERC Showcase, Dec. 6, 7, 2001) Objective UniFrame Approach Parts of UniFrame UMM Standards and OMG QoS Summary
3
3 Objective To create a unified framework (UniFrame) that will allow a seamless integration of heterogeneous and distributed software components To create a unified framework (UniFrame) that will allow a seamless integration of heterogeneous and distributed software components
4
4 Sounds like“Web Services”! Service Broker Service Provider Service Requestor PublishBind Find Web Services focus is on using the Internet protocols for a messaging (SOAP/HTTP and XML), and a UDDI directory for locating services defined in WSDL
5
5 That leaves a lot of interesting problems… Need for a component meta-model in support of generative techniques for mappings to existing components models Need for multi-approach highly intelligent location services QoS instrumentation and metrics Unified approach to using generative techniques with strict QoS requirements Validation of dynamic system compositions
6
6
7
7 UniFrame Approach UMM Components, QoS, Infrastructure GDM Domain model, Composition/Decomposition Rules, Generative Programming TLG Formalism based two-level grammars Process For integration
8
8 Unified Meta-Model (UMM) Component Autonomous and non-uniform Service and its guarantees Offered by each component with QoS Infrastructure Environment –Headhunters –Internet Component Broker
9
9 Aspects of Components Computational Reflects the tasks carried out by an object Cooperative Expected collaborators Auxiliary Mobility, Security, Fault-tolerance
10
10 Service and Guarantees Each component must specify its QoS and ensure it QoS Parameter Catalog static – design oriented dynamic – run-time oriented QoS of a component/integrated DCS based on “event traces”
11
11 Infrastructure Head-hunters Pro-active discovery of new component Multi-level and multi-approach Internet Component Broker Allows heterogeneous components talk to one another Analogous to Object Request Broker Generated adapter technology Instrumentation as a part of the architecture
12
12 Architecture For Component Discovery Head-hunter S2 S4 S3 S1 RMI Registry 1 1 1 1 RMI Model S6 S5 S7 S8 ORB CORBA Model 1 1 1 1 S5 EJB Container EJB Model 1 1 1 Meta-Registry 3 3 3 3 4 4 4 4 2 2 2 2 2 2 Client 5 2 Domain Security Manager 5 5 Search Engine ICB Client
13
13 Component Development and Deployment Process TranslatorIGImpQV Domain KB UMM Spec TLG Component Implementation Satisfy? Yes NoRefine Specifications/implementations Deploy Interface Generator QoS validation
14
14 System Integration Query Processor System Generator Iterative Experiments UMM TLG Domain KB Query Satisfy? Yes No Refine Query Deploy HHs GDM Generative Rules Select Another option QoS Constraints
15
15 Leverage & Drive OMG work Infrastructure & Interoperability CORBA, CORBA Services, CCM, IIOP, COM/CORBA, SOAP/CORBA, CSIv2, Head-hunters, Internet Component Broker Validation Metrics / Instrumentation Model Driven Architecture PIM to PSM mapping Consistent with our Meta-model approach Concept of a QoS Catalog & Interface generation
16
16 Interoperability & Infrastructure Internet Component Broker Leverage “lessons learned” in development of orbs – standard protocol, standard component mappings & portable component adapters Leverage SOAP/CORBA and XML valuetypes Native protocol? Headhunter Use Naming/Trading, Interface Repository Need Standard Implementation Repository? Federation, Native protocol, API?
17
17 Model Driven Architecture We need a standard QoS catalog for Model Driven Architectures Static – design oriented Dynamic – runtime oriented We need to standardize the way that QoS parameters are used to generate interfaces (static QoS) We need to standardize how QoS parameters are used for generated instrumentation (dynamic QoS)
18
18 Quality of Service Reference Model A general categorization of different kinds of QoS; including QoS that are fixed at design time as well as ones that are managed dynamically Identification of the basic conceptual elements involved in QoS and their mutual relationships. This involves the ability to associate QoS parameters to model elements (specification)
19
19 Qualify of Service Parameters These are parameters that describe the fundamental aspects of the various specific kinds of QoS based on the QoS categorization identified in the reference model. This includes but is not limited to the following: time-related characteristics (delays, freshness) importance-related characteristics (priority, precedence) capacity-related characteristics (throughput, capacity) integrity related characteristics (accuracy) safety-related characteristics availability and reliability characteristics
20
20 QoS Catalog Motivation: Creation of a QoS Catalog for Software Components would help the component developer by: Acting as a reference manual for incorporating QoS attributes into the components being developed Allowing him to enhance the performance of his component in an iterative fashion by being able to quantify their QoS attributes Enable him to advertise the Quality of his components by utilizing the QoS metrics.
21
21 QoS Catalog The system developer by Enabling him to specify the QoS requirements of the components that are incorporated into his system Allowing him to verify and validate the claims of the component developer Allowing him to make an objective comparison of Quality of components having the same functionality Empower him with the means to choose the best suited components for his system
22
22 QoS Catalog The catalog is broadly based upon the software patterns catalog. The catalog follows the following format: Name Intent Description Motivation Applicability Model Used Metrics Used Error Situation Aliases Influencing Factors Evaluation Procedure Evaluation Formulae Result Type Static / Dynamic Consequence Related Parameters Domain of Usage Resources
23
23 QoS Catalog Incorporation of methodologies into the catalog is made based on their: Reproducibility Indicativeness (capability to identify parts of the component which need to be improved) Correctness Objectivity Precision Meaningfulness of measure Suitability to the component framework Error Situation Aliases Resources
24
24 QoS Catalog Name: DEPENDABILTY Intent: It is a measure of confidence that the component is free from errors. Description: It is defined as the probability that the component is defect free. Motivation: It allows an evaluation of degree of Dependability of a given component. It allows Dependability of different components to be compared. It allows for modifications to a component to increase its Dependability. Applicability: Can be used in any system, which requires its components to offer a specific level of dependability. Using the model, the Dependability of a given component can be calculated before being incorporated into the system. Model Used: Dependability model by Jeffrey Voas Metrics used: Testability Score, Dependability Score Testability is a measure of the likely hood that a particular statement in a component will hide a defect during testing.
25
25 QoS Catalog Influencing Factors: 1.Degree of testing 2.Fault hiding ability of the code 3.The likelihood that a statement in a component is executed 4.The likelihood that a mutated statement will infect the component’s state 5.The likelihood that a corrupted state will propagate and cause the component output to be mutated Evaluation Procedure: 1.Perform Execution Analysis on the component 2.Perform Propagation Analysis on the component 3.Calculate the Testability value of each statement in the component 4.From the Testability scores of each statement of the component; Select the lowest score as the Testability score of the component 5.Calculate the Dependability Score of the Component
26
26 QoS Catalog Evaluation Formulae: T = E * P T Testability Score E Execution Estimate P Propagation Estimate D = 1-(1-T) N D Dependability Score N Number of successful tests Result Type: Floating Point Value between 0 to 1 Static/Dynamic: Static Consequence: 1.Greater amounts of testing and greater Testability scores result in greater Dependability 2.Lower amounts of testing and lower Testability scores result in lesser Dependability 3.Doing additional testing can improve a poor score
27
27 QoS Catalog 4.Lesser amount of testing is required to provide a fixed dependability score for higher Testability Scores Related Parameters: Availability, Error Rate, Stability Domain of Usage: Domain Independent Error Situation: Low dependability results in: 1. Unreliable component behavior. 2. Improper execution/termination. 3. Erroneous results. Aliases: Maturity, Fault Hiding Ability, Degree of Testing
28
28 Summary of Approach Address key issues that need to be resolved to assist organizations to manage their distributed software systems Meta-model allows a seamless integration of heterogeneous components Formal specifications assist in automated construction and verification of parts and the whole of a distributed computing system (DCS) Support a unified approach to iterative as a pragmatic solution for software development of DCS Incorporation and validation of QoS implies the creation of more reliable DCS Interactions with the industry and standards organizations provide practical feedback and enable proliferation of research results in a timely manner
29
29 Salient Features A meta-model and a unified approach QoS-based generative process Generation based on distributed resources in the form of components – use of HHs Event grammars for dynamic QoS metrics Automation (to the extent feasible) for system generation
30
30 Webpage Http://www.cs.iupui.edu/uniFrame.html
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.