Download presentation
Presentation is loading. Please wait.
Published byTrevor Harper Modified over 8 years ago
1
“OMG: Not your Father’s CORBA Organization Any Longer” The OMG System Assurance Task Force’s SwA Ecosystem Dr. Ben Calloni, P.E. CISSP, CEH, OCRES Professional Lockheed Martin Fellow, Software Security Lockheed Martin Aeronautics Company, FTW OMG Board of Directors Co-chair OMG System Assurance Task Force The Open Group, Former Vice Chairman, Board of Directors Chair, Customer Council Djenana Campara, CEO KDM Analytics OMG Board of Directors Co-chair OMG System Assurance Task Force
2
Who Is OMG? Object Management Group (OMG) factoids: –Founded in 1989 –Over 470 member companies –The largest and longest standing not-for-profit, open-membership consortium which develops and maintains computer industry specifications. –Continuously evolving to remain current while retaining a position of thought leadership. 2 November 9, 2012
3
OMG’s Best-Known Successes Common Object Request Broker Architecture –CORBA® remains the only language- and platform-neutral interoperability standard Unified Modeling Language –UML TM remains the world’s only standardized modeling language Business Process Modeling Notation –BPMN TM provides businesses with the capability of understanding their internal business procedures Common Warehouse Metamodel –CWM TM, the integration of the last two data warehousing initiatives Meta-Object Facility –MOF TM, the repository standard XML Metadata Interchange –XMI TM, the XML-UML standard SBVR 3 November 9, 2012
4
Who Are OMG-ers? ACORD Atego BAE Systems Boeing CA Capgemini Cordys CSC DND Canada FICO Deloitte Fujitsu General Dynamics HP/EDS Harris Hitachi HSBC IBM KDM Analytics Lockheed Martin Mega Practical MetaStorm Microsoft Navy UWC & SWC NEC Sphere Northrop Grumman No Magic Oracle Penn National PrismTech Progress Red Hat SAP Selex Software AG Sopra Sparx Systems Tata Tibco Vangent Some of the hundreds of member companies; 4 November 9, 2012
5
Liaison Relationships 5 November 9, 2012
6
6 OMG cf. TOG The Open Group: Leading the development of open, vendor-neutral IT standards (approximately 12 per year) and certifications –Capture, understand and address current and emerging requirements, and establish policies and share best practices –Facilitate interoperability, develop consensus, and evolve and integrate specifications and open source technologies –Offer a comprehensive set of services to enhance the operational efficiency of consortia –Operate the industry’s premier certification service OMG: Develop, with our worldwide membership, enterprise integration standards that provide real-world value. (approximately 90 per year) –Bring together communities of practice to share experiences in transitioning to new management and technology approaches like Cloud Computing. –Standardize vendor technologies in collaboration with end-users, government agencies, etc. to insure interoperability and product support Open Group Trusted Technology Forum’s objective is to shape global procurement strategies and best practices to help reduce threats and vulnerabilities in the global supply chain OMG System Assurance specifications provide an approach companies can use to meet Trusted Technology Forum requirements
7
OMG Organization Architecture Board Domain TC Vertical Platform TC Horizontal Liaison SC Object & Reference Model SC Spec Mgt SC MDA Users’ SIG Process Metamodels SIG SOA SIG IPR SC Sustainability SIG Architecture Ecosystems SIG Business Architecture SIG A & D PTF ADM PTF MARS PTF SysA PTF Agent PSIG Data Distribution PSIG Japan PSIG Korea PSIG Ontology PSIG Telecoms PSIG BMI DTF C4I DTF Finance DTF Government DTF Healthcare DTF Life Sciences DTF Mfg Tech & Ind. Systems DTF Robotics DTF S/W Based Comm DTF Space DTF Crisis Mgmt DSIG Regulatory Compl. DSIG SDO DSIG Sys Eng DSIG 7 November 9, 2012 Lockheed Martin Leadership / Participation
8
OMG System Assurance Task Force (SysA TF) Strategy –establish a common framework for analysis and exchange of information related to system assurance and trustworthiness. This trustworthiness will assist in facilitating systems that better support Security, Safety, Software and Information Assurance Immediate focus of SysA TF is to complete work related to –SwA Ecosystem - common framework for presenting and analyzing properties of system trustworthiness leverages and connects existing OMG specifications and identifies new specifications that need to be develop to complete framework provides integrated tooling environment for different tool types architected to improve software system analysis and achieve higher automation of risk analysis 8 November 9, 2012
9
Delivering System Assurance: Delivering System Predictability and Reducing Uncertainty Software Assurance (SwA) is 3 step process 1.Specify Assurance Case Enable supplier to make bounded assurance claims about safety, security and/or dependability of systems, product or services 2.Obtain Evidence for Assurance Case perform software assurance assessment to justify claims of meeting a set of requirements through a structure of sub-claims, arguments, and supporting evidence Collecting Evidence and verifying claims’ compliance is complex and costly process 3.Use Assurance Case to calculate and mitigate risk Exam non compliant claims and their evidence to calculate risk and identify course of actions to mitigate it Each stakeholder will have own risk assessment metrics – e.g. security, safety, liability, performance, compliance Currently, SwA 3 step process is informal, subjective & manual 9 November 9, 2012
10
10 Summary of Challenges Key Challenges –Systematic coverage of the weakness space A key step that feeds into the rest of the process – if not properly done, rest of the process is considered add-hock Currently there is a lot of ambiguity associated with system weakness space (often due to requirements and design gaps) that includes coverage, definitions and impact –Objective and cost-effective assurance process Current assurance assessment approaches resist automation due to lack of traceability and transparency between high level security policy/requirement and system artifacts that implements them –Effective and systematic measurement of the risk Today, the risk management process often does not consider assurance issues in an integrated way, resulting in project stakeholders unknowingly accepting assurance risks that can have unintended and severe security issues –Actionable tasks to achieve high system trustworthiness November 9, 2012 Overcoming these challenges will enable automation, a key requirement to a cost-effective, comprehensive, and objective assurance process and effective measure of trustworthiness
11
Improving System Assessments: Systematic, Objective and Automated Key Requirements: 1.Specified assurance compliance points through formal specification 2.Transparency of software process & systems 3.End-to-end Traceability: from code to models to evidence to arguments to security requirements to policy 4.Standards-based Integrated tooling environment Together, these requirements enable the management of system knowledge and knowledge about properties, providing a high degree of transparency, traceability and automation 11 November 9, 2012
12
Establishing Assurance and Trust Assurance Architecture Implementation System Facts Architecture facts DoDAF Evidence Common Fact Model Assurance Case SACM Environment facts Operational Environment NVDB (through SCAP) Threat Model CONOPS (DoDAF) etc. November 9, 2012 12 Software Fault Patterns UPDM System Artifacts Req. STIGS Design etc. KDM Engine
13
Assurance as Part of System Lifecycle (ISO 15288:2008) 13 November 9, 2012 Stakeholder requirements definition Requirements analysis Architectural design Implementatio n Integratio n Transition Operation and Maintenance Validatio n Disposal Verification Risk management process Enterprise Processes Agreement Processes Project Management Processes Technical Processes Safety & Security Needs Initial Assurance Agreement Threat & Risk Analysis Security Plan Preliminary Assessment Full Assessment Security Monitoring & Incident Response CONOPS Assurance Case
14
The Software Assurance Ecosystem: Turning Challenge into Solution SwA Ecosystem is a formal framework for analysis and exchange of information related to software security and trustworthiness Provides a technical environment where formalized claims, arguments and evidence can be brought together with formalized and abstracted software system representations to support high automation and high fidelity analysis. Based entirely on ISO/OMG Open Standards –Knowledge Discovery Metamodel (KDM) –Semantics of Business Vocabulary and Rules (SBVR) –Structured Metrics Metamodel (SMM) –Structured Assurance Case Metamodel (SACM) Architected with a focus on providing fundamental improvements in analysis 14 November 9, 2012
15
Process, People, documentation Evidence Software System / Architecture Evaluation Many integrated & highly automated tools to assist evaluators Claims and Evidence in Formal vocabulary Combination of tools and ISO/OMG standards Standardized SW System Representation In KDM Large scope capable (system of systems) Iterative extraction and analysis for rules Executable Specifications Formalized Specifications Software system Technical Evidence Software System Artifacts Requirements/Design Docs & Artifacts Process Docs & Artifacts Process, People & Documentation Evaluation Environment Some point tools to assist evaluators but mainly manual work Claims in Formal SBVR vocabulary Evidence in Formal SBVR vocabulary Large scope requires large effort Supported by The Open Group’s UDEF* IA Controls Security Policy Protection Profiles CWE Software Assurance Ecosystem: The Formal Framework for System Assessments with Focus on Automation Reports, Threat Risk Analysis, etc Data Structures Hardware Environment Assurance Case Repository - Formalized in SBVR vocabulary - Automated verification of claims against evidence - Highly automated and sophisticated risk assessments using transitive inter- evidence point relationships Supported by the following standards: - ISO/IEC 15026 - ISO/TC 37 / OMG SBVR - OMG Structured Assurance Case Metamodel - Software Fault Patterns (Target 2013) - UML Security Policy Extensions (2014) Software System / Architecture Evaluation Many integrated & highly automated tools to assist evaluators Claims and Evidence in Formal vocabulary Combination of tools and ISO/OMG standards Standardized SW System Representation In KDM Large scope capable (system of systems) Iterative extraction and analysis for rules Supported by ISO/IEC 19506 Tools Interoperability and Unified Reporting Environment 15 November 9, 2012
16
Knowledge Discovery Metamodel KDM is a publicly available specification from the Object Management Group (OMG). KDM is a common intermediate representation for existing software systems and their operating environments, that defines common metadata required for deep semantic integration of Application Lifecycle Management tools. –ensure interoperability between tools for maintenance, evolution, assessment and modernization –represents entire enterprise and embedded software systems, not just code –defines precise semantic foundation for representing behavior –facilitates incremental analysis of existing software systems –is the uniform language- and platform- independent representation. Its extensibility mechanism allows addition of domain-, application- and implementation-specific knowledge. 16 November 9, 2012 The Hinge pin of the SwA Ecosystem
17
Common Fact Model System Assurance Process: Achieving Automation Two key models in Assurance Case –Assurance Traceability Model (ATM) Connects evidence to high level policy –Common Fact Model (CFM) Connects system artifacts to evidence ATMdefines compliance points that the system will be evaluated against –Set of formal knowledge models within System Security Domain –Defined at the lowest level of ATM –Serve as query to Common Fact Model CFM defines unified, precise and normalized System Model –Set of formal knowledge models within System Engineering Domain populated by artifacts from system under evaluation Assurance Case Compliance Points Assurance Traceability Model System artifacts evidence 17 November 9, 2012 Unified and formal knowledge models of the System Security and Engineering Domain are fundamental requirements for supporting and enhancing risk management approaches
18
Confidence Analysis through Assurance Process: Reducing Uncertainty While Assurance does not provide additional security services or safeguards, it does serve to reduce the uncertainty associated with systematic identification of weak links and ultimately with vulnerabilities –(e.g. Common Criteria “Security Assurance Requirements vs. Security Functional Requirements) Product of System Assurance is justified confidence delivered in the form of Assurance Case TYPES OF EVIDENCES FOR ASSURANCE CASE Assurance claim November 9, 2012 18
19
Goal Structured Notation (Safety Domain) Basis of Structured Assurance Case Metamodel 19 November 9, 2012 Note: textbook example only. “is safe” must be quantifiable and measurable!
20
Goal G0 Top Claim J0001 Justification in context of is solved by Strategy S001 Strategy Goal G1 Claim Goal G2 Claim Goal G3 Claim is solved by Goal G1.1 Claim is solved by A0001 Assumptions Assumption in context of C0001 Context in context of Cr0001 Security Criteria Context is solved by Goal G1.2 Claim ER1.1 Evidence Reference Solution ER1.2 Evidence Reference Solution is solved by in context of Model M001 Model Reference in context of OMG’s Structured Assurance Case Metamodel 20 November 9, 2012 UML SysML DoDAF
21
G2 All threats are identified and adequately mitigated Goal G1 System is acceptably secure M1 Integrated system model Model Goal CG1.1 Security criteria are defined Context CG1.4 Concept of operations Context Strategy S1 Argument based on end-to-end risk mitigation analysis Goal G3 All threats to the system are identified Goal G4 Identified threats are adequately mitigated CG1.5 Subject to declared assumptions and limitations Context CG1.2 Assessment scope is defined Context CG1.3 Assessment rigor is defined Context Establishing the Security Assurance Case Goal G5 Residual risk is acceptable …… 21 November 9, 2012
22
Threat Agent Threat Gives rise to Vulnerability Exploits Risk (Prob. Of Attack) Leads to Asset Can damage Exposure ($$ Losses) And causes an Safeguard Can be counter- measured by a Directly affects System Security Risk Analysis and Reduction 22 November 9, 2012 A flaw may not produce a Vulnerability (Security) or Fault (Safety)
23
Safeguard/Control Impact of Confidentiality Must balance C-I-A triad Financial Transaction Servers - $100,000 / min –(Source - VP Visa International - 2004) Maximum Potential Annual Revenue = $52,560,000,000 (60 min x 24 hr x 365 days x $100,000) Realistically one-half = $26,280,000,000 Credit card losses are $2-3 Million annually –Proposal: implement a confidentiality control Confidentiality Control application affects Availability –Reduces throughput utilization by 0.5% –$131,400,000 in lost revenue! 23 November 9, 2012 What is the correct answer?
24
Identifying the Threats Goal G3 All threats to the system are identified Strategy S2 Argument based on various confidence factors affecting threat identification Goal G3.1.1 All operational activities of the system are identified Goal G3.1.2 All assets of the system are identified Goal G3.1.3 All undesired events are identified Goal G3.1.4 All threat scenarios are identified Goal G3.1 All known risk factors related to similar systems are identified Goal G3.2 All risk factors for the system are systematically identified Goal G3.3 Risk analysis team is adequately experienced Goal G3.1.5 All threat agents are identified 24 November 9, 2012
25
DoDAF Mandate / UPDM Certification "DoD and its Program Managers require DoDAF-compliant Architectures. Competing companies will prefer to staff an architecture project with UPDM/DoDAF-certified architects" says Walt Okon, Senior Architecture Engineer, DoD-CIO, Architecture & Interoperability Directorate. "Now included in the DoD IT Standards Registry (DISR) 12-1.0, UPDM Version 2.0 is mandated for projects using DoDAF Version 2+ and OMG SysML or UML.” – Walt Okon, September 2010 OMG Board of Director’s meeting "Technical credentials are as important in our profession as degrees and experience" says Brian Wilczynski, Director, Architecture and Interoperability, DoD CIO/DCIO(IE). "The OMG's UPDM Certification program will help advance the effective use of UPDM in the DoD." "UPDM certification is the easiest way to comply with the Mandate” says Len Levine of the DoD Executive Agent for IT Standards. “By certifying its staff, a company will confirm to DoD that it has made a commitment to building quality architectures conformant to this DoD-mandated standard. The OCUDA certification program, to include a Model User level certification plus two Model Builder levels, will test architects' readiness to review, require, or deliver sharable architectures and designs. Working together, certified practitioners and users of UPDM and DoDAF will reap the benefits of reuse of information, and sharing of architectural artifacts, models, and viewpoints.” 25 November 9, 2012 Lockheed Martin has co-funded the development by OMG of the UPDM Certification Tests and will fund LM Architects (Enterprise and Embedded) to take the exam! Lockheed Martin has co-funded the development by OMG of the UPDM Certification Tests and will fund LM Architects (Enterprise and Embedded) to take the exam!
26
Architecture Risk Analysis using DODAF Problem –Identify critical components with high confidence in order to minimize the residual risks to the mission of the enterprise Challenge –Systematic analysis of threats and risks –Projecting risks analysis to components in a systematic way –Integrating confidence into risk analysis –Integrating confidence into architecture analysis Solution –Common Fact Model All information elements (DoDAF, Threats, Risks, Attacks, Vulnerabilities, etc.) are represented in a uniform way, so that they can be collated and managed together. It is critical to represent complex statements involving all these models (and few more) Semantic Integration of various preexising models (such as importing DoDAF facts into the common normalized model) Uniform analytics across the Common Fact Model –Systematic methodology for identifying risks –Systematic methodology for Architecture Risk Analysis 26 November 9, 2012
27
DoDAF into SwA Ecosystem 1.(Prelude) Importing static images from DoDAF 2. Importing DoDAF elements and semantic integration into the Common Fact Model 3.Importing DoDAF facts and semantic integration into the Common Fact Model 4.DoDAF model analytics using the KWB 5.Threat and Risk Analysis using the KWB and the FORSA methodology 6.Projecting the identified risks to the architectural components 27 November 9, 2012 All screen shots of tool slides 27 – 41 are © KDM Analytic and used with permission.
28
1. Importing static images of DoDAF views 28 November 9, 2012 DoDAF views imported from No Magic tool DoDAF views imported from No Magic tool DoDAF OV-1 view (static image) imported from No Magic tool © KDM Analytics
29
2. Importing DoDAF elements 29 November 9, 2012 DoDAF OV-2 view (static image) imported from No Magic tool Elements of the Common Fact Model, corresponding to the OV- 2 view (imported imported from No Magic tool) © KDM Analytics
30
3. Importing DoDAF facts 30 November 9, 2012 Live KDM view showing all DoDAF Performers (collected from all DoDAF views) and related facts. © KDM Analytics
31
Imported DoDAF facts (1) 31 November 9, 2012 All facts related to each Performer, e.g. “Exchange Element is consumed by Performer” The full list of the atomic facts for the selected relationship “ENV:SRC” element is contextual, to represent subjects that are not part of the current view, but are somewhere else in the fact model. Thus each KWB view represents ALL facts for the selected elements. © KDM Analytics
32
Imported DoDAF facts (2) 32 November 9, 2012 All facts related to each Performer, e.g. “Exchange Element is produced by Performer” The full list of the atomic facts for the selected relationship, with pop- up navigation menu “ENV:SNK” element is contextual, to represent objects that are not part of the current view, but are somewhere else in the fact model. Thus each KWB view represents ALL facts for the selected elements. © KDM Analytics
33
Navigation across facts 33 November 9, 2012 © KDM Analytics
34
4. DoDAF model analytics 34 November 9, 2012 © KDM Analytics
35
DoDAF analytics (1) 35 November 9, 2012 © KDM Analytics
36
DoDAF analytics (2) 36 November 9, 2012 © KDM Analytics
37
5. Threat and Risk Analysis 37 November 9, 2012 © KDM Analytics
38
Overview of the Threat and Risk Analysis model 38 November 9, 2012 Severity in TRA model Minor Medium High Maps nicely to DIACAP Mission Assurance CAT I, II, and III Maps nicely to JAFAN 6/3 Levels of Concern Basic Medium High © KDM Analytics
39
Assets in the TRA model 39 November 9, 2012 © KDM Analytics
40
Analysis of Risks 40 November 9, 2012 © KDM Analytics
41
6. Projecting Risks to Architectural Components 41 November 9, 2012 © KDM Analytics
42
Application to S/W (Wireshark – OSS) 42 November 9, 2012 © KDM Analytics
43
43 November 9, 2012 Compare as-is with intended architecture to identify surface of attack and associated risks and find out how identified code security issues effect increased risk
44
44 November 9, 2012
45
45 November 9, 2012 Identified enlarged surface of attack - 4 components component accessing Hard Disk (HD) instead one
46
46 November 9, 2012 The most important is that Capture component is prohibited accessing HD since that component has access to network and is vulnerable to remote access
47
47 November 9, 2012 By double clicking on the interface line we can directly access line in source code where this is happening
48
48 November 9, 2012 Line of code where Capture component is accessing HD
49
49 November 9, 2012 Find out where in the code security issues are identified and how and what impact they have on architecture
50
50 November 9, 2012
51
51 November 9, 2012
52
52 November 9, 2012 The architectural component where the buffer overflow is happening
53
53 November 9, 2012 Finding how that buffer overflow is effecting the risk
54
54 November 9, 2012 Activate the relationship filter and filter the model by R3 risk
55
55 November 9, 2012
56
56 November 9, 2012
57
57 November 9, 2012
58
58 November 9, 2012 Finding the flow in the code – from entry point to buffer overflow;
59
59 November 9, 2012 Exploitation path for buffer overflow found that feeds directly into possible remote attack risk
60
Visibility into Best Practices implementation in software lifecycle Developers Automated Analysis of: Quality defects SW reliability defects Security vulnerabilities Impact analysis data Code discovery Architecture rules Security Engineering Management T&E Software Architects Development Management Customized reporting Security Analysis supporting security policies & risk management (Security Engineering and Audit) Assessment based on established Assurance Case (quality, reliability, security) Architecture understanding, architecture robustness & rules Security & Quality Policies Policy Creation & Administration Continuous Assurance: SwA Integrated within System Development Lifecycle Control Points Policy enforcement on SE Data - SE Data discovered in context Information Value Chain Feedback Loop Individual developer code System watchdog: continuous integration to verify that nothing is sneaking into the delivery software stream November 9, 2012 60
61
61 November 9, 2012 System Assurance TF Roadmap San Antonio Long Beach JAXMinnCambridgeSanta Clara DCSLCOrlandoSanta Clara DCCambridgePOC Status Sep-09Dec-09Mar-10Jun-10Sep-10Dec-10Mar-11Jun-11Sep-11Dec-11Mar-12Jun-12 Work In-process Structured Assurance Case Metamodel Revised Submissi on Vote to Adopt FTF AdoptedRTFEmmett System Risk Assessment Metamodel RFP Review Presentati on Review Nakabo Repository of Assurance Case White Paper Review Presentati on Emmett Discussion in June Assurance Extensions to Semantic Interoperability (UDEF) RFI Review Calloni Software Implementation Patterns Metamodel RFC Review DiscussionReview Presentati on Review Mansouro vADM Creation of Metrics Repository RFC Review Discussion Review Bob MartinADM Repository of Patterns RFC Review KDM Discussion in June Guardol Language Downgrader RFP Discussi on Presentatio n Calloni Security Policy Extensions to UML (RFI) Discussi on Presentati on Jess Irwin Formed WG Jun 11 Data Tagging and Labeling RFP Collaboration Discussion Review Issue DoneVote to Kill Reformed new RFP WGMARS Safety Critical RT-CORBA PUB-SUB Extensions Discussi on OIS Formed WG Jun 11 High-Dependability DDS Specification Discussion ReviewRTI Assurance Case for High-Dependability DDS Discussion ReviewRTI Assuring Dependability of Consumer Devices RFI Vote to Issue Review feedback Ohata Assuring Dependability of Consumer Devices RFP ReviewOhata Interoperability of UPDM models for data exchange Calloni
62
62 Thank You! Dr. Ben Calloni, P.E., CISSP, CEH, OCRES –LM Fellow, Software Security –Lockheed Martin Aeronautics Company –Fort Worth, TX –ben.a.calloni@lmco.com –Object Management Group Board of Directors OMG System Assurance Task Force Co-Chair –Member of DHS-OSD Software Assurance Forum / WG’s –Future Airborne Capabilities Environment consortium Member of Safety / Security Technical WG Member of FACE Conformance Business WG –Network Centric Operations Industrial Consortium Information Assurance Working Group –SMU Adjunct Professor, System Security Engineering November 9, 2012
63
Backup Slides 63 November 9, 2012
64
Confidence as a Product Confidence is produced by system assurance activities, which include a planned, systematic set of multi-disciplinary activities to achieve the acceptable measures of system assurance and manage the risk of exploitable vulnerabilities Characteristics of Confidence as a product –Acceptable –Reproducible –Transferable –Measurable Challenge: assurance process that produces justified confidence in objective and cost-effective way Confidence = ∫(Importance of Claim, Strength of Evidence) Claims Rating Low High Med High Med Low Low Med High Evidence Rating confidence Med Low 64 November 9, 2012
65
65 Risk Calculation Risk = ∫(Severity of Impact, Likelihood) Impact Rating Low Med High MedHigh Med Low Low Med High Likelihood Rating Risk Med Challenge: effective and systematic measurement of the risk November 9, 2012
66
66 Measure of System’s Trustworthiness Examine confidence and the risk to determine Trustworthiness –If the level of confidence resulting from assurance case is low, the trust in the system’s resilience to malicious attacks is low… regardless of risk rating –If the risk to assets is high, the trust in the system’s resilience to malicious attacks is low… regardless what rating is assigned to confidence Higher confidence results in higher risk It is easy to rate trustworthiness as low when at least one vulnerability is detected however, if no vulnerability was detected that fact, alone does not mean that the system can be trusted Improve Trustworthiness by –Mitigating the risk –Increasing confidence Confidence Rating Low High Med High Med Low Low Med High Risk Rating Trustworthiness MedLow Challenge: producing actionable tasks to achieve high trustworthiness November 9, 2012
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.