Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University

Slides:



Advertisements
Similar presentations
Security Requirements
Advertisements

University of Tulsa - Center for Information Security Common Criteria Dawn Schulte Leigh Anne Winters.
Common Criteria Evaluation and Validation Scheme Syed Naqvi XtreemOS Training Day.
Computer Science CSC 474Dr. Peng Ning1 CSC 474 Information Systems Security Topic 5.2: Evaluation of Secure Information Systems.
TCSEC: The Orange Book. TCSEC Trusted Computer System Evaluation Criteria.
PKE PP Mike Henry Jean Petty Entrust CygnaCom Santosh Chokhani.
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #18-1 Chapter 18: Evaluating Systems Goals Trusted Computer System Evaluation.
4/28/20151 Computer Security Security Evaluation.
CS526Topic 22: TCSEC and Common Criteria 1 Information Security CS 526 Topic 22: TCSEC and Common Criteria.
Information Security of Embedded Systems : Design of Secure Systems Prof. Dr. Holger Schlingloff Institut für Informatik und Fraunhofer FIRST.
Effective Design of Trusted Information Systems Luděk Novák,
IT Security Evaluation By Sandeep Joshi
1 norshahnizakamalbashah CEM v3.1: Chapter 10 Security Target Evaluation.
The Common Criteria Cs5493(7493). CC: Background The need for independently evaluated IT security products and systems led to the TCSEC Rainbow series.
, Name, Folie 1 IT Audit Methodologies.
Trusted Hardware: Can it be Trustworthy? Design Automation Conference 5 June 2007 Karl Levitt National Science Foundation Cynthia E. Irvine Naval Postgraduate.
Security Models and Architecture
Secure Operating Systems Lesson 0x11h: Systems Assurance.
1 Evaluating Systems CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute May 6, 2004.
1 Lecture 8 Security Evaluation. 2 Contents u Introduction u The Orange Book u TNI-The Trusted Network Interpretation u Information Technology Security.
COEN 351: E-Commerce Security Public Key Infrastructure Assessment and Accreditation.
1 Building with Assurance CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute May 10, 2004.
Stephen S. Yau CSE , Fall Evaluating Systems for Functionality and Assurance.
Chapter 2 Access Control Fundamentals. Chapter Overview Protection Systems Mandatory Protection Systems Reference Monitors Definition of a Secure Operating.
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
What Causes Software Vulnerabilities? _____________________ ___________ ____________ _______________   flaws in developers own code   flaws resulting.
Fraud Prevention and Risk Management
1 A Common-Criteria Based Approach for COTS Component Selection Wes J. Lloyd Colorado State University Young Researchers Workshop (YRW) 2004.
Gurpreet Dhillon Virginia Commonwealth University
Principles of Information System Security: Text and Cases
Practical IS security design in accordance with Common Criteria Security and Protection of Information 2005 František VOSEJPKA S.ICZ a.s. June 5, 2005.
A Security Business Case for the Common Criteria Marty Ferris Ferris & Associates, Inc
IS 2620: Developing Secure Systems Assurance and Evaluation Lecture 8 March 15, 2012.
Evaluating Systems Information Assurance Fall 2010.
1 A Disciplined Security Specification for a High- Assurance Grid by Ning Zhu, Jussipekka Leiwo, and Stephen John Turner Parallel Computing Centre Distributed.
ISA 562 Internet Security Theory & Practice
Background. History TCSEC Issues non-standard inflexible not scalable.
1 Common Criteria Ravi Sandhu Edited by Duminda Wijesekera.
Security Standards and Threat Evaluation. Main Topic of Discussion  Methodologies  Standards  Frameworks  Measuring threats –Threat evaluation –Certification.
The Value of Common Criteria Evaluations Stuart Katzke, Ph.D. Senior Research Scientist National Institute of Standards & Technology 100 Bureau Drive;
Engineering Essential Characteristics Security Engineering Process Overview.
Chapter 18: Introduction to Assurance Dr. Wayne Summers Department of Computer Science Columbus State University
Common Criteria V3 Overview Presented to P2600 October Brian Smithson.
CMSC : Common Criteria for Computer/IT Systems
TM8104 IT Security EvaluationAutumn CC – Common Criteria (for IT Security Evaluation) The CC permits comparability between the results of independent.
1 Common Evaluation Methodology for IT Security Part 2: Evaluation Methodology chapter 5-8 Marie Elisabeth Gaup Moe 06/12/04.
Trusted OS Design and Evaluation CS432 - Security in Computing Copyright © 2005, 2010 by Scott Orr and the Trustees of Indiana University.
CSCE 548 Secure Software Development Security Operations.
1 Using Common Criteria Protection Profiles. 2 o A statement of user need –What the user wants to accomplish –A primary audience: mission/business owner.
Copyright (C) 2007, Canon Inc. All rights reserved. P. 0 A Study on the Cryptographic Module Validation in the CC Evaluation from Vendors' point of view.
Verification Formal Verification & Formal Evaluation Derived from Purdue: Cerias.
SAM-101 Standards and Evaluation. SAM-102 On security evaluations Users of secure systems need assurance that products they use are secure Users can:
Trusted Operating Systems
TM8104 IT Security EvaluationAutumn Evaluation - the Main Road to IT Security Assurance CC Part 3.
Chapter 19: Building Systems with Assurance Dr. Wayne Summers Department of Computer Science Columbus State University
High Assurance Products in IT Security Rayford B. Vaughn, Mississippi State University Presented by: Nithin Premachandran.
Chapter 8: Principles of Security Models, Design, and Capabilities
SE513 Software Quality Assurance Lecture12: Software Reliability and Quality Management Standards.
CSCE 727 Awareness and Training Secure System Development and Monitoring.
Information Security Principles and Practices by Mark Merkow and Jim Breithaupt Chapter 5: Security Architecture and Models.
1 Security Architecture and Designs  Security Architecture Description and benefits  Definition of Trusted Computing Base (TCB)  System level and Enterprise.
1 Trusted OS Design CS461/ECE Reading Material Section 5.4 of Security in Computing.
Security Architecture and Design Chapter 4 Part 4 Pages 377 to 416.
TCSEC: The Orange Book.
Ch.18 Evaluating Systems - Part 2 -
2006 Annual Research Review & Executive Forum
Official levels of Computer Security
Chapter 19: Building Systems with Assurance
THE ORANGE BOOK Ravi Sandhu
Computer Security: Art and Science, 2nd Edition
Presentation transcript:

Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University

2 Goals of Formal Evaluation  Provide a set of requirements defining the security functionality for the system or product  Provide a set of assurance requirements that delineate the steps for establishing that the system or product meets its functional requirements.  Provide a methodology for determining that the product or system meets the functional requirements based on analysis of the assurance evidence.  Provide a measure of the evaluation result that indicates how trustworthy the product or system is with respect to the security functional requirements defined for it.

3 TCSEC:  Trusted Computer System Evaluation Criteria (Orange Book):D, C1, C2, B1, B2, B3, A1 Trusted Computer System Evaluation Criteria –Emphasized confidentiality –TCSEC Functional Requirements Discretionary Access control (DAC) Object reuse requirements Mandatory access control (MAC) (>=B1) Label requirements (>=B1) Identification and authentication requirements Trusted path requirements (>=B2) Audit requirements

4 TCSEC:  TSEC Assurance Requirements –Configuration management (>= B2) –Trusted distribution (A1) –TCSEC systems architecture (C1-B3) – mandate modularity, minimize complexity, keep TCB as small and simple as possible –Design specification and verification (>=B1) –Testing requirements –Product documentation requirements

5 TCSEC:  TCSEC Evaluation Classes –C1 – discretionary protection –C2 – controlled access protection –B1 – labeled security protection –B2 – structural protection –B3 – security domains –A1 – verified protection

6 International Efforts and the ITSEC:  Information Technology Security Evaluation Criteria - European Standard since 1991 (E0, E1, E2, E3, E4, E5, E6) –Did not include tamperproof reference validation mechanisms, process isolation, principle of least privilege, well-defined user interface, and requirement for system integrity –Did require assessment of security measures used for the developer environment during the development and maintenance, submission of code, procedures for delivery, ease of use analysis

7 ITSEC  E1 – required a security target, informal description of architecture.  E2 – required informal description of the detailed design, configuration control, distribution control process  E3 – more stringent requirements on detail desing and correspondence between source code and security requirements  E4 – requires formal model of security policy, more rigorous structured approach to architectural and detailed design, and a design level vulnerability analysis  E5 – requires correspondence between detailed desing and source code and source code level vulnerability analysis  E6 – requires extensive use of formal methods

8 Common CriteriaCommon Criteria: 1998-Present  CC – defacto standard for U. S. and many other countries; ISO Standard –TOE (target of evaluation) product/system that is the subject of the evaluation –TSP (TOE Security Policy) – set of rules that regulate how assets are managed, protected, and distributed –TSF (TOE Security Functions) – h’ware, s’ware, and firmware that must be relied on to enforce the TSP (generalization of TCSEC’s trusted computing base (TCB))

9 Common Criteria  CC Protection Profile (PP) – implementation independent set of security requirements for a category of products/systems that meet specific consumer needs –Introduction (PP Indentification & PP Overview) –Product/System Family Description –Product/System Family Security Environment –Security Objectives (product/system; environment) –IT Security Requirements (functional and assurance) –Rationale (objectives and requirements)

10 Common Criteria  Security Target (ST) – set of security requirements and specifications to be used as the basis for evaluation of an identified product/system –Introduction (ST Indentification & ST Overview) –Product/System Family Description –Product/System Family Security Environment –Security Objectives (product/system; environment) –IT Security Requirements (functional and assurance) –Product/System Summary Specification –PP Claims (claims of conformance) –Rationale (objectives, requirements, TOE summary specification, PP claims)

11 Common Criteria - Security Functional Requirements  Class FAU: Security Audit  Class FCO: Communication  Class FCS: Cryptographic Support  Class FDP: User Data Protection  Class FIA: Identification and Authentication  Class FMT: Security Management  Class FPR: Privacy  Class FPT: Protection of Security Functions  Class FRU: Resource Utilization  Class FTA: TOE Access  Class FTP: Trusted Path

12 Common Criteria - Assurance Requirements  Class APE: Protection Profile Evaluation  Class ASE: Security Target Evaluation  Class ACM: Configuration Management  Class ADO: Delivery and Operation  Class ADV: Development  Class AGD: Guidance Documentation  Class ALC: Life Cycle  Class ATE: Tests  Class AVA: Vulnerability Assessment  Class AMA: Maintenance of Assurance

13 Common Criteria – Evaluation Assurance Levels  EAL1: Functionally Tested  EAL2: Structurally Tested  EAL3: Methodically Tested and Checked  EAL4: Methodically Designed, Tested and Reviewed  EAL5: Semiformally Designed and Tested  EAL6: Semiformally Verified Design and Tested  EAL7: Formally Verified Design and Tested

14 SSE-CMM: 1997-Present  System Security Engineering Capability Maturity Model – process-oriented methodology for developing secure systems based on SE- CMM –Assess capabilities of security engineering processes –Provide guidance in designing and improving them –Provides an evaluation technique for an organization’s security engineering

15 SSE-CMM Model  Process capability – range of expected results that can be achieved by following the process  Process performance – measure of actual results achieved  Process maturity – extent to which a process is explicitly defined, managed, measured, controlled, and effective

16 SSE-CMM Process Areas  Administer Security Controls  Assess Impact  Assess Security Risks  Assess Threat  Assess Vulnerability  Build Assurance Argument  Coordinate Security  Monitor System Security Posture  Provide Security Input  Specify Security Needs  Verify and Validate Security

17 SSE-CMM Capability Maturity Levels  Performed Informally – base processes are performed  Planned and Tracked – Project-level definition, planning, and performance verification issues are addressed  Well-Defined – focus on defining and refining standard practice and coordinating it across the organization  Quantitatively Controlled – focus on establishing measurable quality goals and objectively managing their performance  Continuously Improving – organizational capability and process effectiveness are improved