Lecture 20 Trusted Computing and Multilevel Security modified from slides of Lawrie Brown and Hesham El-Rewini.

Slides:



Advertisements
Similar presentations
Operating System Security
Advertisements

Computer Security: Principles and Practice EECS710: Information Security Professor Hossein Saiedian Fall 2014 Chapter 13: Trusted Computing and Multilevel.
Lecture 19 Trusted Computing and Multilevel Security
IT Security Evaluation By Sandeep Joshi
Computer Security: Principles and Practice Chapter 10 – Trusted Computing and Multilevel Security.
Computer Security: Principles and Practice First Edition by William Stallings and Lawrie Brown Lecture slides by Lawrie Brown Chapter 10 – Trusted Computing.
Access Control Methodologies
Security Models and Architecture
Secure Operating Systems Lesson 0x11h: Systems Assurance.
Chapter 2.  CIA Model  Host Security VS Network Security  Least Privileges  Layered Security  Access Controls Prepared by Mohammed Saher2.
1 Evaluating Systems CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute May 6, 2004.
Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson model Introduction to Computer Security ©2004 Matt Bishop.
CMSC 414 Computer and Network Security Lecture 13 Jonathan Katz.
ITIS 3200: Introduction to Information Security and Privacy Dr. Weichao Wang.
Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson model Introduction to Computer Security ©2004 Matt Bishop.
Information Systems Security Security Architecture Domain #5.
CS526Topic 21: Integrity Models1 Information Security CS 526 Topic 21: Integrity Protection Models.
User Domain Policies.
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #6-1 Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson.
Lecture slides prepared for “Computer Security: Principles and Practice”, 2/e, by William Stallings and Lawrie Brown, Chapter 4 “Overview”.
CMSC 414 Computer and Network Security Lecture 19 Jonathan Katz.
D ATABASE S ECURITY Proposed by Abdulrahman Aldekhelallah University of Scranton – CS521 Spring2015.
Computer Science and Engineering Computer System Security CSE 5339/7339 Session 20 October 28, 2004.
Trusted System? What are the characteristics of a trusted system?
Switch off your Mobiles Phones or Change Profile to Silent Mode.
Security Mark A. Magumba. Definitions Security implies the minimization of threats and vulnerabilities A security threat is a harmful event or object.
Session 2 - Security Models and Architecture. 2 Overview Basic concepts The Models –Bell-LaPadula (BLP) –Biba –Clark-Wilson –Chinese Wall Systems Evaluation.
Security Architecture and Design Chapter 4 Part 3 Pages 357 to 377.
ITIS 3200: Introduction to Information Security and Privacy Dr. Weichao Wang.
Chapter 5 Network Security
Lecture slides prepared for “Computer Security: Principles and Practice”, 3/e, by William Stallings and Lawrie Brown, Chapter 13 “Trusted Computing and.
CS426Fall 2010/Lecture 251 Computer Security CS 426 Lecture 25 Integrity Protection: Biba, Clark Wilson, and Chinese Wall.
Trusted OS Design and Evaluation CS432 - Security in Computing Copyright © 2005, 2010 by Scott Orr and the Trustees of Indiana University.
UT DALLAS Erik Jonsson School of Engineering & Computer Science FEARLESS engineering Integrity Policies Murat Kantarcioglu.
12/4/20151 Computer Security Security models – an overview.
Computer Science and Engineering Computer System Security CSE 5339/7339 Session 19 October 26, 2004.
Academic Year 2014 Spring Academic Year 2014 Spring.
Archictecture for MultiLevel Database Systems Jeevandeep Samanta.
Computer Security: Principles and Practice
Protection & Security Greg Bilodeau CS 5204 October 13, 2009.
High Assurance Products in IT Security Rayford B. Vaughn, Mississippi State University Presented by: Nithin Premachandran.
CS426Fall 2010/Lecture 211 Computer Security CS 426 Lecture 21 The Bell LaPadula Model.
Dr. Jeff Teo Class 4 July 2, Deliverables Lecture on Trusted Computing: Evolution and Direction Review of students’ blogs and assignments Summarize.
Chapter 8: Principles of Security Models, Design, and Capabilities
Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University
Slide #6-1 Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson model.
Database Security. Introduction to Database Security Issues (1) Threats to databases Loss of integrity Loss of availability Loss of confidentiality To.
Access Controls Mandatory Access Control by Sean Dalton December 5 th 2008.
PREPARED BY: MS. ANGELA R.ICO & MS. AILEEN E. QUITNO (MSE-COE) COURSE TITLE: OPERATING SYSTEM PROF. GISELA MAY A. ALBANO PREPARED BY: MS. ANGELA R.ICO.
Chap5: Designing Trusted Operating Systems.  What makes an operating system “secure”? Or “trustworthy”?  How are trusted systems designed, and which.
Information Security Principles and Practices by Mark Merkow and Jim Breithaupt Chapter 5: Security Architecture and Models.
6/22/20161 Computer Security Integrity Policies. 6/22/20162 Integrity Policies Commercial requirement differ from military requirements: the emphasis.
CS526Topic 19: Integrity Models1 Information Security CS 526 Topic 19: Integrity Protection Models.
Security Architecture and Design Chapter 4 Part 4 Pages 377 to 416.
TOPIC: Web Security Models
Database and Cloud Security
Database System Implementation CSE 507
Access Control Model SAM-5.
Trusted Computing and Multilevel Security
Security Models and Designing a Trusted Operating System
Database Security and Authorization
Chapter 6 Integrity Policies
Computer Data Security & Privacy
Computer Security: Principles and Practice
Dr. Bhavani Thuraisingham Cyber Security Lecture for July 2, 2010 Security Architecture and Design.
Guest Lecture in Acc 661 (Spring 2007) Instructor: Christopher Brown)
Chapter 6: Integrity Policies
Computer Security Integrity Policies
Presentation transcript:

Lecture 20 Trusted Computing and Multilevel Security modified from slides of Lawrie Brown and Hesham El-Rewini

Computer Security Models two fundamental computer security facts: all complex software systems have eventually revealed flaws or bugs that need to be fixed it is extraordinarily difficult to build computer hardware/software not vulnerable to security attacks problems involved both design and implementation – led to development of formal security models 2

Trusted OS Functions 3

Multilevel Security (MLS) RFC 2828 defines multilevel security as follows: “A class of system that has system resources (particularly stored information) at more than one security level (i.e., has different types of sensitive resources) and that permits concurrent access by users who differ in security clearance and need-to-know, but is able to prevent each user from accessing resources for which the user lacks authorization.” 4

Bell-LaPadula (BLP) Model formal model for access control – developed in 1970s subjects and objects are assigned a security class – a subject has a security clearance – an object has a security classification – form a hierarchy and are referred to as security levels top secret > secret > confidential > restricted >unclassified – security classes control the manner by which a subject may access an object 5

BLP Model Access Modes READ – the subject is allowed only read access to the object APPEND – the subject is allowed only write access to the object WRITE – the subject is allowed both read and write access to the object EXECUTE – the subject is allowed neither read nor write access to the object but may invoke the object for execution 6

Multi-Level Security no read up – subject can only read an object of less or equal security level – referred to as the simple security property ss-property no write down – a subject can only write into an object of greater or equal security level – referred to as the *-property 7

Multi-Level Security 8

ds-property : An individual (or role) may grant to another individual (or role) access to a document –based on the owner’s discretion, constrained by the MAC rules site policy overrides any discretionary access controls 9

BLP Formal Description based on current state of system (b, M, f, H): – current access set b: triples of (s, o, a) subject s has current access to object o in access mode a – access matrix M: matrix of M ij access modes of subject S i to access object O j – level function f: security level of subjects and objects f o ( O j ) is the classification level of object O j f s ( S i ) is the security clearance of subject S i f c ( S i ) is the current security level of subject S i – hierarchy H: a directed rooted tree of objects 10

BLP Formal Description three BLP properties: – ss-property:  (S i, O j, read) has f c (S i ) ≥ f o (O j ) – *-property:  (S i, O j, append) has f c (S i ) ≤ f o (O j ) and  (S i, O j, write) has f c (S i ) = f o (O j ) – ds-property:current (S i, O j, A x ) implies A x  M[S i O j ] BLP gives formal theorems – theoretically possible to prove system is secure – in practice usually not possible 11

Illustration o1o1 s1s1 o2o2 o3o3 s2s2 o4o4 o5o5 Low High o5o5 s2s2 Object Subject 12 Append Read Write Read Write Read

BLP Rules 1 get access 2 release access 3 change object level 4 change current level 5 give access permission 6 rescind access permission 7 create an object 8 delete a group of objects 13

BLP Example *-property

BLP Example ss-property and *-property

16 BLP Example “downgrade” in a controlled and monitored manner “classification creep”: information from a range of sources and levels “covert channels”: (untrusted) low classified executable data allowed to be executed by a high clearance (trusted) subject Integrity ?

Limitations to the BLP Model Incompatibility of confidentiality and integrity within a single MLS system – MLS can work either for powers or for secrets but not for both – This mutual exclusion excludes some interesting power and integrity centered technologies from being used effectively in BLP style MLS environments Cooperating conspirator problem in the presence of covert channels – In the presence of shared resources the *-property may become unenforceable In essence, the BLP model effectively breaks down when (untrusted) low classified executable data are allowed to be executed by a high clearance (trusted) subject

Biba Integrity Model strict integrity policy: – Modify: To write or update information in an object – Observe: To read information in an object – Execute: To execute an object – Invoke: Communication from one subject to another – simple integrity: I(S) ≥ I(O) – integrity confinement:I(S) ≤ I(O) – invocation property:I(S1) ≥ I(S2) 19

Clark-Wilson Integrity Model closely models commercial operations – Well-formed transactions A user should not manipulate data arbitrarily – Separation of duty among users A person who create or certify a well-formed transaction may not execute it 20

Clark-Wilson Integrity Model Principal components of the model – Constrained data items (CDIs) Subject to strict integrity controls – Unconstrained data items (UDIs) Unchecked data items – Integrity verification procedures (IVPs): Intended to assure that all CDIs conform to some application-specific model of integrity and consistency – Transformation procedures (TPs): System transactions that change the set of CDIs from one consistent state to another 21

Clark-Wilson Integrity Model 22

Chinese Wall Model integrity and confidentiality use both discretionary and mandatory access – Subjects: Active entities that may wish to access protected objects – Information: Information organized into a hierarchy Objects: Individual items of information, each concerning a single corporation Dataset (DS): All objects that concern the same corporation Conflict of interest (CI) class: All datasets whose corporations are in competition – Access rules: Rules for read and write access 23

Chinese Wall Model 24 Simple security rule: A S can read O only if O is in the same DS as an object already accessed by S, OR O belongs to a CI from which S has not yet accessed any information *-property rule: A S can write an O only if S can read O according to the simple security rule, AND All objects that S can read are in the same DS as O. sanitized data

Terminology Related to Trust Trust: The extent to which someone who relies on a system can have confidence that the system meets its specifications – i.e., that the system does what it claims to do and does not perform unwanted functions Trusted system: A system believed to enforce a given set of attributes to a stated degree of assurance. Trustworthiness: Assurance that a system deserves to be trusted, such that the trust can be guaranteed in some convincing way, – such as through formal analysis or code review. Trusted computer system: A system that employs sufficient hardware and software assurance measures to allow its use for simultaneous processing of a range of sensitive or classified information. Trusted computing base (TCB): A portion of a system that enforces a particular policy – The TCB must be resistant to tampering and circumvention. – The TCB should be small enough to be analyzed systematically. Assurance: A process that ensures a system is developed and operated as intended by the system's security policy. Evaluation: Assessing whether the product has the security properties claimed for it. Functionality: The security features provided by a product. 25

Reference Monitors 26 Properties Complete mediation Isolation Verifiability o Trustworthy system

Trojan Horse Defense 27

Trusted Platform Module (TPM) concept from Trusted Computing Group hardware module at heart of hardware/ software approach to trusted computing (TC) uses a TPM chip – motherboard, smart card, processor – working with approved hardware/software – generating and using crypto keys has three basic services: authenticated boot, certification, encryption 28

TPM Functions 29

Authenticated Boot Service responsible for booting entire OS in stages and ensuring each is valid and approved for use – at each stage digital signature associated with code is verified – TPM keeps a tamper-evident log of the loading process log records versions of all code running – can then expand trust boundary to include additional hardware and application and utility software – confirms component is on the approved list, is digitally signed, and that serial number hasn’t been revoked result is a configuration that is well-defined with approved components 30

Certification Service once a configuration is achieved and logged the TPM can certify configuration to others – can produce a digital certificate – confidence that configuration is unaltered because: TPM is considered trustworthy only the TPM possesses this TPM’s private key include challenge value in certificate to also ensure it is timely provides a hierarchical certification approach – hardware/OS configuration – OS certifies application programs – user has confidence in application configuration 31

Encryption Service encrypts data so that it can only be decrypted by a machine with a certain configuration TPM maintains a master secret key unique to machine – used to generate secret encryption key for every possible configuration of that machine can extend scheme upward – provide encryption key to application so that decryption can only be done by desired version of application running on desired version of the desired OS – encrypted data can be stored locally or transmitted to a peer application on a remote machine 32

Protected Storage Function 33

Security Assurance “…degree of confidence that the security controls operate correctly and protect the system as intended. Assurance is not, however, an absolute guarantee that the measures work as intended.” 34

Assurance and Evaluation evaluators – use the assurance requirements as criteria when evaluating security features and controls – may be in the same organization as consumers or a third-party evaluation team assurance – deals with security features of IT products – applies to: requirements security policy product design product implementation system operation target audiences: consumers – select security features and functions – determine the required levels of security assurance developers – respond to security requirements – interpret statements of assurance requirements – determine assurance approaches and level of effort

Scope of Assurance system architecture – addresses both the system development phase and the system operations phase system integrity – addresses the correct operation of the system hardware and firmware system testing – ensures security features have been tested thoroughly design specification and verification – addresses the correctness of the system design and implementation with respect to the system security policy covert channel analysis – attempts to identify any potential means for bypassing security policy 36

Scope of Assurance trusted facility management – deals with system administration trusted recovery – provides for correct operation of security features after a system recovers from failures, crashes, or security incidents trusted distribution – ensures that protected hardware, firmware, and software do not go through unauthorized modification during transit from the vendor to the customer configuration management – requirements are included for configuration control, audit, management, and accounting 37

Assurance Verification – each of the system’s functions works correctly Validation – developer is building the right product according to the specification Testing – based on the actual product being evaluated, not on abstraction 38

Testing Observable effects versus internal structure Can demonstrate existence of a problem, – but passing tests does not imply absence of any Hard to achieve adequate test coverage within reasonable time – inputs & internal states hard to keep track of all states Penetrating Testing – tiger team analysis, ethical hacking Team of experts in design of OS tries to crack system 39

Formal verification The most rigorous method Rules of mathematical logic to demonstrate that a system has certain security property Proving a Theorem – Time consuming – Complex process 40

Entry min  A[1] i  1 i  i + 1 i > n min < A[i] min  A[i] Exit yes no yes no Example: find minimum 41

Finding the minimum value Assertions P:n > 0Q:n > 0 and 1  i  n and min  A[1] R:n > 0 and S:n > 0 and 1  i  n andi = n + 1 and for all j 1  j  i -1 for all j 1  j  i -1 min  A[j] 42

Validation Requirements checking – system does things it should do also, system does not do things it is not supposed to do Design and code reviews – traceability from each requirement to design and code components System testing – data expected from reading the requirement document can be confirmed in the actual running of the system 43

Common Criteria (CC) Common Criteria for Information Technology and Security Evaluation – ISO standards for security requirements and defining evaluation criteria aim is to provide greater confidence in IT product security – development using secure requirements – evaluation confirming meets requirements – operation in accordance with requirements following successful evaluation a product may be listed as CC certified – NIST/NSA publishes lists of evaluated products 44

CC Requirements 45 functional requirements – define desired security behavior assurance requirements – basis for gaining confidence that the claimed security measures are effective and implemented correctly component – describes a specific set of security requirements – smallest selectable set common set of potential security requirements for use in evaluation target of evaluation (TOE) – refers to the part of product or system subject to evaluation class – collection of requirements that share a common focus or intent

CC Security FunctionalRequirements 46

CC Security AssuranceRequirements 47

Construction of CC Requirements 48

CC Security Paradigm 49

Protection Profile (PP) smart card provides simple PP example describes IT security requirements for smart card use by sensitive applications physical probing invalid input linkage of multiple operations threats that must be addressed: reflect the stated intent to counter identified threats and comply with identified organizational security policies security objectives provided to thwart specific threats and to support specific policies under specific assumptions security requirements 50

CC Assurance Levels EAL 1: functionally testedEAL 2: structurally testedEAL 3: methodically tested and checkedEAL 4: methodically designed, tested, and reviewedEAL 5: semi-formally designed and testedEAL 6: semi-formally verified design and testedEAL 7: formally verified design and tested 51

Evaluation ensures security features work correctly and effectively and show no exploitable vulnerabilities performed in parallel with or after the development of TOE higher levels entail: greater rigor, more time, more cost principle input: security target, evidence, actual TOE result: confirm security target is satisfied for TOE process relates security target to high-level design, low- level design, functional specification, source code implementation, and object code and hardware realization of the TOE degree of rigor and depth of analysis are determined by assurance level desired 52

Evaluation Parties and Phases evaluation parties: – sponsor - customer or vendor – developer - provides evidence for evaluation – evaluator - confirms requirements are satisfied – certifier - agency monitoring evaluation process monitored and regulated by a government agency in each country Common Criteria Evaluation and Validation Scheme (CCEVS) – operated by NIST and the NSA 53

Evaluation Parties and Phases preparation: – initial contact between sponsor and developer – conduct of evaluation: confirms satisfaction of security target – conclusion: final report is given to the certifiers for acceptance 54

Evaluation Review: requirements, design, implementation, assurance US “Orange Book” Evaluation – Trusted Computer System Evaluation Criteria (TCSEC) European ITSEC Evaluation – Information Technology Security Evaluation Criteria US Combined Federal Criteria – 1992 jointly buy NIST and NSA 55

Summary The Bell-LaPadula model for computer security – Computer security models – General description – Formal description of model – Abstract operations – Example of BLP use – Implementation example-multics – Limitations to the BLP model Other formal models for computer security – Biba integrity model – Clark-Wilson integrity model – Chinese wall model The concept of trusted systems – Reference monitors – Trojan horse defense Trusted computing and the trusted platform module o Authenticated boot service o Certification service o Encryption service o TPM functions o Protected storage Common criteria for information technology security evaluation – Requirements – Profiles and targets – Example of a protection profile Assurance and evaluation o Target audience o Scope of assurance o Common criteria evaluation assurance levels o Evaluation

Role-Based Access Control

RBAC Elements 58

Role Hierarchy User Assignments 59

Database Classification 60 TableColumn

Database Classification 61 RowElement

Database Security: Read Access DBMS enforces simple security rule (no read up) easy if granularity is entire database or at table level inference problems if have column granularity – if can query on restricted data can infer its existence SELECT Ename FROM Employee WHERE Salary > 50K – solution is to check access to all query data also have problems if have row granularity – null response indicates restricted/empty result no extra concerns if have element granularity 62

Database Security: Write Access enforce *-security rule (no write down) have problem if a low clearance user wants to insert a row with a primary key that already exists in a higher level row: – can reject, but user knows row exists – can replace, compromises data integrity – polyinstantiation and insert multiple rows with same key, creates conflicting entries same alternatives occur on update avoid problem if use database/table granularity 63

Example of Polyinstantiation 64