CMSC 414 Computer and Network Security Lecture 11 Jonathan Katz.

Slides:



Advertisements
Similar presentations
Information Flow and Covert Channels November, 2006.
Advertisements

CMSC 414 Computer (and Network) Security Lecture 13 Jonathan Katz.
Lakshmi Narayana Gupta Kollepara 10/26/2009 CSC-8320.
Access Control Methodologies
CMSC 414 Computer (and Network) Security Lecture 12 Jonathan Katz.
CMSC 414 Computer and Network Security Lecture 13 Jonathan Katz.
ITIS 3200: Introduction to Information Security and Privacy Dr. Weichao Wang.
Access Control Intro, DAC and MAC System Security.
CMSC 414 Computer and Network Security Lecture 13 Jonathan Katz.
Security Fall 2009McFadyen ACS How do we protect the database from unauthorized access? Who can see employee salaries, student grades, … ? Who can.
Security Fall 2006McFadyen ACS How do we protect the database from unauthorized access? Who can see employee salaries, student grades, … ? Who can.
CMSC 414 Computer and Network Security Lecture 12 Jonathan Katz.
CMSC 414 Computer and Network Security Lecture 14
CMSC 414 Computer and Network Security Lecture 9 Jonathan Katz.
CMSC 414 Computer and Network Security Lecture 11 Jonathan Katz.
CMSC 414 Computer and Network Security Lecture 10 Jonathan Katz.
1 Integrity Policies CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute March 22, 2004.
CMSC 414 Computer (and Network) Security Lecture 10 Jonathan Katz.
User Domain Policies.
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #6-1 Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson.
Mandatory Flow Control Bismita Srichandan. Outline Mandatory Flow Control Models Information Flow Control Lattice Model Multilevel Models –The Bell-LaPadula.
Lecture 7 Access Control
Dr. Kalpakis CMSC 621, Advanced Operating Systems. Fall 2003 URL: Security & Protection.
CMSC 414 Computer and Network Security Lecture 18 Jonathan Katz.
CMSC 414 Computer and Network Security Lecture 19 Jonathan Katz.
CS-550 (M.Soneru): Protection and Security - 2 [SaS] 1 Protection and Security - 2.
Dr. Kalpakis CMSC 621, Advanced Operating Systems. Security & Protection.
Lecture 18 Page 1 CS 111 Online Design Principles for Secure Systems Economy Complete mediation Open design Separation of privileges Least privilege Least.
Protection.
1 IS 2150 / TEL 2810 Information Security & Privacy James Joshi Associate Professor, SIS Lecture 6 Oct 2-9, 2013 Security Policies Confidentiality Policies.
© G. Dhillon, IS Department Virginia Commonwealth University Principles of IS Security Formal Models.
3/16/2004Biba Model1 Biba Integrity Model Presented by: Nathan Balon Ishraq Thabet.
Lattice-Based Access Control Models Ravi S. Sandhu Colorado State University CS 681 Spring 2005 John Tesch.
Chapter 5 Network Security
CMSC 414 Computer and Network Security Lecture 10 Jonathan Katz.
Access Control. What is Access Control? The ability to allow only authorized users, programs or processes system or resource access The ability to disallow.
Chapter 6: Integrity Policies  Overview  Requirements  Biba’s models  Clark-Wilson model Introduction to Computer Security ©2004 Matt Bishop.
CS426Fall 2010/Lecture 251 Computer Security CS 426 Lecture 25 Integrity Protection: Biba, Clark Wilson, and Chinese Wall.
CMSC 414 Computer (and Network) Security Lecture 11 Jonathan Katz.
Access Control MAC. CSCE Farkas 2 Lecture 17 Reading assignments Required for access control classes:  Ravi Sandhu and P. Samarati, Access Control:
Lattice-based Access Control Models 2 Daniel Trivellato.
Trusted OS Design and Evaluation CS432 - Security in Computing Copyright © 2005, 2010 by Scott Orr and the Trustees of Indiana University.
UT DALLAS Erik Jonsson School of Engineering & Computer Science FEARLESS engineering Integrity Policies Murat Kantarcioglu.
Multics CysecLab Graduate School of Information Security KAIST.
Chapter 5 – Designing Trusted Operating Systems
Materials credits: M. Bishop, UC Davis T. Jaeger, Penn State U.
Access Control: Policies and Mechanisms Vinod Ganapathy.
Computer Security: Principles and Practice
CS426Fall 2010/Lecture 211 Computer Security CS 426 Lecture 21 The Bell LaPadula Model.
Computer Science and Engineering Computer System Security CSE 5339/7339 Session 16 October 14, 2004.
Slide #6-1 Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson model.
Database Security. Introduction to Database Security Issues (1) Threats to databases Loss of integrity Loss of availability Loss of confidentiality To.
Database Security Database System Implementation CSE 507 Some slides adapted from Navathe et. Al.
Access Controls Mandatory Access Control by Sean Dalton December 5 th 2008.
22 feb What is Access Control? Access control is the heart of security Definitions: * The ability to allow only authorized users, programs or.
6/22/20161 Computer Security Integrity Policies. 6/22/20162 Integrity Policies Commercial requirement differ from military requirements: the emphasis.
Lecture 2 Page 1 CS 236 Online Security Policies Security policies describe how a secure system should behave Policy says what should happen, not how you.
CS580 Internet Security Protocols
Database System Implementation CSE 507
Access Control Model SAM-5.
Access Control CSE 465 – Information Assurance Fall 2017 Adam Doupé
Chapter 14: Protection Modified by Dr. Neerja Mhaskar for CS 3SH3.
Chapter 14: System Protection
Security Models and Designing a Trusted Operating System
Guest Lecture in Acc 661 (Spring 2007) Instructor: Christopher Brown)
Access Control.
CS703 - Advanced Operating Systems
Chapter 6: Integrity Policies
Computer Security Integrity Policies
Presentation transcript:

CMSC 414 Computer and Network Security Lecture 11 Jonathan Katz

Announcements  Midterm –Closed book, closed notes –Covers material through today’s lecture –Everything linked from the course syllabus  HW2 out

“Capability myths…”  Equivalence myth: ACLs and capabilities are “just” two views of the AC matrix  Confinement myth: Capability systems cannot enforce confinement –That is, cannot restrict delegation  Irrevocability myth: Capabilities cannot be revoked

Equivalence myth  ACLs have “arrows” from objects to subjects; capabilities have “arrows” from subjects to objects  Capabilities do not require subjects to “know” object names a priori  Capabilities do not require subjects to “know” whether they have authority –They have authority by virtue of the fact that they have a capability! –In contrast, with ACLs how do I obtain a list of all files I am allowed to read?

Equivalence myth  Capabilities allow for finer-grained treatment of subjects –Processes rather than user accounts  ACLs potentially require objects to be aware of all subjects  Capabilities allow greater flexibility to delegate permissions –In ACLs, usually all-or-nothing –In capability-based systems, can delegate a subset of the rights you have

Confinement myth  Myth: Capabilities can be delegated “at will” and therefore cannot be confined  But…can be set up so that A can delegate a capability to B only if A is authorized to pass capabilities to B –If B is untrusted, then the latter capability will not exist

Origin of confinement myth  Mistaken assumption that the ability to write/read files translates into the ability to read/write capabilities –Capabilities should not be viewed as “just” files; they can be typed by the OS

Revocation  One solution: indirection –Capabilities name an entry in a table, rather than the object itself –To revoke access to object, invalidate or change the entry in the table –Difficult to revoke access of a single user  Capabilities can also expire with time  If OS stores capabilities, can delete upon request –Requires object to recall to whom capabilities given

Advantages of capabilities  Better at enforcing “principle of least privilege” –Provide access to minimal resources, to the minimal set of subjects –We have seen already that capabilities allow much finer-grained control over subjects (process-level instead of user-level)

Advantages…  Avoiding “confused deputy” problem –“Deputy” = program managing authorities from multiple sources –In the example we have seen, the problem was not the compiler having the wrong authority, but of exercising its authority for the wrong purpose

Confused deputy…  Capabilities give the ability to identify the authority a subject is using –Can designate use of the authority for a specific purpose  Capabilities also tie together designation and authority –Don’t “know” about a resource if you don’t have the capability to access it! –Any request to access a resource must include the necessary authority to do so --- “deputy” can now examine the context of the request

Disadvantages of capabilities  Overhead  Revocation more difficult  Controlling delegation more difficult  Making files world-readable more difficult (impossible?)

Mandatory access control

“Military security policy”  Primarily concerned with secrecy  Objects given “classification” (rank; compartments)  Subjects given “clearance” (rank; compartments)  “Need to know” basis –Subject with clearance (r, C) dominates object with classification (r’, C’) only if r  r’ and C’  C –Defines a lattice … classifications/clearance not necessarily hierarchical

Security models  Bell-LaPadula model –Identifies allowable communication flows –Concerned primarily with ensuring secrecy  Biba model –Concerned primarily with “trustworthiness”/integrity of data  Chinese wall –Developed for commercial applications

Bell-LaPadula model  Simple security condition: S can read O if and only if l o  l s  *-property: S can write O if and only if l s  l o –Why?  “Read down; write up” –Information flows upward

Dynamic rights  Could consider dynamic rights –Once a process reads a file at one security level, cannot write to any file at a lower security level

Basic security theorem  If a system begins in a secure state, and always preserves the simple security condition and the *-property, then the system will always remain in a secure state –I.e., information never flows down…

Communicating down…  How to communicate from a higher security level to a lower one? –Max. security level vs. current security level –Maximum security level must always dominate the current security level –Reduce security level to write down… Security theorem no longer holds Must rely on users to be security-conscious

Commercial vs. military systems  The Bell-LaPadula model does not work well for commercial systems –Users given access to data as needed Discretionary access control vs. mandatory access control –Would require large number of categories and classifications –Centralized handling of “security clearances”

Biba model  Concerned with integrity –“Dual” of Bell-LaPadula model  The higher the level, the more confidence –More confidence that a program will act correctly –More confidence that a subject will act appropriately –More confidence that data is trustworthy  Integrity levels may be independent of security classifications –Confidentiality vs. trustworthiness –Information flow vs. information modification

Biba model  Simple integrity condition: S can read O if and only if I s  I o –I s, I o denote the integrity levels  (Integrity) *-property: S can write O if and only if I o  I s –Why? –The information obtained from a subject cannot be more trustworthy than the subject itself  “Read up; write down” –Information flows downward

Security theorem  An information transfer path is a sequence of objects o 1, …, o n and subjects s 1, …, s n-1, such that, for all i, s i can read o i and write to o i+1 –Information can be transferred from o 1 to o n via a sequence of read-write operations  Theorem: If there is an information transfer path from o 1 to o n, then I(o n )  I(o 1 ) –Informally: information transfer does not increase the trustworthiness of the data  Note: says nothing about secrecy…

“Low-water-mark” policy  Variation of “pure” Biba model  If s reads o, then the integrity level of s is changed to min(I o, I s ) –The subject may be relying on data less trustworthy than itself –So, its integrity level is lowered  Drawback: the integrity level of a subject is non- increasing!

Chinese wall  Intended to prevent conflicts of interest  Rights are dynamically updated based on actions of the subjects

Chinese wall -- basic setup Bank ABank BSchool 1School 2 School 3 Company datasets files Conflict of interest (CI) class

Chinese wall rules  Subject S is allowed to read from at most one company dataset in any CI class –This rule is dynamically updated as accesses occur –See next slide…

Example Bank ABank BSchool 1School 2 School 3 read

Chinese wall rules II  S can write to O only if –S can read O and –All objects that S can read are in the same dataset as O  This is intended to prevent an indirect flow of information that would cause a conflict of interest –E.g., S reads from Bank A and writes to School 1; S’ can read from School 1 and Bank B –S’ may find out information about Banks A and B!  Note that S can write to at most one dataset…