1 Introduction to Information Security 0368-3065, Spring 2014 Lecture 3: Access control (cont.) Eran Tromer Slide credits: John Mitchell, Stanford Max.

Slides:



Advertisements
Similar presentations
Information Flow and Covert Channels November, 2006.
Advertisements

Lecture 8 Access Control (cont)
CSE331: Introduction to Networks and Security Lecture 34 Fall 2002.
1 Information Security – Theory vs. Reality , Winter 2011 Lecture 7: Information flow control Eran Tromer Slides credit: Max Krohn, MIT Ian.
Ashish Kundu CS590F Purdue 02/12/07 Language-Based Information Flow Security Andrei Sabelfield, Andrew C. Myers Presentation: Ashish Kundu
Slide #5-1 Chapter 5: Confidentiality Policies Overview –What is a confidentiality model Bell-LaPadula Model –General idea –Informal description of rules.
Confidentiality Policies  Overview  What is a confidentiality model  Bell-LaPadula Model  General idea  Informal description of rules  Formal description.
CMSC 414 Computer and Network Security Lecture 13 Jonathan Katz.
1 Enforcing Confidentiality in Low-level Programs Andrew Myers Cornell University.
Verifiable Security Goals
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #5-1 Chapter 5: Confidentiality Policies Overview –What is a confidentiality.
1 Clark Wilson Implementation Shilpa Venkataramana.
CMSC 414 Computer and Network Security Lecture 11 Jonathan Katz.
Sicurezza Informatica Prof. Stefano Bistarelli
Information Systems Security Security Architecture Domain #5.
User Domain Policies.
Policy, Models, and Trust 1. Security Policy A security policy is a well-defined set of rules that include the following: Subjects: the agents who interact.
CMSC 414 Computer and Network Security Lecture 19 Jonathan Katz.
1 Introduction to Information Security , Spring 2015 Lecture 3: Access control (2/2) Eran Tromer Slides credit: John Mitchell, Stanford Max Krohn,
D ATABASE S ECURITY Proposed by Abdulrahman Aldekhelallah University of Scranton – CS521 Spring2015.
1 Information Security – Theory vs. Reality , Winter 2011 Lecture 4: Access Control Eran Tromer Slides credit: John Mitchell, Stanford course.
CH14 – Protection / Security. Basics Potential Violations – Unauthorized release, modification, DoS External vs Internal Security Policy vs Mechanism.
Trusted System? What are the characteristics of a trusted system?
1 Confidentiality Policies September 21, 2006 Lecture 4 IS 2150 / TEL 2810 Introduction to Security.
1 IS 2150 / TEL 2810 Information Security & Privacy James Joshi Associate Professor, SIS Lecture 6 Oct 2-9, 2013 Security Policies Confidentiality Policies.
© G. Dhillon, IS Department Virginia Commonwealth University Principles of IS Security Formal Models.
PRECIP: Towards Practical and Retrofittable Confidential Information Protection XiaoFeng Wang (IUB), Zhuowei Li (IUB), Ninghui Li (Purdue) and Jong Youl.
Containment and Integrity for Mobile Code Security policies as types Andrew Myers Fred Schneider Department of Computer Science Cornell University.
Session 2 - Security Models and Architecture. 2 Overview Basic concepts The Models –Bell-LaPadula (BLP) –Biba –Clark-Wilson –Chinese Wall Systems Evaluation.
Security Architecture and Design Chapter 4 Part 3 Pages 357 to 377.
Security+ All-In-One Edition Chapter 19 – Privilege Management Brian E. Brzezicki.
Lattice-Based Access Control Models Ravi S. Sandhu Colorado State University CS 681 Spring 2005 John Tesch.
Chapter 5 Network Security
Chapter 6: Integrity Policies  Overview  Requirements  Biba’s models  Clark-Wilson model Introduction to Computer Security ©2004 Matt Bishop.
CS426Fall 2010/Lecture 251 Computer Security CS 426 Lecture 25 Integrity Protection: Biba, Clark Wilson, and Chinese Wall.
Trusted OS Design and Evaluation CS432 - Security in Computing Copyright © 2005, 2010 by Scott Orr and the Trustees of Indiana University.
12/4/20151 Computer Security Security models – an overview.
Policy, Models, and Trust
Information Security CS 526 Topic 17
Chapter 5 – Designing Trusted Operating Systems
COEN 350: Network Security Authorization. Fundamental Mechanisms: Access Matrix Subjects Objects (Subjects can be objects, too.) Access Rights Example:
A Lattice Model of Secure Information Flow By Dorothy E. Denning Presented by Drayton Benner March 22, 2000.
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #5-1 Confidentiality Policies Overview –What is a confidentiality model Bell-LaPadula.
CS426Fall 2010/Lecture 211 Computer Security CS 426 Lecture 21 The Bell LaPadula Model.
Dr. Jeff Teo Class 4 July 2, Deliverables Lecture on Trusted Computing: Evolution and Direction Review of students’ blogs and assignments Summarize.
SANS Top 25 Most Dangerous Programming Errors Catagory 1: Insecure Interaction Between Components These weaknesses are related to insecure ways.
Computer Science and Engineering Computer System Security CSE 5339/7339 Session 16 October 14, 2004.
Access Controls Mandatory Access Control by Sean Dalton December 5 th 2008.
EN Lecture Notes Spring 2016 ACCESS CONTROL MODELS.
CS526Topic 19: Integrity Models1 Information Security CS 526 Topic 19: Integrity Protection Models.
Lecture 2 Page 1 CS 236 Online Security Policies Security policies describe how a secure system should behave Policy says what should happen, not how you.
9- 1 Last time ● User Authentication ● Beyond passwords ● Biometrics ● Security Policies and Models ● Trusted Operating Systems and Software ● Military.
Security Architecture and Design Chapter 4 Part 4 Pages 377 to 416.
Access Control CSE 465 – Information Assurance Fall 2017 Adam Doupé
Verifiable Security Goals
Mandatory Access Control (MAC)
Introduction to Information Security , Spring 2016 Lecture 9: Secure Architecture Principles, Access Control Avishai Wool Slides credit: Eran.
Mandatory Access Control (MAC)
Information Security CS 526 Topic 17
Advanced System Security
Confidentiality Models
Guest Lecture in Acc 661 (Spring 2007) Instructor: Christopher Brown)
Chapter 5: Confidentiality Policies
Lecture 17: Mandatory Access Control
Information Security CS 526
CS703 - Advanced Operating Systems
IS 2150 / TEL 2810 Information Security & Privacy
A Survey of Formal Models for Computer Security
Advanced System Security
Presentation transcript:

1 Introduction to Information Security , Spring 2014 Lecture 3: Access control (cont.) Eran Tromer Slide credits: John Mitchell, Stanford Max Krohn, MIT Ian Goldberg and Urs Hengartner, University of Waterloo

2 Multi-Level Security (MLS) Concepts Military security policy  Classification involves sensitivity levels, compartments  Do not let classified information leak to unclassified files Group individuals and resources Use some form of hierarchy to organize policy Other policy concepts Separation of duty “Chinese Wall” Policy (managing conflicts of interests)

3 Military security policy Sensitivity levels Top Secret Secret Confidential Restricted Unclassified Compartments Satellite data Micronesia Middle East Israel

4 Example: military security policy Classification of personnel and data Class =  rank, compartment  Dominance relation C 1  C 2 iff rank 1  rank 2 and compartment 1  compartment 2 Example:  Restricted, Israel    Secret, Middle East  Applies to Subjects – users or processes Objects – documents or resources

5 Example: commercial policy Internal Proprietary Public Product specifications In production OEM Discontinued

6 Generalizing: lattices A dominance relationship that is transitive and antisymmetric defines a lattice For two levels a and b, maybe neither a ≥ b nor b ≥ a However, for every a and b, there is a lowest upper bound u for which u ≥ a and u ≥ b, and a greatest lower bound l for which a ≥ l and b ≥ l There are also two elements U and L that dominate/are dominated by all levels In next slide’s example, U = (“Top Secret”, {“Middle East”, “Micronesia”}) L = (“Unclassified”,  )‏

7 Example Lattice (“Top Secret”, {“Middle East”, “Micronesia”}), (“Unclassified”,  )‏ (“Top Secret”, {“Middle East”}) (“Secret”, {“Middle East”}) (“Secret”, {“Micronesia”}) (“Secret”, {“Middle East”, “Micronesia”})

8 Bell-LaPadula Confidentiality Model When is it OK to release information? Two properties No read up  A subject S may read object O only if C ( O )  C ( S ) No write down  A subject S with read access to O may write to object P only if C ( O )  C ( P ) You may onlyread below your classification andwrite above your classification Mandatory Access Control: protect even against malicious software / users / trojan horses who try to violate policy. (by contrast: Discretionary Access Control)

9 Picture: Confidentiality S Public Proprietary Read below, write above S Public Proprietary Read above, write below

10 Biba Integrity Model Rules that preserve integrity of information (dual to confidentiality) Two properties No write up  A subject S may write object O only if C ( S )  C ( O ) (Only trust S to modify O if S has higher rank …) No read down  A subject S with read access to O may write object P only if C ( O )  C ( P ) (Only move info from O to P if O is more trusted than P ) You may onlywrite below your classification andread above your classification

11 Picture: Integrity S Public Proprietary Read above, write below S Public Proprietary Read below, write above

12 Problems Covert channels Declassification (when OK to violate strict policy?) Composability (e.g, one-time-pad) Aggregation (many “low” facts may enable deduction of “high” information) Overclassification (label creep) Incompatibilies (e.g., upgraded files “disappear”) Polyinstantiation (maintaining consistency when different users have different view of the data, and perhaps even see “cover stories”)

13 Other policy concepts Separation of duty If amount is over $10,000, check is only valid if signed by two authorized people Two people must be different Policy involves role membership and  Chinese Wall Policy Lawyers L1, L2 in same firm If company C1 sues C2,  L1 and L2 can each work for either C1 or C2  No lawyer can work for opposite sides in any case Permission depends on use of other permissions These policies cannot be represented using access matrix

14 Access control in web browsers Operating system Objects: System calls Processes Disk File Principals: Users Discretionary access control Vulnerabilities Buffer overflow Root exploit Web browser Objects: Document object model Frames Cookies / localStorage Principals : “Origins” Mandatory access control Vulnerabilities Cross-site scripting Universal scripting

15 Components of browser security policy Frame-Frame relationships canScript(A,B)  Can Frame A execute a script that manipulates arbitrary/nontrivial DOM elements of Frame B? canNavigate(A,B)  Can Frame A change the origin of content for Frame B? Frame-principal relationships readCookie(A,S), writeCookie(A,S)  Can Frame A read/write cookies from site S?

16 Policies: further reading Anderson, Stajano, Lee, Security Policies Levy, Capability-based Computer Systems

17 Information Flow Control An information flow policy describes authorized paths along which information can flow For example, Bell-La Padula describes a lattice-based information flow policy

18 Program-level IFC Input and output variables of program each have a (lattice-based) security classification S() associated with them For each program statement, compiler verifies whether information flow is secure For example, x = y + z is secure only if S( x ) ≥ lub(S( y ), S( z )), where lub() is lowest upper bound Program is secure if each of its statements is secure

19 Implicit information flow Conditional branches carry information about the condition: secret bool a; public bool b; … b = a; if (a) { b = 1; } else { b = 0; } Possible solution: assign label to program counter and include it in every operation. (Too conservative!)

20 IFC Issues Label creep Examples: Branches Logically-false flows “Doesn’t really matter” Branches Declassification Example: encryption Catching all flows Exceptions (math division by 0, null pointer, …) Signals Covert channels

Noninterference [Goguen & Meseguer ’82] 21 p q S p = {a} S q = {} Experiment #1: p’ q S p’ = {a} S q = {} Experiment #2: = “HIGH”“LOW”

Implementation approaches 22 Language-based IFC (JIF & others) – Compile-time tracking for Java, Haskell – Static analysis – Provable security – Label creep (false positives) Dynamic analysis – Language / bytecode / machine code – Tainting – False positives, false negatives – Performance – Hybrid solutions OS-based DIFC (Asbestos, HiStar, Flume) – Run-time tracking enforced by a trusted kernel Works with any language, compiled or scripting, closed or open