Week 8 - Monday.  What did we talk about last time?  Access control  Authentication.

Slides:



Advertisements
Similar presentations
TOPIC CLARK-WILSON MODEL Ravi Sandhu.
Advertisements

Access Control Methodologies
Chapter 12: Authentication Basics Passwords Challenge-Response Biometrics Location Multiple Methods Computer Security: Art and Science © Matt.
Confidentiality Policies  Overview  What is a confidentiality model  Bell-LaPadula Model  General idea  Informal description of rules  Formal description.
Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson model Introduction to Computer Security ©2004 Matt Bishop.
ITIS 3200: Introduction to Information Security and Privacy Dr. Weichao Wang.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #6-1 Chapter 6: Integrity Policies Overview Requirements Biba’s models Lipner’s.
May 4, 2004ECS 235Slide #1 Biba Integrity Model Basis for all 3 models: Set of subjects S, objects O, integrity levels I, relation ≤  I  I holding when.
Verifiable Security Goals
1 Integrity Policies CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute March 22, 2004.
Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson model Introduction to Computer Security ©2004 Matt Bishop.
User Domain Policies.
Biometrics and Authentication Shivani Kirubanandan.
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #6-1 Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson.
Sicurezza Informatica Prof. Stefano Bistarelli
Security-Authentication
Biometric Authentication Presenter: Yaoyu, Zhang Presenter: Yaoyu, Zhang.
Geoff Lacy. Outline  Definition  Technology  Types of biometrics Fingerprints Iris Retina Face Other ○ Voice, handwriting, DNA  As an SA.
Zachary Olson and Yukari Hagio CIS 4360 Computer Security November 19, 2008.
Chapter 10: Authentication Guide to Computer Network Security.
Csci5233 Computer Security1 Bishop: Chapter 12 Authentication.
Mandatory Security Policies CS461/ECE422 Spring 2012.
Slide #6-1 Integrity Policies CS461/ECE422 – Computer Security I Fall 2009 Based on slides provided by Matt Bishop for use with Computer Security: Art.
Week 8 - Wednesday.  What did we talk about last time?  Authentication  Challenge response  Biometrics  Started Bell-La Padula model.
N ew Security Approaches Biometric Technologies are Coming of Age ANIL KUMAR GUPTA & SUMIT KUMAR CHOUDHARY.
Week 7 - Friday.  What did we talk about last time?  OS security  Directory-based systems  Access control lists.
Session 2 - Security Models and Architecture. 2 Overview Basic concepts The Models –Bell-LaPadula (BLP) –Biba –Clark-Wilson –Chinese Wall Systems Evaluation.
ITIS 3200: Introduction to Information Security and Privacy Dr. Weichao Wang.
Chapter 5 Network Security
D ATABASE A DMINISTRATION L ECTURE N O 3 Muhammad Abrar.
Biometrics Stephen Schmidt Brian Miller Devin Reid.
Lecture 7 Page 1 CS 236, Spring 2008 Challenge/Response Authentication Authentication by what questions you can answer correctly –Again, by what you know.
Identification and Biometrics By Jay Eichler. Introduction What is biometrics? What is biometrics? Types of biometrics Types of biometrics Controversy.
CS426Fall 2010/Lecture 251 Computer Security CS 426 Lecture 25 Integrity Protection: Biba, Clark Wilson, and Chinese Wall.
Biometrics Authentication Technology
File System Security Robert “Bobby” Roy And Chris “Sparky” Arnold.
Trusted OS Design and Evaluation CS432 - Security in Computing Copyright © 2005, 2010 by Scott Orr and the Trustees of Indiana University.
UT DALLAS Erik Jonsson School of Engineering & Computer Science FEARLESS engineering Integrity Policies Murat Kantarcioglu.
Power Point Project Michael Bennett CST 105Y01 ONLINE Course Editor-Paulette Gannett.
12/4/20151 Computer Security Security models – an overview.
Identification Authentication. 2 Authentication Allows an entity (a user or a system) to prove its identity to another entity Typically, the entity whose.
Biometric Technologies
Chapter 5 – Designing Trusted Operating Systems
1 Figure 2-8: Access Cards Magnetic Stripe Cards Smart Cards  Have a microprocessor and RAM  More sophisticated than mag stripe cards  Release only.
INTRODUCTION TO BIOMATRICS ACCESS CONTROL SYSTEM Prepared by: Jagruti Shrimali Guided by : Prof. Chirag Patel.
Week 1 - Friday.  What did we talk about last time?  Threats  Vulnerabilities  Attackers  Controls.
Week 2 - Wednesday.  What did we talk about last time?  Authentication  Challenge-response  Passwords.
A security policy defines what needs to be done. A security mechanism defines how to do it. All passwords must be updated on a regular basis and every.
Lecture 7 Page 1 CS 236 Online Challenge/Response Authentication Authentication by what questions you can answer correctly –Again, by what you know The.
Slide #6-1 Chapter 6: Integrity Policies Overview Requirements Biba’s models Clark-Wilson model.
{ Biometric Sensing and Associated Devices Chris Lange 4/18/16 Abstract: Biometric sensors are used for many things today, from unlocking your phone to.
An Introduction to Biometrics
6/22/20161 Computer Security Integrity Policies. 6/22/20162 Integrity Policies Commercial requirement differ from military requirements: the emphasis.
Lecture 2 Page 1 CS 236 Online Security Policies Security policies describe how a secure system should behave Policy says what should happen, not how you.
Week 8 - Wednesday.  Spam  OS security.
TOPIC: Web Security Models
MANAGEMENT of INFORMATION SECURITY, Fifth Edition
Verifiable Security Goals
Challenge/Response Authentication
Chapter 6 Integrity Policies
Chapter 6: Integrity Policies
2. Access Control Matrix Introduction to Computer Security © 2004 Matt Bishop 9/21/2018.
Chapter 5: Confidentiality Policies
Chapter 6: Integrity Policies
Integrity Policies Dr. Wayne Summers Department of Computer Science
Chapter 6: Integrity Policies
Computer Security Integrity Policies
Presentation transcript:

Week 8 - Monday

 What did we talk about last time?  Access control  Authentication

Andrew Sandridge

 Some systems have a special function f a user (or user's system) must know  Thus, the system will give the user a prompt, and the user must respond  Perhaps the system would issue a random value to the user, who must then encrypt it with his secret key and send it back to the system  Perhaps it's just some other way of processing the data  Monkey Island 2: LeChuck's Revenge hand puzzle

 A one-time password is invalidated as soon as it is used  Thus, an attacker stealing the password can do limited damage  He can only log in once  He has to act quickly before the legitimate user logs in first  How do you generate all these passwords?  How do you synchronize the user and the system?

 RSA SecurID's change the password every 30 or 60 seconds  The user must be synchronized with the system within a few seconds to keep this practical  Using a secure hash function, we start with a seed value k, then  h(k) = k 1, h(k 1 ) = k 2, …, h(k n-1 ) = k n  Then passwords are in reverse order  p 1 = k n, p 2 = k n-1, … p n-1 = k 2, p n = k 1

 Biometrics means identifying humans by their physical and biological characteristics  This technology is often seen in spy and science fiction movies  It does exist, but it is far from perfect  Like passwords, the actual biometric scans are usually not stored  Instead specific features are stored for later comparison  Biometrics pose unique privacy concerns because the information collected can reveal health conditions

 Historically, fingerprints are one of the most heavily used forms of biometric identification  Especially useful for solving crimes  Even identical twins have different fingerprints  Fun fact: Koalas have fingerprints so similar to human beings that even experts are fooled  Optical scanners are available  Cheap, capacitive scanners are now even available on many laptops  The image of the fingerprint is usually not stored  Instead, specific, differentiable features are recorded

 Voice recognition systems must be trained on your voice  They can be defeated with recording devices  If you have a cold, it throws off the characteristics of your voice  As a consequence, they are particularly susceptible to both false positives and false negatives

 As the technology matures and hardware becomes cheaper, eye recognition is becoming more common  Iris recognition looks at the patterns of light and dark areas in your iris (the colored part of your eye)  For simplicity, the image is converted to grayscale for comparison  Newer iris scanners can make successful identifications at 10 feet away or more, even correcting for glasses!  Retina scans exist but are unpopular  The retina is the tissue lining the inside of your eye and requires pupil dilation to get an accurate picture, blinding you for several minutes  There are even systems for recognizing the patterns of discolorations on the whites of your eyes!

 The shape of your face, the distance between your eyes and nose, and other facial features are relatively distinctive  Although they can be nearly the same for identical twins  Computer vision techniques must be used to locate the face, deal with changes in haircut, glasses, etc.  Participants must have a neutral facial expression or results can be thrown off  The US Department of State uses facial recognition and fingerprinting to document foreigners entering the country  Their database has over 75 million photographs

 Hand geometry readers measure the shape of your hand  Keystroke dynamics are the patterns that you use when typing  Users are quite distinctive, but distractions and injuries can vary patterns a lot  Combinations of different biometrics are sometimes used  DNA sequencing is not (yet) fast enough to be used for authentication  Researchers are always coming up with new biometrics to use

 People assume that they are more secure than they are  Attacks:  Fingerprints can be lifted off a champagne glass  Voices can be recorded  Iris recognition can be faked with special contact lenses  Both false positives and false negatives are possible  It is possible to tamper with transmission from the biometric reader  Biometric characteristics can change  Identical twins sometimes pose a problem

 To trust a program, we are looking for 4 things:  Functional correctness ▪ The program does what it should  Enforcement of integrity ▪ The program’s data is still correct even if given bad or unauthorized commands  Limited privilege ▪ If the program accesses secure data, it only accesses what it needs, and it doesn’t leak rights or data to untrusted parties  Appropriate confidence level ▪ The program has been examined carefully and given trust appropriate for its job

 A security policy is a statement of the security we expect a system to enforce  A mechanism is a tool or protocol to enforce the policy  It is possible to have good policies but bad mechanisms or vice versa  A trusted system has:  Enforcement of a security policy  Sufficiency of measures and mechanisms  Evaluation

 Confidentiality access control system  Military-style classifications  Uses a linear clearance hierarchy  All information is on a need- to-know basis  It uses clearance (or sensitivity) levels as well as project-specific compartments Unclassified Restricted Confidential Secret Top Secret

 Both subjects (users) and objects (files) have security clearances  Below are the clearances arranged in a hierarchy Clearance LevelsSample SubjectsSample Objects Top Secret (TS)Tamara, ThomasPersonnel Files Secret (S)Sally, Samuel Files Confidential (C)Claire, ClarenceActivity Log Files Restricted (R)Rachel, RileyTelephone List Files Unclassified (UC)Ulaley, UrsulaAddress of Headquarters

 Let level O be the clearance level of object O  Let level S be the clearance level of subject S  The simple security condition states that S can read O if and only if the level O ≤ level S and S has discretionary read access to O  In short, you can only read down  Example?  In a few slides, we will expand the simple security condition to make the concept of level

 The *-property states that S can write O if and only if the level S ≤ level O and S has discretionary write access to O  In short, you can only write up  Example?

 Assume your system starts in a secure initial state  Let T be all the possible state transformations  If every element in T preserves the simple security condition and the *-property, every reachable state is secure  This is sort of a stupid theorem, because we define “secure” to mean a system that preserves the security condition and the *- property

 We add compartments such as NUC = Non-Union Countries, EUR = Europe, and US = United States  The possible sets of compartments are:   {NUC}  {EUR}  {US}  {NUC, EUR}  {NUC, US}  {EUR, US}  {NUC, EUR, US}  Put a clearance level with a compartment set and you get a security level  The literature does not always agree on terminology

 The subset relationship induces a lattice {NUC, EUR, US} {NUC, US} {EUR}   {NUC, EUR} {EUR, US} {NUC} {US}

 Let L be a security level and C be a category  Instead of talking about level O ≤ level S, we say that security level (L, C) dominates security level (L’, C’) if and only if L’ ≤ L and C’  C  Simple security now requires (L S, C S ) to dominate (L O, C O ) and S to have read access  *-property now requires (L O, C O ) to dominate (L S, C S ) and S to have write access  Problems?

 Commercial model that focuses on transactions  Just like a bank, we want certain conditions to hold before a transaction and the same conditions to hold after  If conditions hold in both cases, we call the system consistent  Example:  D is the amount of money deposited today  W is the amount of money withdrawn today  YB is the amount of money in accounts at the end of business yesterday  TB is the amount of money currently in all accounts  Thus, D + YB – W = TB

 Data that has to follow integrity controls are called constrained data items or CDIs  The rest of the data items are unconstrained data items or UDIs  Integrity constraints (like the bank transaction rule) constrain the values of the CDIs  Two kinds of procedures:  Integrity verification procedures (IVPs) test that the CDIs conform to the integrity constraints  Transformation procedures (TPs) change the data in the system from one valid state to another

 Clark-Wilson has a system of 9 rules designed to protect the integrity of the system  There are five certification rules that test to see if the system is in a valid state  There are four enforcement rules that give requirements for the system

 CR1: When any IVP is run, it must ensure that all CDIs are in a valid state  CR2: For some associated set of CDIs, a TP must transform those CDIs in a valid state into a (possibly different) valid state  By inference, a TP is only certified to work on a particular set of CDIs

 ER1: The system must maintain the certified relations, and must ensure that only TPs certified to run on a CDI manipulate that CDI  ER2: The system must associate a user with each TP and set of CDIs. The TP may access those CDIs on behalf of the associated user. If the user is not associated with a particular TP and CDI, then the TP cannot access that CDI on behalf of that user.  Thus, a user is only allowed to use certain TPs on certain CDIs

 CR3: The allowed relations must meet the requirements imposed by the principle of separation of duty  ER3: The system must authenticate each user attempting to execute a TP  In theory, this means that users don't necessarily have to log on if they are not going to interact with CDIs

 CR4: All TPs must append enough information to reconstruct the operation to an append-only CDI  Logging operations  CR5: Any TP that takes input as a UDI may perform only valid transformations, or no transformations, for all possible values of the UDI. The transformation either rejects the UDI or transforms it into a CDI  Gives a rule for bringing new information into the integrity system

 ER4: Only the certifier of a TP may change the list of entities associated with that TP. No certifier of a TP, or of any entity associated with that TP, may ever have execute permission with respect to that entity.  Separation of duties

 Designed close to real commercial situations  No rigid multilevel scheme  Enforces separation of duty  Certification and enforcement are separated  Enforcement in a system depends simply on following given rules  Certification of a system is difficult to determine

 Chinese Wall and Biba models  Theoretical limitations (HRU result)  Trusted system design elements  Yuki Gage presents

 Read Sections 5.1 – 5.3  Keep working on Project 2