Download presentation
Presentation is loading. Please wait.
Published byEleanore Stewart Modified over 8 years ago
1
Access Control Models Sandro Etalle slides by Daniel Trivellato
2
Outline DAC and the access matrix Mandatory access control and the lattice Multi-level security: the Bell-LaPadula (BLP) model Applications of BLP
3
Outline Review: DAC and the access matrix Information flow policies and the lattice Multi-level security: the Bell-LaPadula (BLP) model Applications of BLP
4
Information security objectives Information security has 3 objectives: confidentiality (or secrecy), related to disclosure of information integrity, related to modification of information availability, related to denial of access to information
5
Access Control (AC) Goal: protect data and resources from unauthorized use Policy: defines rules (high-level) describing the accesses to be authorized by the system Model: formally defines the AC specification and enforcement Mechanism: implements the policies via low level (software and hardware) functions
6
AC requirements Correctness of AC relies on proper user identification/authentication Mechanism based on the definition of a reference monitor which must be tamper-proof non-bypassable confined in a limited part of the system (security kernel) verifiable
7
Discretionary Access Control (DAC) policies explicit access rules establish who can execute which actions on which resource users can pass on their rights to other users granting and revocation of rights regulated by an administrative policy (centralized vs. ownership)
8
Access matrix model (1/2) authorization state is represented as a matrix the system state is a triple (S,O,A), where S is a set of subjects O is a set of objects A is an access matrix, where rows correspond to subjects columns correspond to objects A[s,o] describes the privileges of s on o
9
Access matrix - Example File 1File 2File 3Program 1 Alice own read write read write Bobread write execute Charlieread execute read
10
Access matrix model (2/2) Changes of states via commands calling primitive operations: enter r into A[s,o] delete r from A[s,o] create subject s’ destroy subject s’ create object o’ destroy object o’
11
DAC weaknesses (1/2) DAC is very flexible but… Alice owns a file, she wants Bob to read it, but not Charlie in DAC, Bob can leak the information to Charlie (even without being aware of it) how? Trojan horse: software containing hidden code that performs (illegitimate) functions not known to the caller Trojan horses
12
Trojan horse - Example Bob invokes Application (e.g. calendar) read contacts write stolen code malicious code File contacts owner Bob Alice06-12345678 Charlie06-23456781 File stolen owner Daniel Alice06-12345678 Charlie06-23456781 (Bob,write,stolen)
13
DAC weaknesses (2/2) DAC constraints only direct access, no control on what happens to information after release Trojan horses exploit access privileges of calling subjects covert channels (later)
14
Outline Review: DAC and the access matrix Mandatory Access Control and the lattice The Bell-LaPadula (BLP) model Applications of BLP
15
Mandatory Access Control (MAC) policies goal: prevent the illegitimate flow (leakage) of information idea: attach security labels to subject and objects Makes a distinction between users and subjects acting on their behalf we can trust the users, but not the subjects
16
Military security (1/3) Initially (‘70s) most research in information security was applied to the military domain need to protect information that, if known by an enemy, might damage national security protecting information is costly
17
Military security (2/3) different sensitivity levels (hierarchical) are assigned to information unclassified (= public) (restricted) < confidential < secret < top secret a clearance is assigned to individuals (reflects their trustworhtiness) Example: in the USA, a ‘secret’ clearance involves checking FBI fingerprint files, ‘top secret’ also involves background checks for the previous 5-15 years of employment
18
Military security (3/3) an individual has not to be aware of all information at a given sensitivity level finer grained classification on the basis of the need to know information about different areas divided into separate compartments (e.g. nuclear, chemical), possibly overlapping the classification (security class) of an object is a tuple (sensitivity level,{compartment})
19
Information flow policies (1/2) defined by Denning (’76) concerned with the flow of information from one security class to another information flow as an ordering relation instead of a list of axioms governing users’ accesses, simply require that information transfers obey the ordering relation
20
Information flow policies (2/2) SC is a set of security classes may_flow is a relation on SC x SC ‘+’ is a function from SC x SC to SC An information flow policy is a triple Example: SC = {(S,{nuclear,chemical}), (S,{nuclear}), (S,{chemical})} (S,{nuclear}) may_flow (S,{nuclear,chemical}); (S,{nuclear}) + (S,{chemical}) = (S,{nuclear,chemical})
21
Orders and lattice (1/2) partial order of a set: binary relation that is transitive: a ≥ b and b ≥ c then a ≥ c reflexive: a ≥ a anti-symmetric/acyclic: a ≥ b and b ≥ a then a = b total order: like a chain (either a ≥ b or b ≥ a) lattice: every subset has a least upper bound, and a greatest lower bound
22
Orders and lattice (2/2) Hasse diagrams depict a partial order (e) is not a lattice
23
Information flow - Example Consider a company in which line managers report income to two different superiors - a business manager and an auditor. The auditor and the business manager are independent. Information may_flow from the workers to the line managers, and from the line managers to the business manager and the auditor. may-flow depicted below is not a lattice
24
Classification lattice - Example Levels: TS, S and TS > S lub((TS, {Nuclear}), (S, {Nuclear, Chemical})) = glb((TS, {Nuclear}), (S, {Nuclear, Chemical})) = (TS, {Nuclear, Chemical}) (S, {Nuclear}) TS, {Nuclear, Chemical} TS, {Nuclear}TS, {Chemical} S, {Nuclear, Chemical} TS, {} S, {} S, {Nuclear}S, {Chemical} the partial order on security classes is called dominates (L 1,C 1 ) ≥ (L 2,C 2 ) iff L 1 ≥ L 2 and C 2 C C 1 Compartments: Nuclear, Chemical
25
Denning’s axioms under the following assumptions, an information flow policy forms a finite lattice: 1. the set of security classes SC is finite 2. the may-flow relation is a partial order on SC 3. SC has a lower bound w.r.t. may_flow 4. operator + is a totally defined least upper bound operator
26
Outline Review: DAC and the access matrix Mandatory Access Control and the lattice The Bell-LaPadula (BLP) model Applications of BLP
27
The BLP model formalizes mandatory policy for secrecy goal: prevent information flow to lower or incomparable security classes multi-level security idea: augment DAC with MAC (security labels) to enforce information flow policies two-step approach 1. discretionary access matrix D 2. operations authorized by MAC policy, over which users have no control
28
BLP mandatory access rules object o has security label (class) SL(o) subject s has security label (clearance) SL(s) simple security property: subject s can read object o only if SL(s) ≥ SL(o) *-property: subject s can write object o only if SL(o) ≥ SL(s) NO READ UP NO WRITE DOWN Trojan horses leaking information are blocked
29
BLP information flow SUBJECTSOBJECTS ……..... TS S C U Information flow TS S C U write read write read write read write read
30
BLP - Secure states system modeled as a finite state machine (set of states and transition of states) a state contains information about the current authorizations and security labels a transition transforms a state into another state (adding or removing authorizations) a state is secure if the current authorizations satisfy the simple and * security properties a system is secure if every state reachable by executing a finite sequence of transitions is secure
31
BLP + tranquility Assume that when a subject request access to a resource the security levels of all subjects and objects are downgraded to the lowest level and access is granted secure by BLP…but not secure in a meaningful sense!!! Tranquility property strong: security labels never change during system operation TOO STRONG! weak: labels never change in such a way as to violate a defined security policy e.g. dynamic upgrade of labels principle of least privilege
32
Exceptions to properties Data association and aggregation: a set of values seen together is to be classified higher than the single values (e.g. name and salary) Sanitization and downgrading: a process may produce data less sensitive than those it has read; data may need to be downgraded after some time (embargo) TRUSTED SUBJECTS!
33
MAC weaknesses MAC policies remain vulnerable to covert channels Examples a low level subject requires a resource (e.g. CPU) that is busy by a high level subject a high level process can lock shared resources and modify the response times of processes at lower levels (timing channels) non-interference: the activity of a high level process must have no detectable effect on processes at lower or incomparable level
34
Outline Review: DAC and the access matrix Mandatory Access Control and the lattice The Bell-LaPadula (BLP) model Applications of BLP
35
Real world examples MULTICS for the Air Force Data Services Centre (time-sharing OS) MITRE brassboard kernel SIGMA message system KSOS (Kernelized Secure Operating System) SCOMP (Secure Communications Processor) PSOS (Provably Secure Operating System) SELinux multi-level Database Management Systems
36
SELinux (NSA) a security context contains all security attributes associated with subject and objects policy decisions are made by a security server residing in-kernel (no kernel- userspace calls for security decisions) the server provides a security API to the rest of the kernel, with security model hidden behing this API mechanisms to isolate different services running on a machine
37
Summary DAC: users have the ability to pass on their rights to other users based on the access matrix model no control on information after release: vulnerable to information leakage (e.g. Trojan horses) MAC: prevents illegitimate flow of information by attaching security labels to subjects and objects distinction between users (trusted) and subjects (i.e. processes, not trusted)
38
Summary Lattice: the partial order of security classes forms a lattice structure BLP: augments DAC with MAC to prevent information flow to lower or incomparable classes NO READ UP NO WRITE DOWN
39
Exercises Does DAC prevent information leakage? Why? And what about MAC, does it always prevent information leakage? Construct a lattice of security classes for security levels public < secret < top-secret and compartments {army, politics, business} How many security classes can be constructed with n security levels and m compartments?
40
Exercises Can a user cleared for (S, {dog, cat, pig}) access to documents classified in the following ways under the military information flow model? (TS, {dog}) (S, {dog}) (S, {dog, cow}) (S, {monkey}) (C, {dog, pig, cat}) (C, { })
41
Exercises What policy is adopted in the standard Windows implementations? Consider two subjects s 1 and s 2, with security classification TS and S respectively. Are the following operations authorized according to BLP, and if not, why? 1. s 1 requests to write a S-object o 1 (refer to the tranquility principle) 2. s 2 requests to read o 1 3. s 1 requests to send a TS-object o 2 to s 2 4. s 1 requests to copy the content of o 2 to o 1
42
Exercises What is the security class of an object obtained by aggregating the content of a S- and a TS-object? Describe a situation in which you may want to allow the security kernel to violate one of the security properties of BLP
43
References Ravi S. Sandhu – Lattice-Based Access Control Models (strongly recommended) Carl E. Landwehr – Formal Models for Computer Security (strongly recommended) Pierangela Samarati, Sabrina De Capitani di Vimercati - Access Control: Policies, Models, and Mechanisms (recommended) Ross Anderson – Security Engineering (2 nd Edition) (suggested)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.