Download presentation
Presentation is loading. Please wait.
Published byLoraine Powers Modified over 9 years ago
1
CMSC 414 Computer (and Network) Security Lecture 12 Jonathan Katz
2
Midterm? Will be held Oct 21, in class Will cover everything up to and including the preceding lecture (Oct 16) Includes all reading posted on the class syllabus!
3
Homework review? Questions on HWs 1 or 2??
4
Integrity policies (Chapter 6)
5
Some requirements/assumptions Users will not write their own programs –Will use existing programs and databases Programs will be written/tested on a nonproduction system Special process must be followed to install new program on production system
6
Requirements, continued… The special installation process is controlled and audited Auditors must have access to both system state and system logs
7
Some corollaries… “Separation of duty” –Basically, have multiple people check any critical functions (e.g., software installation) “Separation of function” –Develop new programs on a separate system Auditing –Recovery/accountability
8
Biba integrity model Ordered integrity levels –The higher the level, the more confidence More confidence that a program will act correctly More confidence that a subject will act appropriately More confidence that data is trustworthy –Note that integrity levels may be independent of security labels Confidentiality vs. trustworthiness Information flow vs. information modification
9
Information transfer An information transfer path is a sequence of objects o 1, …, o n and subjects s 1, …, s n-1, such that, for all i, s i can read o i and write to o i+1 Information can be transferred from o 1 to o n via a sequence of read-write operations
10
“Low-water-mark” policy s can write to o if and only if the integrity level of s is higher than that of o –The information obtained from a subject cannot be more trustworthy than the subject itself If s reads o, then the integrity level of s is changed to min(i(o), i(s)) –The subject may be relying on data less trustworthy than itself
11
Continued… s 1 can execute s 2 iff the integrity level of s 1 is higher than the integrity level of s 2 –Note that, e.g., s 1 provides inputs to s 2 so s 2 cannot be more trustworthy than s 1
12
Security theorem If there is an information transfer path from o 1 to o n, then i(o n ) i(o 1 ) –Informally: information transfer does not increase the trustworthiness of the data
13
Drawbacks of this approach The integrity level of a subject is non- increasing –A subject will soon be unable to access objects at high integrity levels Does not help if integrity levels of objects are lowered instead –Downgrades the integrity level of trustworthy information
14
Ring policy Only deals with direct modification –Any subject may read any object –s can write to o iff i(o) i(s) –s 1 can execute s 2 iff i(s 2 ) i(s 1 ) The difference is that integrity levels of subjects do not change… Security theorem holds here as well
15
Strict integrity policy “Biba’s model” –s can read o iff i(s) i(o) –s can write o iff i(o) i(s) –s 1 can execute s 2 iff i(s 2 ) i(s 1 ) Note that read/write are both allowed only if i(s) = i(o) Security theorem holds here as well
16
Lipner’s basic model Based loosely on Bell-LaPadula –Two security levels Audit manager (AM) System low (SL) –Five categories Development (D) - production programs under development Production code (PC) - processes/programs Production data (PD) System development (SD) - system programs under development Software tools (T) - programs unrelated to protected data
17
Lipner’s model, continued Assign users to levels/categories; e.g.: –Regular users: (SL, {PC, PD}) –Developers: (SL, {D, T}) –System auditors (AM, {D, PC, PD, SD, T}) –Etc.
18
Lipner’s model, continued Objects assigned levels/categories based on who should access them; e.g.: –Ordinary users should be able to read production code, so this is labeled (SL, {PC}) –Ordinary users should be able to write production data, so this is labeled (SL, {PC, PD}) –Follows Bell-LaPadula methodology…
19
Properties This satisfies the initial requirements: –Users cannot execute category T, so they cannot write their own programs –Developers do not have read/write access to PD, so cannot access production data If they need production data, the data must first be downgraded to D (this requires sys admins) –Etc.
20
Lipner’s full model Augment security classifications with integrity classifications Now, a subject’s access rights to an object depend on both its security classification and its integrity classification –E.g., subject can read an object only if subject’s security class is higher and subject’s integrity class is lower
21
Clark-Wilson model (highlights) Transactions are the basic operation –Not subjects/objects The system should always remain in a “consistent state” –A well-formed transaction leaves the system in a consistent state Must also verify the integrity of the transactions themselves
22
Access control mechanisms (Chapter 15)
23
The problem Drawbacks of access control matrices… –In practice, number of subjects/objects is large –Most entries blank/default –Matrix is modified every time subjects/objects are created/deleted
24
Access control lists (ACLs) Instead of storing central matrix, store each column with the object it represents –Stored as pairs (s, r) Subjects not in list have no rights –Can use wildcards to give default rights
25
Example: Unix Unix divides users into three classes: –Owner of the file –Group owner of the file –All other users Note that this leaves little flexibility… Some systems have been extended to allow for more flexibility –Abbrev. ACLs overridden by explicit ACLs
26
Modifying ACLs Only processes which “own” the object can modify the ACL of the object –Sometimes, there is a special “grant” right (possibly per right)
27
Privileged user? How do ACLs apply to privileged user? –E.g., in Solaris both abbreviations of ACLs and “full” ACLs are used Abbreviated ACLs ignored for root, but full ACLs apply even to root
28
Groups/wildcards? Groups and wildcards reduce the size and complexity of ACLs –E.g., user : group : r * : group : r user : * : r
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.