Computer Security: Art and Science, 2nd Edition

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Operating System Security
November 1, 2004Introduction to Computer Security ©2004 Matt Bishop Slide #12-1 Chapter 12: Design Principles Overview Principles –Least Privilege –Fail-Safe.
1 Design Principles CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute April 13, 2004.
Usable Security (Part 1 – Oct. 30/07) Dr. Kirstie Hawkey Content primarily from Teaching Usable Privacy and Security: A guide for instructors (
April 6, 2004ECS 235Slide #1 Chapter 13: Design Principles Overview Principles –Least Privilege –Fail-Safe Defaults –Economy of Mechanism –Complete Mediation.
Design Principles Overview Principles Least Privilege Fail-Safe Defaults Economy of Mechanism Complete Mediation Open Design Separation of Privilege Least.
(Breather)‏ Principles of Secure Design by Matt Bishop (augmented by Michael Rothstein)‏
Building Secure Software Chapter 9 Race Conditions.
April 1, 2004ECS 235Slide #1 Chapter 1: Introduction Components of computer security Threats Policies and mechanisms The role of trust Assurance Operational.
1 cs691 chow C. Edward Chow Design Principles for Secure Mechanisms CS591 – Chapter 5.4 Trusted OS Design CS691 – Chapter 13 of Matt Bishop.
C. Edward Chow Presented by Mousa Alhazzazi C. Edward Chow Presented by Mousa Alhazzazi Design Principles for Secure.
Lecture 18 Page 1 CS 111 Online Design Principles for Secure Systems Economy Complete mediation Open design Separation of privileges Least privilege Least.
CMSC 414 Computer (and Network) Security Lecture 14 Jonathan Katz.
The Protection of Information in Computer Systems Part I. Basic Principles of Information Protection Jerome Saltzer & Michael Schroeder Presented by Bert.
Software Security and Security Engineering (Part 2)
CE Operating Systems Lecture 3 Overview of OS functions and structure.
Access Control. What is Access Control? The ability to allow only authorized users, programs or processes system or resource access The ability to disallow.
Privilege separation in Condor Bruce Beckles University of Cambridge Computing Service.
Linux Security. Authors:- Advanced Linux Programming by Mark Mitchell, Jeffrey Oldham, and Alex Samuel, of CodeSourcery LLC published by New Riders Publishing.
14.1/21 Part 5: protection and security Protection mechanisms control access to a system by limiting the types of file access permitted to users. In addition,
(Breather)‏ Principles of Secure Design by Matt Bishop (augmented by Michael Rothstein)‏
Privilege Management Chapter 22.
Design Principles and Common Security Related Programming Problems
Fall 2008CS 334: Computer SecuritySlide #1 Design Principles Thanks to Matt Bishop.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #13-1 Chapter 13: Design Principles Overview Principles –Least Privilege –Fail-Safe.
LINUX Presented By Parvathy Subramanian. April 23, 2008LINUX, By Parvathy Subramanian2 Agenda ► Introduction ► Standard design for security systems ►
1 Chapter 12: Design Principles Overview –There are principles for many kinds of design Generally, a design should consider: Balance, Rhythm, Proportion,
June 1, 2004© Matt Bishop [Changed by Hamid R. Shahriari] Slide #13-1 Chapter 13: Design Principles Overview Principles –Least Privilege –Fail-Safe.
Slide #13-1 Design Principles CS461/ECE422 Computer Security I Fall 2008 Based on slides provided by Matt Bishop for use with Computer Security: Art and.
PREPARED BY: MS. ANGELA R.ICO & MS. AILEEN E. QUITNO (MSE-COE) COURSE TITLE: OPERATING SYSTEM PROF. GISELA MAY A. ALBANO PREPARED BY: MS. ANGELA R.ICO.
1 Design Principles CS461 / ECE422 Spring Overview Simplicity  Less to go wrong  Fewer possible inconsistencies  Easy to understand Restriction.
Chapter 29: Program Security Dr. Wayne Summers Department of Computer Science Columbus State University
SE-1021 Software Engineering II
Access Control Model SAM-5.
Development Environment
Modularity Most useful abstractions an OS wants to offer can’t be directly realized by hardware Modularity is one technique the OS uses to provide better.
Programming Logic and Design Seventh Edition
PROTECTION.
Outline Basic concepts in computer security
Essentials of UrbanCode Deploy v6.1 QQ147
Security+ All-In-One Edition Chapter 1 – General Security Concepts
Operating Systems Protection Alok Kumar Jagadev.
Chapter 14: Protection Modified by Dr. Neerja Mhaskar for CS 3SH3.
Chapter 14: System Protection
E0-256 Computer Systems Security Vinod Ganapathy Lecture 1: Introduction Slide #1-1 1.
Software Security II Karl Lieberherr.
Cryptographic Hash Function
Software Testing An Introduction.
Operating System Structure
CIT 480: Securing Computer Systems
Chapter 13: Design Principles
Chapter 1: Introduction
Chapter 13: Design Principles
Lesson 16-Windows NT Security Issues
SAMANVITHA RAMAYANAM 18TH FEBRUARY 2010 CPE 691
Chapter 2: Operating-System Structures
Security.
Chapter 14: Protection.
Outline Chapter 2 (cont) OS Design OS structure
Chapter 29: Program Security
GUI Design 24-Feb-19.
An Introduction to Debugging
Chapter 13: Design Principles
System calls….. C-program->POSIX call
Cohesion and Coupling.
Operating System Concepts
Security Principles and Policies CS 236 On-Line MS Program Networks and Systems Security Peter Reiher.
Chapter 2: Operating-System Structures
Design Principles Thanks to Matt Bishop 2006 CS 395: Computer Security.
Presentation transcript:

Computer Security: Art and Science, 2nd Edition Design Principles Chapter 14 Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Overview Simplicity, restriction Principles Least Privilege Fail-Safe Defaults Economy of Mechanism Complete Mediation Open Design Separation of Privilege Least Common Mechanism Least Astonishment Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Overview Simplicity Less to go wrong Fewer possible inconsistencies Easy to understand Restriction Minimize access Inhibit communication The design principles are rooted in simplicity and restrictiveness. Simplicity lies on many levels. The basic idea is that simpler things have fewer components, so less can go wrong. Further, there are fewer interfaces, so there are fewer entities communicating through the interfaces that can be inconsistent. Finally, they are easier to check, since the mechanism is not complex, and therefore easier to understand. There is also less to check. Restriction minimizes the number and types of interactions between the entity and other entities. In some circles, an example is the “need to know” principle: only give the entity the information it needs to complete its task. It also should only be able to release information when required to by the goals of the entity. Note this includes writing (integrity), because by altering other entities, the writer can communicate information. Version 1.0 Computer Security: Art and Science, 2nd Edition

Chapter 14: Design Principles Overview Principles Least Privilege Fail-Safe Defaults Economy of Mechanism Complete Mediation Open Design Separation of Privilege Least Common Mechanism Least Astonishment Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Least Privilege A subject should be given only those privileges necessary to complete its task Function, not identity, controls Rights added as needed, discarded after use Minimal protection domain This is an example of restriction. Key concepts are: Function: what is the task, and what is the minimal set of rights needed? “Minimal” here means that if the right is not present, the task cannot be performed. A good example is a UNIX network server that needs access to a port below 1024 (this access requires root). Rights being added, discarded: if the task requires privileges for only one action, then the privileges should be added before the action and then removed. Going back to the UNIX network server, if the server need not act as root (for example, an SMTP server), then drop the root privileges immediately after the port is opened. The protection domain statement emphasizes the other two. Version 1.0 Computer Security: Art and Science, 2nd Edition

Related: Least Authority Principle of Least Authority (POLA) Often considered the same as Principle of Least Privilege Some make distinction: Permissions control what subject can do to an object directly Authority controls what influence a subject has over an object (directly or indirectly, through other subjects) Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Fail-Safe Defaults Default action is to deny access If action fails, system as secure as when action began The first is well-known. Add rights explicitly; set everything to deny, and add back. This follows the Principle of Least Privilege. You see a variation when writing code that has security considerations. If you take untrusted data (such as input) that may contain meta-characters. The rule of thumb is to specify the LEGAL characters, and discard all others, rather than to specify the ILLEGAL characters and discard them. More on this in chapter 29 … The second is often overlooked, but goes to the meaning of “fail safe”: if something fails, the system is still safe. Failure should never change the security state of the system. Hence, if an action fails, the system should be as secure as if the action never took place. Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Economy of Mechanism Keep it as simple as possible KISS Principle Simpler means less can go wrong And when errors occur, they are easier to understand and fix Interfaces and interactions KISS principle: “Keep It Simple, Silly” (often stronger pejoratives are substituted for “silly) Simplicity refers to all dimensions: design, implementation, operation, interaction with other components, even in specification. The toolkit philosophy of the UNIX system is excellent here; each tool is designed and implemented to perform a single task. The tools are then put together. This allows the checking of each component, and then their interfaces. It is conceptually much less complex than examining the unit as a whole. The key, though, is to define all interfaces completely (for example, environment variables and global variables as well as parameter lists). Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Complete Mediation Check every access Usually done once, on first action UNIX: access checked on open, not checked thereafter If permissions change after, may get unauthorized access The reason for relaxing this one is efficiency: if you do lots of accesses, the checks will slow you down substantially. It’s not clear if that is really true, though. Exercise: Have a process open a UNIX file for reading. From the shell, delete the read permissions that allow the process to read the file. Then have the process read from the open file. The process can do so. This shows the check is done at the open. If you want to be sure, have the process close the file. Then have the process try to reopen the file for reading. This open will fail. Note that UNIX systems fail to enforce this principle to any degree on a superuser process, where access permissions are not even checked for an open! This is why people create management accounts (more properly, role accounts) like bin or mail: by restricting processes to those accounts, so access control checking applies. It also is an application of the principle of least privilege. Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Open Design Security should not depend on secrecy of design or implementation Popularly misunderstood to mean that source code should be public “Security through obscurity” Does not apply to information such as passwords or cryptographic keys Note that source code need not be available to meet this principle. It simply says that your security cannot depend upon your design being a secret. Secrecy can enhance the security, but if the design becomes exposed, the security of the mechanism cannot be affected. The problem is that people are very good at finding out what secrets protect you. They may figure it out from the way the system works, or from reverse engineering the interface or system, or by more prosaic techniques such as dumpster diving. This principle does not speak to secrets not involving design or implementation. For example, you can keep crypto keys and passwords secret. Version 1.0 Computer Security: Art and Science, 2nd Edition

Separation of Privilege Require multiple conditions to grant privilege Separation of duty Defense in depth You need to meet more than one condition to gain access. Separation of duty says that the one who signs the checks cannot be the one who prints the checks because then a single person could steal money. To make that more difficult, the thief must compromise two people, not one. This also provides a finer-grained control over a resource than a single condition. The analogy with non-computer security mechanisms is that of “defense in depth.” To get into a castle, you need to cross the moat, scale the walls, and drop down over the walls before you can get in. That is three barriers (conditions) that must be overcome (met). Version 1.0 Computer Security: Art and Science, 2nd Edition

Least Common Mechanism Mechanisms should not be shared Information can flow along shared channels Covert channels Isolation Virtual machines Sandboxes Isolation prevents communication, and communication with something—another process or a resource—is necessary for a breach of security. Limit the communication, and you limit the damage. So, if two processes share a resource, by coordinating access, they can communicate by modulating access to the entire resource. Examples: percent of CPU used. To send a 1 bit, the first process uses 75% of the CPU; to send a 0 bit, it uses 25% of the CPU. The other process sees how much of the CPU it can get and from that can tell what the first process used, and hence is sending. Variations include filling disks, creating files with fixed names, and so forth. Approaches to implementing this principle: isolate each process, via virtual machines or sandboxes (a sandbox is like a VM, but the isolation is not complete). Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Least Astonishment Security mechanisms should be designed so users understand why the mechanism works the way it does, and using mechanism is simple Hide complexity introduced by security mechanisms Ease of installation, configuration, use Human factors critical here This recognizes the human element. General rules are: Be clear in error messages. You don’t need to be detailed (e.g., did the user mistype the password or login name) but you do need to state the rules for using the mechanism (in the example, that the user must supply both password and login name). Use (and compilation, installation, etc.) must be straightforward. It’s okay to have scripts or other aids to help here. But, for example, data types in the configuration file should either be obvious or explicitly stated. A “duration” field does not indicate if the period of time is to be expressed in hours, minutes, seconds, or something else, nor whether fractional units are allowed (a float) or not (an integer). The latter is actually a common bug. If duration is in minutes, then does “0.5” mean 30 seconds (as a float) or 0 seconds (as an integer)? If the latter, an error message should be given (“invalid type”). This is usually interpreted as meaning the mechanism must not impose an onerous burden. Strictly speaking, passwords violate this rule (because it’s not as easy to access a resource by giving a password as accessing the resource without a password), but the password is considered a minimal burden. Version 1.0 Computer Security: Art and Science, 2nd Edition

Related: Psychological Acceptability Security mechanisms should not add to difficulty of accessing resource Idealistic, as most mechanisms add some difficulty Even if only remembering a password Principle of Least Astonishment accepts this Asks whether the difficulty is unexpected or too much for relevant population of users This recognizes the human element. General rules are: Be clear in error messages. You don’t need to be detailed (e.g., did the user mistype the password or login name) but you do need to state the rules for using the mechanism (in the example, that the user must supply both password and login name). Use (and compilation, installation, etc.) must be straightforward. It’s okay to have scripts or other aids to help here. But, for example, data types in the configuration file should either be obvious or explicitly stated. A “duration” field does not indicate if the period of time is to be expressed in hours, minutes, seconds, or something else, nor whether fractional units are allowed (a float) or not (an integer). The latter is actually a common bug. If duration is in minutes, then does “0.5” mean 30 seconds (as a float) or 0 seconds (as an integer)? If the latter, an error message should be given (“invalid type”). This is usually interpreted as meaning the mechanism must not impose an onerous burden. Strictly speaking, passwords violate this rule (because it’s not as easy to access a resource by giving a password as accessing the resource without a password), but the password is considered a minimal burden. Version 1.0 Computer Security: Art and Science, 2nd Edition

Computer Security: Art and Science, 2nd Edition Key Points Principles of secure design underlie all security-related mechanisms Require: Good understanding of goal of mechanism and environment in which it is to be used Careful analysis and design Careful implementation To sum up: these principles require Simplicity (and when not possible, minimal complexity); Restrictiveness; Understanding the goals of the proposed security; and Understanding the environment in which the mechanism will be developed and deployed. Version 1.0 Computer Security: Art and Science, 2nd Edition