Download presentation
Presentation is loading. Please wait.
Published byHolly Harrington Modified over 9 years ago
1
EE515/IS523 Think Like an Adversary Lecture 6 Access Control/UI in a Nutshell Yongdae Kim
2
Recap ^ http://syssec.kaist.ac.kr/courses/ee515 http://syssec.kaist.ac.kr/courses/ee515 ^ E-mail policy Include [ee515] or [is523] in the subject of your e-mail ^ Class Presentation Choices (say “yes” to at least 3) https://docs.google.com/spreadsheet/viewform?formkey=dEdzMT RWRk8zRHFaZjdIS3F%202TE44ekE6MQ#gid=0 https://docs.google.com/spreadsheet/viewform?formkey=dEdzMT RWRk8zRHFaZjdIS3F%202TE44ekE6MQ#gid=0 ^ Text only posting, email! ^ Preproposal meeting this week Group leader sends me three 30-min time windows between Wednesday and Friday (evening is OK)
3
Recap ^Cryptography Challenge-Response Protocols ^Key Management Kerberos vs. PKI vs. IBE
4
OS Security ^ OS Security is essentially concerned with four problems: User authentication links users to processes. Access control is about deciding whether a process can access a resource. Protection is the task of enforcing these decisions: ensuring a process does not access resources improperly. Isolation is the separation of processes’ resources from other processes.
5
Access Control ^ The OS mediates access requests between subjects and objects. ^ This mediation should (ideally) be impossible to avoid or circumvent. ? Object Subject Reference monitor
6
Definitions ^ Subjects make access requests on objects. ^ Subjects are the ones doing things in the system, like users, processes, and programs. ^ Objects are system resources, like memory, data structures, instructions, code, programs, files, sockets, devices, etc… ^ The type of access determines what to do to the object, for example execute, read, write, allocate, insert, append, list, lock, administer, delete, or transfer
7
Access Control ^ Discretionary Access Control: Access to objects (files, directories, devices, etc.) is permitted based on user identity Each object is owned by a user. Owners can specify freely (at their discretion) how they want to share their objects with other users, by specifying which other users can have which form of access to their objects. Discretionary access control is implemented on any multi-user OS (Unix, Windows NT, etc.). ^ Mandatory Access Control: Access to objects is controlled by a system-wide policy for example to prevent certain flows of information. In some forms, the system maintains security labels for both objects and subjects based on which access is granted or denied. Labels can change as the result of an access Security policies are enforced without the cooperation of users or application programs. Mandatory access control for Linux: http://www.nsa.gov/research/selinux/
8
Access Control Matrix Obj 1Obj 2Obj 3…Obj n Subj 1rwlrwlx--l Subj 2rwlrlxrwl-- Subj 3---rlr Subj mrllwrlrwr
9
Representations ^ An access control matrix can be represented internally in different ways: ^ Access Control Lists (ACLs) store the columns with the objects ^ Capability lists store the rows with the subjects ^ Role-based systems group rights according to the “role” of a subject. O1O2… S1 rwlwl- S2 idawlk- S3 --rl … Sm rwlxwiw
10
Access Control Lists ^ The ACL for an object lists the access rights of each subject (usually users). ^ To check a request, look in the object’s ACL. ^ ACLs are used by most OSes and network file systems, e.g. NT, Unix, and AFS.
11
ACL Problems ^ To be secure, the OS must authenticate that the user is who (s)he claims to be. ^ To revoke a user’s access, we must check every object in the system. ^ There is often no good way to restrict a process to a subset of the user’s rights.
12
Capabilities ^ Capabilities store the allowed list of object accesses with each subject. ^ When the subject requests access to object O, it must provide a “ticket” granting access to O. ^ These tickets are stored in an OS-protected table associated to each process. ^ No widely-used OS uses pure capabilities. ^ Some systems have “capability-like” features: e.g. Kerberos, NT, OLPC, Android
13
ACL vs. Capabilities ^ Capabilities do not require authentication: the OS just checks each ticket on access requests. ^ Capabilities can be passed, or delegated, from one process to another. ^ We can limit the privileges of a process, by removing unnecessary tickets from the table.
14
Roles S1 S2S3Sm O1O2On … … S1 S2S3Sm O1O2On … … R1R2
15
Unix/POSIX Access Control kyd@dio (~) % id uid=3259(kyd) gid=717(faculty) groups=717(faculty),1686(mess),1847(S07C8271),1910(F07C5471),2038(S08C 8271) kyd@dio (~) % ls -l News_and_Recent_Events.zip -rw-rw-rw- 1 kyd faculty 714904 Feb 22 10:00 News_and_Recent_Events.zip kyd@dio (/web/classes02/Spring-2011/csci5471) % ls –al drwxrwsr-x 4 kyd S11C5471 512 Jan 19 10:23./ drwxr-xr-x 46 root daemon 1024 Feb 17 23:04../ drwxrwsr-x 3 kyd S11C5471 512 Feb 16 00:36 Assignment/
16
Mandatory Access Control policies ^ Restrictions to allowed information flows are not decided at the user’s discretion (as with Unix chmod), but instead enforced by system policies. ^ Mandatory access control mechanisms are aimed in particular at preventing policy violations by untrusted application software, which typically have at least the same access privileges as the invoking user.
17
Data Pump/Data Diode ^ Like “air gap” security, but with one-way communication link that allow users to transfer data from the low-confidentiality to the high- confidentiality environment, but not vice versa. ^ Examples: Workstations with highly confidential material are configured to have read-only access to low confidentiality file servers.
18
The covert channel problem ^ Reference monitors see only intentional communications channels, such as files, sockets, memory. ^ However, there are many more “covert channels”, which were neither designed nor intended to transfer information at all. ^ A malicious high-level program can use these to transmit high-level data to a low-level receiving process, who can then leak it to the outside world. ^ Examples for covert channels: Resource conflicts – If high-level process has already created a file F, a low-level process will fail when trying to create a file of same name → 1 bit information. Timing channels – Processes can use system clock to monitor their own progress and infer the current load, into which other processes can modulate information. Resource state – High-level processes can leave shared resources (disk head position, cache memory content, etc.) in states that influence the service response times for the next process. Hidden information in downgraded documents – Steganographic embedding techniques can be used to get confidential information past a human downgrader (least-significant bits in digital photos, variations of punctuation/spelling/whitespace in plaintext, etc.).
19
User Interface Failures
20
Humans “Humans are incapable of securely storing high-quality cryptographic keys, and they have unacceptable speed and accuracy when performing cryptographic operations. (They are also large, expensive to maintain, difficult to manage, and they pollute the environment. It is astonishing that these devices continue to be manufactured and deployed. But they are sufficiently pervasive that we must design our protocols around their limitations.)” −− C. Kaufman, R. Perlman, and M. Speciner. Network Security: PRIVATE Communication in a PUBLIC World. 2nd edition. Prentice Hall, page 237, 2002.
21
Humans are weakest link ^ Most security breaches attributed to “human error” ^ Social engineering attacks proliferate ^ Frequent security policy compliance failures ^ Automated systems are generally more predictable and accurate than humans
22
Why are humans in the loop at all? ^ Don’t know how or too expensive to automate ^ Human judgments or policy decisions needed ^ Need to authenticate humans
23
The human threat ^ Malicious humans who will attack system ^ Humans who are unmotivated to perform security-critical tasks properly or comply with policies ^ Humans who don’t know when or how to perform security-critical tasks ^ Humans who are incapable of performing security-critical tasks
24
Need to better understand humans in the loop ^ Do they know they are supposed to be doing something? ^ Do they understand what they are supposed to do? ^ Do they know how to do it? ^ Are they motivated to do it? ^ Are they capable of doing it? ^ Will they actually do it?
26
SSL Warnings
27
False Alarm Effect ^ “Detection system” ≈ “System” ^ If risk is not immediate, warning the user will decrease her trust on the system
28
Patco Construction vs. Ocean Bank ^Hacker stole ~$600K from Patco through Zeus ^The transfer alarmed the bank, but ignored ^“substantially increase the risk of fraud by asking for security answers for every $1 transaction” ^“neither monitored that transaction nor provided notice before completed” ^“commercially unreasonable” Out-of-Band Authentication User-Selected Picture Tokens Monitoring of Risk-Scoring Reports 28
29
Password Authentication
30
Definitions ^ Identification - a claim about identity Who or what I am (global or local) ^ Authentication - confirming that claims are true I am who I say I am I have a valid credential ^ Authorization - granting permission based on a valid claim Now that I have been validated, I am allowed to access certain resources or take certain actions ^ Access control system - a system that authenticates users and gives them access to resources based on their authorizations Includes or relies upon an authentication mechanism May include the ability to grant course or fine-grained authorizations, revoke or delegate authorizations Also includes an interface for policy configuration and management
31
Building blocks of authentication ^ Factors Something you know (or recognize) Something you have Something you are ^ Two factors are better than one Especially two factors from different categories ^ What are some examples of each of these factors? ^ What are some examples of two-factor authentication?
32
Authentication mechanisms ^ Text-based passwords ^ Graphical passwords ^ Hardware tokens ^ Public key crypto protocols ^ Biometrics
33
Evaluation ^ Accessibility ^ Memorability ^ Security ^ Cost ^ Environmental considerations
34
Typical password advice
35
^ Pick a hard to guess password ^ Don’t use it anywhere else ^ Change it often ^ Don’t write it down So what do you do when every web site you visit asks for a password?
36
Bank = b3aYZ Amazon = aa66x! Phonebill = p$2$ta1
38
Problems with Passwords ^ Selection – Difficult to think of a good password – Passwords people think of first are easy to guess ^ Memorability – Easy to forget passwords that aren’t frequently used – Difficult to remember “secure” passwords with a mix of upper & lower case letters, numbers, and special characters ^ Reuse – Too many passwords to remember – A previously used password is memorable ^ Sharing – Often unintentional through reuse – Systems aren’t designed to support the way people work together and share information
39
Mnemonic Passwords Four First letter of each word (with punctuation) fsasya,oF Substitute numbers for words or similar-looking letters 4sa7ya,oF Substitute symbols for words or similar-looking letters F 4sasya,oF Four 4sa7ya,oF 4s&7ya,oF score s anda seven s yearsy ago a,, our oFathers F Source: Cynthia Kuo, SOUPS 2006
40
The Promise? ^ Phrases help users incorporate different character classes in passwords Easier to think of character-for-word substitutions ^ Virtually infinite number of phrases ^ Dictionaries do not contain mnemonics Source: Cynthia Kuo, SOUPS 2006
41
Mnemonic password evaluation ^ Mnemonic passwords are not a panacea for password creation ^ No comprehensive dictionary today ^ May become more vulnerable in future – Many people start to use them – Attackers incentivized to build dictionaries ^ Publicly available phrases should be avoided! Source: Cynthia Kuo, SOUPS 2006
42
Password keeper software ^ Run on PC or handheld ^ Only remember one password
43
Single sign-on ^ Login once to get access to all your passwords
44
Biometrics
45
Fingerprint Spoofing ^ Devices Microsoft Fingerprint Reader APC Biometric Security device ^ Success! Very soft piece of wax flattened against hard surface Press the finger to be molded for 5 minutes Transfer wax to freezer for 10-15 minutes Firmly press modeling material into cast Press against the fingerprint reader ^ Replicated several times
46
Retina/Iris Scan ^ Retinal Scan Must be close to camera (IR) Scanning can be invasive Not User friendly Expensive ^ Iris Scan Late to the game Requires advanced technology to properly capture iris Users do not have to consent to have their identity tested
47
Graphical passwords
48
“Forgotten password” mechanism ^ Email password or magic URL to address on file ^ Challenge questions ^ Why not make this the normal way to access infrequently used sites?
49
Convenient SecureID 1 ^ What problems does this approach solve? ^ What problems does it create? Source: http://worsethanfailure.com/Articles/Security_by_Oblivity.aspx
50
Convenient SecureID 2 ^ What problems does this approach solve? ^ What problems does is create? 50 Previously available at: http://fob.webhop.net/
51
Browser-based mutual authentication ^Chris Drake’s “Magic Bullet” proposal ^http://lists.w3.org/Archives/Public/public-usable- authentication/2007Mar/0004.htmlhttp://lists.w3.org/Archives/Public/public-usable- authentication/2007Mar/0004.html – User gets ID, password (or alternative), image, hotspot at enrollment – Before user is allowed to login they are asked to confirm URL and SSL cert and click buttons – Then login box appears and user enters username and password (or alternative) – Server displays set of images, including user’s image (or if user entered incorrect password, random set of images appear) – User finds their image and clicks on hotspot Image manipulation can help prevent replay attacks ^What problems does this solve? ^What problems doesn’t it solve? ^What kind of testing is needed
52
Phishing
53
Spear Phishing (Targeted Phishing) ^Personalized mail for a (small) group of targeted users Employees, Facebook friends, Alumni, eCommerce Customers These groups can be obtained through identity theft! ^Content of the email is personalized. Different from Viagra phishing/spam ^Combined with other attacks Zero-day vulnerability: unpatched Rootkit: Below OS kernel, impossible to detect with AV software Key logger: Further obtain ID/password APT (Advanced Persistent Threat): long-term surveillance 53
54
Examples of Spear Phishing 54
55
Good Phishing example 55
56
Policy and Usability
58
Cost of Reading Policy Cranor et al. ^ T R = p x R x n p is the population of all Internet users R is the average time to read one policy n is the average number of unique sites Internet users visit annually ^ p = 221 million Americans online (Nielsen, May 2008) ^ R = avg time to read a policy = # words in policy / reading rate To estimate words per policy: Measured the policy length of the 75 most visited websites Reflects policies people are most likely to visit ^ Reading rate = 250 WPM Mid estimate: 2,514 words / 250 WPM = 10 minutes
59
^ n = number of unique sites per year Nielsen estimates Americans visit 185 unique sites in a month: but that doesn’t quite scale x12, so 1462 unique sites per year. ^ T R = p x R x n = 221 million x 10 minutes x 1462 sites ^ R x n = 244 hours per year per person
60
P3P: Platform for Privacy Preferences ^ A framework for automated privacy discussions Web sites disclose their privacy practices in standard machine-readable formats Web browsers automatically retrieve P3P privacy policies and compare them to users’ privacy preferences Sites and browsers can then negotiate about privacy terms
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.