Download presentation
Presentation is loading. Please wait.
Published byThomasina Osborne Modified over 9 years ago
1
Lecture 6 Page 1 CS 236, Spring 2008 Privacy and Anonymity CS 236 Advanced Computer Security Peter Reiher May 6, 2008
2
Lecture 6 Page 2 CS 236, Spring 2008 Groups for This Week 1.Golita Benoodi, Darrell Carbajal, Sean MacIntyre 2.Andrew Castner, Chien-Chia Chen, Chia-Wei Chang 3.Hootan Nikbakht, Ionnis Pefkianakis, Peter Peterson 4.Yu Yuan Chen, Michael Cohen, Zhen Huang 5.Jih-Chung Fan, Vishwa Goudar, Michael Hall 6.Abishek Jain, Nikolay Laptev, Chen-Kuei Lee 7.Chieh-Ning Lien, Jason Liu, Kuo-Yen Lo 8.Min-Hsieh Tsai, Peter Wu, Faraz Zahabian
3
Lecture 6 Page 3 CS 236, Spring 2008 Outline Privacy vs. security? Data privacy issues Network privacy issues Some privacy solutions
4
Lecture 6 Page 4 CS 236, Spring 2008 What Is Privacy? The ability to keep certain information secret Usually one’s own information But also information that is “in your custody” Includes ongoing information about what you’re doing
5
Lecture 6 Page 5 CS 236, Spring 2008 Privacy and Computers Much sensitive information currently kept on computers –Which are increasingly networked Often stored in large databases –Huge repositories of privacy time bombs We don’t know where our information is
6
Lecture 6 Page 6 CS 236, Spring 2008 Privacy and Our Network Operations Lots of stuff goes on over the Internet –Banking and other commerce –Health care –Romance and sex –Family issues –Personal identity information We used to regard this stuff as private –Is it private any more?
7
Lecture 6 Page 7 CS 236, Spring 2008 Threat to Computer Privacy Cleartext transmission of data Poor security allows remote users to access our data Sites we visit can save information on us –Multiple sites can combine information Governmental snooping Location privacy Insider threats in various places
8
Lecture 6 Page 8 CS 236, Spring 2008 Privacy Vs. Security Best friends? Or deadly rivals? Does good security kill all privacy? Does good privacy invalidate security? Are they orthogonal? Or can they just get along?
9
Lecture 6 Page 9 CS 236, Spring 2008 Conflicts Between Privacy and Security Many security issues are based on strong authentication –Which means being very sure who someone is If we’re very sure who’s doing what, we can track everything Many privacy approaches based on obscuring what you’re doing
10
Lecture 6 Page 10 CS 236, Spring 2008 Some Specific Privacy Problems Poorly secured databases that are remotely accessible –Or are stored on hackable computers Data mining by companies we interact with Eavesdropping on network communications by governments Insiders improperly accessing information Cell phone/mobile computer-based location tracking
11
Lecture 6 Page 11 CS 236, Spring 2008 Data Privacy Issues My data is stored somewhere –Can I control who can use it/see it? Can I even know who’s got it? How do I protect a set of private data? –While still allowing some use? Will data mining divulge data “through the back door”?
12
Lecture 6 Page 12 CS 236, Spring 2008 Personal Data Who owns data about you? What if it’s real personal data? –Social security number, DoB, your DNA record? What if it’s data someone gathered about you? –Your Google history or shopping records –Does it matter how they got it?
13
Lecture 6 Page 13 CS 236, Spring 2008 Controlling Personal Data Currently impossible Once someone has your data, they do what they want with it Is there some way to change that? E.g., to wrap data in way to prevent its misuse? Could TPM help?
14
Lecture 6 Page 14 CS 236, Spring 2008 Tracking Your Data How could I find out who knows my passport number? Can I do anything that prevents my data from going certain places? –With cooperation of those who have it? –Without their cooperation? –Via legal means?
15
Lecture 6 Page 15 CS 236, Spring 2008 Protecting Data Sets If my company has (legitimately) a bunch of personal data, What can I/should I do to protect it? –Given that I probably also need to use it? If I fail, how do I know that? –And what remedies do I have?
16
Lecture 6 Page 16 CS 236, Spring 2008 Options for Protecting Data Careful system design Limited access to the database –Networked or otherwise Full logging and careful auditing Using only encrypted data –Must it be decrypted? –If so, how to protect the data and the keys?
17
Lecture 6 Page 17 CS 236, Spring 2008 Data Mining and Privacy Data mining allows users to extract models from databases –Based on aggregated information Often data mining allowed when direct extraction isn’t Unless handled carefully, attackers can use mining to deduce record values
18
Lecture 6 Page 18 CS 236, Spring 2008 Insider Threats and Privacy Often insiders need access to private data –Under some circumstances But they might abuse that access How can we determine when they misbehave? What can we do?
19
Lecture 6 Page 19 CS 236, Spring 2008 Network Privacy Mostly issues of preserving privacy of data flowing through network Start with encryption –With good encryption, data values not readable So what’s the problem?
20
Lecture 6 Page 20 CS 236, Spring 2008 Traffic Analysis Problems Sometimes desirable to hide that you’re talking to someone else That can be deduced even if the data itself cannot How can you hide that? –In the Internet of today?
21
Lecture 6 Page 21 CS 236, Spring 2008 Location Privacy Mobile devices often communicate while on the move Often providing information about their location –Perhaps detailed information –Maybe just hints This can be used to track our movements
22
Lecture 6 Page 22 CS 236, Spring 2008 Implications of Location Privacy Problems Anyone with access to location data can know where we go Allowing government surveillance Or a private detective following your moves Or a maniac stalker figuring out where to ambush you...
23
Lecture 6 Page 23 CS 236, Spring 2008 Some Privacy Solutions The Scott McNealy solution –“Get over it.” Anonymizers Onion routing Privacy-preserving data mining Preserving location privacy Handling insider threats via optimistic security
24
Lecture 6 Page 24 CS 236, Spring 2008 Anonymizers Network sites that accept requests of various kinds from outsiders Then submit those requests –Under their own or fake identity Responses returned to the original requestor A NAT box is a poor man’s anonymizer
25
Lecture 6 Page 25 CS 236, Spring 2008 The Problem With Anonymizers The entity running knows who’s who Either can use that information himself Or can be fooled/compelled/hacked to divulge it to others Generally not a reliable source of real anonymity
26
Lecture 6 Page 26 CS 236, Spring 2008 Onion Routing Meant to handle issue of people knowing who you’re talking to Basic idea is to conceal sources and destinations By sending lots of crypo-protected packets between lots of places Each packet goes through multiple hops
27
Lecture 6 Page 27 CS 236, Spring 2008 A Little More Detail A group of nodes agree to be onion routers Users obtain crypto keys for those nodes Plan is that many users send many packets through the onion routers –Concealing who’s really talking
28
Lecture 6 Page 28 CS 236, Spring 2008 Sending an Onion-Routed Packet Encrypt the packet using the destination’s key Wrap that with another packet to another router –Encrypted with that router’s key Iterate a bunch of times
29
Lecture 6 Page 29 CS 236, Spring 2008 In Diagram Form SourceDestination Onion routers
30
Lecture 6 Page 30 CS 236, Spring 2008 What’s Really in the Packet
31
Lecture 6 Page 31 CS 236, Spring 2008 Delivering the Message
32
Lecture 6 Page 32 CS 236, Spring 2008 What’s Been Achieved? Nobody improper read the message Nobody knows who sent the message –Except the receiver Nobody knows who received the message –Except the sender Assuming you got it all right
33
Lecture 6 Page 33 CS 236, Spring 2008 Issues for Onion Routing Proper use of keys Traffic analysis Overheads –Multiple hops –Multiple encryptions
34
Lecture 6 Page 34 CS 236, Spring 2008 Privacy-Preserving Data Mining Allow users access to aggregate statistics But don’t allow them to deduce individual statistics How to stop that?
35
Lecture 6 Page 35 CS 236, Spring 2008 Approaches to Privacy for Data Mining Perturbation –Add noise to sensitive value Blocking –Don’t let aggregate query see sensitive value Sampling –Randomly sample only part of data
36
Lecture 6 Page 36 CS 236, Spring 2008 Preserving Location Privacy Can we prevent people from knowing where we are? Given that we carry mobile communications devices And that we might want location- specific services ourselves
37
Lecture 6 Page 37 CS 236, Spring 2008 Location-Tracking Services Services that get reports on our mobile device’s position –Probably sent from that device Often useful –But sometimes we don’t want them turned on So, turn them off then
38
Lecture 6 Page 38 CS 236, Spring 2008 But... What if we turn it off just before entering a “sensitive area”? And turn it back on right after we leave? Might someone deduce that we spent the time in that area? Very probably
39
Lecture 6 Page 39 CS 236, Spring 2008 Handling Location Inferencing Need to obscure that a user probably entered a particular area Can reduce update rate –Reducing certainty of travel Or bundle together areas –Increasing uncertainty of which was entered
40
Lecture 6 Page 40 CS 236, Spring 2008 Privacy in the Face of Insider Threats A real problem Recent UCLA medical center case of worker who peeked at celebrity records –They had the right to look at records –But only for the appropriate reasons Generally, how do we preserve privacy when insider needs some access?
41
Lecture 6 Page 41 CS 236, Spring 2008 One Approach A better access control model Don’t give insider full access Only allow access when he really needs it But how do you tell? Could use optimistic access control
42
Lecture 6 Page 42 CS 236, Spring 2008 Optimistic Access Control Don’t normally allow access to protected stuff –Even if worker sometimes needs it On need, require worker to request exceptional action –And keep record it happened Audit log of exceptional actions later
43
Lecture 6 Page 43 CS 236, Spring 2008 A Sample Scenario Nurse X needs to access medical records of patient Y Nurse X doesn’t have ordinary access She requests exceptional access Which is granted –And a record is made of her request
44
Lecture 6 Page 44 CS 236, Spring 2008 Checking the Scenario An auditor (human or otherwise) examines all requests for sensitive access If “not reasonable,” reports to an authority In this case, nurse X acted properly –Since patient Y was being treated
45
Lecture 6 Page 45 CS 236, Spring 2008 How About Misbehavior? Nurse X requests medical information on celebrity Z The request is logged The authority finds it bogus –Good question: How? Nurse X’s ass is fired
46
Lecture 6 Page 46 CS 236, Spring 2008 How Does This Help? Every access to a sensitive record is logged –Well, it could have been, anyway Worker is alerted to the sensitive nature of his actions There’s a built-in mechanism for validating access
47
Lecture 6 Page 47 CS 236, Spring 2008 Will It Be Too Cumbersome? Depends On frequency of requests And auditor’s ability to identify illegitimate accesses Also requires reasonable remediation method And strong authentication
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.