CMU Usable Privacy and Security Laboratory Hey, That’s Personal! Lorrie Faith Cranor 28 July
CMU Usable Privacy and Security Laboratory Lorrie Cranor 2 Outline Privacy risks from personalization Privacy risks from personalization Reducing privacy risks Reducing privacy risks Personalizing privacy Personalizing privacy
Privacy risks from personalization
CMU Usable Privacy and Security Laboratory Lorrie Cranor 4 Unsolicited marketing Desire to avoid unwanted marketing causes some people to avoid giving out personal information PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 5 My computer can “figure things out about me” The little people inside my computer might know it’s me… … and they might tell their friends PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 6 Inaccurate inferences “My TiVo thinks I’m gay!” PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 7 Surprisingly accurate inferences Everyone wants to be understood. No one wants to be known. PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 8 You thought that on the Internet nobody knew you were a dog… …but then you started getting personalized ads for your favorite brand of dog food PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 9 Price discrimination Concerns about being charged higher prices Concerns about being charged higher prices Concerns about being treated differently Concerns about being treated differently PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 10 Revealing private information to other users of a computer Revealing info to family members or co- workers Revealing info to family members or co- workers Gift recipient learns about gifts in advance Co-workers learn about a medical condition Revealing secrets that can unlock many accounts Revealing secrets that can unlock many accounts Passwords, answers to secret questions, etc. PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 11 The Cranor family’s 25 most frequent grocery purchases (sorted by nutritional value)! PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 12 Exposing secrets to criminals Stalkers, identity thieves, etc. Stalkers, identity thieves, etc. People who break into account may be able to access profile info People who break into account may be able to access profile info People may be able to probe recommender systems to learn profile information associated with other users People may be able to probe recommender systems to learn profile information associated with other users PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 13 Subpoenas Records are often subpoenaed in patent disputes, child custody cases, civil litigation, criminal cases Records are often subpoenaed in patent disputes, child custody cases, civil litigation, criminal cases PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 14 Government surveillance Governments increasingly looking for personal records to mine in the name of fighting terrorism Governments increasingly looking for personal records to mine in the name of fighting terrorism People may be subject to investigation even if they have done nothing wrong People may be subject to investigation even if they have done nothing wrong PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 15 Risks may be magnified in future Wireless location tracking Wireless location tracking Semantic web applications Semantic web applications Ubiquitous computing Ubiquitous computing PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 16 If you’re not careful, you may violate data protection laws Some jurisdictions have privacy laws that Some jurisdictions have privacy laws that Restrict how data is collected and used Require that you give notice, get consent, or offer privacy-protective options Impose penalties if personal information is accidently exposed PRIVACY RISKS
Reducing privacy risks
CMU Usable Privacy and Security Laboratory Lorrie Cranor 18 Tends to be MORE Privacy Invasive Tends to be LESS Privacy Invasive Implicit Explicit Persistent (profile) Transient (task or session) System initiated User initiated Predication basedContent based Axes of personalization Data collection method Duration User involvement Reliance on predictions REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 19 A variety of approaches to reducing privacy risks No single approach will always work No single approach will always work Two types of approaches: Two types of approaches: Reduce data collection and storage Put users in control REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 20 Collection limitation: Pseudonymous profiles Useful for reducing risk and complying with privacy laws when ID is not needed for personalization Useful for reducing risk and complying with privacy laws when ID is not needed for personalization But, profile may become identifiable because of unique combinations of info, links with log data, unauthorized access to user’s computer, etc. But, profile may become identifiable because of unique combinations of info, links with log data, unauthorized access to user’s computer, etc. Profile info should always be stored separately from web usage logs and transaction records that might contain IP addresses or PII Profile info should always be stored separately from web usage logs and transaction records that might contain IP addresses or PII REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 21 Collection limitation: Client-side profiles Useful for reducing risk and complying with laws Useful for reducing risk and complying with laws Risk of exposure to other users of computer remains; storing encrypted profiles can help Risk of exposure to other users of computer remains; storing encrypted profiles can help Client-side profiles may be stored in cookies replayed to server that discards them after use Client-side profiles may be stored in cookies replayed to server that discards them after use Client-side scripting may allow personalization without ever sending personal info to the server Client-side scripting may allow personalization without ever sending personal info to the server For some applications, no reason to send data to server For some applications, no reason to send data to server REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 22 Collection limitation: Task-based personalization Focus on data associated with current session or task - no user profile need be stored anywhere Focus on data associated with current session or task - no user profile need be stored anywhere May allow for simpler (and less expensive) system architecture too! May allow for simpler (and less expensive) system architecture too! May eliminate problem of system making recommendations that are not relevant to current task May eliminate problem of system making recommendations that are not relevant to current task Less “spooky” to users - relationship between current task and resultant personalization usually obvious Less “spooky” to users - relationship between current task and resultant personalization usually obvious REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 23 Putting users in control Users should be able to control Users should be able to control what information is stored in their profile how it may be used and disclosed REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 24 Developing good user interface to do this is complicated Setting preferences can be tedious Setting preferences can be tedious Creating overall rules that can be applied on the fly as new profile data is collected requires deep understanding and ability to anticipate privacy concerns Creating overall rules that can be applied on the fly as new profile data is collected requires deep understanding and ability to anticipate privacy concerns REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 25 Possible approaches Provide reasonable default rules with the ability to add/change rules or specify preferences for handling of specific data Provide reasonable default rules with the ability to add/change rules or specify preferences for handling of specific data Up front With each action After-the-fact Explicit privacy preference prompts during transaction process Explicit privacy preference prompts during transaction process Allow multiple personae Allow multiple personae REDUCING PRIVACY RISKS
Example: Google Search History REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 27 Amazon.com privacy makeover REDUCING PRIVACY RISKS
Streamline menu navigation for customization REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 29 Provide way to set up default rules Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info There should be a way to set up default rules Exclude all purchases Exclude all purchases shipped to my work address Exclude all movie purchases Exclude all purchases I had gift wrapped REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 30 Remove excluded purchases from profile Users should be able to remove items from profile Users should be able to remove items from profile If purchase records are needed for legal reasons, users should be able to request that they not be accessible online If purchase records are needed for legal reasons, users should be able to request that they not be accessible online REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 31 Better: options for controlling recent history REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 32 Use personae Amazon already allows users to store multiple credit cards and addresses Amazon already allows users to store multiple credit cards and addresses Why not allow users to create personae linked to each with option of keeping recommendations and history separate (would allow easy way to separate work/home/gift personae)? Why not allow users to create personae linked to each with option of keeping recommendations and history separate (would allow easy way to separate work/home/gift personae)? REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 33 Allow users to access all privacy- related options in one place Currently privacy-related options are found with relevant features Currently privacy-related options are found with relevant features Users have to be aware of features to find the options Users have to be aware of features to find the options Put them all in one place Put them all in one place But also leave them with relevant features But also leave them with relevant features REDUCING PRIVACY RISKS
CMU Usable Privacy and Security Laboratory Lorrie Cranor 34 I didn’t buy it for myself How about an “I didn’t buy it for myself” check-off box (perhaps automatically checked if gift wrapping is requested) I didn’t buy it for myself REDUCING PRIVACY RISKS
Personalizing privacy
CMU Usable Privacy and Security Laboratory Lorrie Cranor 36 Can we apply user modeling expertise to privacy? Personalized systems cause privacy concerns Personalized systems cause privacy concerns But can we use personalization to help address these concerns? But can we use personalization to help address these concerns? PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 37 What is privacy? “the claim of individuals… to determine for themselves when, how, and to what extent information about them is communicated to others.” - Alan Westin, 1967 PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 38 Privacy as process “Each individual is continually engaged in a personal adjustment process in which he balances the desire for privacy with the desire for disclosure and communication….” - Alan Westin, 1967 PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 39 But individuals don’t always engage in adjustment process Lack of knowledge about how info is used Lack of knowledge about how info is used Lack of knowledge about how to exercise control Lack of knowledge about how to exercise control Too difficult or inconvenient to exercise control Too difficult or inconvenient to exercise control Data collectors should inform users Data collectors should inform users Data collectors should provide choices and controls Data collectors should provide choices and controls Sounds like a job for a user model! Sounds like a job for a user model! PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 40 Example: Managing privacy at web sites Website privacy policies Website privacy policies Many posted Few read What if your browser could read them for you? What if your browser could read them for you? Warn you not to shop at sites with bad policies Automatically block cookies at those sites PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 41 Platform for Privacy Preferences (P3P) 2002 W3C Recommendation 2002 W3C Recommendation XML format for Web privacy policies XML format for Web privacy policies Protocol enables clients to locate and fetch policies from servers Protocol enables clients to locate and fetch policies from servers PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 42 Privacy Bird P3P user agent originally developed by AT&T P3P user agent originally developed by AT&T Free download and privacy search service at Free download and privacy search service at Compares user preferences with P3P policies Compares user preferences with P3P policies PERSONALIZING PRIVACY
Link to opt-out page PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 47 I would like to give the bird some feedback “I read this policy and actually I think it’s ok” “I read this policy and actually I think it’s ok” “I took advantage of the opt-out on this site so there is no problem” “I took advantage of the opt-out on this site so there is no problem” “This site is a banking site and I want to be extra cautious when doing online banking” “This site is a banking site and I want to be extra cautious when doing online banking” PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 48 Especially important if bird takes automatic actions Not critical when bird is only informational Not critical when bird is only informational But if bird blocks cookies, the wrong decision will get annoying But if bird blocks cookies, the wrong decision will get annoying PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 49 Can we learn user’s privacy preferences over time? Bad bird! PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 52 Other example applications for personalizing privacy Buddy lists: when to reveal presence information and to whom Buddy lists: when to reveal presence information and to whom Friend finder services: when to reveal location information and what level of detail Friend finder services: when to reveal location information and what level of detail Personalized ecommerce sites: when to start and stop recording my actions, which persona to use Personalized ecommerce sites: when to start and stop recording my actions, which persona to use PERSONALIZING PRIVACY
CMU Usable Privacy and Security Laboratory Lorrie Cranor 53 Conclusions Personalization often has real privacy risks Personalization often has real privacy risks Address these risks by minimizing data collection and storage, putting users in control Address these risks by minimizing data collection and storage, putting users in control Challenge: Can we make it easier for users to be in control by personalizing privacy? Challenge: Can we make it easier for users to be in control by personalizing privacy?
CMU Usable Privacy and Security Laboratory