Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quick Discussion – based on:

Similar presentations


Presentation on theme: "Quick Discussion – based on:"— Presentation transcript:

1 Quick Discussion – based on: http://cups.cs.cmu.edu/courses/ups-sp08/

2 2  Unpatched Windows machines compromised in minutes  Phishing web sites increasing by 28% each month  Most PCs infected with spyware (avg. = 25)  Users have more passwords than they can remember and practice poor password security  Enterprises store confidential information on laptops and mobile devices that are frequently lost or stolen

3 3 “Give end-users security controls they can understand and privacy they can control for the dynamic, pervasive computing environments of the future.” - Computing Research Association 2003

4 4 How do users stay safe online?

5 5 POP!

6 6 After installing all that security and privacy software…

7 7  Security experts are concerned about the bad guys getting in  Users may be more concerned about locking themselves out “Users do not want to be responsible for, nor concern themselves with, their own security.” - Blake Ross

8 8 Secure, but usable?

9 9  Pick a hard to guess password  Don’t use it anywhere else  Change it often  Don’t write it down

10

11 11 Bank = b3aYZ Amazon = aa66x! Phonebill = p$2$ta1

12 12  Make it “just work” ◦ Invisible security  Make security/privacy understandable ◦ Make it visible ◦ Make it intuitive ◦ Use metaphors that users can relate to  Train the user

13 13  Developers should not expect users to make decisions they themselves can’t make

14 - Chris Nodder (in charge of user experience for Windows XP SP2)

15 15  Privacy is a secondary task ◦ Users of privacy tools often seek out these tools due to their awareness of or concern about privacy ◦ Even so, users still want to focus on their primary tasks  Users have differing privacy concerns and needs ◦ One-size-fits-all interface may not work  Most users are not privacy experts ◦ Difficult to explain current privacy state or future privacy implications ◦ Difficult to explain privacy options to them ◦ Difficult to capture privacy needs/preferences  Many privacy tools reduce application performance, functionality, or convenience

16 16  Internet anonymity system  Allows users to send messages that cannot be traced back to them (web browsing, chat, p2p, etc.)  UI was mostly command line interface until recently  2005 Tor GUI competition ◦ CUPS team won phase 1 with design for Foxtor!

17 17  Tor is configurable and different users will want to configure it in different ways ◦ But most users won’t understand configuration options ◦ Give users choices, not dilemmas  We began by trying to understand our users ◦ No budget, little time, limited access to users ◦ So we brainstormed about their needs, tried to imagine them, and develop personas for them

18 Jim is a current UG at CSM. Goals: 1.Be sure he’s on track to graduate in 4 years 2.Find some courses that are interesting 3.Get together with friends to study and have fun Other: Jim is taking a full course load and also working part time, so he’s always very busy. He also tends to be disorganized, so he keeps losing information and having to look it up again. He is a little shy and doesn’t know too many people in the department yet.

19 Susie is a parent researching schools for her son Bob, who will be graduating from HS soon. Goals: 1.She wants to find an environment that will be welcoming and stimulating for Bob 2.She thinks Bob may ultimately want to pursue graduate work, so she wants to be sure the school has faculty doing interesting research 3.She wants to find out how expensive the school is and what type of financial aid is available. Other: Susie works full time but considers her family to be a top priority. It’s very important to her for her son to be happy, so she’s willing to devote a fair amount of time to the task of selecting a university. The family has a computer at home, so she’s spending her evenings visiting websites to collect data. She’s comfortable surfing the web, but prefers websites that are logical and not too cluttered.

20 20  The process led to realization that our users had 3 categories of privacy needs ◦ Basic, selective, critical  Instead of asking users to figure out complicated settings, most of our configuration involves figuring out which types of privacy needs they have

21

22 22

23 23  Privacy laws and regulations vary widely throughout the world  US has mostly sector-specific laws, with relatively minimal protections - often referred to as “patchwork quilt” ◦ Federal Trade Commission has jurisdiction over fraud and deceptive practices ◦ Federal Communications Commission regulates telecommunications  European Data Protection Directive requires all European Union countries to adopt similar comprehensive privacy laws that recognize privacy as fundamental human right ◦ Privacy commissions in each country (some countries have national and state commissions) ◦ Many European companies non-compliant with privacy laws (2002 study found majority of UK web sites non-compliant)

24 24  Bank Secrecy Act, 1970  Fair Credit Reporting Act, 1971  Privacy Act, 1974  Right to Financial Privacy Act, 1978  Cable TV Privacy Act, 1984  Video Privacy Protection Act, 1988  Family Educational Right to Privacy Act, 1993  Electronic Communications Privacy Act, 1994  Freedom of Information Act, 1966, 1991, 1996

25 25  HIPAA (Health Insurance Portability and Accountability Act, 1996) ◦ When implemented, will protect medical records and other individually identifiable health information  COPPA (Children‘s Online Privacy Protection Act, 1998) ◦ Web sites that target children must obtain parental consent before collecting personal information from children under the age of 13  GLB (Gramm-Leach-Bliley-Act, 1999) ◦ Requires privacy policy disclosure and opt-out mechanisms from financial service institutions

26 26  Direct Marketing Association Privacy Promise http://www.thedma.org/library/ privacy/privacypromise.shtml  Network Advertising Initiative Principles http://www.networkadvertising.org/  CTIA Location-based privacy guidelines http://www.wow- com.com/news/press/body.cfm?record_id=907

27 27

28 28  Policies let consumers know about site’s privacy practices  Consumers can then decide whether or not practices are acceptable, when to opt-in or opt-out, and who to do business with  The presence of privacy policies increases consumer trust What are some problems with privacy policies?

29 29  BUT policies are often ◦ difficult to understand ◦ hard to find ◦ take a long time to read ◦ change without notice

30 30 Privacy policy components  Identification of site, scope, contact info  Types of information collected ◦ Including information about cookies  How information is used  Conditions under which information might be shared  Information about opt-in/opt- out  Information about access  Information about data retention policies  Information about seal programs  Security assurances  Children’s privacy There is lots of information to convey -- but policy should be brief and easy-to-read too! What is opt-in? What is opt-out?

31 31  Project organized by Hunton & Williams law firm ◦ Create short version (short notice) of a human-readable privacy notice for both web sites and paper handouts ◦ Sometimes called a “layered notice” as short version would advise people to refer to long notice for more detail ◦ Now being called “highlights notice” ◦ Focus on reducing privacy policy to at most 7 boxes ◦ Standardized format but only limited standardization of language ◦ Proponents believe highlights format may eventually be mandated by law  Alternative proposals from privacy advocates focus on check boxes  Interest Internationally ◦ http://www.privacyconference2003.org/resolution.asp  Interest in the US for financial privacy notices ◦ http://www.ftc.gov/privacy/privacyinitiatives/ftcfinalreport060228.pd f

32 32

33 33

34 34

35 35 Checkbox proposal WE SHARE [DO NOT SHARE] PERSONAL INFORMATION WITH OTHER WEBSITES OR COMPANIES. Collection: YESNO We collect personal information directly from you   We collect information about you from other sources:   We use cookies on our website   We use web bugs or other invisible collection methods   We install monitoring programs on your computer   Uses: We use information about you to:With YourWithout Your ConsentConsent Send you advertising mail   Send you electronic mail   Call you on the telephone   Sharing: We allow others to use your information to:With YourWithout YourConsent Maintain shared databases about you   Send you advertising mail   Send you electronic mail   Call you on the telephoneN/AN/A Access: You can see and correct {ALL, SOME, NONE} of the information we have about you. Choices: You can opt-out of receiving fromUsAffiliatesThird Parties Advertising mail   Electronic mail   Telemarketing  N/A Retention: We keep your personal data for:{Six Months Three Years Forever} Change:We can change our data use policy {AT ANY TIME, WITH NOTICE TO YOU, ONLY FOR DATA COLLECTED IN THE FUTURE}

36 36  Developed by the World Wide Web Consortium (W3C) http://www.w3.org/p3p/ ◦ Final P3P1.0 Recommendation issued 16 April 2002  Offers an easy way for web sites to communicate about their privacy policies in a standard machine-readable format ◦ Can be deployed using existing web servers  Enables the development of tools (built into browsers or separate applications) that ◦ Summarize privacy policies ◦ Compare policies with user preferences ◦ Alert and advise users

37

38 38  Laboratory study of 28 non-expert computer users  Asked to evaluate 10 web sites, take 15 minute break, evaluate 10 more web sites  Experimental group read web-based training materials during break, control group played solitaire  Experimental group performed significantly better identifying phish after training  People can learn from web-based training materials, if only we could get them to read them!

39 39  Most people don’t proactively look for training materials on the web  Many companies send “security notice” emails to their employees and/or customers  But these tend to be ignored ◦ Too much to read ◦ People don’t consider them relevant

40 40  Can we “train” people during their normal use of email to avoid phishing attacks? ◦ Periodically, people get sent a training email ◦ Training email looks like a phishing attack ◦ If person falls for it, intervention warns and highlights what cues to look for in succinct and engaging format P. Kumaraguru, Y. Rhee, A. Acquisti, L. Cranor, J. Hong, and E. Nunge. Protecting People from Phishing: The Design and Evaluation of an Embedded Training Email System. CyLab Technical Report. CMU-CyLab-06-017, 2006. http://www.cylab.cmu.edu/default.aspx?id=2253

41

42 42  Lab study compared two prototype interventions to standard security notice emails from Ebay and PayPal ◦ Existing practice of security notices is ineffective ◦ Diagram intervention somewhat better ◦ Comic strip intervention worked best ◦ Interventions most effective when based on real brands

43

44 44

45 45  Ecommerce personalization systems ◦ Concerns about use of user profiles  Software that “phones home” to fetch software updates or refresh content, report bugs, relay usage data, verify authorization keys, etc. ◦ Concerns that software will track and profile users  Communications software (email, IM, chat) ◦ Concerns about traffic monitoring, eavesdroppers  Presence systems (buddy lists, shared spaces, friend finders) ◦ Concerns about limiting when info is shared and with whom

46 46  Similar to issues to consider for privacy tools PLUS  Users may not be aware of privacy issues up front ◦ When they find out about privacy issues they may be angry or confused, especially if they view notice as inadequate or defaults as unreasonable  Users may have to give up functionality or convenience, or spend more time configuring system for better privacy  Failure to address privacy issues adequately may lead to bad press and legal action

47 47 Amazon.com privacy makeover

48 Streamline menu navigation for customization

49 49  Every time a user makes a new purchase that they want to rate or exclude they have to edit profile info ◦ There should be a way to set up default rules  Exclude all purchases  Exclude all purchases shipped to my work address  Exclude all movie purchases  Exclude all purchases I had gift wrapped

50 50  Users should be able to remove items from profile  If purchase records are needed for legal reasons, users should be able to request that they not be accessible online

51 51 Better: options for controlling recent history

52 52  Currently privacy-related options are found with relevant features  Users have to be aware of features to find the options  Put them all in one place  But also leave them with relevant features

53 53 How about an “I didn’t buy it for myself” check-off box (perhaps automatically checked if gift wrapping is requested) I didn’t buy it for myself

54

55 55 Desire to avoid unwanted marketing causes some people to avoid giving out personal information

56 56 The little people inside my computer might know it’s me… … and they might tell their friends

57 57  “My TiVo thinks I’m a psychopath!”

58 58 Surprisingly accurate inferences Everyone wants to be understood. No one wants to be known.

59 59 You thought that on the Internet nobody knew you were a dog… …but then you started getting personalized ads for your favorite brand of dog food

60 60  Concerns about being charged higher prices  Concerns about being treated differently

61 61  Revealing info to family members or co- workers ◦ Gift recipient learns about gifts in advance ◦ Co-workers learn about a medical condition  Revealing secrets that can unlock many accounts ◦ Passwords, answers to secret questions, etc.

62 62 The Cranor family’s 25 most frequent grocery purchases (sorted by nutritional value)!

63 63  Stalkers, identity thieves, etc.  People who break into account may be able to access profile info  People may be able to probe recommender systems to learn profile information associated with other users

64 64  Records are often subpoenaed in patent disputes, child custody cases, civil litigation, criminal cases

65 65  Governments increasingly looking for personal records to mine in the name of fighting terrorism  People may be subject to investigation even if they have done nothing wrong

66 66 Little Brother as Big Brother

67

68 68  Wireless location tracking  Semantic web applications  Ubiquitous computing


Download ppt "Quick Discussion – based on:"

Similar presentations


Ads by Google