Security and Usability Rachel Greenstadt February 15, 2016 Slide credits Lorrie Cranor
Announcements Midterm available Project 1 due next Thursday Mean 72, Median 75, STDev 15 Very bimodal distribution Project 1 due next Thursday I’m away next week Tues: guest lecture from Russell Handorf, FBI Thurs: discussion led by Kevin
A security professional’s view of humans... “Humans are incapable of securely storing high-quality cryptographic keys, and they have unacceptable speed and accuracy when performing cryptographic operations. (They are also large, expensive to maintain, difficult to manage, and they pollute the environment. It is astonishing that these devices continue to be manufactured and deployed. But they are sufficiently pervasive that we must design our protocols around their limitations.)” Network Security: PRIVATE Communication in a PUBLIC World Charlie Kaufman, Radia Perlman + Mike Speciner Prentice Hall 1995
On the other hand…
The Attacker’s Perspective
security/privacy researchers and system developers But until recently, security and privacy researchers weren’t paying a whole of attention to what users wanted. Security folks worked in one department, while HCI folks worked in a different department, and they didn’t really talk much. But that is starting to change. human computer interaction researchers and usability professionals
How do you stay safe online?
Unusable security & privacy Unpatched Windows machines compromised in minutes Phishing web sites increasing by 28% each month Most PCs infected with spyware (avg. = 25) Users have more passwords than they can remember and practice poor password security Enterprises store confidential information on laptops and mobile devices that are frequently lost or stolen
- Computing Research Association 2003 Grand Challenge “Give end-users security controls they can understand and privacy they can control for the dynamic, pervasive computing environments of the future.” - Computing Research Association 2003
Experts recommend…
POP! Firewalls Antivirus software Spam filters Pop-up blockers Cookie managers Spyware removers encryption software if you have kids, adult content filters, and if you want to cover your tracks, anonymity tools
After installing all that security and privacy software
Do you have any time left to get any work done?
Secondary tasks Security and privacy are secondary tasks Nobody buys a computer so they can spend time securing it. Time we spend configuring security and privacy tools is time we are not spending doing what we really want to be doing with our computers
Approaches to usable security Make it “just work” Invisible security Make security/privacy understandable Make it visible Make it intuitive Use metaphors that users can relate to Train the user
Make decisions Developers should not expect users to make decisions they themselves can’t make Many pieces of software contain prompts in which the developers ask users to make a security-related decision. Blake Ross suggests that - Before coding up such questions, software developers should ask themselves a question: if I can’t figure it out, how are users supposed to? What additional qualifications do users possess that will help them arrive at a better conclusion? (From Blake Ross’ book chapter)
Present choices, not dilemmas - Chris Nodder (in charge of user experience for XP SP2) Chris Nodder, who was in charge of user experience for MS XP SP2, puts it this way: “Present choices, not dilemmas.” If the user doesn’t have any insights into how to answer a question, it is a dilemma, not a decision. There are some situations where users possess some additional contextual knowledge or personal preferences that may allow them to make a better decision than the implementer could make in advance. In those cases, UIs are needed that will help users understand what bit of knowledge they bring to the decision and how they can best apply it. But if the users don’t have anything useful to contribute, they should not be making the decision.
XP SP1 - no question mark
XP SP2
Security Decisions Run this script? Trust this site? Make a firewall exception? Share this piece of personal information? Allow user bob access? Choose a password Buy from alice? Write about my diagnosis on the forum? Open this email? Install this software? Plug Carol’s usb key into my laptop? Drop this packet?
Hard for Machines and Humans Context-dependent Require specialized knowledge Dynamic : sophisticated adversaries and emerging threats Complex risk analysis requiring Large knowledge base and rationality
Contextual Mismatch Alice: * Knows she wants to visit her bank * Doesn’t know she’s not at her bank Alice’s device: * Knows Alice is not visiting her bank * Doesn’t know that Alice believes she is at her bank
Passwords Now let’s talk about a specific usable security problem that is becoming more and more problematic as people do an increasing number of transactions online. 2.
Typical advice Pick a hard to guess password Don’t use it anywhere else Change it often Don’t write it down Common knowledge that this is advice doesn’t work
What do users do when every web site wants a password?
Bank = b3aYZ Amazon = aa66x! Phonebill = p$2$ta1
From http://www.interactivetools.com/staff/dave/damons_office/
By the way, use a password manager
Symbols & Metaphors One way to make security and privacy tools more usable is to use symbols and metaphors. Unfortunately, many of the symbols and metaphors used to date have been a complete failure. 3.
Cookie flag Netscape SSL icons IE6 cookie flag Firefox SSL icon Netscape open lock (only symbol to left) Netscape cookie flag and closed lock IE cookie flag Firefox closed lock, note that it puts name of site next to closed lock, no open lock
Wireless privacy Many users unaware that communications over wireless computer networks are not private Another privacy issue that my students have been looking at is wireless privacy.
Wall of sheep
Defcon 2001 Photo credit: Kyoorius @ techfreakz.org http://www.techfreakz.org/defcon10/?slide=38
Defcon 2004 Photo credit: http://www.timekiller.org/gallery/DefconXII/photo0003
Peripheral display Help users form more accurate expectations of privacy Without making the problem worse
Experimental trial 11 subjects in student workspace Data collected by survey and traffic analysis Did they refine their expectations of privacy?
Results No change in behavior Peripheral display raised privacy awareness in student workspace But they didn’t really get it
Privacy awareness increased “I feel like my information /activity / privacy are not being protected …. seems like someone can monitor or get my information from my computer, or even publish them.”
But only while the display was on “Now that words [projected on the wall] are gone, I'll go back to the same.” Students associated privacy risk with our peripheral display rather than with their use of a wireless network. So, it looks like we still have our work cut out for ourselves.
Questions to ask about a security or privacy cue Do users notice it? Do they know what it means? Do they know what they are supposed to do when they see it? Will they actually do it? Will they keep doing it? I would like to try to generalize a bit from this experience. Our words on the wall were a privacy cue. Other people are developing other types of privacy and security cues, for example to warn people about potential phishing attacks - this is something I have started working on myself. A lot of security researchers are focusing on how to identify potentially dangerous situations and provide unspoofable cues to users. This is a good first step. But we need to go beyond that and examine whether these cues are actually helpful to users. We should be asking the following questions. We can speculate on the answers, but really the only way to know is to do user studies. And to do that generally is going to require some cooperation between security and usability folks.