Agenda  Tuesday, June 28 th  Psychology and Security  Thursday, June 30 th  Usable Security.

Slides:



Advertisements
Similar presentations
Part I: Making Good Online Choices
Advertisements

Microsoft ® Office 2007 Training Security II: Turn off the Message Bar and run code safely P J Human Resources Pte Ltd presents:
User Interfaces 4 BTECH: IT WIKI PAGE:
HIPAA. What Why Who How When What Is HIPAA? Health Insurance Portability & Accountability Act of 1996.
The Third International Forum on Financial Consumer Protection & Education “Fostering Greater Consumer Protection & Education” Preventing Identity Theft.
INTERNET SAFETY FOR EVERYONE A QUICK AND EASY CRASH COURSE.
Social Engineering J Nivethan. Social Engineering The process of deceiving people into giving away access or confidential information Onlinne Phone Offline.
Social Engineering Networks Reid Chapman Ciaran Hannigan.
Lecture 2 Page 1 CS 236, Spring 2008 Security Principles and Policies CS 236 On-Line MS Program Networks and Systems Security Peter Reiher Spring, 2008.
Users Are Not The Enemy A. Adams and M. A. Sasse Presenter: Jonathan McCune Security Reading Group February 6, 2004.
Lesson 13-Intrusion Detection. Overview Define the types of Intrusion Detection Systems (IDS). Set up an IDS. Manage an IDS. Understand intrusion prevention.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Social Engineering PA Turnpike Commission. “Social Engineering is the practice of obtaining confidential information by manipulation of legitimate users”
Understanding Task Orientation Guidelines for a Successful Manual & Help System.
Software Dependability CIS 376 Bruce R. Maxim UM-Dearborn.
Obtaining, Storing and Using Confidential Data October 2, 2014 Georgia Department of Audits and Accounts.
Internet Safety Basics Being responsible -- and safer -- online Visit age-appropriate sites Minimize chatting with strangers. Think critically about.
Malicious Code Brian E. Brzezicki. Malicious Code (from Chapter 13 and 11)
Class Activity: User Education on SNS Phishing. Contextual Training Users are sent simulated phishing s by the experimenter to test user’s vulnerability.
Chapter 4.  Can technology alone provide the best security for your organization?
Information Security 2013 Roadshow. Roadshow Outline  Why We Care About Information Security  Safe Computing Recognize a Secure Web Site (HTTPS) How.
GOLD UNIT 4 - IT SECURITY FOR USERS (2 CREDITS) Thomas Jenkins.
CS5714 Usability Engineering Web Introduction Copyright © 2003 H. Rex Hartson and Deborah Hix.
References  Cranor & Garfinkel, Security and Usability, O’Reilly  Sasse & Flechais, “Usable Security: Why Do We Need It? How Do We Get It?”  McCracken.
Gary MarsdenSlide 1University of Cape Town Human-Computer Interaction - 7 Design Guidelines & Standards Gary Marsden ( ) July 2002.
Understanding Human Behavior Helps Us Understand Investor Behavior MA2N0246 Tsatsral Dorjsuren.
1 User-Centric The Human Factor in Design Susanne M. Furman, PhD Usability Engineer Web Communication and New Media Division U.S. Department of Health.
Security Policies and Procedures. cs490ns-cotter2 Objectives Define the security policy cycle Explain risk identification Design a security policy –Define.
BTT12OI.  Do you know someone who has been scammed online? What happened?  Been tricked into sending someone else money (not who they thought they were)
Chapter 1 Overview The NIST Computer Security Handbook defines the term Computer Security as:
Week 10-11c Attacks and Malware III. Remote Control Facility distinguishes a bot from a worm distinguishes a bot from a worm worm propagates itself and.
The Role of Decision Making in Management Chapter 1.
Knowing What You Missed Forensic Techniques for Investigating Network Traffic.
Lecture slides prepared for “Computer Security: Principles and Practice”, 3/e, by William Stallings and Lawrie Brown, Chapter 1 “Overview”. © 2016 Pearson.
Topic 5: Basic Security.
Chapter 11: Policies and Procedures Security+ Guide to Network Security Fundamentals Second Edition.
Lecture 1 Page 1 CS 236 Online What Are Our Security Goals? CIA Confidentiality –If it’s supposed to be a secret, be careful who hears it Integrity –Don’t.
Policy 2 Dr.Talal Alkharobi. 2 Create Appropriate Policy Each organization may need different policies. Policy templates are useful to examine and to.
BEHAVIORAL FINANCE.
Develop your Legal Practice using “Cloud” applications, but … Make sure your data is safe! Tuesday 17 November 2015 The Law Society, London Allan Carton,
ETHICS in the WORKPLACE © 2012 Cengage Learning. All Rights Reserved. Chapter 1 Welcome to Ethics.
CHAPTER 2 Laws of Security. Introduction Laws of security enable user make the judgment about the security of a system. Some of the “laws” are not really.
Human-Computer Interaction Design process Task and User Characteristics Guidelines Evaluation ISE
Basic Security Concepts University of Sunderland CSEM02 Harry R Erwin, PhD.
Basic Security Concepts University of Sunderland CIT304 Harry R Erwin, PhD.
LOOKOUT GPS TRACKER BY : PENYU NELANG. WHAT IS LOOKOUT ? We proposed this device to cope the kidnapped problems that become a rampant issues these days.
1 Saltzer [1974] and later Saltzer and Schroeder [1975] list the following principles of the design of secure protection systems, which are still valid:
1 Design Principles CS461 / ECE422 Spring Overview Simplicity  Less to go wrong  Fewer possible inconsistencies  Easy to understand Restriction.
F8: Audit and Assurance. 2 Designed to give you knowledge and application of: Section A: Audit Framework and Regulation Section B: Internal audit Section.
6. (supplemental) User Interface Design. User Interface Design System users often judge a system by its interface rather than its functionality A poorly.
Outline of this module By the end of this module, you will be able to: Understand the benefits that internet banking provides; Name the different dangers.
PCS Technology for Students: Acceptable Use, Privacy, and Safety.
Governance, Risk and Ethics. 2 Section A: Governance and responsibility Section B: Internal control and review Section C: Identifying and assessing risk.
Safety and Security Management Fundamental Concepts
Outline Basic concepts in computer security
Issues and Protections
Social Engineering Brock’s Cyber Security Awareness Committee
Common Methods Used to Commit Computer Crimes
PCS Technology for Staff: Acceptable Use, Privacy, and Safety
Social Engineering Charniece Craven COSC 316.
Lesson Objectives Aims You should be able to:
Cybersecurity Awareness
Robert Leonard Information Security Manager Hamilton
Explaining Bitcoins will be the easy part: Borne Attacks and How You Can Defend Against Them Matthew Gardiner Product Marketing.
Network Security Best Practices
Motivation Chapter Four.
Usability Techniques Lecture 13.
The Psychology of Security
Security Principles and Policies CS 236 On-Line MS Program Networks and Systems Security Peter Reiher.
Quattrone and Tversky 1998, Slovic 1987
Presentation transcript:

Agenda  Tuesday, June 28 th  Psychology and Security  Thursday, June 30 th  Usable Security

References  Ross Anderson, Security Engineering  Chapter 2 “Usability and Psychology”  Ryan West, “The Psychology of Security”, Communications of the ACM, April 2008, p

People  Only amateurs attack machines; professionals target people. — Bruce Schneier  Many real attacks exploit psychology at least as much as technology.  Kevin Mitnick, Art of Deception

Phishing  it is much easier for crooks to build a bogus bank website that passes casual inspection than it is for them to create a bogus bank in a shopping mall.

Phishing Examples  US Bank US Bank  Amazon Amazon  Twitter Twitter

Pretexting & Social Engineering  The most common way for private investigators to steal personal information is pretexting — phoning someone who has the information under a false pretext, usually by pretending to be someone authorized to be told it. Such attacks are sometimes known collectively as social engineering.

Trusting people  Many frauds work by appealing to our atavistic instincts to trust people more in certain situations.

Psychological manipulation  As designers learn how to forestall the easier techie attacks, psychological manipulation of system users or operators becomes ever more attractive.  The security engineer simply must understand basic psychology and ‘security usability’.

IRS Social Engineering  Fixing the problem is hard. Despite continuing publicity about pretexting, there was an audit of the IRS in 2007 by the Treasury Inspector General for Tax Administration, whose staff called 102 IRS employees at all levels, asked for their user ids, and told them to change their passwords to a known value. 62 did so.

Policies & Training  It’s not enough for rules to exist; you have to train all the staff who have access to the confidential material, and explain to them the reasons behind the rules.

Research Areas  Information security and psychology  Human-computer interaction (HCI)  Poorly understood by systems developers  Information security and economics

Perception of Risk  Terrorism is largely about manipulating perceptions of risk.  Many protection mechanisms are sold using scaremongering.

Cognitive psychology  How we think, remember, and make decisions.  What makes security harder than safety is that we have a sentient attacker who will try to provoke exploitable errors.

Practiced actions  People are trained to click ‘OK’ to pop-up boxes as that’s often the only way to get the work done.

Risk Evaluation  Risk and uncertainty are extremely difficult concepts for people to evaluate.  For designers of security systems, it is important to understand how users evaluate and make decisions regarding security.  The most elegant and intuitively designed interface does not improve security if users ignore warnings, choose poor settings, or unintentionally subvert corporate policies.

Risk Evaluation  The user problem in security systems is not just about user interfaces or system interaction. Fundamentally, it is about how people think of risk that guides their behavior.

Following rules  Starting URLs with the impersonated bank’s name, as looking for the name being for many people a stronger rule than parsing its position.

Mental Model  Attackers exploit dissonances between users’ mental models of a system and its actual logic.  A cognitive walkthrough can be aimed at identifying attack points, just as a code walkthrough can be used to search for software vulnerabilities.

Behavioral economics  People’s decision processes depart from the rational behavior.  The heuristics we use in everyday judgment and decision making lie somewhere between rational thought and the unmediated input from the senses.

Calculating Probabilities  We’re also bad at calculating probabilities, and use all sorts of heuristics to help us make decisions:  We also worry too much about unlikely events.  Many people perceive terrorism to be a much worse threat than food poisoning or road traffic accidents.

Problem 1  Read “Users do not think they are at risk” on page 36 of Ryan West, “The Psychology of Security”.  Complete Problem 1

Users aren’t stupid, they’re unmotivated  To conserve mental resources, we generally tend to favor quick decisions based on learned rules and heuristics.  It is efficient in the sense it is quick, it minimizes effort, and the outcome is good enough most of the time. (cognitive miser)  This partially accounts for why users do not reliably read all the text relevant in a display or consider all the consequences of their actions.

Problem 2  Safety is an abstract concept.  Chose a partner.  Complete Problem #2

Evaluating the security/cost trade-off  While the gains of security are generally abstract the cost is real and immediate.  it usually comes with a price paid in time, effort, and convenience.  Users weigh the cost of the effort against the perceived value of the gain (safety/security) and the perceived chance that nothing bad would happen either way.

Risk aversion  People dislike losing $100 they already have more than they value winning $100.  Marketers talk in terms of ‘discount’ and ‘saving’ — by framing an action as a gain rather than as a loss makes people more likely to take it.

Problem 3  Security as a secondary task.  Losses perceived disproportionately to gains  With your partner, complete Problem #3.

Principle of Psychological Acceptability  Security Mechanisms should not make the resource more difficult to access than if the security mechanisms were not present.  Salzer & Schroeder 1975

Principle of Psychological Acceptability  The security mechanism may add some extra burden, but that burden must be both minimal and reasonable.  Every file access requires the user enter his password?

Password Policies  Many users want to use a simple easy to remember password. They do not want to change their password. They write down their password. They want to use the same password for all their accounts.  It is a challenge to write a password policy that is psychologically acceptable and still provides security.

Airport Security  Is it psychologically acceptable?  How about full body scans and pat downs?

IMPROVING SECURITY COMPLIANCE AND DECISION MAKING  Reward pro-security behavior.  Users must be motivated to take pro-security actions.  There must be a tangible reward for making good security decisions.  One form of reward is to see that the security mechanisms are working and that the action the user chose is, in fact, making them safer.

IMPROVING SECURITY COMPLIANCE AND DECISION MAKING  When an antivirus or antispyware product finds and removes malicious code. The security application often issues a notification that it has found and mitigated a threat.

Improve the awareness of risk  People often believe they are at less risk compared to others.  Increase user awareness of the risks they face.  Security messages should be instantly distinguishable from other message dialogs. Security messages should look and sound very different

Catch corporate security policy violators  Having a corporate security policy that is not monitored or enforced is tantamount to having laws but no police.  Security systems should have good auditing capabilities.  The best deterrent to breaking the rules is not the severity of consequences but the likelihood of being caught.

Reduce the cost of implementing security  To accomplish a task, users often seek the path of least resistance that satisfies the primary goal.  Making the secure choice the easiest for the user to implement, one takes advantage of normal user behavior and gains compliance.

Reduce the cost of implementing security  To reduce the cost of security is to employ secure default settings.  Most users never change the default settings of their applications.  “Secure by Default” principle.  While good default settings can increase security, system designers must be careful that users do not find an easier way to slip around them.

CONCLUSION  We can increase compliance if we work with the psychological principles that drive behavior.

Problem #4 1. Consider some software product that you regularly use, some website that you regularly visit, or some software product that you develop as part of your job. Briefly describe this product. 2. Discuss how well it meets the Principle of Psychological Acceptability for users of this product or website. 3. Discuss how this product or website could be improved from the psychological viewpoint. 