Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems Jason Hong Carnegie Mellon Jennifer Ng Carnegie Mellon Scott Lederer University.

Slides:



Advertisements
Similar presentations
Designing for Context: Usability in a Ubiquitous Environment Jenna Burrell, Paul Treadwell, Geri K. Gay Human Computer Interaction Group Cornell University.
Advertisements

Joshua Sunshine. Defining Ubiquitous Computing Unique Privacy Problems Examples Exercise 1: Privacy Solution Privacy Tradeoffs Professional Solutions.
An Architecture for Privacy-Sensitive Ubiquitous Computing Jason I. Hong Group for User Interface Research Computer Science Division University of California.
Beyond Prototypes: Challenges in Deploying Ubiquitous Systems N. Davies and H. Gellersen IEEE pervasive computing, 2002 Presenter: Min Zhang
HIPAA. What Why Who How When What Is HIPAA? Health Insurance Portability & Accountability Act of 1996.
UMMC Risky Business?. YES We work with YOU High Acuity Increasing Volume.
Sometimes…. What Seems Unfamiliar and Strange Ends Up Almost Being Business as Usual Margaret D. LeCompte, PhD School of Education University of Colorado-Boulder.
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Risk Assessment How do we know what we should be working on? This material was produced under grant SH from the Occupational Safety and Health.
1 INTERNAL CONTROLS A PRACTICAL GUIDE TO HELP ENSURE FINANCIAL INTEGRITY.
How to Prepare for the Fall Exam COM380/CIT304 Harry Erwin, PhD University of Sunderland.
Development and Evaluation of Emerging Design Patterns for Ubiquitous Computing Eric Chung Carnegie Mellon Jason Hong Carnegie Mellon Madhu Prabaker University.
Introducing Computer and Network Security
Privacy and Sensor Andrew Jason Hong. Characteristics –Real-time, distributed –Invisibility of sensors –Potential scale Questions –What data is collected?
Privacy and Ubiquitous Computing Jason I. Hong. Ubicomp Privacy is a Serious Concern “[Active Badge] could tell when you were in the bathroom, when you.
Privacy and Security in the Location-enhanced World Wide Web UC Berkeley Intel / UW UW Intel UC Berkeley Jason Hong Gaetano Boriello James Landay David.
User studies. Why user studies? How do we know security and privacy solutions are really usable? Have to observe users! –you may be surprised by what.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, Jan 22, 2004.
An Architecture for Privacy-Sensitive Ubiquitous Computing Jason I. Hong HCI Institute Carnegie Mellon University James A. Landay Computer Science and.
Computer Security: Principles and Practice
Security and Privacy in Ubiquitous Computing. Agenda Project issues? Project issues? Ubicomp quick overview Ubicomp quick overview Privacy and security.
Location Privacy Christopher Pride. Readings Location Disclosure to Social Relations: Why, When, and What People Want to Share Location Disclosure to.
WebQuilt and Mobile Devices: A Web Usability Testing and Analysis Tool for the Mobile Internet Tara Matthews Seattle University April 5, 2001 Faculty Mentor:
Introduction to Network Defense
You can customize your privacy settings. The privacy page gives you control over who can view your content. At most only your friends, their friends and.
Social impacts of the use of it By: Mohamed Abdalla.
HOST Data Walk 2014 Neighborhood Revitalization Conference Presentation Elsa Falkenburger.
Project Risk Management. The Importance of Project Risk Management Project risk management is the art and science of identifying, analyzing, and responding.
Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style COPS Community Studies Presented by Sherley Codio Community-Oriented.
G53SEC Computer Security Introduction to G53SEC 1.
Privacy Sensitive Architecture for Psychiatric Behaviour Monitoring Service System Presenter: Rusyaizila Ramli (Ph.D student) Supervisors: Associate Professor.
BUSINESS B1 Information Security.
Information Security Rabie A. Ramadan GUC, Cairo Room C Lecture 2.
Introducing Computer and Network Security. Computer Security Basics What is computer security? –Answer depends on the perspective of the person you’re.
Staying Safe Online Keep your Information Secure.
Topiary: A Tool for Prototyping Location-Enhanced Applications Yang Li, Jason I. Hong, James A. Landay, Presented by Daniel Schulman.
Sampling is the other method of getting data, along with experimentation. It involves looking at a sample from a population with the hope of making inferences.
Participate in a Team to Achieve Organizational Goal
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are.
Tools for Web Design and for Ubiquitous Computing Jason I. Hong Computer Science Division University of California, Berkeley.
An Architecture for Privacy-Sensitive Ubiquitous Computing By Jason I-An Hong In MobiSYS ’04: Proceedings of the 2nd international conference on mobile.
Mobile and Location-Based Services Jason I. Hong Product Design and Usability April
Approximate Information Flows: Socially-based Modeling of Privacy in Ubiquitous Computing Xiaodong Jiang Jason I. Hong James A. Landay G r o u p f o r.
Project Risk Management Planning Stage
1.Research Motivation 2.Existing Techniques 3.Proposed Technique 4.Limitations 5.Conclusion.
E-Safety. A great place… Image by: Shutterstock/nasirkhan As we have discussed over the last few lessons, the Internet is a great tool for sharing information,
GOLD UNIT 4 - IT SECURITY FOR USERS (2 CREDITS) Bailey Ryan.
Easy Read Summary Mental Capacity Act Mental Capacity Act A Summary The Mental Capacity Act 2005 will help people to make their own decisions.
Usable Privacy and Security and Mobile Social Services Jason Hong
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Prof. James A. Landay Richard Davis Kate Everitt University of Washington Autumn 2004 UW Undergraduate HCI Projects A CSE 490jl Overview December 9, 2004.
Dude, Where's My Car? And Other Questions in Context-Awareness Jason I. Hong James A. Landay Group for User Interface Research University of California.
A Study of Context-Awareness: The Context Fusion Network, The Context Fabric Presented by Sangkeun Lee IDS Lab., Seoul National University Solar:
Privacy in the Age of Ubiquitous Computing Jason I. Hong Scott Lederer Jennifer Ng Anind K. Dey James A. Landay G r o u p f o r User Interface Research.
Network Requirements Analysis CPIT 375 Data Network Designing and Evaluation.
Systems Analysis & Design 7 th Edition Chapter 2.
Research Topics in Ubiquitous Computing Jason I. Hong.
The Context Fabric: An Infrastructure for Context-Aware Computing Jason I. Hong Group for User Interface Research, Computer Science Division University.
Prof. James A. Landay University of Washington Spring 2008 Web Interface Design, Prototyping, and Implementation Ubicomp Design Pre-Patterns May 29, 2008.
25/09/ Firewall, IDS & IPS basics. Summary Firewalls Intrusion detection system Intrusion prevention system.
Mobile Testing - Bug Report
Evaluating Existing Systems
Evaluating Existing Systems
Here are some top tips to help you bake responsible data into your project design:.
Insights from Children about Abuse and Neglect
Online Safety: Rights and Responsibilities
Online Safety; Privacy and Sharing
Online Safety; Privacy and Sharing
Presentation transcript:

Privacy Risk Models for Designing Privacy-Sensitive Ubiquitous Computing Systems Jason Hong Carnegie Mellon Jennifer Ng Carnegie Mellon Scott Lederer University of California, Berkeley James Landay University of Washington

Motivation Ubiquitous Computing is Coming Advances in wireless networking, sensors, devices –Greater awareness of and interaction with physical world E911Find Friends “But what about my privacy?”

Motivation But Hard to Design Privacy-Sensitive Ubicomp Apps Discussions on privacy generate lots of heat but not light –Big brother, overprotective parents, telemarketers, genetics… –Many conflicting values –Often end up talking over each other –Hard to have reasoned debates and create designs that address the issues Need a design method that helps design teams: –Identify –Prioritize –Manage privacy risks for specific applications Propose Privacy Risk Models for doing this

Privacy Risk Model Analogy Security Threat Model “[T]he first rule of security analysis is this: understand your threat model. Experience teaches that if you don’t have a clear threat model – a clear idea of what you are trying to prevent and what technical capabilities your adversaries have – then you won’t be able to think analytically about how to proceed. The threat model is the starting point of any security analysis.” - Ed Felten

Privacy Risk Model Two Parts: Risk Analysis and Risk Management Privacy Risk Analysis –Common questions to help design teams identify potential risks –Like a task analysis Privacy Risk Management –Helps teams prioritize and manage risks –Like severity rankings in heuristic evaluation Will present a specific privacy risk model for ubicomp –Draws on previous work, plus surveys and interviews –Provide reasonable level of protection for foreseeable risks

Outline  Motivation  Privacy Risk Analysis  Privacy Risk Management  Case Study: Location-enhanced Instant Messenger

Privacy Risk Analysis Common Questions to Help Design Teams Identify Risks Social and Organizational Context –Who are the users? –What kinds of personal info are shared? –Relationships between sharers and observers? –Value proposition for sharing? –…

Social and Organizational Context Who are the users? Who shares info? Who sees it? Different communities have different needs and norms –An app appropriate for families might not be for work settings Affects conditions and types of info willing to be shared –Location information with spouse vs co-workers –Real-time monitoring of one’s health Start with most likely users –Ex. Find Friends –Likely sharers are people using mobile phone –Likely observers are friends, family, co-workers Find Friends

Social and Organizational Context What kinds of personal info are shared? Different kinds of info have different risks and norms –Current location vs home phone# vs hobbies Some information already known between people –Ex. Don’t need to protect identity with your friends and family Different ways of protecting different kinds of info –Ex. Can revoke access to location, cannot for birthday or name

Social and Organizational Context Relationships between sharers and observers? Kinds of risks and concerns –Ex. Risks w/ friends are unwanted intrusions, embarrassment –Ex. Risks w/ paid services are spam, 2nd use, hackers Incentives for protecting personal information –Ex. Most friends don’t have reason to intentionally cause harm –Ex. Neither do paid services, but want to make more money Mechanisms for recourse –Ex. Kindly ask friends and family to stop being nosy –Ex. Recourse for paid services include formally complaining, switching services, suing

Social and Organizational Context Value proposition for sharing personal information? What incentive do users have for sharing? Quotes from nurses using locator badges –“I think this is disrespectful, demeaning and degrading” –“At first, we hated it for various reasons, but mostly we felt we couldn’t take a bathroom break without someone knowing where we were…[but now] requests for medications go right to the nurse and bedpans etc go to the techs first... I just love [the locator system].” When those who share personal info do not benefit in proportion to perceived risks, then the tech is likely to fail

Privacy Risk Analysis Common Questions to Help Design Teams Identify Risks Social and Organizational Context –Who are the users? –What kinds of personal info are shared? –Relationships between sharers and observers? –Value proposition for sharing? –…–… Technology –How is personal info collected? –Push or pull? –One-time or continuous? –Granularity of info? –…–…

Technology How is personal info collected? Different technologies have different tradeoffs for privacy Network-based approach –Info captured and processed by external computers that users have no practical control over –Ex. Locator badges, Video cameras Client-based approach –Info captured and processed on end-user’s device –Ex. GPS, beacons –Stronger privacy guarantees, all info starts with you first

Technology Push or pull? Push is when user sends info first –Ex. you send your location info on E911 call –Few people seem to have problems with push Pull is when another person requests info first –Ex. a friend requests your current location –Design space much harder here need to make people aware of requests want to provide understandable level of control don’t want to overwhelm E911 Find Friends

Technology One-time or continuous disclosures? One-time disclosure –Ex. observer gets snapshot Fewer privacy concerns Find Friends Active Campus Continuous disclosure –Ex. observer repeatedly gets info Greater privacy concerns –“It’s stalking, man.”

Technology Granularity of info shared? Different granularities have different utility and risks Spatial granularity –Ex. City? Neighborhood? Street? Room? Temporal granularity –Ex. “at Boston last month” vs “at Boston August ” Identification granularity –Ex. “a person” vs “a woman” vs Keep and use coarsest granularity needed –Least specific data, fewer inferences, fewer risks

Outline  Motivation  Privacy Risk Analysis  Privacy Risk Management  Case Study: Location-enhanced Instant Messenger

Privacy Risk Management Helps teams prioritize and manage risks First step is to prioritize risks by estimating: –Likelihood that unwanted disclosure occurs –Damage that will happen on such a disclosure –Cost of adequate privacy protection Focus on high likelihood, high damage, low cost risks first –Like heuristic eval, fix high severity and/or low cost –Difficult to get exact numbers, more important is the process

Privacy Risk Management Helps teams prioritize and manage risks Next step is to help manage those risks How does the disclosure happen? –Accident? Bad user interface? Poor conceptual model? –Malicious? Inside job? Scammers? What kinds of choice, control, and awareness are there? –Opt-in? Opt-out? –What mechanisms? Ex. Buddy list, Invisible mode –What are the default settings? Better to prevent or to detect abuses? –“Bob has asked for your location five times in the past hour”

Case Study Location-enhanced Instant Messenger New features –Request a friend’s current location –Automatically show your location –Invisible mode, reject requests –Default location is “unknown” Who are the users? –Typical IM users Relationships? –Friends, family, classmates, … One-time or continuous? –One-time w/ notifications

Case Study Location-enhanced Instant Messenger Identifying potential privacy risks –Over-monitoring by friends and family –Over-monitoring at work place –Being found by malicious person (ex. stalker, mugger) Assessing the first risk, over-monitoring by family –Likelihood depends on family, conservatively assign “high” –Damage might be embarrassing but not life-threatening, assign “medium” Managing the first risk –Buddy list, Notifications for awareness, invisible mode, “unknown” if location not disclosed –All easy to implement, cost is “low”

Discussion Privacy risk models are only a starting point –Like task analysis, should try to verify assumptions and answers –Can be combined with field studies, interviews, low-fi prototypes

Summary Privacy risk models for helping design teams identify, prioritize, and manage risks Privacy risk analysis for identifying risks –Series of common questions, like a task analysis Privacy risk management for prioritizing & managing risks –Like severity ratings in heuristic evaluation Described our first iteration of privacy risk model –Help us evolve and advance it!