Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Privacy Overview and Issues.

Slides:



Advertisements
Similar presentations
UPM E.U.I.T. Diatel Xi Chen Internet of Things UPM E.U.I.T. Diatel Xi Chen
Advertisements

Big Data - Ethical Data Use Kimberlin Cranford. Ethical Use in the Era of Big Data  Landscape has Changed  Attitudes about Big Data  PII, Anonymous,
Control and Accounting Information Systems
P RIVACY, T RUST, AND R EPUTATION Jose Miguel Such July 27th, 2011.
Usable Security – CS 6204 – Fall, 2009 – Dennis Kafura – Virginia Tech Multimedia Communications Tejinder Judge Usable Security – CS 6204 – Fall, 2009.
Interaction of RFID Technology and Public Policy Presentation at RFID Privacy MIT 15 TH November 2003 By Rakesh Kumar
1 Disclosure and Privacy 10: Inter-Act, 13 th Edition 10: Inter-Act, 13 th Edition.
Chap 1: Overview Concepts of CIA: confidentiality, integrity, and availability Confidentiality: concealment of information –The need arises from sensitive.
© 2006 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice Privacy Management for a Global Enterprise.
Security Controls – What Works
Information Security Policies and Standards
Chapter 2 An Interpersonal Communication Process.
Privacy and Sensor Networks: Do Sensor Networks fit with Fair Information Practices Deirdre K. Mulligan Acting Clinical Professor of Law Director, Samuelson.
Stephen S. Yau CSE465 & CSE591, Fall Information Assurance (IA) & Security Overview Concepts Security principles & strategies Techniques Guidelines,
Mark Ackerman Department of Electrical Engineering and Computer Science and School of Information University of Michigan HCI Issues in Privacy DIMACS July,
Chapter 9 Information Systems Controls for System Reliability— Part 2: Confidentiality and Privacy Copyright © 2012 Pearson Education, Inc. publishing.
Security as Experience & Practice Supporting Everyday Security Paul Dourish Donald Bren School of Information and Computer Sciences & California Institute.
Data Protection in Higher Education: Recent Experiences in Privacy and Security Institute for Computer Law and Policy Cornell University June 29, 2005.
Information Systems Controls for System Reliability -Information Security-
Electronic Banking BY Bahaa Abas Noor abo han. Definition * e-banking is defined as: …the automated delivery of new and traditional banking products and.
At Home with Ubiquitous Computing: Seven Challenges W. Keith Edwards and Rebecca Grinter UbiComp 2001.
Navigating the Maze How to sell to the public sector Adrian Farley Chief Deputy CIO State of California
ISO Richard Welford CSR Asia © CSR Asia 2011.
Privacy and Security Risks in Higher Education
Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style COPS: Community-Oriented Privacy System The Prototype.
SWAMI Threats, vulnerabilities & safeguards in a World of Ambient Intelligence David Wright Trilateral Research & Consulting 21 March 2006.
Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style COPS Community Studies Presented by Sherley Codio Community-Oriented.
Margaret J. Cox King’s College London
C4- Social, Legal, and Ethical Issues in the Digital Firm
Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Design Extensions to Google+ CS6204 Privacy and Security.
Company Confidential How to implement privacy and security requirements in practice? Tobias Bräutigam, OTT Senior Legal Counsel, Nokia 8 October
Organizational Change
Defining Computer Security cybertechnology security can be thought of in terms of various counter measures: (i) unauthorized access to systems (ii) alteration.
Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Course Overview Dennis Kafura.
© 2013 Cengage Learning. All Rights Reserved. 1 Part Four: Implementing Business Ethics in a Global Economy Chapter 9: Managing and Controlling Ethics.
1 / 14 FIDIS 2 nd WS WP2 – Fontainebleau, December 2004 Identity in the Ambient Intelligence Environment Sabine Delaitre.
Policy Review (Top-Down Methodology) Lesson 7. Policies From the Peltier Text, p. 81 “The cornerstones of effective information security programs are.
Part 6 – Special Legal Rights and Relationships Chapter 35 – Privacy Law Prepared by Michael Bozzo, Mohawk College © 2015 McGraw-Hill Ryerson Limited 34-1.
The Ethics of Internet Research Rebecca Eynon, Jenny Fry and Ralph Schroeder Oxford Internet Institute, University of Oxford
Social analysis and collateral impact of pervasive technologies CNIT - TN.
LeToia Crozier, Esq., CHC Vice President, Compliance & Regulatory Affairs Corey Wilson Director of Technical Services & Security Officer Interactive Think.
Location, Location, Location: The Emerging Crisis in Wireless Data Privacy Ari Schwartz & Alan Davidson Center for Democracy and Technology
The Framework for Privacy Policies in the UK: Is telling people what information is gathered about them part of the framework? Does it need to be? Emma.
Usable Security – CS 6204 – Fall, 2009 – Dennis Kafura – Virginia Tech Designing for Privacy Human factors and system’s engineering Usable Security – CS.
Future ICT Landscapes – Security and Privacy Challenges & Requirements Simone Fischer-Hübner IVA Workshop, Stockholm 24th May 2012.
Understanding the Human Network Martin Kruger LCDR Jodie Gooby November 2008.
Master Course /11/ Some additional words about pervasive/ubiquitous computing Lionel Brunie National Institute of Applied Science (INSA)
© Prentice Hall, 2007 Excellence in Business Communication, 7eChapter Communicating in Teams and Mastering Listening and Nonverbal Communication.
Privacy and Security: Thinking About and Analyzing Privacy privacy and security 1 Research Topics in Ubiquitous Computing Ben Elgart thinking about and.
International Security Management Standards. BS ISO/IEC 17799:2005 BS ISO/IEC 27001:2005 First edition – ISO/IEC 17799:2000 Second edition ISO/IEC 17799:2005.
An Introduction to the Privacy Act Privacy Act 1993 Promotes and protects individual privacy Is concerned with the privacy of information about people.
Smart Home Technologies Privacy. Data Security and Privacy Security deals with the assurance of a set of rules set to protect privacy Prevent access to.
Usable Security – CS 6204 – Fall, 2009 – Dennis Kafura – Virginia Tech Collective Information Practice: Exploring Privacy and Security as Social and Cultural.
Computer Science and Engineering 1 Mobile Computing and Security.
Ambient Intelligence: Everyday Living Aid System for Elders
Chapter 2 An Interpersonal Communication Process Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
INFORMATION ASSURANCE POLICY. Information Assurance Information operations that protect and defend information and information systems by ensuring their.
Business Challenges in the evolution of HOME AUTOMATION (IoT)
Information Security and Privacy in HRIS
Jim Loter Director of Information Technology
Cyber Issues Facing Medical Practice Managers
Information Security Risk Management
Healthcare Privacy: The Perspective of a Privacy Advocate
IoTSec Taxonomy Proposal
Internet of Things (IoT) for Industrial Development and Automation
Project leader: Richard Morton Lead Editor: Jalal Benhayoun
Unit # 1: Overview of the Course Dr. Bhavani Thuraisingham
Presentation transcript:

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Privacy Overview and Issues

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science 2 Concern “Recent inventions and business methods call attention to the next step which must be taken for the protection of the person, and for securing to the individual what Judge Cooley calls the right “to be left alone”. … modern enterprise and invention have, through invasions upon his privacy, subject him to mental pain and distress, far greater than could be inflicted by mere bodily injury.”

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science 3 Concern

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style 4 Promise and Peril ServiceThreat Web e-commerce social networking news, entertainment search electronic medical records recommendations identity theft spam pfishing unwanted correlation privacy incursion denial of service viruses, worms, … Ubiquitous systems context awareness location awareness pervasive services smart objects loss of privacy, anonymity electronic stalking invasive monitoring loss of control

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science 5 A view of the future?

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science Privacy  Motivations for privacy protection  empowerment: control the dissemination of information about oneself (identity theft)  utility: protection against nuisance (spam)  dignity: freedom from unsubstantiated suspicion (surveillance of public spaces)  regulating agent: checks and balances on power (unauthorized wiretaps) 6

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science Undermining privacy  Trespass of presumed “personal borders”  natural (walls, doors,…)  social (confidentiality within social groups)  spatial/temporal (isolation of activities in different places or times)  ephemeral: (expectation of forgetting/disposal)  “the potential to create an invisible and comprehensive surveillance network”  Privacy impacted by  ability to monitor  ability to search 7

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science  Security  Traditionally about confidentiality, integrity, availability (CIA) of information  Threat vs. risk assessment  Focus on system artifacts (access control policies, cryptography)  Privacy  “The right of the individual to decide what information about himself should be communicated to others and under what circumstances” (Westin, Privacy and Freedom, 1970.)  About context, purpose/intention, and obligation related to disclosed information  Traditional focus on personally identifying information (PII) 8 Privacy vs Security

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science 9 Unique challenges of privacy/security  Security is not the user’s primary goal  Must be usable by a wide range of individuals with differing skills sets  Higher risk associated with failure of security applications than for other application types  Need for updates to account for changes in law, organizational practices, or personal preferences. Karat, C.-M., J. Karat, and C. Brodie, Editorial: why HCI research in privacy and security is critical now. International Journal of Human-Computer Studies, (1-2): p. 1-4.

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science 10 Nature of Privacy  A “boundary regulation process” of accessibility depending on “context” (Altman)  A “personal adjustment process” (Westin) balancing the desire for privacy against the desire to interact in the context of social norms and their environment  A distinction (Solove) between access control (regulating access to information about oneself) and risk management (reducing likelihood of unintended/undesired usage)  Preferences (Westin’s classifications) – Fundamentalists (15-25%) – Pragmatists (40-60%) – Unconcerned (15-25%) Katie Shilton, “Four Billion Little Brothers? Privacy, mobile phones, and ubiquitous computing,” CACM November, 2009.

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science  Altman’s view  “the selective control of access to the self” Dynamic dialectic process (boundary regulation) Optimization Multi-mechanism  Varies by culture and social relationships 11 Boundary Regulation Irwin Altman

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science  Boundaries (not independent)  Disclosure (what is reveled) Disclosure required to participate in network world Increased access to third-party disclosures and aggregation complicates control  Identity (self vs. others) Mediation (technology complicates recipient design and reflexive interpretability of action ) Information persistence (loss of ability to control representations of self)  Temporality ( orientation toward past/future events)  Genres of disclosure  “regularly reproduced arrangements of people, technology and practice that yield identifiable and socially meaningful styles of interaction”  Conclusion  Dynamic: “privacy management is a dynamic response to circumstance rather than a static enforcement of rules”  Dialectic: “privacy management is…a resolution of tensions not just between people but between their internal conflicting requirements”  Situated: “when considering privacy concerns…the whole of the social and institutional setting in which technologies are deployed should be considered” 12 In a networked world

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science New media  New media affords new communication possibilities and new privacy concerns  IM/SMS  Teens showed varying privacy behaviors (caution against assumption of standard preferences)  Unobtrusive nature of text messaging supports “environmental privacy” (limited interruption of the activity in the physical space)  Sharing of information Greater with closer acquaintances Depends on purpose of disclosure  Shared displays  Accidental disclosure  Concern is magnified by Sensitivity of information Relation to onlookers Onlookers control of display 13

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science New media  Media spaces  Physical spaces (offices, work areas) enhanced with multimedia or video recording technology Videoconferencing Always-on audio/video between/among locations  Important privacy design considerations Symmetry Opt-out control Purposefulness: acceptance of privacy risks based on perceived value (a value proposition judgment) 14

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science New media  Sensors, RFID  Concerns Loss of control of collected data Uncertainty of technologies utility  Trust (elderly interviewees regarding home-based monitoring) Accept potential privacy invasion based on trust in those controlling the technology Judgment of value proposition for increased safety  Location disclosure  Effected more by who was asking more than the current location  Tracking/disclosure seen as more invasive than location-based configuration (e.g., ringtone volume control)  Concerns affected by Trust in service provider Oversight of regulatory agencies  Precision “blurring” of current location less used than anticipated Instead users either did not respond or provide information they believed was most useful to recipient 15

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science Smart objects  Enabling technologies  low-power processors with integrated sensors and wireless communication  remote identification of objects  precise localization of objects  Smart everyday objects  attached processing “introspection” capability  ability to respond in context-sensitive manner  creating “ambient intelligence” (smart without actually being intelligent) 16

Fall, Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, Privacy&Security - Virginia Tech – Computer Science Other risks  Reliability  manageability of such a scale of interacting devices; continue to meet requirements?  predictability (unanticipated consequences?)  dependability in the face of service interruptions  Delegation of control  content: who attests to the veracity of information conveyed by a smart object?  system control: will our cars drive the way the insurance company prefers?  accountability: who is responsible for economic or legally significant actions taken by a smart object? 17