Download presentation
Presentation is loading. Please wait.
Published byDonald Leonard Modified over 9 years ago
1
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Privacy Overview and Issues
2
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 2 Concern “Recent inventions and business methods call attention to the next step which must be taken for the protection of the person, and for securing to the individual what Judge Cooley calls the right “to be left alone”. … modern enterprise and invention have, through invasions upon his privacy, subject him to mental pain and distress, far greater than could be inflicted by mere bodily injury.”
3
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 3 Concern
4
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style 4 Promise and Peril ServiceThreat Web e-commerce email social networking news, entertainment search electronic medical records recommendations identity theft spam pfishing unwanted correlation privacy incursion denial of service viruses, worms, … Ubiquitous systems context awareness location awareness pervasive services smart objects loss of privacy, anonymity electronic stalking invasive monitoring loss of control
5
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 5 A view of the future? http://www.aclu.org/pizza/images/screen.swf
6
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Privacy Motivations for privacy protection empowerment: control the dissemination of information about oneself (identity theft) utility: protection against nuisance (spam) dignity: freedom from unsubstantiated suspicion (surveillance of public spaces) regulating agent: checks and balances on power (unauthorized wiretaps) 6
7
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Undermining privacy Trespass of presumed “personal borders” natural (walls, doors,…) social (confidentiality within social groups) spatial/temporal (isolation of activities in different places or times) ephemeral: (expectation of forgetting/disposal) “the potential to create an invisible and comprehensive surveillance network” Privacy impacted by ability to monitor ability to search 7
8
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Security Traditionally about confidentiality, integrity, availability (CIA) of information Threat vs. risk assessment Focus on system artifacts (access control policies, cryptography) Privacy “The right of the individual to decide what information about himself should be communicated to others and under what circumstances” (Westin, Privacy and Freedom, 1970.) About context, purpose/intention, and obligation related to disclosed information Traditional focus on personally identifying information (PII) 8 Privacy vs Security
9
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 9 Unique challenges of privacy/security Security is not the user’s primary goal Must be usable by a wide range of individuals with differing skills sets Higher risk associated with failure of security applications than for other application types Need for updates to account for changes in law, organizational practices, or personal preferences. Karat, C.-M., J. Karat, and C. Brodie, Editorial: why HCI research in privacy and security is critical now. International Journal of Human-Computer Studies, 2005. 63(1-2): p. 1-4.
10
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 10 Nature of Privacy A “boundary regulation process” of accessibility depending on “context” (Altman) A “personal adjustment process” (Westin) balancing the desire for privacy against the desire to interact in the context of social norms and their environment A distinction (Solove) between access control (regulating access to information about oneself) and risk management (reducing likelihood of unintended/undesired usage) Preferences (Westin’s classifications) – Fundamentalists (15-25%) – Pragmatists (40-60%) – Unconcerned (15-25%) Katie Shilton, “Four Billion Little Brothers? Privacy, mobile phones, and ubiquitous computing,” CACM November, 2009.
11
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Altman’s view “the selective control of access to the self” Dynamic dialectic process (boundary regulation) Optimization Multi-mechanism Varies by culture and social relationships 11 Boundary Regulation Irwin Altman
12
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Boundaries (not independent) Disclosure (what is reveled) Disclosure required to participate in network world Increased access to third-party disclosures and aggregation complicates control Identity (self vs. others) Mediation (technology complicates recipient design and reflexive interpretability of action ) Information persistence (loss of ability to control representations of self) Temporality ( orientation toward past/future events) Genres of disclosure “regularly reproduced arrangements of people, technology and practice that yield identifiable and socially meaningful styles of interaction” Conclusion Dynamic: “privacy management is a dynamic response to circumstance rather than a static enforcement of rules” Dialectic: “privacy management is…a resolution of tensions not just between people but between their internal conflicting requirements” Situated: “when considering privacy concerns…the whole of the social and institutional setting in which technologies are deployed should be considered” 12 In a networked world
13
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science New media New media affords new communication possibilities and new privacy concerns IM/SMS Teens showed varying privacy behaviors (caution against assumption of standard preferences) Unobtrusive nature of text messaging supports “environmental privacy” (limited interruption of the activity in the physical space) Sharing of information Greater with closer acquaintances Depends on purpose of disclosure Shared displays Accidental disclosure Concern is magnified by Sensitivity of information Relation to onlookers Onlookers control of display 13
14
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science New media Media spaces Physical spaces (offices, work areas) enhanced with multimedia or video recording technology Videoconferencing Always-on audio/video between/among locations Important privacy design considerations Symmetry Opt-out control Purposefulness: acceptance of privacy risks based on perceived value (a value proposition judgment) 14
15
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science New media Sensors, RFID Concerns Loss of control of collected data Uncertainty of technologies utility Trust (elderly interviewees regarding home-based monitoring) Accept potential privacy invasion based on trust in those controlling the technology Judgment of value proposition for increased safety Location disclosure Effected more by who was asking more than the current location Tracking/disclosure seen as more invasive than location-based configuration (e.g., ringtone volume control) Concerns affected by Trust in service provider Oversight of regulatory agencies Precision “blurring” of current location less used than anticipated Instead users either did not respond or provide information they believed was most useful to recipient 15
16
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Smart objects Enabling technologies low-power processors with integrated sensors and wireless communication remote identification of objects precise localization of objects Smart everyday objects attached processing “introspection” capability ability to respond in context-sensitive manner creating “ambient intelligence” (smart without actually being intelligent) 16
17
Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Other risks Reliability manageability of such a scale of interacting devices; continue to meet requirements? predictability (unanticipated consequences?) dependability in the face of service interruptions Delegation of control content: who attests to the veracity of information conveyed by a smart object? system control: will our cars drive the way the insurance company prefers? accountability: who is responsible for economic or legally significant actions taken by a smart object? 17
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.