Download presentation
Presentation is loading. Please wait.
Published byAlexandrina Farmer Modified over 9 years ago
1
1 On Protecting Private Information in Social Networks: A Proposal Bo Luo 1 and Dongwon Lee 2 1 The University of Kansas, bluo@ku.edu 2 The Pennsylvania State University, dongwon@psu.edu
2
2 Motivation Online social networks Getting very popular (e.g. Facebook: 68M unique visitors, 1.2B visits) Various types of communities –General (e.g. Facebook; MySpace) –Business/professional (e.g. LinkedIn) –Alumni –Leisure –Healthcare (e.g. SoberCircle; PatientsLikeMe) People socialize with friends But also adversaries!
3
3 Motivation Privacy vulnerabilities in online social networks Huge amount of personal information available over various types of social network sites. Users are not fully aware of the risks. Adversaries use various techniques to collect such information. –E.g. information retrieval and search engine News stories Facebook Stalkers [Dubow, USA Today, 2007] Gadgets and add-ons read user profiles [Irvin, USA Today, 2008] How Not to Lose Face on Facebook, for Professors. [Young, Chronicle, 2009].
4
4 Privacy vulnerabilities Threat 1: out-of-context information disclosure Users present information to a “context” (e.g. targeted readers) Implicit assumption –Information stays in the context –This is wrong! Out-of-context information disclosure –Wrong configuration –Mal-functioning code –Users ’ misunderstanding Examples Adversaries could simply register for forums to access many information. Messages in a “closed” email-based community is archived and accessible to everyone. Gadgets and add-ons read user profiles
5
5 Privacy vulnerabilities Threat 2: In-network information aggregation User share information in social networks Implicit assumption: “a small piece of personal information is not a big deal” Adversaries collect all the pieces of information associated with a user. Adversaries aggregate all the information pieces. Significant amount of privacy! In-network information aggregation attack.
6
6 Threat 3: cross-network information aggregation User participates in multiple networks Different levels of privacy concerns. Adversaries use evidences to link profiles from different SN sites –Attribute –Neighborhood –Similar posts –Propagation Adversaries collects all the private information across multiple SN sites Cross-network information aggregation Privacy vulnerabilities
7
7 Goals and solutions at a glance Goal: prevent users from unwanted information disclosure, especially from the three threats. Users should be able to socialize. We cannot prevent users from sharing information Honest-but-curious observer Honest: no phishing, no spam, no hacking Curious: very aggressive in seeking information –Registers for social networks –Uses search engines –Manipulates information Our goal: Protect users from honest-but-curious observers
8
8 Design goals Enable users to describe a privacy plan——How they allow their private information items to be disclosed Solution: privacy models Alert users when they share information over social networks Solution: passive monitor Monitor private information over various social networks to make sure that they are not violated Solution: active monitor
9
9 Online social networks We define two properties to describe online social networks Openness level –How information in a social network could be accessed –E.g. OL=public – everyone can access; –E.g. OL=registration-required – all registered users can access, but not search engines. Access equivalency group –Social networks with identical openness level belongs to a group.
10
10 Private information model We define two private information models Multi-level model Private information items are managed in hierarchically organized categories Information flow from lower level (less private) to higher level (more private) –E.g. when user trusts SN with level 3, s/he also trusts SN with levels 1 and 2 Simple model Easy for users to understand Less descriptive
11
11 Private information model Discretionary model—— a set-based model Private information items are organized into sets Private information items in one set could be released together Private information item may belong to multiple sets Private information disclosure model Formally describes: –out-of-context information disclosure –information aggregation attacks under discretionary model. Details: please refer to the paper
12
12 Privacy monitor: the proposal
13
13 Privacy sandbox Picks a privacy model Allows users to describe their privacy plan in the model, i.e. how they want to arrange private information items E.g. define privacy information sets under discretionary model Define how sets could be released to social networks with different openness level. Keeps privacy plans
14
14 Passive monitor is triggered when users send information to social networks Alerts users –who can access the submitted information –Openness level –Access equivalency group Checks against the privacy plan Keeps a local log of private information disclosure –For future use
15
15 Remote Component and Active monitor Remote component Actively collects personal information from various social networks Simulates in-network and cross-network information aggregation Stores information in a data repository Active monitor Compares users’ privacy plans with –Local log –Remote data repository –Search engine results Checks for discrepancy –Warns user about unwanted information disclosure
16
16 Conclusion In this paper, we present privacy vulnerabilities over social networks, especially information aggregation attacks model social networks and private information disclosure from access control perspective describe information aggregation attacks in the model propose our initial design of a privacy monitor This is our preliminary proposal Further analysis and implementation is on-going Thanks a lot!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.