F EELING - BASED L OCATION P RIVACY P ROTECTION FOR L OCATION - BASED S ERVICES CS587x Lecture Department of Computer Science Iowa State University Ames,

Slides:



Advertisements
Similar presentations
A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan.
Advertisements

Cipher Techniques to Protect Anonymized Mobility Traces from Privacy Attacks Chris Y. T. Ma, David K. Y. Yau, Nung Kwan Yip and Nageswara S. V. Rao.
Location Based Services and Privacy Issues
Efficient Evaluation of k-Range Nearest Neighbor Queries in Road Networks Jie BaoChi-Yin ChowMohamed F. Mokbel Department of Computer Science and Engineering.
Quality Aware Privacy Protection for Location-based Services Zhen Xiao, Xiaofeng Meng Renmin University of China Jianliang Xu Hong Kong Baptist University.
G. Alonso, D. Kossmann Systems Group
Sheng Xiao, Weibo Gong and Don Towsley,2010 Infocom.
PRIVACY AND SECURITY ISSUES IN DATA MINING P.h.D. Candidate: Anna Monreale Supervisors Prof. Dino Pedreschi Dott.ssa Fosca Giannotti University of Pisa.
Mohamed F. Mokbel University of Minnesota
1 On Protecting Private Information in Social Networks: A Proposal Bo Luo 1 and Dongwon Lee 2 1 The University of Kansas, 2 The Pennsylvania.
A Crowd-Enabled Approach for Efficient Processing of Nearest Neighbor Queries in Incomplete Databases Samia Kabir, Mehnaz Tabassum Mahin Department of.
Location Privacy in Casper: A Tale of two Systems
1 Complexity of Network Synchronization Raeda Naamnieh.
CSCE 715 Ankur Jain 11/16/2010. Introduction Design Goals Framework SDT Protocol Achievements of Goals Overhead of SDT Conclusion.
Beneficial Caching in Mobile Ad Hoc Networks Bin Tang, Samir Das, Himanshu Gupta Computer Science Department Stony Brook University.
Improving Robustness in Distributed Systems Jeremy Russell Software Engineering Honours Project.
An Authentication Service Against Dishonest Users in Mobile Ad Hoc Networks Edith Ngai, Michael R. Lyu, and Roland T. Chin IEEE Aerospace Conference, Big.
The k-server Problem Study Group: Randomized Algorithm Presented by Ray Lam August 16, 2003.
APPLAUS: A Privacy-Preserving Location Proof Updating System for Location-based Services Zhichao Zhu and Guohong Cao Department of Computer Science and.
1 CMSC 132: Object-Oriented Programming II Software Development III Department of Computer Science University of Maryland, College Park.
Lecture 12 Synchronization. EECE 411: Design of Distributed Software Applications Summary so far … A distributed system is: a collection of independent.
A Customizable k-Anonymity Model for Protecting Location Privacy Written by: B. Gedik, L.Liu Presented by: Tal Shoseyov.
PRIVACY CRITERIA. Roadmap Privacy in Data mining Mobile privacy (k-e) – anonymity (c-k) – safety Privacy skyline.
C LOAKING AND M ODELING T ECHNIQUES FOR LOCATION P RIVACY PROTECTION Ying Cai Department of Computer Science Iowa State University Ames, IA
Chapter 3: Cluster Analysis  3.1 Basic Concepts of Clustering  3.2 Partitioning Methods  3.3 Hierarchical Methods The Principle Agglomerative.
FIREWALL TECHNOLOGIES Tahani al jehani. Firewall benefits  A firewall functions as a choke point – all traffic in and out must pass through this single.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
On the Anonymity of Anonymity Systems Andrei Serjantov (anonymous)
Location Privacy Location privacy in mobile systems: A personalized Anonymization Model Burga Gedik, Ling Liu.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Privacy Preserving Data Mining on Moving Object Trajectories Győző Gidófalvi Geomatic ApS Center for Geoinformatik Xuegang Harry Huang Torben Bach Pedersen.
Hiding in the Mobile Crowd: Location Privacy through Collaboration.
Systems and Internet Infrastructure Security (SIIS) LaboratoryPage Systems and Internet Infrastructure Security Network and Security Research Center Department.
Trust- and Clustering-Based Authentication Service in Mobile Ad Hoc Networks Presented by Edith Ngai 28 October 2003.
Live Streaming over Subscription Overlay Networks CS587x Lecture Department of Computer Science Iowa State University.
1 CS 502: Computing Methods for Digital Libraries Lecture 19 Interoperability Z39.50.
Encryption Questions answered in this lecture: How does encryption provide privacy? How does encryption provide authentication? What is public key encryption?
Marina Drosou, Evaggelia Pitoura Computer Science Department
A Mobile Terminal Based Trajectory Preserving Strategy for Continuous Querying LBS Users Yunxia Feng, Peng Liu, Jianhui Zhang May , 2012 Hangzhou,
Preserving Privacy in GPS Traces via Uncertainty- Aware Path Cloaking Baik Hoh, Marco Gruteser, Hui Xiong, Ansaf Alrabady Presented by Joseph T. Meyerowitz.
Analyzing the Vulnerability of Superpeer Networks Against Attack Niloy Ganguly Department of Computer Science & Engineering Indian Institute of Technology,
Simulation & Confidence Intervals COMP5416 Advanced Network Technologies.
Geo-Indistinguishability: Differential Privacy for Location Based Services Miguel Andres, Nicolas Bordenabe, Konstantinos Chatzikokolakis, Catuscia Palamidessi.
INFORMATION SYSTEM ANALYSIS & DESIGN
Trajectory Data Mining Dr. Yu Zheng Lead Researcher, Microsoft Research Chair Professor at Shanghai Jiao Tong University Editor-in-Chief of ACM Trans.
Vector Quantization Vector quantization is used in many applications such as image and voice compression, voice recognition (in general statistical pattern.
Location Privacy Protection for Location-based Services CS587x Lecture Department of Computer Science Iowa State University.
H IDING S TARS WITH F IREWORKS : L OCATION P RIVACY THROUGH C AMOUFLAGE H IDING S TARS WITH F IREWORKS : L OCATION P RIVACY THROUGH C AMOUFLAGE J OSEPH.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Data Mining: Cluster Analysis This lecture node is modified based on Lecture Notes for Chapter.
Unraveling an old cloak: k-anonymity for location privacy
Privacy-Preserving Publication of User Locations in the Proximity of Sensitive Sites Bharath Krishnamachari Gabriel Ghinita Panos Kalnis National University.
Identifying Ethnic Origins with A Prototype Classification Method Fu Chang Institute of Information Science Academia Sinica ext. 1819
Optimizing the Location Obfuscation in Location-Based Mobile Systems Iris Safaka Professor: Jean-Pierre Hubaux Tutor: Berker Agir Semester Project Security.
Privacy-Preserving Trajectory Collection Győző Gidófalvi * Uppsala University, Dept. of Information Technology Geomatic ApS Xuegang Huang and Torben Bach.
Internet Privacy Define PRIVACY? How important is internet privacy to you? What privacy settings do you utilize for your social media sites?
Privacy Vulnerability of Published Anonymous Mobility Traces Chris Y. T. Ma, David K. Y. Yau, Nung Kwan Yip (Purdue University) Nageswara S. V. Rao (Oak.
Center for E-Business Technology Seoul National University Seoul, Korea Private Queries in Location Based Services: Anonymizers are not Necessary Gabriel.
 A Two-level Protocol to Answer Private Location-based Queries Roopa Vishwanathan Yan Huang [RoopaVishwanathan, Computer Science and.
Hierarchical Clustering: Time and Space requirements
Feeling-based location privacy protection for LBS
On Scheduling of Peer-to-Peer Video Services
Location Cloaking for Location Safety Protection of Ad Hoc Networks
SocialMix: Supporting Privacy-aware Trusted Social Networking Services
Location Privacy.
Collaborative Filtering Nearest Neighbor Approach
Hierarchical clustering approaches for high-throughput data
Effective Social Network Quarantine with Minimal Isolation Costs
Mining Query Subtopics from Search Log Data
Continuous Density Queries for Moving Objects
Presentation transcript:

F EELING - BASED L OCATION P RIVACY P ROTECTION FOR L OCATION - BASED S ERVICES CS587x Lecture Department of Computer Science Iowa State University Ames, IA 50011

L OCATION - BASED S ERVICES

D ILEMMA Users have to report their locations to LBS providers LBS providers may abuse the collected location data

L OCATION E XPOSURE P RESENTS S IGNIFICANT T HREATS Threat1: Anonymity of service use A user may not want to be identified as the subscriber E.g., where is the nearest Threat2: Location privacy A user may not want to reveal where she is E.g., a query is sent from

RESTRICTED SPACE IDENTIFICATION A user’s location can be correlated to her identity ……… E.g., a location belonging to a private property indicates the user is most likely the property owner A single location sample may not be linked to an individual, but a time-series sequence will do identified Once the user is identified All her visits may be disclosed

L OCATION D EPERSONALIZATION Protect anonymous use of service Cloak the service user with her neighbors Location privacy leak Protect location privacy Cloak the service user with nearby footprints Adversary cannot know who’s there when the service is requested

M OTIVATION Privacy modeling Users specify their desired privacy with a number K Privacy is about personal feeling, and it is difficult for users to choose a K value Robustness Just ensuring each cloaking region has been visited by K people may NOT provide protection at level K It has to do with footprints distribution

OUR SOLUTION Feeling-based modeling A user specifies a public region A spatial region which a user feels comfortable that it is reported as her location should she request a service inside it The public region becomes her privacy requirement All location reported on her behalf will be at least as popular as the public region she identifies

C HALLENGE How to measure the privacy level of a region? The privacy level is determined by Number of visitors Footprints distribution A good measure should involve both factors

E NTROPY We borrow the concept of entropy Entropy of R is computed using the number of footprints in R belonging to different users Entropy of R is E(R) = Its value denotes the amount of information needed for the adversary to identify the client

P OPULARITY Popularity of R is P(R) = 2 E(R) Its value denotes the actual number of users among which the client is indistinguishable Popularity is a good measure of privacy More visitors – higher popularity More evener distribution – higher popularity

L OCATION C LOAKING WITH O UR P RIVACY M ODEL Sporadic LBSs Each location update is independent Cloaking strategy: Ensuring each reported location is a region which has a popularity no less than P(R) Continuous LBSs A sequence of location updates which form a trajectory The strategy for sporadic LBSs may not work Adversary may identify the common set of visitors

P-P OPULOUS T RAJECTORY We should compute the popularity of cloaking boxes with respect to a common user set, called cloaking set Only the footprints of users in the cloaking set are considered in entropy computation Entropy w.r.t. cloaking set U is Popularity w.r.t. U is P U (R) = 2 Eu(R) P-Populous Trajectory (PPT) The popularity of each cloaking box in the trajectory w.r.t. a cloaking set is no less than P(R)

S YSTEM S TRUCTURE

F OOTPRINT I NDEXING Grid-based pyramid structure 4 i-1 cells at level i Cells at the bottom level keep the footprint index

T RAJECTORY C LOAKING To receive an LBS, a client needs to submit Public region R Travel bound B Location updates repeatedly during her travel In response, the server will Generate a cloaking box for each location update Ensure the sequence of cloaking boxes form a PPT

C HALLENGE How to find the cloaking set? Basic solution: Finding the users who have footprints closest to the service-user o Resolution becomes worse o There may exist another cloaking set which leads to a finer average resolution

SELECTING CLOAKING SET Observation Popular user: Who have footprints spanning the entire travel bound B Cloaking with popular users tends to have a fine cloaking resolution Easy to find their footprints close to the service user no matter where she moves Idea Use the most popular users as the cloaking set

FINDING MOST POPULAR USERS l -popular : the user has visited all cells at level l overlapping with B Larger l : more popular user u 1, u 2, u 3 : 2-popular u 2, u 3 : 3-popular u 3 : 4-popular E.g. Strategy: Sort users by the level l, and choose the most popular ones as the cloaking set

C LOAKING C LIENT ’ S L OCATION Let S be the cloaking set, p be the client’s location, we cloak p in three steps 1.Find closest footprints to p for each user in S 2.Compute the minimal bounding box of these footprints, say b 3.Calculate P S (b) If P S (b) < P(R), for each user find her closest footprint to p among her footprints outside b, and goto 2. If P S (b) ≥ P(R), b is reported as the client’s location

S IMULATION We implement two other strategies for comparison Naive cloaks each location independently Plain selects cloaking set by finding footprints closest to service user’s start position Performance metrics Cloaking area Protection level

E XPERIMENT Location privacy aware gateway (LPAG) A prototype which involves location privacy protection into a real LBS system Two software components LBS system: Spatial messaging

C ONCLUSION Feeling-based privacy modeling for location privacy protection in LBSs Public region instead of K value Trajectory cloaking Algorithm, simulation, experiment Future work Investigate attacks other than restricted space identification Observation implication attack