Download presentation
1
Privacy Preserving In LBS
Evaluating Privacy of LBS Algorithms In Dynamic Context
2
Outline Introduction Design Model & Workflow System
Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work
3
Introduction(1) What is Context ? [1]
Any information that can be used to characterize the situation of entities (whether a person, place or object) are considered relevant to the interaction between (include themselves) User Application
4
Introduction(2) Problem of Privacy Preserving in Dynamic Context ?
Different services require different algorithms. Even in the only one service? How to evaluate privacy algorithms in dynamic context ?
5
Outline Introduction Design Model & Workflow System
Design Model System Workflow System Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work
6
Design Model System(1)
7
Evaluation Module[2][3]
8
Workflow System
9
Outline Introduction Design Model & Workflow System
Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work
10
Design Specification : General Approach
Introduction to Privacy Attack Models Location Distribution Attack Maximum Movement Boundary Attack Query Tracking Attack Message attributes summary
11
Introduction to Privacy Attack Models
Privacy attacks are categorized by Privacy Attack Model (adversary model). Attack models are differ by : Collected target information. Attacker ability in capturing message during service provisioning. Attacker background knowledge.
12
Privacy Attack Models Introduction (cont.)
Conditions/When it takes place Appropriate techniques & Algorithm Location Distribution Attack -User locations are known. -Some users have oulier locations. -The employed spatial cloaking tends to generate minimum areas. -adding k-region property. Algorithm : CliqueCloak … Maximum Movement Boundary (MMB) attack -continuous captured queries. -same pseudonym for 2 consecutive updates. -maximum speed is known. -adding safe update property. -techniques : patching, delaying … Query Tracking Attack -adding memorization property.
13
Privacy Attack Models Content
Introduction Location Distribution Attack Maximum Movement Boundary Attack Query Tracking Attack Message attributes summary
14
Privacy Attack Models Location Distribution Attack
Location Distribution Attack takes place when: User locations are known Some user have outlier locations. The employed spatial cloaking algorithm tends to generate minimum areas. F E D C B A Given a cloaked spatial region covering a sparse area (user A) and a partial dense area (user B, C, and D), an adversary can easily figure out that the query issuer is an outlier.
15
Solution to Location Distribution Attack k-Sharing property
K-sharing Region Property: A cloaked spatial region not only contains at least k users, but it also is shared by at least k of these users. The same cloaked spatial region is produced from k users. An adversary cannot link the region to an outlier. F E D C B A Result in an overall more privacy-aware environment. Example of technique that are free from this attack include CliqueCloak.
16
Solution to Location Distribution Attack CliqueCloak algorithm
Each user requests: A level of k anomity. A constraint area. Build an undirected constraint graph. Two nodes are linked, if their constraint areas contain each other. E (k=3) B (k=4) F (k=5) D (k=4) For a new user m, add m to the graph. Find the set of nodes that are linked to m in the graph and has level of anonymity less than m.k. m (k=3) H (k=4) A (k=3) C (k=2) The cloaked region is the MBR (cloaking box) that includes the user and the neighboring nodes. All users within an MBR use that MBR as their cloaked region.
17
Solution to Location Distribution Attack CliqueCloak pseudo-code
while TRUE do pick a message m from S. N ← all messages in range B(m) for each n in N do: if P(m) is in B(n) then: add the edge (m,n) into G M ← local_k_search(m.k, m, G) if M ≠ Ø then Bcl(M) ← The minimal area that contains M for each n in M do remove n from S remove n from G nT ← < n.uid, n.rno, Bcl(M), n.C > output transformed message nT remove expired messages from S Building constraint graph G Finding a subset M of S s.t. m is in M, m.k = |M|, for each n in M n.k ≤ |M|, and M forms a clique in G. Building transformed messages from all messages in M
18
Solution to Location Distribution Attack CliqueCloak pseudo-code(cont
local_k_search(k, m, G) U ← { n | (m,n) is an edge in G and n.k ≤ k } if |U| < k-1 then return Ø l ← 0 while l ≠ |U| do l ← |U| for each u in U do if |{G neighbors of u in U}| < k-2 then U ← U \ {u} find any subset M in U s.t. |M| = k-1 and M U {m} forms a clique return M U {m} Find a group U of neighbors to m in G s.t. their anonymity value doesn’t exceed k. Remove members of U with less than k-2 neighbors, that cannot provide us with a (k-1)-clique Look for a k-clique inside U.
19
Solution to Location Distribution Attack CliqueCloak message specification
A plan message (from client to server) m consists of: m.uid = Unique identifier of the sender m.rno = Message’s reference number P(m) = Message’s spatial point (e.g. the client’s current location). B(m) = Message’s spatial constraint area m.t = Message’s temporal constraint (expiration time) m.C = Message’s content m.k = Message’s anonymity level
20
Solution to Location Distribution Attack CliqueCloak message specification(cont.)
A transformed message (from server to database) mT consists of: m.uid , m.rno Bcl(m) = Message’s spatial cloaking box m.C
21
Solution to Location Distribution Attack Evaluation of CliqueCloak
Pros: Free from location distribution attack (query sampling attack). Cons: suffers from high computational cost as it can support only k-anomity up to k=10. cost of searching a clique in a graph is costly. some requests that cannot be anonymized will be dropped when their lifetimes expire.
22
Privacy Attack Models Content
Introduction Location Distribution Attack Maximum Movement Boundary Attack Query Tracking Attack Message attributes summary
23
Privacy Attack Models Maximum Movement Boundary Attack
I know you are here! Maximum movement boundary attack takes place when: Continuous location updates or continuous queries are considered The same pseudonym is used for two consecutive updates The maximum possible speed is known The maximum speed is used to get a maximum movement boundary (MBB) The user is located at the intersection of MBB with the new cloaked region Ri+1 Ri
24
Solution to Maximum Movement Boundary Attack Safe Update Property[4]
Two consecutive cloaked regions Ri and Ri+1 from the same users are free from the maximum movement boundary attack if one of these three conditions hold: The overlapping area satisfies user requirements Ri totally covers Ri+1 The MBB of Ri totally covers Ri+1 Ri Ri+1 Ri Ri+1 Ri Ri+1 The MMB of Ri totally covers Ri+1
25
Solution to Maximum Movement Boundary Attack Patching and Delaying[4][9]
Patching: Combine the current cloaked spatial region with the previous one Delaying: Postpone the update until the MMB covers the current cloaked spatial region Ri+1 Ri+1 Ri Ri
26
Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]
Main idea Incrementally maintain maximal cliques for location cloaking in an un-directed graph that takes into consideration the effect of continuous location updates. Use a graph model to formulate the problem Each mobile user is represented by a node in the graph An edge exists between two nodes/users only if they are within the MMB of each other and can be potentially cloaked together
27
Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]
Graph Modeling Let G(V, E) be an undirected graph where V is the set of nodes/users who submitted location-based query requests, and E is the set of edges. There exists an edge evwbetween two nodes/users v and w, if and only if v≠w v is covered by MMB w,ti−1,ti w is covered by MMB v,ti−1,ti Area(MBR(v, w)) < A max
28
Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]
maximal clique as a clique that is not contained in any other clique. start with a graph without any edges All nodes themselves constitute a set of 1-node cliques. Then add the edges to the graph one by one and incrementally update the set of maximal cliques. the cliques where the user of the new request is involved might be candidate cloaking sets, classified to three classes: positive candidates negative candidates not candidates
29
Solution to Maximum Movement Boundary Attack Using ICliqueCloak Algorithm[10]
Performance
30
Maximum Movement Boundary Attack Atributes
Location Time Maximum velocity Privacy level k User-tolerant maximum area Amax
31
Privacy Attack Models Content
Introduction Location Distribution Attack Maximum Movement Boundary Attack Query Tracking Attack Message attributes summary
32
Query attacks K-anonymity: Query attacks: Query sampling attacks
Interval Cloak, CliqueCloak, Uncertainty Cloaking,… Query attacks: Query sampling attacks Query homogeneity attacks Query tracking attacks Hầu hết những nghiên cứu đã có đều tập trung vào thiết kế một giải thuật “cloaking” hiệu quả để đạt được location k-anonymity. Ý tưởng của location k-anonymity nhằm dấu đi user thực sự trong số những user khác hay kẻ tấn công không biết ai trong số k người là người đã gửi yêu cầu dịch vụ. Để làm được điều này người ta sử dụng các “cloaking regions” (Interval Cloak, CliqueCloak, Uncertainty Cloaking,…). Tuy nhiên privacy vẫn bị ảnh hưởng bởi các tấn công sau.
33
Query homogeneity attacks[12]
Đây là ví dụ trong cơ sở dữ liệu. Tương tự trong LBS
34
Query tracking attacks[4]
F This attack takes place when: Continuous location updates or continuous queries are considered The same pseudonym is used for several consecutive updates User locations are known Once a query is issued, all users in the query region are candidates to be the query issuer If the query is reported again, the intersection of the candidates between the query instances reduces the user privacy G H D E A C I B J K At time ti {A,B,C,D,E} At time ti+1{A,B,F,G,H} At time ti+2 {A,F,G,H,I}
35
Solutions Memorizing m-Invariance Historical k-Anonymity …
36
Memorizing[4] Remember a set of users S that is contained in the cloaked spatial region when the query is initially registered with the database server Adjust the subsequent cloaked spatial regions to contain at least k of these users. F G H D E A C I B J K If a user s is not contained in a subsequent cloaked spatial region, this user is immediately removed from S. This may result in a very large cloaked spatial region. At some point, the server may decide to disconnect the query and restart it with a new identity.
37
Query m-Invariance[11][13]
Query l-diversity: ensures that a user cannot be linked to less than ℓ distinct service attribute values. Trade-off: pravacy <-> QoS
38
Query m-Invariance(cont)
Satisfying location k-anonymity Satisfying query ℓ-diversity query 3-diverse and location 3-anonymous ví dụ về các vùng làm mờ được sinh ra thỏa mãn query 3-diverse và location 3-anonymous. Tuy nhiên vẫn bị Query tracking attack.
39
Query m-Invariance(cont)
Query m-Invariance: the number of possible query association attacks will increase if a user can be associated with more number of service attribute values.
40
Attributes A plain message sent from user:
Id: Unique identifier of the sender. Ref: Message’s reference number. P: Message’s spatial point (e.g. user current location). C: Message’s content. k: Message’s anonymity level. ℓ: Message’s diversity level. m: Message’s invariance level.
41
Privacy Attack Models Content
Introduction Location Distribution Attack Maximum Movement Boundary Attack Query Tracking Attack Message attributes summary
42
Attack Model privacy Message attributes summary
A plain message sent from user must consist of 11 attributes: Id: Unique identifier of the sender. Ref: Message’s reference number. P: Message’s spatial point (e.g. user current location). B: Message’s spatial constraint area. t: Message’s temporal constraint (expiration time). v: velocity / maximum speed. QoS: quality of service. C: Message’s content. k: Message’s anonymity level. ℓ: Message’s diversity level. m: Message’s invariance level.
43
Outline Introduction Design Model & Workflow System
Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work
44
Build Privacy Case Based Database
From attack model and attributes we found, a case will include: Input attributes Graph Algorithm using to protect privacy Specification Define a interval for each attribute Define some properties which input must satisfy them Note To reduce the computation, we just calculate on subgraph which is related to the query issuer. Database will delete queries which is expired
45
Outline Introduction Design Model & Workflow System
Design Specification : General Approach Build Privacy Case Based Database Conclusion & Future Work
46
Conclusion Evaluating privacy algorithms in dynamic context need a flexible technique Calculation case-based Ontology reasoner Attack Models are core component of calculation case-based
47
Future Work Continue on case-base specification
Research other attack models Study on User, CP, SP. Select appropriate structure for case-base data. Tree structure: parent node present a more general case. Specify Ontology Reasoner.
48
Reference [1]. Anind K. Dey and Gregory D. Abowd. Towards a Better Understanding of Context and Context-Awareness. In Graphics, Visualization and Usability Center and College of Computing, Georgia Tech, Atlanta, GA USA, , 2000. [2]. Yonnim Lee, Ohbyung Kwon: An index-based privacy preserving service trigger in context-aware computing environments. In Expert Systems with Applications 37, p.5192–5200, 2010. [3]. Claudio Bettini, Linda Pareschi, Daniele Riboni: Efficient profile aggregation and policy evaluation in a middleware for adaptive mobile applications. In: Pervasive and mobile computing. ISSN , p , 2008 Oct. [4] Mohamed F. Mokbel. Privacy in Location-based Services: State-of-the-art and Research Directions International Conference on Mobile Data Management. [5] B. Gedik and L.Liu. A customizable k-Anonymity Model for Protecting Location Privacy. Proc. IEEE Int'l Conf. Distributed Computing Systems (ICDCS '05), pp , 2005. [6] B. Gedik and L.Liu. Location Privacy in Mobile Systems: A Personalized Anonymization Model. In ICDCS, 2005.
49
Reference [7] Z. Xiao, X. Meng and J. Xu. Quality Aware Privacy Protection for Location-based Services. Proc. the 12th Int. Conf. on Database Systems for Advanced Applications (DASFAA '07), Bangkok, Thailand, April 2007. [8] Chi-Yin Chow and Mohamed F. Mokbel. Enable Private Continuous Queries For Revealed User Locations. Proc. Int'l Symp. Spatial and Temporal Databases (SSTD), 2007. [9] . Reynold Cheng, Yu Zhang, Elisa Bertino, and Sunil Prabhakar. Preserving User Location Privacy in Mobile Data Management Infrastructures. In Proceedings of Privacy Enhancing Technology Workshop, PET, 2006. [10]. X.Pan, J.Xu, and X.Meng. Protecting Location Privacy against Location-Dependent Attack in Mobile Services. Conference on Information and Knowledge Management. Proceeding of the 17th ACM conference on Information and knowledge management 2008, Napa Valley, California, USA October , 2008. [11] Rinku Dewri, Indrakshi Ray, Indrajit Ray and Darrell Whitley. Query m-Invariance: Preventing Query Disclosures in Continuous Location-Based Services., 11th International Conference on Mobile Data Management, MDM 2010, Kansas City, Missouri, USA, May 23-26, 2010
50
Reference [12] Fuyu Liu, Kien A. Hua, Ying Cai. Query l-diversity in Location-Based Services, in Proceedings of the 10th International Conference on Mobile Data Management: Systems, Services and Middleware, 2009, pp. 436–442. [13] Panos Kalnis, Gabriel Ghinita, Kyriakos Mouratidis, and Dimitris Papadias. Preventing Location-Based Identity Inference in Anonymous Spatial Queries, IEEE Transactions on Knowledge and Data Engineering, vol. 19, no. 12, pp , Aug. 2007
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.