Presentation is loading. Please wait.

Presentation is loading. Please wait.

EVALUATING LBS PRIVACY IN DYNAMIC CONTEXT 1. Outline 2  Overview Attack Model  Classification Defend Model  Evaluation Module  Conclusion.

Similar presentations


Presentation on theme: "EVALUATING LBS PRIVACY IN DYNAMIC CONTEXT 1. Outline 2  Overview Attack Model  Classification Defend Model  Evaluation Module  Conclusion."— Presentation transcript:

1 EVALUATING LBS PRIVACY IN DYNAMIC CONTEXT 1

2 Outline 2  Overview Attack Model  Classification Defend Model  Evaluation Module  Conclusion

3 Outline 3  Overview Attack Model  Classification Defend Model  Evaluation Module  Conclusion

4 Overview Attack Model[1] 4  What is a privacy threat?  Whenever an adversary can associate The identity of a user Information that the user considers private

5 Overview Attack Model (1) 5  What is a privacy attack?  A specific method used by an adversary to obtain the sensitive association  How to classify privacy attacks?  Depending on parameters of an adversary model  What is main components of an adversary model?  Target private information  Ability to obtain the transmit messages  Background knowledge and the inferencing abilities

6 How adversary model can be used? 6  The target private information  Explicit in message (i.e. real id) Eavesdropping channel  Implicit (using pseudo id) Inference with external knowledge Ex. Joining pseudo id with location data  Attacks exploiting quasi-identifiers in requests

7 How adversary model can be used? 7  Ability to obtain the transmitted messages  Message Snapshot Chain  Issuer Single Multiple  Single versus multiple-issuer attacks

8 How adversary model can be used? 8  The background knowledge and inferencing abilities  Unavailable Depend on sensitive information in message (implicit or explicit)  Complete available Privacy violation occurs independently from the service request  Attacks exploiting knowledge of the defense

9 Outline 9  Overview Attack Model  Classification Defend Model  Evaluation Module  Conclusion

10 Classification Defend Model 10  Our Target  Architecture: centralized  Technique: anonymity-based and obfuscation  Defend Model against  Snapshot, Single-Issuer and Def-Unaware Attacks  Snapshot, Single-Issuer and Def-Aware Attacks  Historical, Single-Issuer Attacks  Multiple-Issuer Attacks

11 Outline 11  Overview Attack Model  Classification Defend Model  Snapshot, Single-Issuer and Def-Unaware Attacks  Snapshot, Single-Issuer and Def-Aware Attacks  Historical, Single-Issuer Attacks  Multiple-Issuer Attacks  Evaluation Module  Conclusion

12 Single-issuer and undef-aware attack  Some assumptions  Attacker could acquire the knowledge about the exact location of each user  Attacker knows that the generalization region g(r).Sdata always includes point r.Sdata  Attacker can’t reason with more than one request.  uniform attack 12

13 Single-issuer and undef-aware attack  A uniform attack: u1u1 u3u3 u2u2 Not safe if user require k = 4 (with threshold h = ¼). 13

14 Single-issuer and def-aware attack  Some assumptions:  Like undef-aware  Attack can know generalization function g  Uniform attack and outlier problem 14 u1u1 u3u3 u2u2

15 Example attack Outlier problem 15

16 C st+g -unsafe generalization algorithms  Following algorithms:  IntervalCloaking  Casper  Nearest Neighbor Cloak  Why are they not safe? 16

17 C st+g -unsafe generalization algorithms  These algorithms are not safe:  Every user in anonymizing set (AS) does not generete the same AS for given k  Uniform attack  A property that C st+g -safe generalization algorithms must satisfy:  AS contains issuer U and at least k-1 additional user  Every user in AS generates the same AS for given k  Reciprocity property 17

18 C st+g -safe generalization algorithms  hilbASR  dichotomicPoints  Grid  All above algorithm satisfy Reciprocity property. So, they are safe with knowledge of generalization function. 18

19 Grid algorithm 19

20 Centralized defenses against snapshot, single- issuer and def-aware attacks  To defend snapshot, single-issuer and def-aware attack, generalization must satisfy reciprocity property  How to know an algorithm satisfy that property or not? 20

21 Decide an algorithm satisfy Reciprocity?  For an request r with k  Run algorithm to get AS  For each id u i in AS, run algorithm to get AS i If AS = AS i for every i, then algorithm is satisfy reciprocity. Else, it’s not safe. 21

22 Check reciprocity based calculated result 22  After check reciprocity directly, save result to database  With a new request r, find a similar case  result of previous request of same issuer (if movement is not large)  Result of another request, with: Same issuer’s location Same surrounding user’s locations

23 Case-based module  Run algorithm to generate AS.  Find a similar case in database, return results.  If not found, check reciprocity property  Change k parameter if necessary.  Update result to database  Send result to next step. 23

24 Outline 24  Overview Attack Model  Classification Defend Model  Snapshot, Single-Issuer and Def-Unaware Attacks  Snapshot, Single-Issuer and Def-Aware Attacks  Historical, Single-Issuer Attacks  Multiple-Issuer Attacks  Evaluation Module  Conclusion

25 Memorization Property  Definition  Single-Issuer Historical Attacks  Query Tracking Attack  Maximum Movement Boundary Attack  Multiple-Issuers Historical Attacks  Notion of Historical k-Anonymity 25

26 Memorization Property Definition 26  k-anonymity property: the spatial cloaking algorithm generates a cloaked area that cover k different users, including the real issuer. Privacy Middleware r DE B A C r’ Service Provider Cloaked area contains k users Issuer A

27 Memorization Property Definition 27  k users in the cloaked area are easy to move to different places. Attacker which knowledge of exact location of users, has chance to infer the real issuer from the anonymity set. DE B A C RISK !

28 Memorization Property Definition 28  memorization property[5]: the spatial cloaking algorithm memorizes the movement history of each user and utilize this information when building cloaked area. DE B A C Spatial Cloaking Algorithm Processor movement patterns cloaked region

29 Memorization Property Definition 29  Lacking of memorization property the issuer may suffer from the following attacks:  Single-Issuer Historical Attacks: attacker consumes historical movement of single issuer Query Tracking Attack Maximum Movement Boundary Attack  Multiple-Issuers Historical Attacks: attacker use multiple users historical movement Notion of Historical k-Anonymity

30 Memorization Property Query Tracking Attack[6] 30  Case description:  User query is requested multiple times at t i, t i+1, etc.  Attacker knows exact location of each user.  Attack description:  Attacker reveal real issuer by intersecting the candidate-sets between the query instances DE B I J A F H K G C At time t i {A,B,C,D,E} At time t i+1 {A,B,F,G,H} At time t i+2 {A,F,G,H,I} Reveal A

31 Memorization Property Query Tracking Attack[6] 31  Possible instant solution:  Delay request until the cloaked until most of the candidate return  Make new cloaked area, consuming users location history.  Etc. DE B I A F H G C At time t i D E BI A F H G C At time t i +k Risky  Delay D E BI A FH G C At time t i+k+m Safe  Forward

32 Memorization Property Maximum Movement Boundary Attack[6] 32  Case description:  Consider the movement rate (speed) of users.  Attacker knows exact location and speed of each user.  Attack description:  Attacker limit the real issuer into the overlap area RiRi R i+1 I know you are here! movement bound area

33 Memorization Property Maximum Movement Boundary Attack[6] 33  Solution must satisfy one of the three cases: RiRi R i+1  The overlapping area satisfies user requirements RiRi R i+1  R i totally covers R i+1 RiRi R i+1  The MBB of R i totally covers R i+1 33  Possible solutions are Patching and Delaying

34 Memorization Property Maximum Movement Boundary Attack[6] 34 Patching: Combine the current cloaked spatial region with the previous one Delaying: Postpone the update until the MMB covers the current cloaked spatial region RiRi R i+1 RiRi 34

35 Memorization Property Historical k-Anonymity[7] 35  If attacker also considers users frequent movement patterns, he has more chance to differ the real issuer with other candidates.

36 Memorization Property Historical k-Anonymity Terminology 36  Quasi-identifier (QID): set of attributes which can be used to identify an individual.  Location-Based QIDs (LBQIDs):  Spatio-temporal movement patterns consisting of Set of elements: and A recurrence formula: rec 1.G 1, …, rec n.G n,  Depict frequent user movement patterns,,, 1.day, 5.week  Personal History Locations (PHL):  Sequence of element (x, y, t) that indicate the location (x, y) of a user U at time t.

37 Memorization Property Historical k-Anonymity Terminology 37  …  Historical k-anonymity:  A set of request R of user U is historical k-anonymity if there exist k-1 PHLs P 1, …, P k-1 for k-1 users other than U, such that each P i is LS-consistent with R.

38 Memorization Property Historical k-Anonymity Terminology 38  Request:  A tuple R = (x, y, t, S), S is service-specific data.  Element matching:  User request R i = (x, y, t, S) matches an element E of an LBQID if Ǝ (x, y) ϵ E.coord and t ϵ E.timestamp R = (park, 8 :30am) …, … E  Request LBQID matching:  A set of user requests R match his/her LBQID iff: Each request matches an element E and All requests satisfy the recurrence formula.

39 Memorization Property Historical k-Anonymity Terminology 39  LT-consistency: A PHL is Location and Time consistent with a set of request R if:  Each request r i exists an element in the PHL or  Request was sent at a time/location that can be extracted from consecutive elements of PHL.  When a user U sends a set of request R, (historical) k-anonymity is preserved if at least k-1 user, other than U, have PHLs that are LT-consistent with R.

40 Memorization Property Historical k-Anonymity Algorithm[7] 40

41 Memorization Property Historical k-Anonymity Algorithm[7] 41

42 Memorization Property Historical k-Anonymity Algorithm[7] 42  Input:  The ST information (x, y, t) of the request R.  The desired level of anonymity (k).  The spatial and temporal constraints.  Output:  The generalized 3D area.  A boolean value b to denote success/failure.  A list N of the k-1 neighbors (after execution of the first-element matching phrase)

43 Memorization Property Historical k-Anonymity Algorithm 43  Problems to considers:  LTS has to generate each request when it is issued without knowledge of future locations and future request of users.  The longer PHL traces require, the more computational costs.  Our approach:  PHLs of user are predefined (testing only), not updated at real time.  Only consider short PHL trace.

44 Memorization Property Summary & Work Flow 44  Memorization is the 2 nd property we consider.  Memorization property checking is after Reciprocity property checking.  Memorization property checking covers 3 phases:  Check Maximum Movement Boundary Attack.  Check Query Tracking Attack.  Check Frequent Pattern Attack.

45 Memorization Property Summary & Work Flow 45  Memorization is the 2 nd property we consider.  Memorization property checking is after Reciprocity property checking.  Memorization property checking initially covers 3 phases:  P1: Check Maximum Movement Boundary Attack.  P2: Check Query Tracking Attack.  P3: Check Frequent Pattern Attack.  If the request is failed in any phase, the algorithm stops and report the result to the next property checking.

46 Outline 46  Overview Attack Model  Classification Defend Model  Snapshot, Single-Issuer and Def-Unaware Attacks  Snapshot, Single-Issuer and Def-Aware Attacks  Historical, Single-Issuer Attacks  Multiple-Issuer Attacks  Evaluation Module  Conclusion

47 Multiple-issuer attacks  Problem:  The attacker can acquire multiple requests from multiple users  The attacker can infer the sensitive association for a user from the sensitive attribute(s) involved in a user’s request => query association 47

48  Two methods to prevent query association:  Many-to-one queries: a k-anonymous cloaking region is associated with a single service attribute => k potential users who may be the owner of the service attribute  Many-to-many queries: a cloaking region is associated with a set of service attribute values 48

49 Example 49

50 l-diversity [2]  Query l-diversity: ensures that a user cannot be linked to less than ℓ distinct service attribute values. 50

51 t-closeness [3] 51 all disease is stomach problem all disease is stomach problem

52 Problem in continuous LBS  Satisfying location k-anonymity  Satisfying query ℓ -diversity  Satisfying query t-closeness 52 query 3-diverse and location 3-anonymous

53 m-Invariance [4]  Query m-Invariance: the number of possible query association attacks will increase if a user can be associated with more number of service attribute values. 53

54 Outline 54  Overview Attack Model  Classification Defend Model  Snapshot, Single-Issuer and Def-Unaware Attacks  Snapshot, Single-Issuer and Def-Aware Attacks  Historical, Single-Issuer Attacks  Multiple-Issuer Attacks  Evaluation Module  Conclusion

55 55

56 Case-based Reasoner 56

57 Outline 57  Overview Attack Model  Classification Defend Model  Snapshot, Single-Issuer and Def-Unaware Attacks  Snapshot, Single-Issuer and Def-Aware Attacks  Historical, Single-Issuer Attacks  Multiple-Issuer Attacks  Evaluation Module  Conclusion

58 Conclusion 58  Refine algorithm for checking reciprocity  Using cache  Grid structure  Define case-based specification  Define “similar” case

59 Reference 59  [1] Claudio Bettini, Sushil Jajodia, X. Sean Wang, and Duminda Wijesekera. 2003. Provisions and Obligations in Policy Rule Management. J. Netw. Syst. Manage. 11, 3 (September 2003), 351-372. DOI=10.1023/A:1025711105609 http://dx.doi.org/10.1023/A:1025711105609  [2] A. Machanavajjhala, J. Gerkhe, D. Kifer and M. Venkitasubramaniam, "l- Diversity: Privacy Beyond k-Anonymity", Proceedings of the 22nd IEEE International Conference on Data Engineering, Atlanta, GA, 2006.  [3] Ninghui Li, Tiancheng Li, S.Venkatasubramanian, “t-Closeness: Privacy Beyond k-Anonymity and l-Diversity”, Data Engineering, 2007. ICDE 2007. IEEE 23rd International Conference.  [4] R. Dewri, I.Ray, D.Whitley, “Query m-Invariance: Preventing Query Disclosures in Continuous Location-Based Services”, Mobile Data Management (MDM), 2010 Eleventh International Conference

60 Reference 60  [5] Chow, C.Y., Mokbel, M.. “Preventing Enabling private continuous queries for revealed user locations”. Symposium on Spatial and Temporal Databases (SSTD) 2007.  [6] Mohamed F. Mokbel, "Privacy in Location-Based Services: State-of-the- Art and Research Directions," MDM, pp.228, 2007 International Conference on Mobile Data Management, 2007.  [7] Zacharouli, P. Gkoulalas-Divanis, A. Verykios, V.S. Thessaly Univ., Volos, “A k-Anonymity Model for Spatio-Temporal Data”, Data Engineering Workshop, 2007 IEEE 23rd International Conference.  [8] Claudio Bettini, Sergio Mascetti, Xiaoyang Sean Wang, Dario Freni, Sushil Jajodia, “Anonymity and Historical-Anonymity in Location-Based Services”. European Symposium on Research in Computer Security - ESORICS

61 Reference 61  [9] Sergio Mascetti, Claudio Bettini, “Spatial Generalization Algorithms for LBS Privacy Preservation”. Mobile Data Management - MDM, pp. 258- 262, 2007.  [10] Kalnis, P., Ghinita, G., Mouratidis, K., Papadias, D.: Preventing location- based identity inference in anonymous spatial queries. IEEE Transactions on Knowledge and Data Engineering 19(12), 1719–1733 (2007)  [11] Gedik, B., Liu, L.: Protecting location privacy with personalized k- anonymity: Architecture and algorithms. IEEE Transactions on Mobile Computing 7(1), 1–18 (2008)


Download ppt "EVALUATING LBS PRIVACY IN DYNAMIC CONTEXT 1. Outline 2  Overview Attack Model  Classification Defend Model  Evaluation Module  Conclusion."

Similar presentations


Ads by Google