Download presentation
Presentation is loading. Please wait.
Published byMarjory Hubbard Modified over 9 years ago
1
Ebrahim Tarameshloo, Philip W.L.Fong, Payman Mohassel University of Calgary Calgary, Alberta, Canada {etarames, pwlfong, pmohasse}@ucalgary.ca On Protection in Federated Social Computing Systems 1 March 2014
2
Federated Social Computing Systems Example: Her access policy: (Share with my friends)@Foursquare vs. (Share with public)@Twitter Privacy challenges Access control policy of the originating SCS may not be honored by the destination SCS 2 On Protection in Federated Social Computing Systems
3
Outline 3 Privacy in Federated Social Computing Systems Formal model Privacy via Private Function Evaluation (PFE) Privacy via safe function evaluation On Protection in Federated Social Computing Systems
4
Outline 4 Privacy in Federated Social Computing Systems Formal model Privacy via Private Function Evaluation (PFE) Privacy via safe function evaluation On Protection in Federated Social Computing Systems
5
Closer Look at Protection Challenges Policy fidelity Ambiguity in terms of what policy to be used for protecting shared contents Mechanism fidelity Challenge of tracking the protection model of the origin site by the destination site State fidelity The user information may not be available for policy enforcement at the destination SCS 5 On Protection in Federated Social Computing Systems
6
Assumptions User identity The manual identity mapping process is consistent and applied whenever needed Authorization service Secure queriable PDPs (Policy Decision Points) for each SCSs of the confederation 6 On Protection in Federated Social Computing Systems
7
Feature Overview of Our Protection Model 7 1.Protection of Shared Resources Native access: (Not the focus of this work) Shared access: (The goal of our work) On Protection in Federated Social Computing Systems
8
Feature Overview of Our Protection Model 8 2.Shared Access Policies Policies for controlling shared accesses defined by resource owner Addresses Policy Fidelity On Protection in Federated Social Computing Systems
9
Feature Overview of Our Protection Model 9 3.Distributed Evaluation of Situated Queries Shared access policy in the form of situated queries Example: “friend@Facebook”, “co-located@Foursquare” Distributed evaluation ensures Mechanism and State Fidelity On Protection in Federated Social Computing Systems
10
Feature Overview of Our Protection Model 10 4.Policy Composition More flexible protection model Made up of boolean combinations of situated queries Example: (friend@Facebook ∨ follower@Twitter) ∧ nearby@Foursquare On Protection in Federated Social Computing Systems
11
Outline 11 Privacy in Federated Social Computing Systems Formal model Privacy via Private Function Evaluation (PFE) Privacy via safe function evaluation On Protection in Federated Social Computing Systems
12
Formal Model of Federated SCSs 12 Confederation Schema Specifies the constant entities in federation Privacy Configuration Specifies current privacy settings of the confederation Protection State Tracks the current protection state of member SCSs Tracks the whereabouts of shared resources On Protection in Federated Social Computing Systems
13
Policy Language 13 Distinctive features Atomic queries can be interpreted at specific SCS Composite policies by composition of atomic queries Syntax Semantics Resource owner and requester must satisfy policy formula in a given protection state On Protection in Federated Social Computing Systems
14
Outline 14 Privacy in Federated Social Computing Systems Formal model Privacy via Private Function Evaluation (PFE) Privacy via safe function evaluation On Protection in Federated Social Computing Systems
15
Privacy via Secure Multiparty Computation 15 Distributed evaluation of shared access policies Privacy effect: Disclosure of SCSs protection states Example: friend@Facebook ∧ nearby@Foursquare Evaluation may disclose user location claims in Foursquare to Facebook Privacy goal Preserving the privacy of SCSs’ protection states during the evaluation of shared access policies Possible approach Secure Multiparty Computation (SMC) On Protection in Federated Social Computing Systems
16
SMC and Output Privacy 16 SMC allows a group of parties to collectively compute a function of their inputs, while at the same time keeping these inputs private SMC does not guaranty output privacy Example: SMC does not try to determine which function is “safe” to compute On Protection in Federated Social Computing Systems
17
SMC and Output Privacy 17 Privacy challenge in our scheme: Example: Evaluation of at Instagram may leak users’ location and friendship Possible approaches Hide policy formulas from federated SCSs Evaluate only safe public policy formulas On Protection in Federated Social Computing Systems
18
Approach1: PFE-based Architectures 18 Hide the from the SCSs involved Advantage: no restriction on what the formula can be Core challenge: hiding policy while running the SMC protocol Private Function Evaluation (PFE) Three PFE-based architectures Origin arch. (Origin tracks policy) User arch. (User tracks policy) TP arch. (Third party tracks all policies) On Protection in Federated Social Computing Systems
19
Origin Arch. (Origin SCS Tracks Policy) 19 PFE Authorization Decision Origin SCS Current SCS Ask to initiate PFE Each SCS tracks shared access policy of its own resources On Protection in Federated Social Computing Systems
20
User Arch. (User Tracks Policy) 20 PFE Authorization Decision Origin SCS Current SCS Ask to initiate PFE Each user stores shared access policies on a user owned storage On Protection in Federated Social Computing Systems
21
TP Arch. (Third Party Tracks Policy) 21 PFE Authorization Decision Origin SCS Current SCS Ask to initiate PFE TP Centralized policy storage service by a trusted third party (TP) On Protection in Federated Social Computing Systems
22
Challenge of Policy Administration On Protection in Federated Social Computing Systems 22 Every user must define a shared access policy for every resource Tedious for users Default policies for various categories of resources
23
Assessment of three architectures 23 Privacy Origin arch. Authorization decision should be hidden from origin SCS if it contributes an input to the policy formula User arch. There should not be any collusion between storage service and any SCS Example: Google+ and Google Drive TP arch. Should remain trusted Knowledge of query vocabulary Origin arch. Every SCSs must understand the full query vocabulary of all other SCSs in confederation User arch. Same as Origin arch. TP arch. Only TP must understand the full query vocabulary Fault tolerance Origin arch. Failing of one SCS affects all policy lookup of all resources originating from that SCS User arch. Failing of user storage will affect only the shared resources of that user TP arch. Single point of failure. Will affect entire confederation. Policy administration Every user must define a shared access policy for every resource Tedious for users Default policies for various categories of resources Example: On Protection in Federated Social Computing Systems
24
Outline 24 Privacy in Federated Social Computing Systems Formal model Privacy via Private Function Evaluation (PFE) Privacy via safe function evaluation On Protection in Federated Social Computing Systems
25
Approach2: Privacy via Safe Functions 25 All shared access policies are allowed to be public Example: default policies Evaluate only “safe” policies by confederation Privacy goal: No inference of inputs from output values An SCS can refrain from providing input if a policy is detected to be unsafe “Safe” function definition based on Sutherland’s definition of information flow via the notion of deducibility On Protection in Federated Social Computing Systems
26
Input NonDeducibility On Protection in Federated Social Computing Systems 26 x1x1 x2x2 …xixi …x n-1 xnxn f 11… 0 111 10… 1 …11 01……10 10……11 11… 0 …00 00… 0 …00 0 1 1 0101 1 0 If the policy evaluated @ Google+ False Requester is a family member What if the policy evaluated @ Linkedin Example:
27
Application and Complexity of IND 27 SCSs test whether policy function is I’th input nondeducible I is the set of contributed input by an SCS Deciding input nondeducibility To implement the static analysis Complement of IND is in Encode IND instance to Quantified Boolean Formula (QBF) Use a QBF solver to test the satisfiability On Protection in Federated Social Computing Systems
28
IND Functions 28 Rarity of input nondeducible functions Limited composibility Useful IND functions Threshold function Threshold returns 1 if at least m of the n inputs are 1 Replacement for conjunction Conditional function Replacement for disjunction On Protection in Federated Social Computing Systems
29
Policy Idioms 29 It is unwise to leave it to the user to formulate “safe” policies Users can be provided with templates of “safe” policies Safe policy templates Threshold policy Conditional policy On Protection in Federated Social Computing Systems
30
Related Work 30 [1] Ko, Moo Nam, Gorrell P. Cheek, Mohamed Shehab, and Ravi Sandhu. "Social-networks connect services." Computer 43, no. 8 (2010): 37-43. [2] Shehab, Mohamed, Moo Nam Ko, and Hakim Touati. "Enabling cross-site interactions in social networks." Social Network Analysis and Mining 3.1 (2013): 93-106. [3] Squicciarini, Anna Cinzia, Giuseppe Petracca, and Elisa Bertino. "Adaptive data protection in distributed systems." Proceedings of the third ACM conference on Data and application security and privacy. ACM, 2013. On Protection in Federated Social Computing Systems
31
Calgary On Protection in Federated Social Computing Systems 31
32
ICT Bldg. at the University of Calgary On Protection in Federated Social Computing Systems 32
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.