Presentation is loading. Please wait.

Presentation is loading. Please wait.

International Technology Alliance in Network & Information Sciences Knowledge Inference for Securing and Optimizing Secure Computation Piotr (Peter) Mardziel,

Similar presentations


Presentation on theme: "International Technology Alliance in Network & Information Sciences Knowledge Inference for Securing and Optimizing Secure Computation Piotr (Peter) Mardziel,"— Presentation transcript:

1 International Technology Alliance in Network & Information Sciences Knowledge Inference for Securing and Optimizing Secure Computation Piotr (Peter) Mardziel, Michael Hicks, Aseem Rastogi, Matthew Hammer, Jonathan Katz (UMD) Mudhakar Srivatsa (IBM TJ Watson) With Towsley et al (Umass), Kasturi Rangan (UCLA) Annual Meeting of the ITA October 2013

2 Sharing between coalition domains is critical for mission success 2 Scout (Coalition A) Supporting force (Coalition B) Unmanned Air Vehicle (UAV) (Coalition A) Back-office Data Analyst (Coalition A) Satellite Communications – backhaul (Coalition A) X X X X Y Y X’X’ X’X’ X’X’ X’X’ X’X’ X’X’ Y Y Z Z Z Z Y Y Y Y Y Y Mixed force (Coalition A, C)

3 ITA Technologies facilitate sharing  ITA has developed many excellent technologies for sharing information –Gaian DB –Information fabric –Controlled English Store  All harness information and make it available to coalition partners –Provide a query or pub/sub interface  But: there may be risk in sharing all information –Might like to allow some queries but not others If the query would reveal too much information about the raw data If a sequence of queries would do so, even if one would not 3

4 Our research: Knowledge inference  Key idea: use program analysis (of the query) –to understand what the answer reveals about sensitive information to (a rational) recipient  We call this analysis knowledge inference  We have used knowledge inference in a variety of applications 4

5 Summary of Results (outline)  Knowledge-based security [CSF’11, NIPSPP’12, JCS’13, HOTNETS’13] –Enforce a security policy based on adversary’s (accumulated) knowledge –Implementation and experimental evaluation –Proof of soundness: will never underestimate adversary knowledge  Knowledge-based security for SMC [PLAS’12] –Adapt knowledge inference to consider multiple parties’ secrets –Proof of soundness  Optimizing SMC [PLAS’13] –Identify inferrable values by knowledge inference Do not bother to compute these using SMC –Leads to 30x speedup –Proof of correctness of technique 5

6 Papers on ITACS  [JCS’13] Piotr Mardziel, Stephen Magill, Mike Hicks and Mudhakar Srivatsa, Dynamic Enforcement of Knowledge-based Security Policies, Journal of Comp. Security, Feb’12, –https://www.usukitacs.com/node/1900.https://www.usukitacs.com/node/1900  [NIPSPP’12] Piotr Mardziel and Kasturi Rangan, Probabilistic Computation for Information Security, NIPS Probabilistic Programming Workshop, Dec’12, –https://www.usukitacs.com/node/2234.  [PLAS’12] P. Mardziel, M. Hicks, J. Katz and M. Srivatsa, Knowledge-Oriented Secure Multiparty Computation, Programming Languages and Analyses for Security, June’12, – https://www.usukitacs.com/node/2003.https://www.usukitacs.com/node/2003  [PLAS’13] Aseem Rastogi, Piotr Mardziel, Michael Hicks and Matthew Hammer, Knowledge Inference for Optimizing Secure Multi-party Computation, Programming Languages and Analyses for Security, June’13. –https://www.usukitacs.com/node/2310https://www.usukitacs.com/node/2310  [HOTNETS’13] Z. Shafiq, F. Le, M. Srivatsa and D. Towsley. Cross-Path Inference Attacks on Multipath TCP, ACM HotNets, July’13. –https://www.usukitacs.com/node/2491https://www.usukitacs.com/node/2491 6

7 Knowledge about the world  Learning about the world from observations. 7 0.5 : Today = not-raining 0.5 : Today = raining weather Outlook 0.82 : Today = not-raining 0.18 : Today = raining Outlook = sunny inference

8 Knowledge about secrets  Characterize adversary knowledge. 8 Secret system Public Output Public Output = “login failed” inference … 0.01 : Secret = 41 0.90 : Secret = 42 0.01 : Secret = 43 …

9 Levels of knowledge?  Characterize system as safe vs. unsafe. 9 … 0.05 : Secret = 41 0.05 : Secret = 42 0.05 : Secret = 43 … … 0.02 : Secret = 41 0.40 : Secret = 42 0.02 : Secret = 43 … … 0.01 : Secret = 41 0.90 : Secret = 42 0.01 : Secret = 43 … 1.00 : Secret = 42 inference approx. inference unsafe safe

10 Soundness of knowledge  Soundly approximate level of knowledge. 10 … 0.05 : Secret = 41 0.05 : Secret = 42 0.05 : Secret = 43 … … 0.02 : Secret = 41 0.40 : Secret = 42 0.02 : Secret = 43 … … 0.01 : Secret = 41 0.90 : Secret = 42 0.01 : Secret = 43 … 1.00 : Secret = 42 actual inference sound approx. inference unsafe safe

11 Technology: probabilistic programming  Programs –whose inputs and outputs may be distributions rather than values –which may contain uses of probabilistic choice  Effectively represent algorithmic description of a probabilistic model –conditional probability distribution relating inputs and outputs 11 Pr [ Outlook = sunny | Today = not-raining ] = 0.9 weather(today) { if (today == “not-raining”) { if (flip 0.9) return “sunny” else return “overcast” } else if (today == “raining”) { if (flip 0.8) return “overcast” else return “sunny” } CODE

12 Maintain a representation of each querier’s belief about secret’s possible values Each query result revises the belief; reject if actual secret becomes too likely Cannot let rejection defeat our protection. time Q1 Q3 … … Q2 Reject 12 Belief ≜ probability distribution Bayesian reasoning to revise belief OK (answer) Knowledge-based security

13 Policy = knowledge threshold  Answer a query if, for querier’s revised belief, Pr[my secret] < t –Call t the knowledge threshold  Choice of t depends on the risk of revelation 13

14 αProb: Implementation (CSF’11, JCS’13)  Queries are simple imperative programs  Approach: abstract interpretation for implementing probabilistic operations. Building blocks: –lattice point enumeration –integer programming  Key idea: abstract interpretation is sound –Never underestimate the knowledge –But may overestimate it Improves audit time May reject some legal queries  Application to sensor networks, location [NIPSPP’12] –Gave demo earlier in the week  Application to MPTCP [HOTNETS’13] 14

15 Current activity: Modeling time/change  Secrets can change over time.  In progress: formal model, theorems about knowledge of both the stream of secrets and the delta function 15 Pr [ Secret 2 = 42 | Secret 1 = 42 ] = 0.900392 delta(secret 1 ) { if (flip 0.9) return secret 1 else return (uniform 0,255) } CODE Pr [ Secret 1 = 42 ] = 1.0

16 Other activities  Expand expressiveness, improve performance –Model continuous distributions, not just discrete ones –Employ other forms of approximation  More applications –Multiparty TCP flows –Sensor networks –Mobility 16

17 Joint computations over secrets  Rather than asymmetric queries, may want to compute joint results –Coalitions each have sensor networks; use them to answer queries while hiding details –Coalitions perform joint mission planning; staff mission without knowing total resources 17 Q = Some function x y Q (X,Y) “attack at dawn”

18 Secure multiparty computation  Multiple parties have secrets to protect.  Want to compute some function over their secrets without revealing them. 18 x y Q(x,y) True / False Q = if x ≥ y then out := True else out := False

19 Secure multiparty computation  Use trusted third party. 19 x y T Q(x,y) Q = if x ≥ y then out := True else out := False True

20 Secure multiparty computation  SMC lets the participants compute this without a trusted third party. 20 T x y Q(x,y) True Q = if x ≥ y then out := True else out := False

21 Secure multiparty computation  Nothing is learned beyond what is implied* by the query output. 21 x y Q(x,y) True / False Q = if x ≥ y then out := True else out := False

22 Secure multiparty computation  Nothing is learned beyond what is implied* by the query output. –* what is implied can be a lot 22 x = ? x y=2 Q(x,2) Q = if x ≥ y then out := True else out := False False A B

23 Secure multiparty computation  Nothing is learned beyond what is implied* by the query output. –* what is implied can be a lot 23 x = 1 Q(x,2) Q = if x ≥ y then out := True else out := False False x A y=2 B

24 Secure multiparty computation  Nothing is learned beyond what is implied* by the query output. –* what is implied can be a lot 24 x = ? Q(x,3) Q = if x ≥ y then out := True else out := False False x A y=3 B

25 Secure multiparty computation  Nothing is learned beyond what is implied* by the query output. –* what is implied can be a lot 25 x ∈ {1,2} Q(x,3) Q = if x ≥ y then out := True else out := False False x A y=3 B

26 Secure multiparty computation  Nothing is learned beyond what is implied* by the query output. –* what is implied can be a lot 26 x = ? Q(x, ∞) Q = if x ≥ y then out := True else out := False False x A y=∞ B

27 Secure multiparty computation  Nothing is learned beyond what is implied* by the query output. –* what is implied can be a lot 27 x ≥ 1 Q(x, ∞) Q = if x ≥ y then out := True else out := False False x A y=∞ B

28 Knowledge-based security for SMC (PLAS’12)  Results (details in paper): –Adapt knowledge inference to SMC setting –Enforce threshold-based policies Two techniques: Belief sets and SMC-based belief tracking –Proof that our methods are sound (never underapproximate adversary knowledge)  Implementation not sufficiently performant for use on-line 28

29 Goal: Make SMC more performant (PLAS’13)  SMC is an appealing technology, but it is very slow –Implementation based on “garbled circuits” –Several orders of magnitude slower than normal computation  Recent work has developed general methods to improve SMC performance –Circuit-level optimizations –Pipelining circuit generation and execution (increases parallelism and decreases memory) –But: ultimately SMC is always going to be much slower than normal computation  Idea: use knowledge inference to find opportunities to replace SMC with normal computation in particular programs, with no loss to security 29

30 Example – Joint Median Computation { A 1, A 2 }, { B 1, B 2 } Assume: A 1 < A 2 and B 1 < B 2 and Distinct( A 1, A 2, B 1, B 2 ) a = A 1 ≤ B 1 ; b = a ? A 2 : A 1 ; c = a ? B 1 : B 2 ; d = b ≤ c ; output = d ? b : c ; 10/27/201530 Can show that Alice and Bob can infer a and d

31 Secure Computation 10/27/201531 output = d ? b : c ; dd a = A 1 ≤ B 1 ; b = a ? A 2 : A 1 ; c = a ? B 1 : B 2 ; d = b ≤ c ; Knowledge leads to optimized protocol

32 Median Example – Analysis from Bob’s Perspective 10/27/201532 a = A 1 ≤ B 1 ; b = a ? A 2 : A 1 ; c = a ? B 1 : B 2 ; d = b ≤ c ; output = d ? b : c ; A 1 ≤ B 1 ∧ A 2 ≤ B 1 A 1 ≤ B 1 ∧ A 2 > B 1 A 1 > B 1 ∧ A 2 ≤ B 1 A 1 > B 1 ∧ A 2 > B 1 d = ( output ≠ B 1 Ʌ output ≠ B 2 ) Recall: Distinct( A 1, A 2, B 1, B 2 ) a = ( output ≤ B 1 ) Recall: B 1 < B 2

33 Formalization of Knowledge 10/27/201533 x can be uniquely determined by p ’s inputs I and outputs O Party p knows x if: Two program executions that agree on I and O, also agree on x

34 Knowledge in Median Example Let states σ map program variables to values 10/27/201534 a = A 1 ≤ B 1 ; b = a ? A 2 : A 1 ; c = a ? B 1 : B 2 ; d = b ≤ c ; output = d ? b : c ; Bob knows a, if for all final states σ 1 and σ 2 s.t. σ 1 [ B 1 ] = σ 2 [B 1 ], σ 1 [ B 2 ] = σ 2 [B 2 ], and σ 1 [output] = σ 2 [ output ], we have, σ 1 [a] = σ 2 [a]

35 Results (details in paper)  Make the previous definition into an algorithm by using an idea called self-composition –Allows us to create a formula that, if satisfiable, says whether a variable is known –Can give this formula to an SMT solver –Result: implementation and proof of correctness (sound and relatively complete)  We have also developed an algorithm that is constructive –Computes formula that witnesses knowledge of the variable 35

36 Ongoing work  Building SMC compiler –Novel programming language for expressing mixed mode multiparty computation (M3PC) Combination of joint and local computations –Will employ knowledge-inference optimization to transform SMC programs to M3PC programs –Developing novel back end based on garbled circuits (standard mechanism) and oblivious RAM 36

37 Summary  Research agenda based on knowledge inference –Determining what a party can learn about a secret given a run of a program –Can use this for enforcing security, and optimizing computation  Ongoing work continues this agenda –Time-varying secrets –New applications (greater expressiveness) –New computational platform 37

38 BACKUP 38

39 Expressibility  Prior work [CSF’11, JCS‘13], supported limited language features. –distributions: piecewise bounds over discrete domains –possible but inconvenient to express other distributions 39 discrete distributions upper bounds lower bounds

40 Expressibility: continuous distributions  Continuous distributions for modeling real world processes. 40

41 Polynomial approximation  Improve precision by polynomial bounds (as opposed to constant). 41

42 Scales better than enumeration = 0 ≤ bday ≤ 364 1956 ≤ byear ≤ 1992 = 0 ≤ bday ≤ 364 1910 ≤ byear ≤ 2010 1 pp > 1 pp 42 each equally likely bday1 small bday 1 large

43 43 Birthday query 1+2+special Performance/precision tradeoff

44 Intervals very fast generally QueryIntervalsOctagonsPolyhedra Bday1 (small)0.011.872.81 Bday1+2 (small)0.012.95.25 Bday1+2+spec0.4717.823.0 Bday1 (large)0.012.12.48 Bday1+2 (large)0.023.024.52 Bday1+2+spec0.5833.646.5 Pizza0.2692.7125.5 Photo0.025.477.98 Travel0.48126.9154.5 44 Times in seconds All achieve maximum precision when given unlimited polyhedra

45 LattE is the performance bottleneck 45

46 Merging order matters for precision 46 Each point represents a different merging order for the given bound Median precision point depicted as a box Semi-interquartile range given in gray Best precision possible is at the very bottom (about 3.8 * 10 -4 )

47 Knowledge-based security for SMC (PLAS’12)  Approach: –Adapt knowledge inference to SMC setting –Develop means to enforce threshold-based policies  Knowledge inference: –Each party A Knows his own secret, estimates what others know about it Estimates something about each other party’s secret –Goal Define how a query result revises each party’s belief, and each party’s estimate of other beliefs about his secret  Threshold security: –Using knowledge inference, accept/reject query based on inferred knowledge (of others) 47

48 THE MINOR INCONVENIENCE What you learn depends on your secret. 48

49 Secure multiparty computation  Knowledge depends on secret. 49 Q(x,?) Q = if x ≥ y then out := True else out := False False x ∈ {1,…,42} Peter y=? The audience y

50 Secure multiparty computation  Knowledge depends on secret.  Knowledge leaks information about secret. 50 Q(x,?) Q = if x ≥ y then out := True else out := False False x ∈ {1,…,42} Peter y=? The audience x ∈ {1,…,42} y=?

51 Secure multiparty computation  Knowledge depends on secret.  Knowledge leaks information about secret. 51 Q(x,?) Q = if x ≥ y then out := True else out := False False x ∈ {1,…,42} Peter y=43 The audience x ∈ {1,…,42} y=43

52 THE MEDIOCRE INCONVENIENCE I am not allowed to know what knowledge you attain. 52

53 It gets worse…  Your Knowledge depends on your secret.  Your knowledge leaks information about your secret.  My knowledge-based policy depends on your knowledge.  My policy decision leaks information about your knowledge.  Therefore: my policy decision leaks information about your secret.

54 THE BIG INCONVENIENCE I am not allowed to know whether my knowledge-based policy permits you to see the output of a query. 54

55 55 Peter The audience I give up

56 THE MINOR IDEA Option 1 Be (very) conservative 56

57 x A B Approach 1: Belief sets  Generalize the asymmetric case –Party A estimates B’s knowledge as an enumeration B knows his own secret perfectly, and knows A’s secret imperfectly Thus: enumerate all possible values of B’s secret according to A’s estimation –Perform knowledge inference for each element of the enumeration 57 Q 1 (x,y) δ δ’δ’ δ δ’δ’ y δ δ’δ’ y=1 y=2 y=3 … Belief set

58 x A B Belief sets: assessment  Pros –Straightforward generalization, can reuse existing technology  Cons –Conservative: threshold security rejects in the worst (rare) case –Expensive: for N parties with size-M beliefs, we must try N*M possibilities 58 Q 1 (x,y) δ δ’δ’ δ δ’δ’ y δ δ’δ’ y=1 y=2 y=3 … Belief set

59 x Peter Approach 2: knowledge inference in SMC 59 Q 1 (x,y) NOPE Q 2 (x,y) TIME δ2δ2 δ2’δ2’ δ2’δ2’ δ 2 ’’ … okay δ1δ1 δ1’δ1’ δ2’δ2’ δ 2 ’’ The audience y Q 1 (x,y) = true

60 T x Peter SMC like a trusted third party Q 2 (x,y) TIME δ2δ2 δ2’δ2’ δ2’δ2’ δ 2 ’’ δ1δ1 δ1’δ1’ δ2’δ2’ The audience y True NOPE Q 1 (x,y)Q 2 (x,y) NOPE

61 T x Peter Answer may depend on secret Q 2 (x,y) TIME δ2δ2 δ2’δ2’ δ2’δ2’ δ 2 ’’ δ1δ1 δ1’δ1’ δ2’δ2’ The audience y True NOPE Q 1 (x,y)Q 2 (x,y)

62 SMC belief tracking: assessment  Pros –More precise: SMC can know each party’s secrets exactly  Cons –Party not allowed to know others’ outcomes, so knowledge estimate incomplete (kept by pseudo-PT) –Expensive – SMC s bad enough without adding probabilistic programming. Supporting it still a matter of research 62


Download ppt "International Technology Alliance in Network & Information Sciences Knowledge Inference for Securing and Optimizing Secure Computation Piotr (Peter) Mardziel,"

Similar presentations


Ads by Google