Download presentation
Presentation is loading. Please wait.
1
Seminar in Foundations of Privacy 1.Adding Consistency to Differential Privacy 2.Attacks on Anonymized Social Networks Inbal Talgam March 2008
2
1. Adding Consistency to Differential Privacy
3
Differential Privacy 1977 Dalenius - The risk to one’s privacy is the same with or without access to the DB. 2006 Dwork & Naor – Impossibe (auxiliary info). 2006 Dwork et al – The risk is the same with or without participating in the DB. Plus: Strong mechanism of Calibrated Noise to achieve DP while maintaining accuracy. 2007 Barak et al - Adding consistency.
4
Setting – Contingency Table and Marginals k binary attributes n participants DB 0 1 0 0 1 1 1 0 0 0 1 0 1 0 … Terminology: Contingency table (private), marginals (public). ##… 2 k attribute settings 0…00…1… Contingency Table 83… 2 j attribute settings 09… 2 i attribute settings Marginals j << k
5
Main Contribution Solve following consistency problem: At low accuracy cost 20… Marginals Noise NaN-0.5… Contingency Table +
6
Outline Discussion of: 1.Privacy 2.Accuracy & Consistency Key method - Fourier basis The algorithm –Part I –Part II
7
Privacy – Definition Intuition: The risk is the same with or without participating in the DB Definition: DB 1 DB 2 Differing on 1 element A randomized function K gives ε -differential privacy if for all DB 1, DB 2 differing on at most 1 element
8
Privacy - Mechanism Noise Pls let me know f ( DB ) DB Goal: Noise K ( DB ) = f ( DB )+ Noise Laplace noise: Pr[ K ( DB )= a ] exp (|| f ( DB ) - a|| 1 / σ)
9
The Calibrated Noise Mechanism for DP Main idea: Amount of noise to add to f(DB) is calibrated according to the sensitivity of f, denoted Δf. Definition: All useful functions should be insensitive… (e.g. marginals) For f : D → R d, the L 1-sensitivity of f is for all DB 1, DB 2 differing on at most 1 element
10
The Calibrated Noise Mechanism – How Much Noise Main result: To ensure ε-differential privacy for a query of sensitivity Δf, add Laplace noise with σ = Δf/ε. Why does it work? Remember: Laplace: Definition: Pr[ K ( DB )= a ] exp (|| f ( DB ) - a||1 / σ)
11
Accuracy & Consistency 83… Contingency Table 20… Marginals Noise + NaN-0.5… New Table Compromise consistency May lead to technical problems and confusion So smoking is one of the leading causes of statistics? 83… Contingency Table + Noise 32… Marginals Compromise accuracy Non-calibrated, binomial noise Var=Θ(2 k )
12
Key Approach Non-redundant representation Specific for required marginals 83… Contingency Table 20… Marginals + Small number of coefficients of the Fourier basis Consistency: Any set of Fourier coefficients correspond to a (fractional and possibly negative) contingency table. Accuracy: Few Fourier coefficients are needed for low- order marginals, so low sensitivity and small error. Noise + Linear Programming + Rounding
13
Accuracy – What is Guaranteed Let C be a set of original marginals, each on ≤ j attributes. Let C’ be the result marginals. With probability 1-δ, : Remark: Advantage of working in the interactive model. DB
14
Outline Discussion of: 1.Privacy 2.Accuracy & Consistency Key method - Fourier basis The algorithm –Part I –Part II
15
Notation & Preliminaries ||x|| 1 = ? We say α ≤ β if β has all α’s attributes (and more) e.g. 0110 ≤ 0111 but not 0110 ≤ 0101 Introduce the linear marginal operator C β β determines attributes Remember: x α, α ≤ β, C β (x), C β (x) γ ##… Contingency Table x 0…0 x 0…1 x α where 20… Marginal C β (x) :
16
The Fourier Basis –Orthonormal basis for space of contingency tables x (R 2 k ). Motivation: Any marginal C β (x) can be written as a combination of few f α ’s. –How few? Depends on order of marginal. f α : …
17
Writing marginals in Fourier Basis Theorem: Marginal of x with attributes β Write x in Fourier basis Linearity Proof. For any coordinate By definition of marginal operator and Fourier vector
18
Outline Discussion of: 1.Privacy 2.Accuracy & Consistency Key method - Fourier basis The algorithm –Part I – adding calibrated noise –Part II – non-negativity by linear programming
19
Algorithm – Part I INPUT: Required marginals {C β } {f α } = Fourier vectors needed to write marginals Releasing marginals {C β (x)} = releasing coeffs OUTPUT: Noisy coeffs {Φ α } METHOD: Add calibrated noise Sensitivity depends on |{α}| on order of C β ’s
20
Part II – Non-negativity by LP INPUT: Noisy coeffs {Φ α } OUTPUT: Non-negative contingency table x' METHOD: Minimize difference between Fourier coefficients Most entries x' γ in a vertex solution are 0 Rounding adds small error minimize b subject to: x ' γ ≥ 0 | Φ α - | ≤ b
21
Algorithm Summary Input: Contingency table x, required marginals {C β } Output: Marginals {C β } of new contingency table x'' {f α } = Fourier vectors needed to write marginals Compute noisy Fourier coefficients {Φ α } Find non-negative x' with nearly the correct Fourier coefficients Round to x'' Part I Part II
22
Accuracy Guarantee - Revisited With probability 1-δ, #Coefficients
23
Summary & Open Questions Algorithm for marginals release Guarantees privacy, accuracy & consistency –Consistency: can reconstruct a synthetic, consistent table –Accuracy: error increases smoothly with order of marginals Open questions: –Improving efficiency –Effect of noise on marginals’ statistical properties
24
Any Questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.