Differential Privacy A Primer for the Perplexed

Slides:



Advertisements
Similar presentations
PRIVACY IN NETWORK TRACES Ilya Mironov Microsoft Research (Silicon Valley Campus)
Advertisements

Private Analysis of Graph Structure With Vishesh Karwa, Sofya Raskhodnikova and Adam Smith Pennsylvania State University Grigory Yaroslavtsev
Research Methods: How We Do Psychology Forming and Testing Hypotheses Theory Integrated set of principles that explain and predict observed events Hypotheses.
Raef Bassily Adam Smith Abhradeep Thakurta Penn State Yahoo! Labs Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds Penn.
Foundations of Privacy Lecture 4 Lecturer: Moni Naor.
1 Differentially Private Analysis of Graphs and Social Networks Sofya Raskhodnikova Pennsylvania State University.
Privacy Enhancing Technologies
The State of the Art Cynthia Dwork, Microsoft Research TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A AA A AAA.
Foundations of Privacy Lecture 4 Lecturer: Moni Naor.
Adding Privacy to Netflix Recommendations Frank McSherry, Ilya Mironov (MSR SVC) Attacks on Recommender Systems — No “blending in”, auxiliary information.
Seminar in Foundations of Privacy 1.Adding Consistency to Differential Privacy 2.Attacks on Anonymized Social Networks Inbal Talgam March 2008.
Differential Privacy 18739A: Foundations of Security and Privacy Anupam Datta Fall 2009.
Privacy Meets Game Theory and Economics. 2 Example: Pricing of an Online Cartoon (digital good, unlimited supply)
Shuchi Chawla, Cynthia Dwork, Frank McSherry, Adam Smith, Larry Stockmeyer, Hoeteck Wee From Idiosyncratic to Stereotypical: Toward Privacy in Public Databases.
Lecturer: Moni Naor Foundations of Privacy Formal Lecture Zero-Knowledge and Deniable Authentication.
Healthy Living Factors in Preventing Serious Disease ( Diabetes, Cancer Heart Attack, Stroke)
Differentially Private Data Release for Data Mining Benjamin C.M. Fung Concordia University Montreal, QC, Canada Noman Mohammed Concordia University Montreal,
Differentially Private Transit Data Publication: A Case Study on the Montreal Transportation System Rui Chen, Concordia University Benjamin C. M. Fung,
Private Analysis of Graphs
Using Data Privacy for Better Adaptive Predictions Vitaly Feldman IBM Research – Almaden Foundations of Learning Theory, 2014 Cynthia Dwork Moritz Hardt.
Defining and Achieving Differential Privacy Cynthia Dwork, Microsoft TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Differential Privacy - Apps Presented By Nikhil M Chandrappa 1.
Differentially Private Data Release for Data Mining Noman Mohammed*, Rui Chen*, Benjamin C. M. Fung*, Philip S. Yu + *Concordia University, Montreal, Canada.
Differentially Private Marginals Release with Mutual Consistency and Error Independent of Sample Size Cynthia Dwork, Microsoft TexPoint fonts used in EMF.
Razi Mukatren Golan Salman 1 Workshop in information security Privacy in a Demographic Database Lecturer: Dr. Eran Tromer Teaching assistant: Mr. Nir Atias.
Foundations of Privacy Lecture 5 Lecturer: Moni Naor.
Comparison of Tarry’s Algorithm and Awerbuch’s Algorithm CS 6/73201 Advanced Operating System Presentation by: Sanjitkumar Patel.
1 What Can We Learn Privately? Sofya Raskhodnikova Penn State University Joint work with Shiva Kasiviswanathan Los Alamos Homin Lee UT Austin Kobbi Nissim.
1 Differential Privacy Cynthia Dwork Mamadou H. Diallo.
Preserving Statistical Validity in Adaptive Data Analysis Vitaly Feldman IBM Research - Almaden Cynthia Dwork Moritz Hardt Toni Pitassi Omer Reingold Aaron.
Smoking Prevention By: Quinten, Fahid, Ian Plankton smokes a whole pack of cigarettes everyday. What he does not know is the dangers that it comes with.
Reconciling Confidentiality Risk Measures from Statistics and Computer Science Jerry Reiter Department of Statistical Science Duke University.
Epithelial Malignant Tumours of Maxillofacial Region Diagnostics Treatment and Complications. By: Dr Ahmeda Ali.
Smoking and Lung Cancer
Differential Privacy: What, Why and When
Post-Modern Private Data Analysis
Computation and Society: The Case of Privacy and Fairness
Privacy-preserving Release of Statistics: Differential Privacy
Graph Analysis with Node Differential Privacy
Thinking (and teaching) Quantitatively about Bias and Privacy
By (Group 17) Mahesha Yelluru Rao Surabhee Sinha Deep Vakharia
Points to consider Broaden the perspective: groups, exposures, wider working environment Multiple exposures Broaden the use of data sources: approaches.
دانشیار دانشکده دندانپزشکی دانشگاه علوم پزشکی شهید صدوقی یزد
Data Mining 2017: Privacy and Differential Privacy
**This may be expedited depending on PRMC SOP
Differential Privacy and Statistical Inference: A TCS Perspective
Chapter 4: Designing Studies
Chapter 4: Designing Studies
CHAPTER 4 Designing Studies
تصاویر و پیام های هشدار دهنده بهداشتی روی بسته بندی های محصولات دخانی الزامات و دانستنی ها تهيه و تنظيم: مهندس ياور منظوري كارشناس بهداشت محيط شبكه بهداشت.
Differential Privacy (2)
Privacy-preserving Prediction
Lesson Using Studies Wisely.
Chapter 4: Designing Studies
Chapter 4: Designing Studies
Chapter 4: Designing Studies
CHAPTER 4 Designing Studies
Published in: IEEE Transactions on Industrial Informatics
CHAPTER 4 Designing Studies
Census: a survey which measures an entire population.
Chapter 4: Designing Studies
Chapter 4: Designing Studies
Chapter 4: Designing Studies
Chapter 4: Designing Studies
Chapter 4: Designing Studies
Volume 124, Issue 5, Pages (May 2003)
Chapter 4: Designing Studies
Chapter 4: Designing Studies
CHAPTER 4 Designing Studies
Differential Privacy.
Presentation transcript:

Differential Privacy A Primer for the Perplexed Cynthia Dwork, Frank McSherry, Adam Smith, Kobbi Nissim

Differential Privacy  

Differential Privacy is a criterion. It is not a specific algorithm. Many randomized algorithms provide eps-DP. The discovery that one works poorly for your problem is only that. Others may work better.   On the other hand, using just noisy counts we can produce a cumulative density function:

Differential Privacy can’t “make” privacy. Imagine that a DP analysis teaches us that smokers are at risk for cancer, and also you smoke in public. DP has not violated your privacy. All conclusions about you could be reached without your secrets. DP masks the nature of one’s participation in surveys, and prevents the mishandling of individuals’ records. It does not manufacture privacy where none exists.

Differential Privacy is good research.