I have a DREAM! (DiffeRentially privatE smArt Metering) Gergely Acs and Claude Castelluccia {gergely.acs, INRIA 2011.

Slides:



Advertisements
Similar presentations
Provable Unlinkability Against Traffic Analysis Ron Berman Joint work with Amos Fiat and Amnon Ta-Shma School of Computer Science, Tel-Aviv University.
Advertisements

Attacking Cryptographic Schemes Based on Perturbation Polynomials Martin Albrecht (Royal Holloway), Craig Gentry (IBM), Shai Halevi (IBM), Jonathan Katz.
Trusted Data Sharing over Untrusted Cloud Storage Provider Gansen Zhao, Chunming Rong, Jin Li, Feng Zhang, and Yong Tang Cloud Computing Technology and.
Cipher Techniques to Protect Anonymized Mobility Traces from Privacy Attacks Chris Y. T. Ma, David K. Y. Yau, Nung Kwan Yip and Nageswara S. V. Rao.
1 Programa de Engenharia Elétrica - PEE/COPPE/UFRJ Universidade Federal do Rio de Janeiro A Review of Anomalies Detection Schemes for Smart Grids Andrés.
ITIS 6200/ Secure multiparty computation – Alice has x, Bob has y, we want to calculate f(x, y) without disclosing the values – We can only do.
Distribution and Revocation of Cryptographic Keys in Sensor Networks Amrinder Singh Dept. of Computer Science Virginia Tech.
Daisuke Mashima and Arnab Roy Fujitsu Laboratories of America, Inc. Privacy Preserving Disclosure of Authenticated Energy Usage Data.
Smart Grid, Data and Behaviour – Privacy and Security Issues - Potential for Secure Computation Lexpert Seminar December 9, 2013David Young, Partner.
Cynthia Kuo, Mark Luk, Rohit Negi, Adrian Perrig Carnegie Mellon University Message-In-a-Bottle: User-Friendly and Secure Cryptographic Key Deployment.
Authors Haifeng Yu, Michael Kaminsky, Phillip B. Gibbons, Abraham Flaxman Presented by: Jonathan di Costanzo & Muhammad Atif Qureshi 1.
Packet Leashes: Defense Against Wormhole Attacks Authors: Yih-Chun Hu (CMU), Adrian Perrig (CMU), David Johnson (Rice)
Security and Privacy Issues in Wireless Communication By: Michael Glus, MSEE EEL
Seminar in Foundations of Privacy 1.Adding Consistency to Differential Privacy 2.Attacks on Anonymized Social Networks Inbal Talgam March 2008.
Pervasive devices are powerful but fragile Rachid Guerraoui EPFL.
Models and Security Requirements for IDS. Overview The system and attack model Security requirements for IDS –Sensitivity –Detection Analysis methodology.
Roberto Di Pietro, Luigi V. Mancini and Alessandro Mei.
CSCE 715 Ankur Jain 11/16/2010. Introduction Design Goals Framework SDT Protocol Achievements of Goals Overhead of SDT Conclusion.
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Certificateless encryption and its infrastructures Dr. Alexander W. Dent Information Security Group Royal Holloway, University of London.
CMSC 414 Computer (and Network) Security Lecture 2 Jonathan Katz.
ITIS 6010/8010 Wireless Network Security Dr. Weichao Wang.
Dept. of Computer Science & Engineering, CUHK1 Trust- and Clustering-Based Authentication Services in Mobile Ad Hoc Networks Edith Ngai and Michael R.
Co-operative Private Equality Test(CPET) Ronghua Li and Chuan-Kun Wu (received June 21, 2005; revised and accepted July 4, 2005) International Journal.
A Designer’s Guide to KEMs Alex Dent
Cryptographic Technologies
CMSC 414 Computer and Network Security Lecture 2 Jonathan Katz.
An Authentication Service Against Dishonest Users in Mobile Ad Hoc Networks Edith Ngai, Michael R. Lyu, and Roland T. Chin IEEE Aerospace Conference, Big.
Security in Wireless Sensor Networks Perrig, Stankovic, Wagner Jason Buckingham CSCI 7143: Secure Sensor Networks August 31, 2004.
Privacy-Preserving Computation and Verification of Aggregate Queries on Outsourced Databases Brian Thompson 1, Stuart Haber 2, William G. Horne 2, Tomas.
Differential Privacy (2). Outline  Using differential privacy Database queries Data mining  Non interactive case  New developments.
SybilGuard: Defending Against Sybil Attacks via Social Networks Haifeng Yu, Michael Kaminsky, Phillip B. Gibbons, and Abraham Flaxman Presented by Ryan.
CMSC 414 Computer and Network Security Lecture 2 Jonathan Katz.
CS 580S Sensor Networks and Systems Professor Kyoung Don Kang Lecture 7 February 13, 2006.
Advanced Metering Infrastructure
CMSC 414 Computer and Network Security Lecture 3 Jonathan Katz.
Multiplicative Weights Algorithms CompSci Instructor: Ashwin Machanavajjhala 1Lecture 13 : Fall 12.
R 18 G 65 B 145 R 0 G 201 B 255 R 104 G 113 B 122 R 216 G 217 B 218 R 168 G 187 B 192 Core and background colors: 1© Nokia Solutions and Networks 2014.
Slicing the Onion: Anonymity Using Unreliable Overlays Sachin Katti Jeffrey Cohen & Dina Katabi.
Overview of Privacy Preserving Techniques.  This is a high-level summary of the state-of-the-art privacy preserving techniques and research areas  Focus.
MOBILE AD-HOC NETWORK(MANET) SECURITY VAMSI KRISHNA KANURI NAGA SWETHA DASARI RESHMA ARAVAPALLI.
1 Privacy-Preserving Distributed Information Sharing Nan Zhang and Wei Zhao Texas A&M University, USA.
Smart Grid Security Challenges Ahmad Alqasim 1. Agenda Problem Statement Power system vs. smart grid Background Information Focus Point Privacy Attack.
Data Fusion & Multi-Sensors in Power Grids Rabinder N. Madan FIEEE Program Manager, Systems Theory Office of Naval Research.
Asymmetric-Key Cryptography Also known as public-key cryptography, performs encryption and decryption with two different algorithms. Each node announces.
Protecting Sensitive Labels in Social Network Data Anonymization.
Disclosure risk when responding to queries with deterministic guarantees Krish Muralidhar University of Kentucky Rathindra Sarathy Oklahoma State University.
Evoting using collaborative clustering Justin Gray Osama Khaleel Joey LaConte Frank Watson.
Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1.
Taiming Feng, Chuang wang, Wensheng Zhang and Lu Ruan INFOCOM 2008 Presented by Mary Nader.
PRISM: Private Retrieval of the Internet’s Sensitive Metadata Ang ChenAndreas Haeberlen University of Pennsylvania.
Privacy-preserving data publishing
Bloom Cookies: Web Search Personalization without User Tracking Authors: Nitesh Mor, Oriana Riva, Suman Nath, and John Kubiatowicz Presented by Ben Summers.
Microdata masking as permutation Krish Muralidhar Price College of Business University of Oklahoma Josep Domingo-Ferrer UNESCO Chair in Data Privacy Dept.
By: Gang Zhou Computer Science Department University of Virginia 1 Medians and Beyond: New Aggregation Techniques for Sensor Networks CS851 Seminar Presentation.
Privacy Issues in Smart Grid R. Newman. Topics Defining anonymity Need for anonymity Defining privacy Threats to anonymity and privacy Mechanisms to provide.
多媒體網路安全實驗室 Anonymous Authentication Systems Based on Private Information Retrieval Date: Reporter: Chien-Wen Huang 出處: Networked Digital Technologies,
Differential Privacy (1). Outline  Background  Definition.
Differential Privacy Xintao Wu Oct 31, Sanitization approaches Input perturbation –Add noise to data –Generalize data Summary statistics –Means,
1 Differential Privacy Cynthia Dwork Mamadou H. Diallo.
Privacy Preserving in Social Network Based System PRENTER: YI LIANG.
 Attacks and threats  Security challenge & Solution  Communication Infrastructure  The CA hierarchy  Vehicular Public Key  Certificates.
A hospital has a database of patient records, each record containing a binary value indicating whether or not the patient has cancer. -suppose.
Security in Outsourcing of Association Rule Mining
Privacy and Fault-Tolerance in Distributed Optimization Nitin Vaidya University of Illinois at Urbana-Champaign.
Differential Privacy in Practice
Differential Privacy (2)
Published in: IEEE Transactions on Industrial Informatics
Smart Meter Data Privacy: A Survey
Differential Privacy (1)
Presentation transcript:

I have a DREAM! (DiffeRentially privatE smArt Metering) Gergely Acs and Claude Castelluccia {gergely.acs, INRIA 2011

Smart Metering Electricity suppliers are deploying smart meters – that report energy consumption periodically (every minutes). Should improve energy management (for suppliers and customers) … Part of the Smart Grid (Critical Infrastructure)

Privacy?

Hoover Microwave Kettle Fridge Lighting

Motivation: Privacy/Security Potential threats – Profiling Increase in the granular collection, use and disclosure of personal energy information; Data linkage of personally identifiable information with energy use; Creation of an entirely new "library" of personal information – Security Is someone at home? We want to prevent – Suppliers from profiling customers – Attackers from getting private information

Contributions First provably private scheme for smart metering – No need for trusted aggregator – No assumptions about the adversary’s power (knowledge) – Remains useful for the supplier – Robust against node failures!! – Secure against colluding malicious users Validated by simulations – a new simulator to generate synthetic consumption data

Overview Model – Adversary model – Network model – Privacy model Our scheme: Distributed aggregation with encryption Performance and privacy analysis Conlusions

Model Dishonest-but-non-intrusive adversary – does not follow the protocol correctly – collude with malicious users – BUT: cannot access the distribution network (like to install wiretapping devices) Network model – No communication between meters! – Each meter has a public/private key pair Privacy model – Differential privacy model

Why Differential Privacy? There are different possible models (k- anonymity, l-diversity, …) We are using the Differential Privacy model – Only model that does not make any assumptions about the attacker model – Proposes a simple off-the-shelf sanitization technique – Strong (too strong?) and provides provable privacy!

The Differential Privacy Model Informally, a sanitization algorithm A is differentially private if its output is insensitive to changes in any individual value Definition: A is ε -differential private if given 2 datasets (set of traces) I and I’ differing in only one user, and any output x, then: First model that provides provable privacy! …and make no assumptions about the adversary! Very strong (too strong?)

Sanitization It was shown that a simple solution is to add noise to each sample in each slot such that: It can be shown that if: 1.noise follows a Laplacian distribution 2.where is the scale parameter of the laplace distribution, and Δ is the sensitivity (i.e. maximum value a sample can take) Then is ε -private in each slot

Sanitization: Example Time slotOriginal (Wh)Max user (Wh) Noised data (Wh) Lap(1350/0.1) Lap(1350/0.1) Lap(1350/0.1) Lap(1350/0.1) (sum over 4 slots) (over 4 slots)

Aggregating Data Electricity Supplier Aggregator Supplier gets (noisy) aggregated value but can’t recover individual sample!

Error/utility The larger the cluster, the better the utility…but the smaller the granularity

Noised Aggregated Data: Sum of N samples + Lapl. noise N=200 N=600

Aggregating Data Pros/Cons Pros: – Great solution to reduce noise/error – … and still generate useful (aggregated) data to the supplier – …with strict privacy guarantees. Cons: – Aggregators have to be trusted  ! – Who can be the aggregator? Supplier? Network? Can we get ride of the aggregator and still perform aggregation??

Distributed Aggregation Electricity Supplier

Our Approach: Distributed Aggregation Step 1: Distributed noise generation – We use the fact that a Laplacian noise can be generated as a sum of Gamma noises – Each node adds to its sample and sends result to the supplier – When noised samples are aggregated by the supplier, the noise gets added to a Laplacian noise… – No more aggregator needed!

Problem: original data:gamma noised data: The added gamma noise is too small to guarantee privacy of individual measurements! The supplier can possibly retrieve sample value from noised samples!

Step 2: Encrypting noised samples Electricity Supplier

Performance and privacy analysis A new trace generator Error depending on the number of users Privacy over multiple slots – Privacy of appliance usages and different activities (cooking, watching TV, …) – Privacy of being home

Trace generation

Error and the number of users ε over a single slot!

Privacy of appliances Noise is added to guarantee ε=1 per slot = error is 0.17 with 100 users

Privacy of the simultanous usage of active appliances (Are you at home?) ε 0.17 error for 100 users (ε=1 per slot)

Privacy of the simultanous usage of all appliances ε 0.17 error for 100 users (ε=1 per slot)

Conclusion First practical scheme that provides formal privacy and utility guarantees… Our scheme uses aggregation + noise Validation based on realistic datasets (generated by simulator) We can guarantee meaningful privacy for some activities (or appliances) but cannot hide everything! Privacy can be increased by adding more noise but we have to add more users to ensure low error!

Encryption Modulo-addition based: where k i is not known to the supplier where

Key generation Each node pair shares a symmetric key Each node randomly picks x other nodes such that if v selects w then w also selects v. Example for two nodes: 1.v selects w (and w selects v) if: 1.v and w generate the encryption key: 1.v  supplier: 2.w  supplier: Supplier decrypts by adding the ciphertexts:

Security analysis misbehaving users: – supplier can deploy fake meters ( α fraction of N nodes) or some users collude with the supplier and omit adding noise – each user adds extra noise to tolerate this attack… supplier lies about the cluster size … see report for proofs/details

Error and the number of misbehaving users ( ε =1 per slot)

Why aggregation is not enough? Why noise has to be added? Because we don’t make any assumption about the adversary model…. – E.g., if he knows (N-1) values, it can get the N th value… even with aggregation and encryption – But can’t get any info about Nth value if noise is added ;-) – Very strong guarantee!

Laplace Distribution

Privacy over multiple slots Composition property of diff. privacy: If we have ε 1 and ε 2 privacy in two different slots, then we have ε 1 +ε 2 privacy over the two slots Note ε =1 is an upper bound (for all users) in each slot! The exact bound by adding if we have consumption c(t) Over multiple slots:

Example

Differential Privacy Model: interpretation I or I’ Was the input I or I’ ??? Similar idea than indistinguishability in crypto…. If ε = 1: If ε = 0.5: If ε = 0.1: