Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010.

Slides:



Advertisements
Similar presentations
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,
Advertisements

Operating System Security
Ragib Hasan Johns Hopkins University en Spring 2011 Lecture 8 04/04/2011 Security and Privacy in Cloud Computing.
1 Privacy in Microdata Release Prof. Ravi Sandhu Executive Director and Endowed Chair March 22, © Ravi Sandhu.
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
Differential Privacy 18739A: Foundations of Security and Privacy Anupam Datta Fall 2009.
Toward a Framework for Preventing Side-Channel Attacks in Wireless Networks Jeff Pang.
Demo, May 2005 Privacy Preserving Database Application Testing Xintao Wu, Yongge Wang, Yuliang Zheng, UNC Charlotte.
Verifiable Security Goals
Security in Databases. 2 Srini & Nandita (CSE2500)DB Security Outline review of databases reliability & integrity protection of sensitive data protection.
Anatomy: Simple and Effective Privacy Preservation Israel Chernyak DB Seminar (winter 2009)
(c) 2007 Mauro Pezzè & Michal Young Ch 10, slide 1 Functional testing.
Security in Databases. 2 Outline review of databases reliability & integrity protection of sensitive data protection against inference multi-level security.
CS526Topic 21: Integrity Models1 Information Security CS 526 Topic 21: Integrity Protection Models.
SE571 Security in Computing
1 Preserving Privacy in GPS Traces via Uncertainty-Aware Path Cloaking by: Baik Hoh, Marco Gruteser, Hui Xiong, Ansaf Alrabady ACM CCS '07 Presentation:
Database Laboratory Regular Seminar TaeHoon Kim.
Privacy and trust in social network
Software Testing Content Essence Terminology Classification –Unit, System … –BlackBox, WhiteBox Debugging IEEE Standards.
Overview of Privacy Preserving Techniques.  This is a high-level summary of the state-of-the-art privacy preserving techniques and research areas  Focus.
Ragib Hasan University of Alabama at Birmingham CS 491/691/791 Fall 2011 Lecture 16 10/11/2011 Security and Privacy in Cloud Computing.
3/16/2004Biba Model1 Biba Integrity Model Presented by: Nathan Balon Ishraq Thabet.
Database Security DBMS Features Statistical Database Security.
Sensitive Data  Data that should not be made public  What if some but not all of the elements of a DB are sensitive Inherently sensitiveInherently sensitive.
Introduction to: 1.  Goal[DEN83]:  Provide frequency, average, other statistics of persons  Challenge:  Preserving privacy[DEN83]  Interaction between.
Tuning Privacy-Utility Tradeoffs in Statistical Databases using Policies Ashwin Machanavajjhala cs.duke.edu Collaborators: Daniel Kifer (PSU),
Refined privacy models
1 Security on Social Networks Or some clues about Access Control in Web Data Management with Privacy, Time and Provenance Serge Abiteboul, Alban Galland.
Disclosure risk when responding to queries with deterministic guarantees Krish Muralidhar University of Kentucky Rathindra Sarathy Oklahoma State University.
Accuracy-Constrained Privacy-Preserving Access Control Mechanism for Relational Data.
Privacy of Correlated Data & Relaxations of Differential Privacy CompSci Instructor: Ashwin Machanavajjhala 1Lecture 16: Fall 12.
Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1.
Database Security Outline.. Introduction Security requirement Reliability and Integrity Sensitive data Inference Multilevel databases Multilevel security.
CS426Fall 2010/Lecture 251 Computer Security CS 426 Lecture 25 Integrity Protection: Biba, Clark Wilson, and Chinese Wall.
SPAR-MPC Day 1 Breakout Sessions Emily Shen 29 May 2014.
© 2010 Health Information Management: Concepts, Principles, and Practice Chapter 5: Data and Information Management.
Privacy vs. Utility Xintao Wu University of North Carolina at Charlotte Nov 10, 2008.
Privacy-preserving data publishing
Bloom Cookies: Web Search Personalization without User Tracking Authors: Nitesh Mor, Oriana Riva, Suman Nath, and John Kubiatowicz Presented by Ben Summers.
Hyperproperties Michael Clarkson and Fred B. Schneider Cornell University Ph.D. Seminar Northeastern University October 14, 2010.
Design Principles and Common Security Related Programming Problems
Mix networks with restricted routes PET 2003 Mix Networks with Restricted Routes George Danezis University of Cambridge Computer Laboratory Privacy Enhancing.
Presented by Minkoo Seo March, 2006
Andrew Lindell Aladdin Knowledge Systems and Bar-Ilan University 04/08/08 CRYP-106 Efficient Fully-Simulatable Oblivious Transfer.
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010.
Differential Privacy (1). Outline  Background  Definition.
1 Differential Privacy Cynthia Dwork Mamadou H. Diallo.
Secure Data Outsourcing
Belief in Information Flow Michael Clarkson, Andrew Myers, Fred B. Schneider Cornell University 18 th IEEE Computer Security Foundations Workshop June.
Wired Equivalent Privacy (WEP) Chris Overcash. Contents What is WEP? What is WEP? How is it implemented? How is it implemented? Why is it insecure? Why.
PREPARED BY: MS. ANGELA R.ICO & MS. AILEEN E. QUITNO (MSE-COE) COURSE TITLE: OPERATING SYSTEM PROF. GISELA MAY A. ALBANO PREPARED BY: MS. ANGELA R.ICO.
CS526Topic 19: Integrity Models1 Information Security CS 526 Topic 19: Integrity Protection Models.
PRESENTED BY Raju. What is information security?  Information security is the process of protecting information. It protects its availability, privacy.
Privacy Issues in Graph Data Publishing Summer intern: Qing Zhang (from NC State University) Mentors: Graham Cormode and Divesh Srivastava.
CPSC-310 Database Systems
Verifiable Security Goals
Executive Director and Endowed Chair
Computer Data Security & Privacy
Input Space Partition Testing CS 4501 / 6501 Software Testing
Privacy-preserving Release of Statistics: Differential Privacy
Quantification of Integrity
Executive Director and Endowed Chair
Differential Privacy in Practice
Differential Privacy (2)
Data Warehousing Data Mining Privacy
Published in: IEEE Transactions on Industrial Informatics
TELE3119: Trusted Networks Week 4
BPSec: AD Review Comments and Responses
Refined privacy models
Anonymity – Generalizing Mixes
Presentation transcript:

Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010

Goal Information-theoretic Quantification of programs’ impact on Integrity of Information (relationship to database privacy) Clarkson: Quantification of Integrity2 [Denning 1982]

What is Integrity? Databases: Constraints that relations must satisfy Provenance of data Utility of anonymized data Common Criteria: Protection of assets from unauthorized modification Biba (1977): Guarantee that a subsystem will perform as it was intended; Isolation necessary for protection from subversion; Dual to confidentiality …no universal definition Clarkson: Quantification of Integrity3

Our Notions of Integrity Clarkson: Quantification of Integrity4 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output …distinct, but interact

Contamination Goal: model taint analysis  Untrusted input contaminates trusted output Clarkson: Quantification of Integrity5 Program User Attacker User Attacker trusted untrusted

Contamination u contaminates o (Can’t u be filtered from o ?) Clarkson: Quantification of Integrity6 o:=(t,u)

Quantification of Contamination Use information theory: information is surprise X, Y, Z: distributions I (X,Y): mutual information between X and Y (in bits) I (X,Y | Z): conditional mutual information Clarkson: Quantification of Integrity7

Quantification of Contamination Contamination = I (U in,T out | T in ) Clarkson: Quantification of Integrity8 Program User Attacker User Attacker trusted untrusted U in T in T out [Newsome et al. 2009] Dual of [Clark et al. 2005, 2007]

Example of Contamination Clarkson: Quantification of Integrity9 o:=(t,u) Contamination = I (U, O | T) = k bits if U is uniform on [0,2 k -1]

Our Notions of Integrity Clarkson: Quantification of Integrity10 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output

Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity11 Implementation Sender Attacker Receiver Attacker trusted untrusted Sender Receiver Information about correct output is suppressed from real output real correct Specification

Example of Program Suppression Clarkson: Quantification of Integrity12 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression—a[0] missing No contamination Suppression—a[m] added Contamination a[0..m-1]: trusted

Suppression vs. Contamination Clarkson: Quantification of Integrity13 * * Attacker * * Contaminatio n Suppression output := input

Quantification of Program Suppression Clarkson: Quantification of Integrity14 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Program transmission = I (Spec, Impl) In Spec Impl Sender Receiver

Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity15 Info actually learned about Spec by observing Impl Total info to learn about Spec Info NOT learned about Spec by observing Impl

Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Program Suppression = H (Spec | Impl) Clarkson: Quantification of Integrity16

Example of Program Suppression Clarkson: Quantification of Integrity17 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression = H (A)Suppression ≤ H (A) A = distribution of individual array elements

Suppression and Confidentiality Declassifier: program that reveals (leaks) some information; suppresses rest Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009] Thm. Leakage + Suppression is a constant  What isn’t leaked is suppressed Clarkson: Quantification of Integrity18

Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage …sacrifices integrity for confidentiality’s sake Clarkson: Quantification of Integrity19 Anonymizer User Databas e User query response anonymized response

k-anonymity DB: Every individual must be anonymous within set of size k. [Sweeney 2002] Programs: Every output corresponds to k inputs. But what about background knowledge? …no bound on leakage …no bound on suppression Clarkson: Quantification of Integrity20

L-diversity DB: Every individual’s sensitive information should appear to have L (roughly) equally likely values. [Machanavajjhala et al. 2007] Entropy L-diversity: H (anon. block) ≥ log L [Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007] Program: H (T in | t out ) ≥ log L (if T in uniform) …implies suppression ≥ log L Clarkson: Quantification of Integrity21

Summary Measures of information corruption: Contamination (generalizes taint analysis) Suppression (generalizes program correctness) Application: database privacy (model anonymizers; relate utility and privacy) Clarkson: Quantification of Integrity22

More Integrity Measures  Channel suppression … same as channel model from information theory, but with attacker  Attacker- and program-controlled suppression  Belief-based measures [Clarkson et al. 2005] …generalize information-theoretic measures Granularity:  Average over all executions  Single executions  Sequences of executions Clarkson: Quantification of Integrity23