Presentation is loading. Please wait.

Presentation is loading. Please wait.

TRUST Area 3 Overview: Privacy, Usability, & Social Impact

Similar presentations


Presentation on theme: "TRUST Area 3 Overview: Privacy, Usability, & Social Impact"— Presentation transcript:

1 TRUST Area 3 Overview: Privacy, Usability, & Social Impact
Doug Tygar UC Berkeley NSF STC Review September 13th 2004

2 Security can not be understood in isolation
TRUST Security can not be understood in isolation Computer security arises from human needs for information If we view it purely as a mathematical science, we miss important aspects of the problem Example: Problems using encryption Why Johnny Can’t Encrypt Our project fully incorporates these aspects: Economics, Public Policy and Societal Challenges Digital Forensics and Privacy Human Computer Interfaces and Security We integrate these issues in all aspects of our study 2 NSF STC Review September 13th 2004

3 Economics, Public Policy & Societal Challenges
TRUST Economics, Public Policy & Societal Challenges Team members: : McFadden, Samuelson, Varian, Weber Insurance is often a way of enforcing desirable norms: e.g., business fire insurance require fire safety measures Requirement: party with control bears liability. Example: ATM machines in UK and US Economic analysis changes Attacks can be deliberate and not simply accidents Weakest link model Transaction costs associated with security 3 NSF STC Review September 13th 2004

4 Human Computer Interfaces & Security
TRUST Human Computer Interfaces & Security Team members: Garcia-Molina, Perrig, Reiter, Song, Tygar Most common source of security problems (by far): People can’t figure out how to configure the software Problems System complexity Software complexity People have trouble generating random values (passwords) People have trouble remembering long strings Low tolerance for noticing small changes in repetitive tasks 4 NSF STC Review September 13th 2004

5 Digital Forensics & Privacy
TRUST Digital Forensics & Privacy Team members: Birman, Boneh, Mitchell, Reiter, Samuelson, Tygar, Weber Example challenge problems: Privacy-preserving data mining (law enforcement) Peer-to-peer privacy and security Privacy in sensor networks Identity theft Mechanisms: Strong audit Selective revelation of information Rule processing technologies The next few slides explore this area more deeply 5 NSF STC Review September 13th 2004

6 Strategy: Selective revelation
TRUST Strategy: Selective revelation Architecture based on selective revelation Goal: minimize revelation of personal data while supporting analysis Approach: partial, incremental revelation of personal data Procedure: Initial revelation by statistics & categories Subsequent revelation as justified by earlier results Supports both “standing” & real-time queries 6 NSF STC Review September 13th 2004

7 Idealized architecture
TRUST Idealized architecture Initial revelation of sanitized data ! Discovery via standing queries or real-time search Privacy/ Security Barrier Core Idea: (1) Analyze data behind security barrier; find critical relationships (2) Reveal relationships selectively only through guarded interface 7 Data Repositories NSF STC Review September 13th 2004

8 Distributed architecture
TRUST Distributed architecture Multiple repositories  Multiple privacy/security barriers ! 8 NSF STC Review September 13th 2004

9 Audit TRUST 9 Protect against abuse by “watching the watchers”
Design goals: Distributed audit Everyone subject to audit Cross-organizational audit Measure accuracy of auditors by cross-validation Usage records are tamper-evident Hall of Mirrors: Audit also has a privacy problem Data sets are voluminous Usage records are sensitive 9 NSF STC Review September 13th 2004

10 Example technology: encrypted search
TRUST Queries are sent encrypted Queries are processed but not decrypted by repository Repository prepares response but does not know what search was or whether it was successful. Work by Song (CMU), Perrig (CMU), Wagner (Berkeley) Example technology: encrypted search encrypted queries Repository of private data Analyst encrypted response  Limited trust between parties  10 NSF STC Review September 13th 2004

11 Labeling derived data TRUST Example: derived restrictions [R2] [R1]
Conservative approach: output inherits all restrictions of inputs Often too restrictive Sometimes too liberal Hard problem Seek semi-automated solution to minimize human overhead Build on recent work on program semantics Example: derived restrictions [R2] [R1] [R3] 11 [?] NSF STC Review September 13th 2004

12 Privacy rules TRUST 12 Need language for expressing rules
Related technology: Digital Rights Management Translate English  agent based language Rules differ based on data Types of data (3rd party or self-generated, video vs. textual) Contents of data Need tools for compliance checking Both automated and human in the loop 12 NSF STC Review September 13th 2004

13


Download ppt "TRUST Area 3 Overview: Privacy, Usability, & Social Impact"

Similar presentations


Ads by Google