Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #19 Biometrics and Privacy - I October 31, 2005.

Similar presentations


Presentation on theme: "Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #19 Biometrics and Privacy - I October 31, 2005."— Presentation transcript:

1 Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #19 Biometrics and Privacy - I October 31, 2005

2 Outline l Overview of Privacy l Biometrics and Privacy

3 Some Privacy concerns l Medical and Healthcare - Employers, marketers, or others knowing of private medical concerns l Security - Allowing access to individual’s travel and spending data - Allowing access to web surfing behavior l Marketing, Sales, and Finance - Allowing access to individual’s purchases l Biometrics - Biometric technologies used to violate privacy

4 Data Mining as a Threat to Privacy l Data mining gives us “facts” that are not obvious to human analysts of the data l Can general trends across individuals be determined without revealing information about individuals? l Possible threats: - Combine collections of data and infer information that is private l Disease information from prescription data l Military Action from Pizza delivery to pentagon l Need to protect the associations and correlations between the data that are sensitive or private

5 Some Privacy Problems and Potential Solutions l Problem: Privacy violations that result due to data mining - Potential solution: Privacy-preserving data mining l Problem: Privacy violations that result due to the Inference - Inference is the process of deducing sensitive information from the legitimate responses received to user queries - Potential solution: Privacy Constraint Processing l Problem: Privacy violations due to un-encrypted data - Potential solution: Encryption at different levels l Problem: Privacy violation due to poor system design - Potential solution: Develop methodology for designing privacy- enhanced systems l Problem: Privacy violation due to Biometrics systems - Privacy sympathetic Biometrics

6 Privacy Preserving Data Mining l Prevent useful results from mining - Introduce “cover stories” to give “false” results - Only make a sample of data available so that an adversary is unable to come up with useful rules and predictive functions l Randomization - Introduce random values into the data and/or results - Challenge is to introduce random values without significantly affecting the data mining results - Give range of values for results instead of exact values l Secure Multi-party Computation - Each party knows its own inputs; encryption techniques used to compute final results

7 Privacy Controller User Interface Manager Constraint Manager Privacy Constraints Query Processor: Constraints during query and release operations Update Processor: Constraints during update operation Database Design Tool Constraints during database design operation Database DBMS

8 Semantic Model for Privacy Control Patient John Cancer Influenza Has disease Travels frequently England address John’s address Dark lines/boxes contain private information

9 Platform for Privacy Preferences (P3P): What is it? l P3P is an emerging industry standard that enables web sites to express their privacy practices in a standard format l The format of the policies can be automatically retrieved and understood by user agents l It is a product of W3C; World wide web consortium www.w3c.org l Main difference between privacy and security - User is informed of the privacy policies - User is not informed of the security policies

10 Platform for Privacy Preferences (P3P): Key Points l When a user enters a web site, the privacy policies of the web site is conveyed to the user l If the privacy policies are different from user preferences, the user is notified l User can then decide how to proceed l User/Client maintains the privacy controller - That is, Privacy controller determines whether an untrusted web site can give out public information to a third party so that the third party infers private information

11 Platform for Privacy Preferences (P3P): Organizations l Several major corporations are working on P3P standards including: - Microsoft - IBM - HP - NEC - Nokia - NCR l Web sites have also implemented P3P l Semantic web group has adopted P3P

12 Platform for Privacy Preferences (P3P): Specifications l Initial version of P3P used RDF to specify policies l Recent version has migrated to XML l P3P Policies use XML with namespaces for encoding policies l Example: Catalog shopping - Your name will not be given to a third party but your purchases will be given to a third party - http://www.w3.org/2002/01/P3Pv1 <POLICY name = - - - -

13 Platform for Privacy Preferences (P3P): Specifications (Concluded) l P3P has its own statements a d data types expressed in XML l P3P schemas utilize XML schemas l XML is a prerequisite to understanding P3P l P3P specification released in January 2005 uses catalog shopping example to explain concepts l P3P is an International standard and is an ongoing project

14 P3P and Legal Issues l P3P does not replace laws l P3P work together with the law l What happens if the web sites do no honor their P3P policies - Then appropriate legal actions will have to be taken l XML is the technology to specify P3P policies l Policy experts will have to specify the policies l Technologies will have to develop the specifications l Legal experts will have to take actions if the policies are violated

15 Challenges and Discussion l Technology alone is not sufficient for privacy l We need technologists, Policy expert, Legal experts and Social scientists to work on Privacy l Some well known people have said ‘Forget about privacy” l Should we pursue working on Privacy? - Interesting research problems - Interdisciplinary research - Something is better than nothing - Try to prevent privacy violations - If violations occur then prosecute l Privacy is a major concern for Biometrics

16 Biometrics and Privacy l How are Biometrics and Privacy Related? l What are the major privacy concerns associated with Biometrics Usage? l What types of Biometric deployments require stronger protections against privacy invasiveness l What biometrics technologies are more susceptible to privacy- invasive usage l What types of protections are necessary to ensure that biometrics are not use in a privacy invasive fashion

17 Relationship: Biometrics and Privacy l Biometrics technology can be used without individual knowledge or consent to link personal information from various sources, creating individual profiles l These profiles may be used for privacy invasive purposes such as tracking movement l Biometrics systems capable of being used in a privacy compromising way are called privacy invasive systems l Privacy neutral means that the technology cannot be used to protect information nor undermine privacy l Privacy sympathetic deployments include special designs to ensure that biometrics data cannot be used in a privacy invasive fashion l Privacy protection is about using biometric authentication to protect other personal information (e.g., bank accounts)

18 HIPPA and Biometrics l HIPPA (Health Insurance Portability and Accountability Act) refers to biometrics l Biometrics could be a potential identifier and as a result cause privacy concerns and must be disassociated from medical information l Biometrics can be used for authentication and ensuring security l HIPPA and P3P relationships - Implementing HIPPA rules in P3P

19 Privacy Concerns Associated with Biometric Deployments l Informational privacy - Unauthorized collection, storage and usage of biometrics information l Personal Privacy - Discomfort of people when encountering biometrics technology l Privacy sympathetic qualities of biometrics technology - E.g., not storing raw data

20 Informational Privacy l Usage of biometric data is not usually the problem, potential linkage, aggregation and misuse of personal information associated with biometric data is the problem l Unauthorized use of biometric technology - Conducting criminal forensic searches on drivers license databases - Using biometric data as a unique identifier - Is biometric data personal information – debate in the industry l Unauthorized collection of biometric data - E.g., Surveillance l Unnecessary collection of biometric data l Unauthorized disclosure - Sharing biometric data

21 Personal Privacy l Many biometric technologies are offensive to certain individuals especially when they are introduced - Smartcards, Surveillance l Unlike informational privacy, technology in general cannot help with personal privacy l Need psychologists and social scientists to work with individuals to ensure comfort l Legal procedures also should be in place in case privacy is violated so that individuals are comfortable with the technology l “Please excuse for intruding on your privacy”

22 Privacy Sympathetic Qualities of Biometric Systems l Most biometric systems (except forensic systems) do not store raw data such as fingerprints or images l Biometric data is stored in templates; templates consist of numbers; cannot reconstruct biometric data from templates l The idea of universal biometric identifier does not work as different applications require different biometric technologies l Different enrollments such as different samples also enhance privacy l Non interoperable biometrics technologies also help with privacy, however difficult for different systems to interact without standards

23 Application Specific Privacy Risks l Each deployment should address privacy concerns; also depends on the technology used and how it is used; what are the steps taken, what are the consequences of privacy violations l BioPrivacy framework was developed in 2001 to help deployers come up with risk ratings for their deployments l Risk ratings depend on several factors such as verification vs. identification

24 BioPrivacy Framework l Overt vs. Covert - Users being aware that biometric data is being collected has less risk l Opt-in vs. Mandatory - Mandatory enrollment such as a public sector program has higher risk l Verification vs. Identification - Searching a database to match a biometric (e.g., Identification) has higher risk as individual’s biometric data may be collected l Fixed duration vs. Indefinite duration - Fixed duration has a negative impact l Public sector vs. Private Sector - Public sector deployments are more risky

25 BioPrivacy Framework (Concluded) l User Role - Citizen, Employee Traveler, Student, Customers, Individual - E.g., Citizen may face more penalties for noncompliance l User ownership vs. Institutional ownership - User maintaining ownership of his/her biometric data is less risky l Personal storage vs. Storage in template database Is the data stored in central database or in a user’s PC - Central database is more risky l Behavioral vs. Physiological Storage - Physiological biometrics may be compromised more l Template storage vs. Identifiable Storage - Template storage is less risky

26 Risk Ratings l For each biometric technology, rate risk with respect to the BioPrivacy framework l Example: Over/Covert risk is - Moderate for Finger Scan - High for face scan - Low for Iris Scan - Low for Retina Scan - High for Voice scan - Low for signature scan - Moderate for Keystroke scan - Low for hand scan l Based on individual risk ratings compute an overall risk rating: example, High for facial scan, Moderate for Iris scan and Low for hand scan

27 Biometrics for Private Data Sharing? Export Data/Policy Component Data/Policy for Agency A Data/Policy for Federation Export Data/Policy Component Data/Policy for Agency C Component Data/Policy for Agency B Export Data/Policy


Download ppt "Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #19 Biometrics and Privacy - I October 31, 2005."

Similar presentations


Ads by Google