Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #19 Biometrics and Privacy - I October 31, 2005.

Slides:



Advertisements
Similar presentations
Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #20 Biometrics and Privacy - II November 2, 2005.
Advertisements

Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #26 Emerging Technologies.
Signature (unit, name, etc.) Introduction to biometrics from a legal perspective Yue Liu Mar NRCCL, UIO.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #21 Privacy March 29, 2005.
Security Controls – What Works
Secure Knowledge Management Dr. Bhavani Thuraisingham The National Science Foundation September 2004.
1 Information and Data Privacy: An Indian Perspective  Why is this important? Public concern about privacy.  Considerable concern in developed countries.
Hippocratic Databases Paper by Rakesh Agrawal, Jerry Kiernan, Ramakrishnan Srikant, Yirong Xu CS 681 Presented by Xi Hua March 1st,Spring05.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Secure Knowledge Management: and.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #17 Data Mining, Security.
Zachary Olson and Yukari Hagio CIS 4360 Computer Security November 19, 2008.
By Alvaro E. Escobar 1 Biometrics Agenda I. Video II. Biometric Overview III. Biometric Technologies IV. Accuracy Metrics V. BioPrivacy Concerns.
Component 4: Introduction to Information and Computer Science Unit 2: Internet and the World Wide Web 1 Component 4/Unit 2Health IT Workforce Curriculum.
Policy Review (Top-Down Methodology) Lesson 7. Policies From the Peltier Text, p. 81 “The cornerstones of effective information security programs are.
Data Management Information Management Knowledge Management Data and Applications Security Challenges Bhavani Thuraisingham October 2006.
Patient Data Security and Privacy Lecture # 7 PHCL 498 Amar Hijazi, Majed Alameel, Mona AlMehaid.
Data Mining for Security Applications Dr. Bhavani Thuraisingham The University of Texas at Dallas January 2006.
Secure Sensor Data/Information Management and Mining Bhavani Thuraisingham The University of Texas at Dallas October 2005.
Location, Location, Location: The Emerging Crisis in Wireless Data Privacy Ari Schwartz & Alan Davidson Center for Democracy and Technology
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Guest Lecture Lecture #27 Cyber.
Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #15 Biometrics Applications - II October 19, 2005.
Patient Confidentiality and Electronic Medical Records Ann J. Olsen, MBA, MA Information Security Officer and Director, Information Management Planning.
Chapter No 4 Query optimization and Data Integrity & Security.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #22 Secure Web Information.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture ##9 Data Mining, Security.
Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #18 Biometrics Applications - III October 26, 2005.
Power Point Project Michael Bennett CST 105Y01 ONLINE Course Editor-Paulette Gannett.
Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #5 Issues on Designing Biometric Systems September 7, 2005.
TECHNICAL SEMINAR PRESENTATION BIOMETRICS:THE MAGIC OF IDENTIFICATION.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Inference Problem - I September.
Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #1 Biometrics and Other Emerging Technologies in Applications.
Data Mining, Security and Privacy Prof. Bhavani Thuraisingham Prof. Murat Kantarcioglu Ms Li Liu (PhD Student – completing December 2007) The University.
An Introduction to the Privacy Act Privacy Act 1993 Promotes and protects individual privacy Is concerned with the privacy of information about people.
INTRODUCTION TO BIOMATRICS ACCESS CONTROL SYSTEM Prepared by: Jagruti Shrimali Guided by : Prof. Chirag Patel.
Trustworthy Semantic Web Dr. Bhavani Thuraisingham The University of Texas at Dallas Inference Problem March 4, 2011.
Data Security and Integrity Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas June 2009.
Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Introduction to the Course August 22, 2005.
Erik Jonsson School of Engineering and Computer Science The University of Texas at Dallas Cyber Security Research on Engineering Solutions Dr. Bhavani.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #25 Dependable Data Management.
Data and Applications Security Developments and Directions Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #8 Inference Problem - I.
Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #6 Guest Lecture + Some Topics in Biometrics September 12,
Data Mining, Security and Privacy Dr. Bhavani Thuraisingham The University of Texas at Dallas March 2008.
Data Mining, Security and Privacy Prof. Bhavani Thuraisingham Prof. Murat Kantarcioglu Ms Li Liu (PhD Student – completing December 2007) The University.
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Athina Antoniou and Lilian Mitrou
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Component 4: Introduction to Information and Computer Science Unit 2: Internet and the World Wide Web Lecture 4 This material was developed by Oregon.
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Introduction to Health Privacy
Data and Applications Security Developments and Directions
Data and Applications Security Developments and Directions
Data and Applications Security Developments and Directions
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Data and Applications Security Developments and Directions
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Prof. Bhavani Thuraisingham The University of Texas at Dallas
Data and Applications Security Developments and Directions
Trustworthy Semantic Web
Data and Applications Security Developments and Directions
Data and Applications Security Developments and Directions
Presentation transcript:

Introduction to Biometrics Dr. Bhavani Thuraisingham The University of Texas at Dallas Lecture #19 Biometrics and Privacy - I October 31, 2005

Outline l Overview of Privacy l Biometrics and Privacy

Some Privacy concerns l Medical and Healthcare - Employers, marketers, or others knowing of private medical concerns l Security - Allowing access to individual’s travel and spending data - Allowing access to web surfing behavior l Marketing, Sales, and Finance - Allowing access to individual’s purchases l Biometrics - Biometric technologies used to violate privacy

Data Mining as a Threat to Privacy l Data mining gives us “facts” that are not obvious to human analysts of the data l Can general trends across individuals be determined without revealing information about individuals? l Possible threats: - Combine collections of data and infer information that is private l Disease information from prescription data l Military Action from Pizza delivery to pentagon l Need to protect the associations and correlations between the data that are sensitive or private

Some Privacy Problems and Potential Solutions l Problem: Privacy violations that result due to data mining - Potential solution: Privacy-preserving data mining l Problem: Privacy violations that result due to the Inference - Inference is the process of deducing sensitive information from the legitimate responses received to user queries - Potential solution: Privacy Constraint Processing l Problem: Privacy violations due to un-encrypted data - Potential solution: Encryption at different levels l Problem: Privacy violation due to poor system design - Potential solution: Develop methodology for designing privacy- enhanced systems l Problem: Privacy violation due to Biometrics systems - Privacy sympathetic Biometrics

Privacy Preserving Data Mining l Prevent useful results from mining - Introduce “cover stories” to give “false” results - Only make a sample of data available so that an adversary is unable to come up with useful rules and predictive functions l Randomization - Introduce random values into the data and/or results - Challenge is to introduce random values without significantly affecting the data mining results - Give range of values for results instead of exact values l Secure Multi-party Computation - Each party knows its own inputs; encryption techniques used to compute final results

Privacy Controller User Interface Manager Constraint Manager Privacy Constraints Query Processor: Constraints during query and release operations Update Processor: Constraints during update operation Database Design Tool Constraints during database design operation Database DBMS

Semantic Model for Privacy Control Patient John Cancer Influenza Has disease Travels frequently England address John’s address Dark lines/boxes contain private information

Platform for Privacy Preferences (P3P): What is it? l P3P is an emerging industry standard that enables web sites to express their privacy practices in a standard format l The format of the policies can be automatically retrieved and understood by user agents l It is a product of W3C; World wide web consortium l Main difference between privacy and security - User is informed of the privacy policies - User is not informed of the security policies

Platform for Privacy Preferences (P3P): Key Points l When a user enters a web site, the privacy policies of the web site is conveyed to the user l If the privacy policies are different from user preferences, the user is notified l User can then decide how to proceed l User/Client maintains the privacy controller - That is, Privacy controller determines whether an untrusted web site can give out public information to a third party so that the third party infers private information

Platform for Privacy Preferences (P3P): Organizations l Several major corporations are working on P3P standards including: - Microsoft - IBM - HP - NEC - Nokia - NCR l Web sites have also implemented P3P l Semantic web group has adopted P3P

Platform for Privacy Preferences (P3P): Specifications l Initial version of P3P used RDF to specify policies l Recent version has migrated to XML l P3P Policies use XML with namespaces for encoding policies l Example: Catalog shopping - Your name will not be given to a third party but your purchases will be given to a third party - <POLICY name =

Platform for Privacy Preferences (P3P): Specifications (Concluded) l P3P has its own statements a d data types expressed in XML l P3P schemas utilize XML schemas l XML is a prerequisite to understanding P3P l P3P specification released in January 2005 uses catalog shopping example to explain concepts l P3P is an International standard and is an ongoing project

P3P and Legal Issues l P3P does not replace laws l P3P work together with the law l What happens if the web sites do no honor their P3P policies - Then appropriate legal actions will have to be taken l XML is the technology to specify P3P policies l Policy experts will have to specify the policies l Technologies will have to develop the specifications l Legal experts will have to take actions if the policies are violated

Challenges and Discussion l Technology alone is not sufficient for privacy l We need technologists, Policy expert, Legal experts and Social scientists to work on Privacy l Some well known people have said ‘Forget about privacy” l Should we pursue working on Privacy? - Interesting research problems - Interdisciplinary research - Something is better than nothing - Try to prevent privacy violations - If violations occur then prosecute l Privacy is a major concern for Biometrics

Biometrics and Privacy l How are Biometrics and Privacy Related? l What are the major privacy concerns associated with Biometrics Usage? l What types of Biometric deployments require stronger protections against privacy invasiveness l What biometrics technologies are more susceptible to privacy- invasive usage l What types of protections are necessary to ensure that biometrics are not use in a privacy invasive fashion

Relationship: Biometrics and Privacy l Biometrics technology can be used without individual knowledge or consent to link personal information from various sources, creating individual profiles l These profiles may be used for privacy invasive purposes such as tracking movement l Biometrics systems capable of being used in a privacy compromising way are called privacy invasive systems l Privacy neutral means that the technology cannot be used to protect information nor undermine privacy l Privacy sympathetic deployments include special designs to ensure that biometrics data cannot be used in a privacy invasive fashion l Privacy protection is about using biometric authentication to protect other personal information (e.g., bank accounts)

HIPPA and Biometrics l HIPPA (Health Insurance Portability and Accountability Act) refers to biometrics l Biometrics could be a potential identifier and as a result cause privacy concerns and must be disassociated from medical information l Biometrics can be used for authentication and ensuring security l HIPPA and P3P relationships - Implementing HIPPA rules in P3P

Privacy Concerns Associated with Biometric Deployments l Informational privacy - Unauthorized collection, storage and usage of biometrics information l Personal Privacy - Discomfort of people when encountering biometrics technology l Privacy sympathetic qualities of biometrics technology - E.g., not storing raw data

Informational Privacy l Usage of biometric data is not usually the problem, potential linkage, aggregation and misuse of personal information associated with biometric data is the problem l Unauthorized use of biometric technology - Conducting criminal forensic searches on drivers license databases - Using biometric data as a unique identifier - Is biometric data personal information – debate in the industry l Unauthorized collection of biometric data - E.g., Surveillance l Unnecessary collection of biometric data l Unauthorized disclosure - Sharing biometric data

Personal Privacy l Many biometric technologies are offensive to certain individuals especially when they are introduced - Smartcards, Surveillance l Unlike informational privacy, technology in general cannot help with personal privacy l Need psychologists and social scientists to work with individuals to ensure comfort l Legal procedures also should be in place in case privacy is violated so that individuals are comfortable with the technology l “Please excuse for intruding on your privacy”

Privacy Sympathetic Qualities of Biometric Systems l Most biometric systems (except forensic systems) do not store raw data such as fingerprints or images l Biometric data is stored in templates; templates consist of numbers; cannot reconstruct biometric data from templates l The idea of universal biometric identifier does not work as different applications require different biometric technologies l Different enrollments such as different samples also enhance privacy l Non interoperable biometrics technologies also help with privacy, however difficult for different systems to interact without standards

Application Specific Privacy Risks l Each deployment should address privacy concerns; also depends on the technology used and how it is used; what are the steps taken, what are the consequences of privacy violations l BioPrivacy framework was developed in 2001 to help deployers come up with risk ratings for their deployments l Risk ratings depend on several factors such as verification vs. identification

BioPrivacy Framework l Overt vs. Covert - Users being aware that biometric data is being collected has less risk l Opt-in vs. Mandatory - Mandatory enrollment such as a public sector program has higher risk l Verification vs. Identification - Searching a database to match a biometric (e.g., Identification) has higher risk as individual’s biometric data may be collected l Fixed duration vs. Indefinite duration - Fixed duration has a negative impact l Public sector vs. Private Sector - Public sector deployments are more risky

BioPrivacy Framework (Concluded) l User Role - Citizen, Employee Traveler, Student, Customers, Individual - E.g., Citizen may face more penalties for noncompliance l User ownership vs. Institutional ownership - User maintaining ownership of his/her biometric data is less risky l Personal storage vs. Storage in template database Is the data stored in central database or in a user’s PC - Central database is more risky l Behavioral vs. Physiological Storage - Physiological biometrics may be compromised more l Template storage vs. Identifiable Storage - Template storage is less risky

Risk Ratings l For each biometric technology, rate risk with respect to the BioPrivacy framework l Example: Over/Covert risk is - Moderate for Finger Scan - High for face scan - Low for Iris Scan - Low for Retina Scan - High for Voice scan - Low for signature scan - Moderate for Keystroke scan - Low for hand scan l Based on individual risk ratings compute an overall risk rating: example, High for facial scan, Moderate for Iris scan and Low for hand scan

Biometrics for Private Data Sharing? Export Data/Policy Component Data/Policy for Agency A Data/Policy for Federation Export Data/Policy Component Data/Policy for Agency C Component Data/Policy for Agency B Export Data/Policy