Interviewing and Deception Detection Techniques for Rapid Screening and Credibility Assessment Dr. Jay F. Nunamaker, Jr. Dr. Judee K. Burgoon.

Slides:



Advertisements
Similar presentations
Alina Pommeranz, MSc in Interactive System Engineering supervised by Dr. ir. Pascal Wiggers and Prof. Dr. Catholijn M. Jonker.
Advertisements

Database Planning, Design, and Administration
M & E for K to 12 BEP in Schools
Enabling Access to Sound Archives through Integration, Enrichment and Retrieval WP1. Project Management.
Principles of Information Systems, Tenth Edition
Information Systems in Business
CITeR The Center for Identification Technology Research NSF I/UCR Center advancing integrative biometrics research Kinesic Credibility.
CrimeLink Explorer: Lt. Jennifer Schroeder Tucson Police Department Jie Xu University of Arizona June 2, 2003 Using Domain Knowledge to Facilitate Automated.
Crime Section, Central Statistics Office..  The Crime Section would like to acknowledge the assistance provided by the Probation Service in this project.
INSTRUCTIONAL LEADERSHIP: CLASSROOM WALKTHROUGHS
NETS Meets Common Core Teresa Knapp Gordon, NBCT
1 CISR-consultancy Challenges “Customer ask us what to do next” Keywords: “Customer ask us what to do next” From Policy to Practise The world is going.
1 Software Requirement Analysis Deployment Package for the Basic Profile Version 0.1, January 11th 2008.
Sensemaking and Ground Truth Ontology Development Chinua Umoja William M. Pottenger Jason Perry Christopher Janneck.
Principles of Information Systems, Sixth Edition 1 Systems Investigation and Analysis Chapter 12.
MSIS 110: Introduction to Computers; Instructor: S. Mathiyalakan1 Systems Investigation and Analysis Chapter 12.
Lecture Nine Database Planning, Design, and Administration
Literacy Textual: The ability to read and write Oral: The ability to listen and speak Visual: The ability to interpret visual messages accurately and.
Emotional Intelligence and Agents – Survey and Possible Applications Mirjana Ivanovic, Milos Radovanovic, Zoran Budimac, Dejan Mitrovic, Vladimir Kurbalija,
Foundation Degree IT Project Methodologies (for reference)
S/W Project Management
Project Planning with IT Y/601/7321
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
Overview of the Database Development Process
Sheila Roberts Department of Geology Bowling Green State University.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Literature Review and Parts of Proposal
Dr. Jay F. Nunamaker, Jr. Director, National Center for Border Security and Immigration Regents and Soldwedel Professor of MIS, Computer Science, and Communication.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Educator Effectiveness Academy STEM Follow-Up Webinar December 2011.
Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking.
Chapter 14 Information System Development
System Analysis and Design Dr. Taysir Hassan Abdel Hamid Lecture 5: Analysis Chapter 3: Requirements Determination November 10, 2013.
=_A-ZVCjfWf8 Nets for students 2007.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Management & Development of Complex Projects Course Code MS Project Management Perform Qualitative Risk Analysis Lecture # 25.
©2011, Cengage Learning, Brooks/ Cole Publishing Assessing Chapter 10 Social Work Skills Workbook Barry Cournoyer Indiana University School of Social Work.
Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking.
Database System Development Lifecycle 1.  Main components of the Infn System  What is Database System Development Life Cycle (DSDLC)  Phases of the.
Future Learning Landscapes Yvan Peter – Université Lille 1 Serge Garlatti – Telecom Bretagne.
1 Introduction to Software Engineering Lecture 1.
Knowledge Representation of Statistic Domain For CBR Application Supervisor : Dr. Aslina Saad Dr. Mashitoh Hashim PM Dr. Nor Hasbiah Ubaidullah.
Principles of Information Systems, Sixth Edition Systems Investigation and Analysis Chapter 12.
Chapter 3 Managing Design Processes. 3.1 Introduction Design should be based on: –User observation Analysis of task frequency and sequences –Prototypes,
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
Digital Libraries1 David Rashty. Digital Libraries2 “A library is an arsenal of liberty” Anonymous.
Research Methodology Class.   Your report must contains,  Abstract  Chapter 1 - Introduction  Chapter 2 - Literature Review  Chapter 3 - System.
Date: 2012/08/21 Source: Zhong Zeng, Zhifeng Bao, Tok Wang Ling, Mong Li Lee (KEYS’12) Speaker: Er-Gang Liu Advisor: Dr. Jia-ling Koh 1.
Automatic Discovery and Processing of EEG Cohorts from Clinical Records Mission: Enable comparative research by automatically uncovering clinical knowledge.
1 Educational Technology Electronic Teaching Portfolio Based on the ISTE/NCATE Foundation Standards for all educators. All candidates in teacher preparation.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
NC-BSI: TASK 3.5: Reduction of False Alarm Rates from Fused Data Problem Statement/Objectives Research Objectives Intelligent fusing of data from hybrid.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Function BIRN The ability to find a subject who may have participated in multiple experiments and had multiple assessments done is a critical component.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
VIEWS b.ppt-1 Managing Intelligent Decision Support Networks in Biosurveillance PHIN 2008, Session G1, August 27, 2008 Mohammad Hashemian, MS, Zaruhi.
Chapter 9 Database Planning, Design, and Administration Transparencies © Pearson Education Limited 1995, 2005.
D RAFT OF F RAMEWORK OF C OLLABORATION A CTIVITIES “SEAEDUNET 2.0: D IGITAL -A GE T EACHING AND L EARNING M ODEL ”
Generating data with enacted methods
WP5: Semantic Multimedia
Douglas C. Derrick Aaron C. Elkins Judee K. Burgoon
Usage scenarios, User Interface & tools
Initiating systems development
WP7 – COMBINING BIG DATA - STATISTICAL DOMAINS
Metrics for process and Projects
Presentation transcript:

Interviewing and Deception Detection Techniques for Rapid Screening and Credibility Assessment Dr. Jay F. Nunamaker, Jr. Dr. Judee K. Burgoon

2 Agenda Introduction Project Phases Project Plan –Year One: Unique Datasets –Year One: Sensors and Tools –Year One: Tasks 1.Analysis Psycho-Physiological Datasets 2.Lexical Analysis 3.Interoperable Video Database 4.Collaborative Credibility Assessment Tools Conclusion

Introduction Identify verbal/non-verbal behaviors and physiological cues that indicate deception and hostile intentions in rapid screening environments Experimental research to evaluate and develop automated deception detection technology Develop questioning and information elicitation strategies for border screeners 3

Project Phases Experimentation and analysis of credibility assessment tools –Test new deception detection technology which incorporate verbal/non- verbal behavior and physiological measures –Prototype automated systems and enabling technologies for detecting deception –Replicate knowledge learned in experiments in the field Analysis of interviewing techniques in screening scenarios –Techniques for information elicitation –Behavioral analysis and questioning strategies Screening and border specific analysis of detection methods –Unobtrusive methods for behavior monitoring and deception detection –Interview and screening techniques 4

Project Plan Phase Schedule (year) (1) Experimentation and analysis of credibility assessment tools (2) Analysis of interviewing techniques in screening scenarios (3) Screening and border specific analysis of detection methods 5

Project Plan – Year 1 Unique Datasets Datasets for original analysis: –Cultural Benchmarks 220 international participants Professionally interviewed (25 questions) Lie or truth instructions –Mock Crime 134 participants Realistic Mock theft scenario New Proposed Experiments 6

7 Project Plan – Year 1 Mock Crime: Experiment Example Stage 1: Subject arrival at separate building Stage 2: Subject receives instructions by recording Stage 3: Subject arrives at secretary’s office to steal ring Stage 4: Subject completes credibility interview about involvement in theft

Project Plan – Year 1 Sensors and Tools LDVTHERMALBLINK EYE-TRACKINGPUPILOMETRYKINESIC Equipment supplied by:

Project Plan – Year 1 Task 1: Analysis Psycho-Physiological Datasets Phase 1: LDV Data Analysis (Year 1) –Unintentionally leaked psycho-physiological cues may be indicative of deceptive behavior –“Cultural Benchmarks” experiment captured Pulse and Respiration data via LDV –Determine if the LDV can provide cues that are indicative of deception Phase 2: Data from multi-sensors (Year 2) –Cultural Benchmarks Experiment – sensor data analyzed individually – not looked at collectively Pulse Respiration, Kinesics (Blob, ASM, Gestures, Blinking, Pose), and Pupilometry Phase 3: Data-fusion techniques (Year 2) –How do we fuse data from distinct sources? –Do fusion techniques provide greater accuracy in detecting deception?

Milestones and Deliverables MilestoneDescription and DeliverableTimeframe (1) Phase 1 Perform analysis of data from LDV collected during previous deception experiment. Determine if any psycho-physiological cues emerge that indicate deceptive behavior. Deliverable is the psycho-physiological cues provided by LDV indicative of deception. 6 months (2) Phase 2 Gather additional experimental data collected from various sensors during deception detection experiment. Format and prepare for processing. Deliverable is a formatted data set ready for analysis. 6 months (3) Phase 3 Explore methods to fuse data into one robust algorithm. Conduct statistical analysis to determine if collectively the combined sensor output is a greater predictor of deceptive behavior. 6 months *Year one deliverables in green

Project Plan – Year 1 Task 2: Lexical Analysis “words whose job it is to make things more or less fuzzy” (Lakoff, 1972) e.g. perhaps, might, maybe, approximately Communicates speaker’s degree of confidence (Hyland, 1998; Coates, 1987) Reduces strength of a statement (Zucker & Zucker, 1986) Expresses tentativeness and probability Theoretically linked to deception use

Milestones and Deliverables MilestoneDescription and DeliverableTimeframe (1) Phase 1 Prepare transcriptions of cultural benchmarks interviews for analysis 1 month (2) Phase 2 Refine hedging dictionaries and lexical bundle software 1 month (3) Phase 2b Parse text, generate features, run classification algorithms, run statistical analyses 2 months (4) Phase 3 Write results and discuss implications for rapid screening, articulate future investigations 2 months *Year one deliverables in green

Project Plan – Year 1 Task 3: Interoperable Video Database Currently have many large, disparate data sources. Challenges in managing large and diverse data sets include: Integrating the datasets efficiently Querying integrated data sets intelligently Example: Retrieve all data associated w/specific gesture Capturing tacit information Determining the elementary/composite data elements Creating semantic interoperability across the datasets In order to begin addressing these challenges, we need to begin to create a framework which will help us understand the datasets and how to manage them holistically. Mock Crime Interviews Cultural Benchmarks Interviews Retrieve all data associated with shrugs

Milestones and Deliverables MilestoneDescription and DeliverableTimeframe (1) Phase 1 Initial Data Framework and Recommendations – an initial data framework that integrates video datasets from two experiments. This framework will serve as discussion for the future data management directions. 6 month (2) Phase 2 Prototype Implementation –implement prototype to demonstrate solutions to various data challenges. Through Year 6 *Year one deliverables in green

Project Plan – Year 1 Task 4: Collaborative Credibility Assessment Tools Problem: Individuals are not as accurate as machines in credibility assessment Proposal: Use group collaboration to discover more cues to deception First phase Deliverable: Determine feasibility of and requirements for collaboration tools for credibility assessment. Create prototype(s)

Milestones and Deliverables MilestoneDescription and DeliverableTimeframe (1) Phase 1 Methodically identify and adapt most useful collaboration tools 3 month (2) Phase 2Design software2 month (3) Phase 2bCreate initial prototype(s)*1 month *Year one deliverables in green

Conclusion Identify verbal/non-verbal behaviors and physiological cues that indicate deception and hostile intentions in rapid screening environments Experimental research using unique datasets to evaluate and develop: –Automated deception detection technology –Experimental research to evaluate and develop automated deception detection technology Four proposed tasks with deliverables in Year One