Approaches to Physical Evidence Research In the Forensic Sciences

Slides:



Advertisements
Similar presentations
Error Rates and Random Match Probabilities (RMP) Based on the RUGER 10-Barrel Test and the GLOCK Cartridge Case Tests James E. Hamby, Ph.D., David J.
Advertisements

What they are and how they fit into
Current Research in Forensic Toolmark Analysis Helping to satisfy the “new” needs of forensic scientists with state of the art microscopy, computation.
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Chapter 3 Physical Evidence.
An Essential Component of Crime Scene Investigation.
1.  The term ballistics refers to the science of the travel of a projectile in flight.  The flight path of a bullet includes: travel down the barrel,
TrueAllele ® Casework Validation on PowerPlex ® 21 Mixture Data Australian and New Zealand Forensic Science Society September, 2014 Adelaide, South Australia.
1. True or False? When testing for DNA, investigators must use all of the sample to make sure they get an accurate test. 2. Where do we find DNA in a cell?
A Career In Forensic Chemistry Bethany Pompy Department of Chemistry and Chemical Engineering, South Dakota School of Mines and Technology Rapid City,
 Forensic science is the application of the science of physical anthropology to the legal process.  Not only does the scientist work with the deceased,
The Forensic Laboratory. K-Fed sez: Quiz on Friday.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Careers in Forensic Science. 2 Copyright and Terms of Service Copyright © Texas Education Agency, These materials are copyrighted © and trademarked.
INTRODUCTION TO FORENSICS Science, Technology, & Society MR. CANOVA PERIOD 11.
My Career Forensic Scientist.
Forensic science lab 4.4 What equipment do forensic scientists use to assess evidence?
Fraud Examination Evidence III: Forensic Science and Computer Forensics McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies,
History of Forensic Science. BCEEvidence of fingerprints in early paintings and rock carvings made by prehistoric humans 700sChinese used fingerprints.
Criminalistics  Also known as Criminalistics  The application of science to the law.
Intro to Forensic Science What is Forensic Science?
Introduction Osborn. Daubert is a benchmark!!!: Daubert (1993)- Judges are the “gatekeepers” of scientific evidence. Must determine if the science is.
1 Enviromatics Environmental sampling Environmental sampling Вонр. проф. д-р Александар Маркоски Технички факултет – Битола 2008 год.
Current Research in Forensic Toolmark Analysis Petraco Group.
Computational Strategies for Toolmarks:. Outline Introduction Details of Our Approach Data acquisition Methods of statistical discrimination Error rate.
History of Forensic Science and the Crime Scene. Warm-Up Activity Study the diagram shown. ► How many animals were here? What kind were they? ► What were.
History of Forensic Science Research based on Miami-Dade County Website.
Discriminant functions are trained on a finite set of data How much fitting should we do? What should the model’s dimension be? Model must be used to.
1 Structure of Aalborg University Welcome to Aalborg University.
The Organization of a Crime Lab What is the name of the system operating in Alabama?
BY: EMERALD MOORE CRIME SCENE INVESTIGATOR (CSI).
Getting Past First Bayes with DNA Mixtures American Academy of Forensic Sciences February, 2014 Seattle, WA Mark W Perlin, PhD, MD, PhD Cybergenetics,
Frequentist and Bayesian Measures of Association Quality in Algorithmic Toolmark Identification.
Copyright © 2001, SAS Institute Inc. All rights reserved. Data Mining Methods: Applications, Problems and Opportunities in the Public Sector John Stultz,
Enhancement of forensic abilities: Topics: * Forensic crime scene investigation: Training & Equipment * Forensic laboratories: Experts ’ training & Instrumentation.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
Objective DNA Mixture Information in the Courtroom: Relevance, Reliability & Acceptance NIST International Symposium on Forensic Science Error Management:
INTRODUCTION TO FORENSIC SCIENCE. Introduction to forensic science  Forensic  The word forensic comes from the Latin for ē nsis, meaning "of or before.
Forensic Science 9/1/15. Drill Pick up papers in front of classroom. Staple them to make a packet: Ch 1 Review Questions (2 sheets) Ch 1 Fill-in-the-blank.
History of Forensic Science Research based on Miami-Dade County Website.
Careers in Forensic Science. Definitions Also known as Criminalistics The application of science to the law 2UNT in partnership with TEA, Copyright ©.
Introduction to Forensic Science There are 3 main areas of work for the forensic scientist.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Forensic Science Education Dr. Jason Linville University of Alabama at Birmingham Forensic Science Education Finding a Forensic Job or.
By: Casey Crawford.  - Foundation in chemistry, biology, physics, and math  - General chemistry I and II and lab for science majors (8 credit hours)
Statistics and the Law Varieties of Statistical Challenges Varieties of Challenges to Statistics.
Unit 3: Impression Evidence tool marks
Forensic Surface Metrology
Forensic Specialist.
Careers in Forensic Science
Pushing Out the Frontiers of Forensic Science
1.3: Crime Labs SFS1. Students will recognize and classify various types of evidence in relation to the definition and scope of Forensic Science. 8/4/16.
Computational Reasoning in High School Science and Math
Bayesian data analysis
Careers in Forensic Science
Explaining the Likelihood Ratio in DNA Mixture Interpretation
Crime Laboratories There are nearly 400 crime labs in US –Federal, state, county, and municipal (local) –Most function as part of a police department –Others.
History of Forensic Science
An Introduction to Forensic Science and Professions
(Introduction to Forensic Science)
Jeopardy Choose a category. You will be given the answer.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Careers in Forensic Science
Growth in Recent years is due to:
Steve Lund and Hari Iyer
Litigating Fingerprint Evidence: Ensuring a Statistical Foundation
INTRODUCTION TO FORENSIC SCIENCE
CS639: Data Management for Data Science
Physical Evidence.
Forensic Science and Criminalistics
Presentation transcript:

Approaches to Physical Evidence Research In the Forensic Sciences Petraco Group City University of New York, John Jay College

Outline Introduction Lessons learned from research and training in forensic science Recent John Jay alumni careers A bit about our research products: Statistical tools implemented Directions for the future from lessons learned

Raising Standards with Data and Statistics DNA profiling the most successful application of statistics in forensic science. Responsible for current interest in “raising standards” of other branches in forensics. No protocols for the application of statistics to physical evidence. Our goal: application of objective, numerical computational pattern comparison to physical evidence

Learned Elements of Successful Research in Forensic Science My group has been involved with research and training in quantitative forensic science for about a decade. There must be a synergy between academics and practitioners!

Learned Elements of Successful Research in Forensic Science My group has been involved with research and training in quantitative forensic science for about a decade. There must be a synergy between academics and practitioners! Practitioners have specialized knowledge of: Technical problems in their fields Don’t dismiss their opinions for improving practice! Successful research products will have been borne out by treating practitioners as equal partners Co-PIs “Test subjects” on proposed research products

Learned Elements of Successful Research in Forensic Science Practitioners have specialized knowledge of: Intricacies of working within the criminal justice system Academics are largely self-managed. Practitioners must deal with attorneys/judges/court systems.

Learned Elements of Successful Research in Forensic Science Practitioners have specialized knowledge of: Intricacies of working within the criminal justice system Academics are largely self-managed. Practitioners must deal with attorneys/judges/court systems. Intricacies of implementing new technologies/S.O.Ps Lab’s have tight budgets: Consider equipment costs for purchase/maintenance/training Realistic expectation for their staff’s: knowledge/training/capacities Will users need a PhD to do this?? User interfaces are important!!

Learned Elements of Successful Research in Forensic Science Successful University research programs will ultimately involve: “Well oiled” research teams of academics and practitioners with a proven dedication to research in forensic science.

Learned Elements of Successful Research in Forensic Science Successful University research programs will ultimately involve: “Well oiled” research teams of academics and practitioners with a proven dedication to research in forensic science. Track-records of producing research products related to what was proposed to win the grant.

Learned Elements of Successful Research in Forensic Science Funding sources should consider:

Learned Elements of Successful Research in Forensic Science Funding sources should consider: Significant PhD/D-Fos program start-up money for Curriculum development Teaching time/classroom space Research space Travel for dissemination of research products/training Equipment purchase/maintenance costs Should be close monitoring of nascent PhD/D-Fos program by funding source. Top and middle administration must be involved and supportive for the long-term (10-15 years).

Learned Elements of Successful Research in Forensic Science Funding sources should consider: Accessibility/outreach of PhD/D-Fos program Research in forensic science must me disseminated!! Is the nearest airport to campus small and two hours away? Does campus have access to short term living facilities? Specialists to come and teach. Practitioner to come and take non-matric classes. What are the dedicated facilities for training on campus?

Learned Elements of Successful Research in Forensic Science Funding sources should consider: Accessibility/outreach of PhD/D-Fos program Research in forensic science must me disseminated!! Is the nearest airport to campus small and two hours away? Does campus have access to short term living facilities? Specialists to come and teach. Practitioner to come and take non-matric classes. What are the dedicated facilities for training on campus? Proximity to federal/state/local forensic laboratories and large metropolitan areas Sources of collaboration and external opportunities

Some of the Recent Career Tracks of John Jay Alumni Federal, state and local public Crime laboratories Armed Forces DNA Identification Laboratory, Dover, DE; Center of Forensic Sciences, Toronto, Canada Contra Costa County Sheriff’s Department Forensic Division, CA Federal Bureau of Investigation Laboratory Division, Quantico, VA Los Angeles Police Department, Scientific Investigation Division, LA, CA Los Angeles Sheriff’s Department Laboratory, CA Nassau County Crime Lab, East Meadow, NY New Jersey State Police, Office of Forensic Sciences, West Trenton, NJ New York City Office of Chief Medical Examiner, NY; New York City Police Department Laboratory, Jamaica, NY Office of the Medical Examiner San Francisco, CA; Orange County Sheriff Department, Santa Ana, CA San Diego Police Department Laboratory, San Diego, CA Suffolk County Crime Lab, Hauppauge, NY United States Drug Enforcement Agency, Laboratory Division NY, NY United States Drug Enforcement Agency, Laboratory Division SF, CA; U.S. Postal Inspection Service, Forensic Laboratory, Dulles, VA Washington DC Department of Forensic Sciences, DC Private Companies Antech Diagnostics, New Hyde Park, NY; Bode Technology, Lorton, VA Purdue Pharma, Totowa, NJ; Microtrace, Elgin, IL Quest Diagnostics, Madison, NJ Smiths Detection, Edgewood, MD; NMS Labs, Willow Grove, PA Core Pharma, Middlesex, NJ Academic and Research Institutions Central Police University of Taiwan California State University, Los Angeles, CA Cold Spring Harbor Laboratory, Cold Spring Harbor, NY John Jay College, New York, NY Hawaii Chaminade University, Honolulu, HI; McMaster University, Hamilton, ON St. Xaviers College, Bombay, India Penn State, College Station, PA University of New Haven, New Haven, CT University of West Virginia, Morgantown, WV University of Toronto, Canada Weill Cornell Medical College, New York, NY

Our Research products: Quantitative Criminalistics All forms of physical evidence can be represented as numerical patterns Toolmark surfaces Dust and soil categories and spectra Hair/Fiber categories Any instrumentation data (e.g. spectra) Triangulated fingerprint minutiae Machine learning trains a computer to recognize patterns Can give “…the quantitative difference between an identification and non-identification”Moran Can yield average identification error rate estimates May be even confidence measures for I.D.s*

Bullets Bullet base, 9mm Ruger Barrel

Bullets In: 3D Bullet base, 9mm Ruger Barrel

Bullets Land Engraved Area In: 3D Bullet base, 9mm Ruger Barrel

Bullets In: 3D Bullet base, 9mm Ruger Barrel Land Engraved Area Land engraved area flattened: This is just a table of numbers!

Aggregate Trace: Dust/Soils 9/11 DUST STORM Taken by Det. H. Sherman, NYPD CSU Sept. 2001

Aggregate Trace: Dust/Soils 9/11 DUST STORM Taken by Det. H. Sherman, NYPD CSU Sept. 2001 Photomicrograph of 9/11 dust sample Taken by Det. N. Petraco, NYPD (Ret.)

Aggregate Trace: Dust/Soils 9/11 DUST STORM Taken by Det. H. Sherman, NYPD CSU Sept. 2001 Photomicrograph of 9/11 dust sample Taken by Det. N. Petraco, NYPD (Ret.) Record dust constituents on data sheet: Count data!! Taken by Det. N. Petraco, NYPD (Ret.)

The Identification Task Toolmark Data Space striation pattern A striation pattern B

The Identification Task Modern algorithms that “make comparisons” and “ID unknowns” are called machine learning Toolmark Data Space striation pattern A striation pattern B

The Identification Task Modern algorithms that “make comparisons” and “ID unknowns” are called machine learning Toolmark Data Space Idea is to measure features of the physical evidence that characterize it striation pattern A striation pattern B

The Identification Task Modern algorithms that “make comparisons” and “ID unknowns” are called machine learning Toolmark Data Space Idea is to measure features of the physical evidence that characterize it striation pattern A Train algorithm to recognize “major” differences between groups of features while taking into account natural variation and measurement error. striation pattern B

Visually explore: 3D PCA of 760 real and simulated mean profiles of primer shears from 24 Glocks:

Visually explore: 3D PCA of 760 real and simulated mean profiles of primer shears from 24 Glocks: A real 3D data space of toolmarks

Visually explore: 3D PCA of 760 real and simulated mean profiles of primer shears from 24 Glocks: A machine learning algorithm is then trained to I.D. the toolmark: A real 3D data space of toolmarks E.g.: A support vector machine “learning” algorithm

Visually explore: 3D PCA of 760 real and simulated mean profiles of primer shears from 24 Glocks: A machine learning algorithm is then trained to I.D. the toolmark: A real 3D data space of toolmarks But… E.g.: A support vector machine “learning” algorithm

How good of a “match” is it? Conformal PredictionVovk Can give a judge or jury an easy to understand measure of reliability of classification result Cumulative # of Errors Sequence of Unk Obs Vects

How good of a “match” is it? Conformal PredictionVovk Can give a judge or jury an easy to understand measure of reliability of classification result Confidence on a scale of 0%-100% Testable claim: Long run I.D. error- rate should be the chosen significance level Cumulative # of Errors Sequence of Unk Obs Vects

How good of a “match” is it? Conformal PredictionVovk Can give a judge or jury an easy to understand measure of reliability of classification result Confidence on a scale of 0%-100% Testable claim: Long run I.D. error- rate should be the chosen significance level This is an orthodox “frequentist” approach Roots in Algorithmic Information Theory Cumulative # of Errors Sequence of Unk Obs Vects

How good of a “match” is it? Conformal PredictionVovk Can give a judge or jury an easy to understand measure of reliability of classification result Confidence on a scale of 0%-100% Testable claim: Long run I.D. error- rate should be the chosen significance level This is an orthodox “frequentist” approach Roots in Algorithmic Information Theory Cumulative # of Errors Data should be IID but that’s it Sequence of Unk Obs Vects

How good of a “match” is it? Conformal PredictionVovk Can give a judge or jury an easy to understand measure of reliability of classification result Confidence on a scale of 0%-100% Testable claim: Long run I.D. error- rate should be the chosen significance level 80% confidence 20% error Slope = 0.2 This is an orthodox “frequentist” approach Roots in Algorithmic Information Theory Cumulative # of Errors Data should be IID but that’s it Sequence of Unk Obs Vects

How good of a “match” is it? Conformal PredictionVovk Can give a judge or jury an easy to understand measure of reliability of classification result Confidence on a scale of 0%-100% Testable claim: Long run I.D. error- rate should be the chosen significance level 80% confidence 20% error Slope = 0.2 This is an orthodox “frequentist” approach Roots in Algorithmic Information Theory 95% confidence 5% error Slope = 0.05 Cumulative # of Errors Data should be IID but that’s it Sequence of Unk Obs Vects

How good of a “match” is it? Conformal PredictionVovk Can give a judge or jury an easy to understand measure of reliability of classification result Confidence on a scale of 0%-100% Testable claim: Long run I.D. error- rate should be the chosen significance level 80% confidence 20% error Slope = 0.2 This is an orthodox “frequentist” approach Roots in Algorithmic Information Theory 95% confidence 5% error Slope = 0.05 Cumulative # of Errors 99% confidence 1% error Slope = 0.01 Data should be IID but that’s it Sequence of Unk Obs Vects

Conformal Prediction For 95%-CPT (PCA-SVM) confidence intervals will not contain the correct I.D. 5% of the time in the long run Straight-forward validation/explanation picture for court Empirical Error Rate: 5.3% Theoretical (Long Run) Error Rate: 5% 14D PCA-SVM Decision Model for screwdriver striation patterns

Conformal Prediction For 95%-CPT (PCA-SVM) confidence intervals will not contain the correct I.D. 5% of the time in the long run Straight-forward validation/explanation picture for court Anyone can do this now!!: Empirical Error Rate: 5.3% cptIDPetraco for Theoretical (Long Run) Error Rate: 5% 14D PCA-SVM Decision Model for screwdriver striation patterns

How good of a “match” is it? Empirical Bayes An I.D. is output for each questioned tool mark

How good of a “match” is it? Empirical Bayes An I.D. is output for each questioned tool mark This is a computer “match”

How good of a “match” is it? Empirical Bayes An I.D. is output for each questioned tool mark This is a computer “match” What’s the probability the tool is truly the source of the tool mark?

How good of a “match” is it? Empirical Bayes An I.D. is output for each questioned tool mark This is a computer “match” What’s the probability the tool is truly the source of the tool mark? Similar problem in genomics for detecting disease from microarray dataEfron

How good of a “match” is it? Empirical Bayes An I.D. is output for each questioned tool mark This is a computer “match” What’s the probability the tool is truly the source of the tool mark? Similar problem in genomics for detecting disease from microarray dataEfron They use data and Bayes’ theorem to get an estimate

Empirical Bayes’ Model’s use with crime scene “unknowns”:

Empirical Bayes’ Model’s use with crime scene “unknowns”: Computer outputs “match” for: unknown crime scene toolmarks-with knowns from “Bob the burglar” tools

Empirical Bayes’ Model’s use with crime scene “unknowns”: This is the est. post. prob. of no association = 0.00027 = 0.027% Computer outputs “match” for: unknown crime scene toolmarks-with knowns from “Bob the burglar” tools

Empirical Bayes’ Model’s use with crime scene “unknowns”: This is the est. post. prob. of no association = 0.00027 = 0.027% 99.97% “believable” that “Bob the burglar’s” tool made the crime scene toolmark (est.) Computer outputs “match” for: unknown crime scene toolmarks-with knowns from “Bob the burglar” tools

Empirical Bayes’ Model’s use with crime scene “unknowns”: This is the est. post. prob. of no association = 0.00027 = 0.027% 99.97% “believable” that “Bob the burglar’s” tool made the crime scene toolmark (est.) This is an uncertainty in the estimate Computer outputs “match” for: unknown crime scene toolmarks-with knowns from “Bob the burglar” tools

Empirical Bayes’ Model’s use with crime scene “unknowns”: This is the est. post. prob. of no association = 0.00027 = 0.027% You can do this now with: fdrIDPetraco for : 99.97% “believable” that “Bob the burglar’s” tool made the crime scene toolmark (est.) This is an uncertainty in the estimate Computer outputs “match” for: unknown crime scene toolmarks-with knowns from “Bob the burglar” tools

WTC Dust Source Probability (Posterior) Another example: from a civil case

WTC Dust Source Probability (Posterior) Another example: from a civil case Questioned samples (from flag) without human remains compared to WTC dust ~ 98.6% same source

WTC Dust Source Probability (Posterior) Another example: from a civil case Questioned samples (from flag) with human remains compared to WTC dust ~ 99.3% same source Questioned samples (from flag) without human remains compared to WTC dust ~ 98.6% same source

Likelihood Ratios from Empirical Bayes Using the fit posteriors and priors we can obtain the likelihood ratios

Likelihood Ratios from Empirical Bayes Using the fit posteriors and priors we can obtain the likelihood ratios

Likelihood Ratios from Empirical Bayes Using the fit posteriors and priors we can obtain the likelihood ratios

Likelihood Ratios from Empirical Bayes Using the fit posteriors and priors we can obtain the likelihood ratios Known non-match LR values

Likelihood Ratios from Empirical Bayes Using the fit posteriors and priors we can obtain the likelihood ratios Known match LR values Known non-match LR values

Typical “Data Acquisition” For Toolmarks Comparison Microscope

Typical “Data Acquisition” For Toolmarks Photomicrograph of a Known Match Comparison Microscope Jerry Petillo

Typical “Data Acquisition” For Toolmarks Photomicrograph of a Known Match Photomicrograph of a Known Non Match Comparison Microscope Jerry Petillo Jerry Petillo

Bayesian Networks “Prior” network based on historical/available count data and multinomial-Dirichlet model for run length probabilities:

Bayesian Networks “Prior” network based on historical/available count data and multinomial-Dirichlet model for run length probabilities: GeNIe A Bayesian Network for Consecutive Matching Striae

Run Your “Algorithm” 3D Striation Pattern of Unknown Source 3D Striation Pattern of Known Source 3D Striation Pattern of Unknown Source

Run Your “Algorithm” 6 3D Striation Pattern of Unknown Source 3D Striation Pattern of Known Source 3D Striation Pattern of Unknown Source 6 Matching Lines

Run Your “Algorithm” 6 5 3D Striation Pattern of Unknown Source 3D Striation Pattern of Known Source 3D Striation Pattern of Unknown Source 6 Matching Lines 5 Matching Lines

Run Your “Algorithm” 6 5 3D Striation Pattern of Unknown Source 3D Striation Pattern of Known Source 3D Striation Pattern of Unknown Source 6 Matching Lines 5 Matching Lines 6 Matching Lines

Run Your “Algorithm” 6 5 4 3D Striation Pattern of Unknown Source 3D Striation Pattern of Known Source 3D Striation Pattern of Unknown Source 6 Matching Lines 5 Matching Lines 6 Matching Lines 4 Matching Lines

Run Your “Algorithm” Result: 1 set of-4X, 1 set of-5X, 2 sets of-6X 6 3D Striation Pattern of Known Source 3D Striation Pattern of Unknown Source 6 Matching Lines 5 Matching Lines 6 Matching Lines 4 Matching Lines Result: 1 set of-4X, 1 set of-5X, 2 sets of-6X

Enter the observed run length data for the comparison into the network and update “match” (same source) odds:Buckelton,Wevers,Neel;Petraco,Neel

Enter the observed run length data for the comparison into the network and update “match” (same source) odds:Buckelton,Wevers,Neel;Petraco,Neel 1-4X 1-5X 2-6X

Enter the observed run length data for the comparison into the network and update “match” (same source) odds:Buckelton,Wevers,Neel;Petraco,Neel 0-2X 0-3X 1-4X 1-5X 2-6X 0-7X 0-8X 0-9X 0-10X 0-≤10X

Enter the observed run length data for the comparison into the network and update “match” (same source) odds:Buckelton,Wevers,Neel;Petraco,Neel Source probability updates in light of data 0-2X 0-3X 1-4X 1-5X 2-6X 0-7X 0-8X 0-9X 0-10X 0-≤10X

Enter the observed run length data for the comparison into the network and update “match” (same source) odds:Buckelton,Wevers,Neel;Petraco,Neel Now compute the “weight of evidence” LR = 96/3.8 ≈ 25 0-2X 0-3X 1-4X 1-5X 2-6X 0-7X 0-8X 0-9X 0-10X 0-≤10X

Enter the observed run length data for the comparison into the network and update “match” (same source) odds:Buckelton,Wevers,Neel;Petraco,Neel Now compute the “weight of evidence” LR = 96/3.8 ≈ 25 0-2X 0-3X 1-4X 1-5X 2-6X 0-7X 0-8X 0-9X 0-10X 0-≤10X The evidence “strongly supports”Kass-Raftery that the striation patterns were made by the same tool

Where to Get the Model and Software

Where to Get the Model and Software https://github.com/npetraco/CMS-Network

Where to Get the Model and Software https://github.com/npetraco/CMS-Network BayesFusion: http://www.bayesfusion.com/ SamIam: http://reasoning.cs.ucla.edu/samiam/ Hugin: http://www.hugin.com/ gR packages: http://people.math.aau.dk/~sorenh/software/gR/ Bayes Net software: No cost for non-commercial/demo use

Directions for the future Need More Data! Working on a wavelet based simulator for 2D toolmarks:

Directions for the future Need More Data! Working on a wavelet based simulator for 2D toolmarks: LH4 Simulate stochastic detail HL4 HH4

Directions for the future Need More Data! Working on a wavelet based simulator for 2D toolmarks: LH4 Simulate stochastic detail HL4 HH4

Directions for the future Need More Data! Working on a wavelet based simulator for 2D toolmarks: LH4 Simulate stochastic detail HL4 HH4 +

Need More Data!: Model aggregate evidence dependencies

Need More Data!: Model aggregate evidence dependencies dust constituent data sheet

Need More Data!: Model aggregate evidence dependencies Use Ising and Potts-like “spin” models: Capture dependences between components of materials/trace evidence dust constituent data sheet

Need More Data!: Model aggregate evidence dependencies Use Ising and Potts-like “spin” models: Capture dependences between components of materials/trace evidence dust constituent data sheet

Need More Data!: Model aggregate evidence dependencies Use Ising and Potts-like “spin” models: Capture dependences between components of materials/trace evidence Simulate data using model with standard techniques from statistical-mechanics dust constituent data sheet

Compute weight of evidence for any pair of hypotheses

Compute weight of evidence for any pair of hypotheses Weight of evidence was canonically defined by Jeffreys:

Compute weight of evidence for any pair of hypotheses Weight of evidence was canonically defined by Jeffreys: “Weight of evidence” = Bayes Factor

Compute weight of evidence for any pair of hypotheses Weight of evidence was canonically defined by Jeffreys: “Weight of evidence” = Bayes Factor “Thermodynamic Integration” trick to compute hard part:

Compute weight of evidence for any pair of hypotheses Weight of evidence was canonically defined by Jeffreys: “Weight of evidence” = Bayes Factor “Thermodynamic Integration” trick to compute hard part:

Compute weight of evidence for any pair of hypotheses Weight of evidence was canonically defined by Jeffreys: “Weight of evidence” = Bayes Factor “Thermodynamic Integration” trick to compute hard part: We can (in principle…) get these averages from MCMC for fixed b from 0 to 1 and numerically integrate the aboveLartillot,Friel,Oates,Calderhead,Gelman

More Future Directions Clean up: cptID, fdrIDPetraco GUI modules for common toolmark comparison tasks/calculations using 3D microscope data 2D features for toolmark impressions: feature2Petraco Parallel/GPU/FPGA implementation of computationally intensive routines e.g. ALMA Correlator for astronomy data Especially for retrieving “relevant pop/best match” reference sets Uncertainty for Bayesian Networks Models, parameters…

References Petraco: https://github.com/npetraco/x3pr https://github.com/npetraco/feature2 https://github.com/npetraco/cptID https://github.com/npetraco/fdrID Whitcher: https://cran.r-project.org/web/packages/waveslim/index.html R-Core: https://www.r-project.org/ Vovk: http://www.alrw.net/ Efron: Efron, B. “Large-Scale Inference: Empirical Bayes Methods for Estimation, Testing, and Prediction”, Cambridge, 2013. Neel: Neel, M and Wells M. “A Comprehensive Analysis of Striated Toolmark Examinations. Part 1: Comparing Known Matches to Known Non-Matches”, AFTE J 39(3):176-198 2007. Wevers: Wevers, G, Michael Neel, M and Buckleton, J. “A Comprehensive Statistical Analysis of Striated Tool Mark Examinations Part 2: Comparing Known Matches and Known Non-Matches using Likelihood Ratios”, AFTE J 43(2):1-9 2011. Buckleton: Buckleton J, Nichols R, Triggs C and Wevers G. “An Exploratory Bayesian Model for Firearm and Tool Mark Interpretation”, AFTE J 37(4):352-359 2005. BayesFusion: http://www.bayesfusion.com/

References Lartillot: Lartillot N, Philippe H. “Computing Bayes factors using thermodynamic integration”’ Syst Biol. 55(2):195-207, 2006. Friel: https://arxiv.org/pdf/1209.3198v3.pdf Oates: https://arxiv.org/pdf/1404.5053.pdf Calderhead: http://eprints.gla.ac.uk/34589/1/34589.pdf Gelman: https://projecteuclid.org/download/pdf_1/euclid.ss/1028905934

Collaborations, Reprints/Preprints: Acknowledgements Professor Chris Saunders (SDSU) Professor Christophe Champod (Lausanne) Alan Zheng (NIST) Ryan Lillien (Cadre) Scott Chumbley (Iowa State) Robert Thompson (NIST) Research Team: Mr. Daniel Azevedo Dr. Martin Baiker Dr. Peter Diaczuk Mr. Antonio Del Valle Ms. Carol Gambino Dr. James Hamby Ms. Lily Lin Mr. Kevin Moses Mr. Mike Neel Collaborations, Reprints/Preprints: npetraco@gmail.com http://jjcweb.jjay.cuny.edu/npetraco/ Ms. Alison Hartwell, Esq. Dr. Brooke Kammrath Mr. Chris Lucky Off. Patrick McLaughlin Dr. Mecki Prinz Dr. Linton Mohammed Mr. Nicholas Petraco Ms. Stephanie Pollut Dr. Jacqueline Speir Dr. Peter Shenkin Mr. Peter Tytell Dr. Peter Zoon