Download presentation
Presentation is loading. Please wait.
Published byMalcolm Booker Modified over 7 years ago
1
Algorithmic Fairness Risk assessments for pre-trial detention
Sam Corbett-Davies with Emma Pierson, Avi Feller, and Sharad Goel
2
Pre-trial risk assessment
Used in most states and big cities, including SF and NYC Attempts to predict whether a defendant will commit a new violent crime 1-10 scale, high risk defendants more likely to be detained
4
Possible advantages and drawbacks
Better than judges at identifying risky defendants Can detain fewer people and improve public safety Less randomness than judges Less bias than judges Opaque (no due process) Impersonal; cannot take into account unique mitigating circumstances “Scientific rationalization of discrimination”1 1Sonja Starr, "Evidence-based sentencing and the scientific rationalization of discrimination." Stanford Law Review
5
Evidence in favor Lucas County, OH1
Defendants released without bail increased from 14% to 28% Pre-trial crime down from 20% to 10%, violent crime down from 5% to 3% Defendants skipping court down from 41% to 29% Virginia2, random pre-trail agencies introduce risk assessment 2.3 times more likely to recommend release, 1.9 times more likely to release Those released 1.3 times less like to fail to appear or commit a crime 1http:// 2http://luminosity-solutions.com/site/wp-content/uploads/2014/02/Risk-Based-Pretrial-Guidelines-August-2015.pdf Lucas County Laura John Arnold Foundation, non-experimental design
7
What features are acceptable to include?
“Fairness through blindness” - remove protected characteristics (especially race) from classifier What about features that correlate very closely with race (red-lining) “Fairness through awareness” – explicitly uses race to achieve some measure of fairness
9
What features are acceptable to include?
Input features Outcome variable Machine learning classifier (logistic regression, random forests, etc) Criminal history Age Will the defendant commit a violent crime before their trial? Charge type Family situation Drug use
11
Bias in the input data? Real example from Virginia
Whites with “unstable housing” 17% more likely to recidivate (p<0.01) “Unstable housing” has no significant association with recidivism for POC (p=0.1) Remove feature Underestimate risk – whites with unstable housing Overestimate risk – whites with stable housing Include feature for all defendants Overestimate risk – POC with unstable housing Underestimate risk – POC with stable housing Include feature only when assessing whites No over/underestimate risk, but model now differs by race
13
Bias in the input data? Popular concern, but one that can be addressed
We can very easily identify bias in the input data It can also be corrected with appropriate interaction terms
14
Bias in the outcome data?
We want to predict crime, but only see arrest What if blacks more likely to be arrested than whites who’ve committed the same crime? Very hard to correct for (what is the true rate of crime?) Violent crime less biased Violent crime more likely to be reported, excludes victimless crimes Surveys about crime victimization match arrest rates Not an issue if “failure to appear in court” is the outcome of concern No chance of biased enforcement or recording Law in NYC states FTA is only consideration when granting bail
16
What does it mean for an algorithm to be fair?
17
COMPAS risk assessment
Developed by Northpointe, a private company Predicts “will this defendant commit a crime within their next two years of freedom?” Evaluated by ProPublica on Broward County, FL
18
ProPublica’s evidence of bias
White Defendants Black Defendants Proportion of those who didn’t reoffend labeled as high risk 24% 45% Proportion of those who did reoffend labeled as low risk 48% 28% False positive/negative rates differ by race
19
Northpointe’s evidence of fairness
White Defendants Black Defendants Proportion of those labeled as high risk who did reoffend 59% 63% Proportion of those labeled as low risk who didn’t reoffend 71% 65% Algorithm is calibrated by race (positive predictive values approximately equal)
20
Northpointe’s evidence of fairness
21
Definitions of fairness
Calibration (white and black defendants with equal scores reoffend at equal rates) Predictive equality (equal false positive rates by race) Statistical parity (equal detention rates by race)
22
Definitions of fairness – bad news
24
Who should be detained? Single threshold Race-based thresholds
(to achieve predictive equality or statistical parity) S. Corbett-Davies et al., Algorithmic decision making and the cost of fairness
26
Definitions of fairness – more bad news
Fairness Definition White detention rate Black detention rate White false positive rate Black false positive rate Percent low risk detainees Increase in violent crime Calibration 18% 39% 14% 31% - Equal FPR 28% 32% 24% 7% Equal detention rate 30% 26% 22% 17% 9% Equalizing FPR or detention rate results in more crime committed by released defendants Most crime is within-race, so extra crime will disproportionately occur in the communities we’re trying to help The detainees subject to the lower threshold will probably sue under 14th Amendment, race-based decisions require strict scrutiny
27
The news isn’t all bad The courts have permitted race-based decision making in the past (eg Fisher v. University of Texas, affirmative action) There are alternatives – raise the threshold or use alternatives to detention Judges’ decisions have all these problems and more In many cases using algorithms will reduce crime, reduce detention and reduce racial disparities!
28
Further reading Our work:
ProPublica article alleging racial bias in risk assessments Cathy O’Neil, Weapons of Math Destruction Are criminal risk assessment scores racist? Inherent Trade-Offs in the Fair Determination of Risk Scores Equality of Opportunity in Supervised Learning Big Data's Disparate Impact Evidence-Based Sentencing and the Scientific Rationalization of Discrimination Our work: Our “rebuttal” to ProPublica Our paper proving multiple thresholds optimally satisfy other notions of fairness
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.