Algorithms Fail to Make Bail Less Biased

A leading argument behind the current movement to limit or eliminate bail is that the traditional cash bail process discriminates against minority arrestees.  Tom Simonite of Wired reports that to fix this disparity,  many jurisdictions are utilizing computer software to peruse nine factors about an arrestee including age,  past convictions and current charges.  No data on an arrestee’s race are considered.   The Oakland, CA nonprofit  MediaJustice found that these algorithms are used in 46 states to guide judicial decisions on bail, sentencing, parole and probation.  Law enforcement representatives have been skeptical, noting that using software to decide which arrestees are held in jail or set free removes responsibility from those charged with making those decisions, who should be encouraged to err on the side of caution.  The software doesn’t consider public safety.

But in recent months even the proponents of the algorithms now say they should be abandoned.  The Pretrial Justice Institute (PJI), an early proponent, now claims that these tools have not reduced racial disparities, and in some jurisdictions have increased the disparities.  New Jersey figures show that while eliminating cash bail and using an algorithm to determine pretrial release reduced the jail population, the racial disparity among those denied release remained the same, with blacks more likely to be held in jail than whites.   The PJI and other groups that sold these risk assessment tools to the public to create a more colorblind system, are now calling for the abandonment of the tools.  The be clear, no process intended to create proportional racial outcomes in bail and sentencing will ever work.   The members of different racial and ethnic groups do not commit crimes proportionally.  Blacks and Hispanics commit more crimes than whites and Japanese.  While it is most certainly politically incorrect to say this, any process that ignores this reality is guaranteed to fail.