justice

What’s wrong with Algorithmic Tools to Determine Who Gets to Make Bail

The United States incarcerates more individuals than any other nation in the world. In 2016, the jails and prisons of the country held around 2.2 million inmates, with another 4.5 million being held in other correctional facilities. In other words, 1 out if every 38 American adults were in some type of correctional facility.

Taking these nightmarish stats into consideration, courtrooms across the United States turned to automation in an effort to bring efficiency and safety to the legal system.

How Automated Tools Are Being Used in the Legal System

The use of automated tools in the legal system is nothing new. For instance, law enforcement agencies have been using predictive algorithms to develop strategies that determine where to send their ranks for a while now. Police departments have been using facial recognition systems to identify and apprehend suspects. The latter has been the root of great controversy, especially when you consider how inaccurate facial recognition systems can be at identifying individuals who’re dark-skinned.

However, surprisingly, the facial recognition systems aren’t most controversial automated tool being used in the justice system. The real problem starts after an individual is arrested by the police. Yes, we’re talking about criminal risk assessment algorithms.

The Problem with Using Algorithmic Tools to Determine Who Gets to Make Bail

In some states, judges are now using an algorithm to decide whether to detain a defendant in jail before trail. The algorithm takes in the details of the defendant’s profile, such as their previous criminal history and the severity of the offence, before spitting out a score which estimates the likelihood that he/she is a flight risk.

order hammer

The logic behind using algorithmic tools is that they allow you to accurately predict the behavior of an individual. Theoretically, using algorithms also prevents the process being influenced bias, as the judges are relying on data-driven recommendations instead of their gut-feeling.

However, there’s a problem. You see, the algorithmic tools being used in the justice systems are being driven by historical crime data. They utilize statistics to identify patterns in the data. So, if you feed it crime data from previous years, it will pick out the patterns that are related to the crime.

However, these patterns are merely statistical correlations—and not causations. We know that the United States has a history of minorities being targeted disproportionately by law enforcement agencies—particularly the colored and low-income communities. Because of this history, individuals from such communities are at a higher risk of being given a high recidivism score by the algorithmic tool. Consequently, these tools can perpetuate—or even amplify—the historic biases.

DeLaughter Bail Bonds have been serving the state of Indiana for around a decade arranging for surety and transfer bail bonds for all charges throughout all counties in the jurisdiction. The areas we offer our services in include LaPorte County, Newton County, and Starke County. Get in touch with them today for more information.

Leave a Reply

Your email address will not be published. Required fields are marked *