Software that Predicts Crime “No More Accurate or Fair” than Human Judgment

Apps and Software

Photo by: Alexas_Fotos via Pixabay

 

US’ Dartmouth College researchers Julia Dressel and Hany Farid conducted a study that showed how a software is “no more accurate or fair” than human judgment in predicting recidivism, the tendency of a convicted criminal to re-offend.

The two researchers found out that in using the Correctional Offender Management Profiling for Alternative Sanctions, a forecasting algorithm that is also used in some American courts, the accuracy rate is virtually identical to the small groups that predict whether a defendant can be convicted again because of a future crime. Dressel further emphasized that “an algorithm’s accuracy can’t be taken for granted.” 

The researchers claimed that by using COMPAS, it needs six variables to assess the risk of recidivism while a human uses only two pieces of vital information -- the defendant's past convictions and age -- for their judgment. Dressel emphasized that the cost of being wrong in judgment is very high and it poses a serious question whether the software should have a part in the said life-changing decisions.

The authors wrote that “the widely used commercial risk assessment software COMPAS is no more accurate or fair than predictions made by people with little or no criminal justice expertise.”

The study authors also specify that they considered the database of more than 7,000 pretrial defendants situated in Broward County, Florida from 2013 to 2014. 

Dressel and Farid’s study is not the only one to raise a question about the role of the software in judicial decision making. In a high-profile case, wherein a Wisconsin man was charged with eluding police, he said that his right to due process was violated when a judge used the software to sentence him.