Updated: Nov 7
According to a report by The Bureau of Investigative Journalism, a new algorithmic tool for categorizing prisoners in UK jails risks automating and embedding racism in the system1. The tool draws on data from the prison service, police, and the National Crime Agency to assess what type of prison a person should be put in and how strictly they should be controlled during their sentence. Source
Critics have warned that the new system could result in ethnic minority prisoners being unfairly placed in higher security conditions than white prisoners, exacerbating long-standing problems of discrimination exposed in an excoriating review two years ago by the Labour MP David Lammy. Source
A preliminary evaluation of the new system by the Ministry of Justice (MoJ) concluded that there was no indication of discrimination. However, experts have questioned whether the MoJ’s findings can support this conclusion.
The evaluation assessed an initial trial on 269 prisoners, of whom 32 were black, Asian, or from an ethnic minority (BAME). Although some numbers were redacted from the report, the Bureau’s analysis indicates that 5 of the 32 non-white prisoners — 16% — had their risk category raised under the new system, while 7%, or 16 of 230 white prisoners, did. Source
The evaluation highlights an inherent risk of the new tool – that potential racial bias already present in the prison system “may translate into more reporting on BAME prisoners through our intelligence systems.” Because the tool relies on these systems, “this may result in a greater proportion of BAME prisoners having an increase in their security category”.
It is important to note that the use of algorithms in the criminal justice system has been a topic of debate for some time. While algorithms can be useful in predicting recidivism rates, they can also perpetuate existing biases and inequalities. Source
In a report issued days before Christmas in 2021, the department said its algorithmic tool for assessing the risk that a person in prison would return to crime produced uneven results. The algorithm, known as Pattern, overpredicted the risk that many Black, Hispanic and Asian people would commit new crimes or violate rules after leaving prison.