top of page

Click button to join the conversation!

OPEN AI INCLUDED OPEN IN OUR DISCORD 

Or, type in 'systemic racism' in search bar 

DOES SYSTEMIC RACISM EXIST?

Share, *Rate this post & leave your comment down below!

Prisoner risk algorithm could program in racism

Updated: Nov 7, 2023

According to a report by The Bureau of Investigative Journalism, a new algorithmic tool for categorizing prisoners in UK jails risks automating and embedding racism in the system1. The tool draws on data from the prison service, police, and the National Crime Agency to assess what type of prison a person should be put in and how strictly they should be controlled during their sentence. Source


Critics have warned that the new system could result in ethnic minority prisoners being unfairly placed in higher security conditions than white prisoners, exacerbating long-standing problems of discrimination exposed in an excoriating review two years ago by the Labour MP David Lammy. Source


A preliminary evaluation of the new system by the Ministry of Justice (MoJ) concluded that there was no indication of discrimination. However, experts have questioned whether the MoJ’s findings can support this conclusion.


The evaluation assessed an initial trial on 269 prisoners, of whom 32 were black, Asian, or from an ethnic minority (BAME). Although some numbers were redacted from the report, the Bureau’s analysis indicates that 5 of the 32 non-white prisoners — 16% — had their risk category raised under the new system, while 7%, or 16 of 230 white prisoners, did. Source


The evaluation highlights an inherent risk of the new tool – that potential racial bias already present in the prison system “may translate into more reporting on BAME prisoners through our intelligence systems.” Because the tool relies on these systems, “this may result in a greater proportion of BAME prisoners having an increase in their security category”.


It is important to note that the use of algorithms in the criminal justice system has been a topic of debate for some time. While algorithms can be useful in predicting recidivism rates, they can also perpetuate existing biases and inequalities. Source


9 views0 comments

Recent Posts

See All

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page