The UK Ministry of Justice (MoJ) is using data profiling tools to predict who might commit crimes, and it’s raising serious concerns about bias. Statewatch, a pressure group, argues that relying on historically biased data will deepen existing discrimination.
Through Freedom of Information requests, Statewatch revealed that the MoJ is using a flawed algorithm to gauge the risk of reoffending and is developing another tool aimed at predicting murders. Proponents claim these tools help direct resources more effectively, but critics contend they target marginalized communities that have historically faced over-policing, creating a harmful cycle.
In their 2018 book, “Police: A Field Guide,” David Correia and Tyler Wall argue that these systems give law enforcement a guise of objectivity while perpetuating discrimination. They suggest it’s no surprise these predictions often label the poor as future criminals.
The Offender Assessment System (OASys) is the first tool implemented. Developed by the Home Office and launched between 2001 and 2005, it assesses offenders’ needs and risks using machine learning. Yet, critics, including Sobanan Narenthiran from Breakthrough Social Enterprise, point out that bias can be embedded in the risk scores generated by OASys. He notes that systemic issues like biased policing lead to certain groups being flagged as “higher risk” based on skewed data rather than actual risk.
OASys impacts critical decisions, including bail, sentencing, and access to rehabilitation. Statewatch found it processes thousands of assessments weekly—in just one week in January 2025, it completed over 9,400. With more than seven million risk scores in its database, concerns about accuracy are mounting.
Narenthiran highlighted how challenging inaccurate data is often frustrating. Many people aren’t even aware of what’s reported about them or lack the opportunity to contest it effectively. A prisoner described feeling dehumanized by an OASys report filled with inaccuracies.
The MoJ claims it checks the quality and fairness of these systems. However, many question whether these measures genuinely address the potential for bias. The new murder prediction tool aims to collect data from various sources to identify who may commit serious violence, further processing personal information, including mental health data. Critics worry this could further entrench systemic inequalities.
Sofia Lyall from Statewatch argues the existing systems are fundamentally flawed, coding bias into their predictions and emphasizing the need for significant reform.
Despite these concerns, the MoJ continues to rely on OASys and is developing a new system called ARNS, which is set to roll out by 2026. Yet many, including Statewatch, believe that instead of investing in problematic algorithms, the government should focus on enhancing welfare services to truly support communities.