Close Menu
Edu Expertise Hub
    Facebook X (Twitter) Instagram
    Monday, July 7
    • About us
    • Contact
    • Submit Coupon
    Facebook X (Twitter) Instagram YouTube
    Edu Expertise Hub
    • Home
    • Udemy Coupons
    • Best Online Courses and Software Tools
      • Business & Investment
      • Computers & Internet
      • eBusiness and eMarketing
    • Reviews
    • Jobs
    • Latest News
    • Blog
    • Videos
    Edu Expertise Hub
    Home » Latest News » UK MoJ crime prediction algorithms raise serious concerns
    Latest News

    UK MoJ crime prediction algorithms raise serious concerns

    TeamBy TeamApril 26, 2025No Comments10 Mins Read0 Views
    Facebook Twitter Pinterest LinkedIn Telegram Tumblr Email
    UK MoJ crime prediction algorithms raise serious concerns
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Data-based profiling tools are being used by the UK Ministry of Justice (MoJ) to algorithmically “predict” people’s risk of committing criminal offences, but pressure group Statewatch says the use of historically biased data will further entrench structural discrimination.

    Documents obtained by Statewatch via a Freedom of Information (FoI) campaign reveal the MoJ is already using one flawed algorithm to “predict” people’s risk of reoffending, and is actively developing another system to “predict” who will commit murder.

    While authorities deploying predictive policing tools say they can be used to more efficiently direct resources, critics argue that, in practice, they are used to repeatedly target poor and racialised communities, as these groups have historically been “over-policed” and are therefore over-represented in police datasets.

    This then creates a negative feedback loop, where these “so-called predictions” lead to further over-policing of certain groups and areas, thereby reinforcing and exacerbating the pre-existing discrimination as increasing amounts of data are collected.

    Tracing the historical proliferation of predictive policing systems in their 2018 book Police: A field guide, authors David Correia and Tyler Wall argue that such tools provide “seemingly objective data” for law enforcement authorities to continue engaging in discriminatory policing practices, “but in a manner that appears free from racial profiling”.

    They added it therefore “shouldn’t be a surprise that predictive policing locates the violence of the future in the poor of the present”.

    Computer Weekly contacted the MoJ about how it is dealing with the propensity of predictive policing systems to further entrench structural discrimination, but received no response on this point.

    MoJ systems

    Known as the Offender Assessment System (OASys), the first crime prediction tool was initially developed by the Home Office over three pilot studies before being rolled out across the prison and probation system of England and Wales between 2001 and 2005.

    According to His Majesty’s Prison and Probation Service (HMPPS), OASys “identifies and classifies offending-related needs” and assesses “the risk of harm offenders pose to themselves and others”, using machine learning techniques so the system “learns” from the data inputs to adapt the way it functions.

    Structural racism and other forms of systemic bias may be coded into OASys risk scores – both directly and indirectly
    Sobanan Narenthiran, Breakthrough Social Enterprise

    The risk scores generated by the algorithms are then used to make a wide range of decisions that can severely affect people’s lives. This includes decisions about their bail and sentencing, the type of prison they’ll be sent to, and whether they’ll be able to access education or rehabilitation programmes while incarcerated.

    The documents obtained by Statewatch show the OASys tool is being used to profile thousands of prisoners in England and Wales every week. In just one week, between 6 and 12 January 2025, for example, the tool was used to complete a total of 9,420 reoffending risk assessments – a rate of more than 1,300 per day.

    As of January this year, the system’s database holds over seven million risk scores setting out people’s alleged risk of reoffending, which includes completed assessments and those in progress.

    Commenting on OASys, Sobanan Narenthiran – a former prisoner and now co-CEO of Breakthrough Social Enterprise, an organisation that “supports people at risk or with experience of the criminal justice system to enter the world of technology” – told Statewatch that “structural racism and other forms of systemic bias may be coded into OASys risk scores – both directly and indirectly”.

    He further argued that information entered in OASys is likely to be “heavily influenced by systemic issues like biased policing and over-surveillance of certain communities”, noting, for example, that: “Black and other racialised individuals may be more frequently stopped, searched, arrested and charged due to structural inequalities in law enforcement. 

    “As a result, they may appear ‘higher risk’ in the system, not because of any greater actual risk, but because the data reflects these inequalities. This is a classic case of ‘garbage in, garbage out’.”

    Computer Weekly contacted the MoJ about how the department is ensuring accuracy in its decision-making, given the sheer volume of algorithmic assessments it is making every day, but received no direct response on this point.

    A spokesperson said that practitioners verify information and follow detailed scoring guidance for consistency.

    While the second crime prediction tool is currently in development, the intention is to algorithmically identify those most at risk of committing murder by pulling a wide variety of data about them from different sources, such as the probation service and specific police forces involved in the project.

    Statewatch says the types of information processed could include names, dates of birth, gender and ethnicity, and a number that identifies people on the Police National Computer (PNC).

    Originally called the “homicide prediction project”, the initiative has since been renamed to “sharing data to improve risk assessment”, and could be used to profile convicted and non-convicted people alike.

    According to a data sharing agreement between the MoJ and Greater Manchester Police (GMP) obtained by Statewatch, for example, the types of data being shared can include the age a person had their first contact with the police, and the age they were first the victim of a crime, including for domestic violence.

    Listed under “special categories of personal data”, the agreement also envisages the sharing of “health markers which are expected to have significant predictive power”.

    This can include data related to mental health, addiction, suicide, vulnerability, self-harm and disability. Statewatch highlighted how data from people not convicted of any criminal offence will be used as part of the project.

    In both cases, Statewatch says using data from “institutionally racist” organisations like police forces and the MoJ will only work to “reinforce and magnify” the structural discrimination that underpins the UK’s criminal justice system.

    Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed
    Sofia Lyall, Statewatch

    “The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems,” said Statewatch researcher Sofia Lyall.

    “Like other systems of its kind, it will code in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction and disability is highly intrusive and alarming.”

    Lyall added: “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed.”

    Statewatch also noted that Black people in particular are significantly over-represented in the data held by the MoJ, as are people of all ethnicities from more deprived areas.

    Challenging inaccuracies

    According to an official evaluation of the risk scores produced by OASys from 2015, the system has discrepancies in accuracy based on gender, age and ethnicity, with the risk scores generated being disproportionately less accurate for racialised people than white people, and especially so for Black and mixed-race people.

    “Relative predictive validity was greater for female than male offenders, for White offenders than offenders of Asian, Black and Mixed ethnicity, and for older than younger offenders,” it said. “After controlling for differences in risk profiles, lower validity for all Black, Asian and Minority Ethnic (BME) groups (non-violent reoffending) and Black and Mixed ethnicity offenders (violent reoffending) was the greatest concern.”

    A number of prisoners affected by the OASys algorithm have also told Statewatch about the impacts of biased or inaccurate data. Several minoritised ethnic prisoners, for example, said their assessors entered a discriminatory and false “gangs” label in their OASys reports without evidence, a decision they say was based on racist assumptions.

    Speaking with a researcher from the University of Birmingham about the impact of inaccurate data in OASys, another man serving a life sentence likened it to “a small snowball running downhill”.

    The prisoner said: “Each turn it picks up more and more snow (inaccurate entries) until eventually you are left with this massive snowball which bears no semblance to the original small ball of snow. In other words, I no longer exist. I have become a construct of their imagination. It is the ultimate act of dehumanisation.”

    Narenthiran also described how, despite known issues with the system’s accuracy, it is difficult to challenge any incorrect data contained in OASys reports: “To do this, I needed to modify information recorded in an OASys assessment, and it’s a frustrating and often opaque process.

    “In many cases, individuals are either unaware of what’s been written about them or are not given meaningful opportunities to review and respond to the assessment before it’s finalised. Even when concerns are raised, they’re frequently dismissed or ignored unless there is strong legal advocacy involved.”

    MoJ responds

    While the murder prediction tool is still in development, Computer Weekly contacted the MoJ for further information about both systems – including what means of redress the department envisages people being able to use to challenge decisions made about them when, for example, information is inaccurate.

    A spokesperson for the department said that continuous improvement, research and validation ensure the integrity and quality of these tools, and that ethical implications such as fairness and potential data bias are considered whenever new tools or research projects are developed.

    They added that neither the murder prediction tool nor OASys use ethnicity as a direct predictor, and that if individuals are not satisfied with the outcome of a formal complaint to HMPSS, they can write to the Prison and Probation Ombudsman.

    Regarding OASys, they added there are five risk predictor tools that make up the system, which are revalidated to effectively predict reoffending risk.

    Commenting on the murder prediction tool specifically, the MoJ said: “This project is being conducted for research purposes only. It has been designed using existing data held by HM Prison and Probation Service and police forces on convicted offenders to help us better understand the risk of people on probation going on to commit serious violence. A report will be published in due course.”

    It added the project aims to improve risk assessment of serious crime and keep the public safe through better analysis of existing crime and risk assessment data, and that while a specific predictive tool will not be developed for operational use, the findings of the project may inform future work on other tools.

    The MoJ also insisted that only data about people with at least one criminal conviction has been used so far.

    New digital tools

    Despite serious concerns around the system, the MoJ continues to use OASys assessments across the prison and probation services. In response to Statewatch’s FoI campaign, the MoJ confirmed that “the HMPPS Assess Risks, Needs and Strengths (ARNS) project is developing a new digital tool to replace the OASys tool”.

    An early prototype of the new system has been in the pilot phase since December 2024, “with a view to a national roll-out in 2026”. ARNS is “being built in-house by a team from [Ministry of] Justice Digital who are liaising with Capita, who currently provide technical support for OASys”.

    The government has also launched an “independent sentencing review” looking at how to “harness new technology to manage offenders outside prison”, including the use of “predictive” and profiling risk assessment tools, as well as electronic tagging.

    Statewatch has also called for a halt to the development of the crime prediction tool.

    “Instead of throwing money towards developing dodgy and racist AI and algorithms, the government must invest in genuinely supportive welfare services. Making welfare cuts while investing in techno-solutionist ‘quick fixes’ will only further undermine people’s safety and well-being,” said Lyall.

    This post is exclusively published on eduexpertisehub.com

    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Team

      Related Posts

      How our district turned a sea of data into a compass for change

      July 7, 2025

      Fine-tuning to deliver business AI value

      July 7, 2025

      How Teachers Are Making Computer Science Click

      July 6, 2025

      The AI arms race begins at age 4

      July 6, 2025

      From the FBI to F&A: lessons learnt in safeguarding systems and data

      July 5, 2025

      Supreme Court Ruling Highlights Continued Power Struggle Over LGBTQ+ Books in Schools

      July 5, 2025
      Courses and Software Tools

      Extreme Privacy: What It Takes to Disappear

      August 24, 202455 Views

      Modern C++ Programming Cookbook: Master Modern C++ with comprehensive solutions for C++23 and all previous standards

      September 18, 202427 Views

      Meebook E-Reader M7 | 6.8′ Eink Carta Screen | 300PPI Smart Light | Android 11 | Ouad Core Processor | Out Speaker | Support Google Play Store | 3GB+32GB Storage | Micro-SD Slot | Gray

      August 19, 202422 Views

      HR from the Outside In: Six Competencies for the Future of Human Resources

      May 20, 202517 Views

      Coders at Work: Reflections on the Craft of Programming

      April 19, 202516 Views
      Reviews

      Genesis: Artificial Intelligence, Hope, and the Human Spirit

      July 7, 2025

      Communication Skills Training: A Practical Guide to Improving Your Social Intelligence, Presentation, Persuasion and Public Speaking: Positive Psychology Coaching Series, Book 9

      July 7, 2025

      NIMO 15.6″ IPS FHD-Business-Laptop, Intel 6 Core i3-1215U 8GB RAM 256GB SSD (Beat i5-1135G7, Up to 4.4GHz) Computer with Backlit Keyboard Fingerprint 65W Type C Wi-Fi 6 Numpad Win 11

      July 7, 2025

      Cybersecurity Mastery: Malware & Hacking Techniques | Udemy Coupons 2025

      July 7, 2025

      Software Engineer in Test – Android

      July 7, 2025
      Stay In Touch
      • Facebook
      • YouTube
      • TikTok
      • WhatsApp
      • Twitter
      • Instagram
      Latest News

      How our district turned a sea of data into a compass for change

      July 7, 2025

      Fine-tuning to deliver business AI value

      July 7, 2025

      How Teachers Are Making Computer Science Click

      July 6, 2025

      The AI arms race begins at age 4

      July 6, 2025

      From the FBI to F&A: lessons learnt in safeguarding systems and data

      July 5, 2025
      Latest Videos

      Unlocking Ethical Hacking: Your Cybersecurity Career Guide

      July 7, 2025

      What is Digital Marketing? Scope, Earnings & Who Can Start a Career in It Hammad’s Digital Hub

      July 5, 2025

      Just trend #gacha #memecreator #gachaclub #gcmeme #gachalife #trend #gachememe #edit #memes

      July 4, 2025

      Kenley Jansen notches his 1,000th career MLB strikeout | August 25, 2021 | Dodgers @ Padres

      July 3, 2025

      Top 5 Cyber Security Jobs in India || Cyber Security Career 2024

      July 2, 2025
      Latest Jobs

      Software Engineer in Test – Android

      July 7, 2025

      Wetland Scientist

      July 7, 2025

      Powertrain Mechanical Engineer

      July 7, 2025

      Experienced Tax Preparer

      July 7, 2025

      barista – Store# 08693, 1ST ST & W MILL RD

      July 7, 2025
      Legal
      • Home
      • Privacy Policy
      • Cookie Policy
      • Terms and Conditions
      • Disclaimer
      • Affiliate Disclosure
      • Amazon Affiliate Disclaimer
      Latest Udemy Coupons

      Mastering Maxon Cinema 4D 2024: Complete Tutorial Series | Udemy Coupons 2025

      August 22, 202435 Views

      Advanced Program in Human Resources Management | Udemy Coupons 2025

      April 5, 202531 Views

      Diploma in Aviation, Airlines, Air Transportation & Airports | Udemy Coupons 2025

      March 21, 202530 Views

      Python Development & Data Science: Variables and Data Types | Udemy Coupons 2025

      May 24, 202521 Views

      Time Management and Timeboxing in Business, Projects, Agile | Udemy Coupons 2025

      April 2, 202521 Views
      Blog

      3 Ways To Network Over Summer Vacation And Grow Your Career

      July 3, 2025

      Why Community Is Your Most Valuable Career Asset In 2025

      June 28, 2025

      What Employers Are Really Looking For In Job Interviews

      June 27, 2025

      The Best Way to End a Cover Letter (With 4 Winning Examples)

      June 26, 2025

      5 Job Interview Secrets To Beat The Competition

      June 25, 2025
      Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
      © 2025 All rights reserved!

      Type above and press Enter to search. Press Esc to cancel.

      We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
      .
      SettingsAccept
      Privacy & Cookies Policy

      Privacy Overview

      This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
      Necessary
      Always Enabled
      Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
      Non-necessary
      Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
      SAVE & ACCEPT