PULSE POINTS:
❓What Happened: An algorithm is being developed to predict which criminals might later commit murder.
👥 Who’s Involved: Britain’s Ministry of Justice, Greater Manchester Police and other British police forces, and civil liberties group Statewatch.
📍 Where & When: The United Kingdom; the project started in January 2023 and is believed to have been completed in December 2024 but is yet to be deployed.
💬 Key Quote: “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed,” noted Sofia Lyall from Statewatch.
⚠️ Impact: Concerns about bias and the ethical implications of using such predictive models on vast datasets, including from communities already facing structural discrimination.
IN FULL:
The British Ministry of Justice is advancing in its initiative to create an algorithmic tool aimed at predicting which individuals convicted of crimes might escalate to committing homicide. Known internally as the Homicide Prediction Project, the undertaking emerged through Freedom of Information requests from the civil liberties group Statewatch, which flagged the project as concerning.
Expanding on risk-prediction systems already in place, the project is designed to build upon frameworks such as the Offender Assessment System (OASys), which has been used since 2001 to forecast recidivism and inform legal decisions. However, the broad scope of data in this new model has raised red flags. Data utilized, sourced from various police and government bodies, potentially includes information on up to half a million people, some without any criminal history.
Despite officials’ assertions that the project remains in a research phase, uncovered documents allude to future deployments. Sources claim increased collaboration across government agencies and police forces, such as Greater Manchester Police and the Metropolitan Police, to enhance the dataset driving these predictions.
Statewatch has raised ethical concerns about the predictive model’s potential for systemic bias. The British state has already attempted to introduce guidelines that were explicitly two-tier and would have seen ethnic minorities prioritized for bail over white men in the country.
Statewatch’s Sofia Lyall described the algorithm project as “chilling and dystopian,” calling for an immediate cessation of its development. “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed,” he said. She highlighted the risk algorithms pose in creating profiles of potential criminals before any crime is committed.
show less