The impact of algorithms in criminal sentencing on due process rights

  • Caoimhe Anderson

Student thesis: Doctoral ThesisJD (Juris Doctor)

Abstract

There is a growing trend towards the adoption of algorithms and automated data processing systems in areas of life which were previously entirely in the remit of human beings. The criminal justice system and in particular, criminal sentencing, is no exception. The use of algorithmic tools in criminal sentencing creates various implications for due process rights due to their opacity and the use of particular input variables.

Currently, the majority of academic literature focuses on the use of algorithms in the criminal justice system, but largely ignores their use in criminal sentencing. This project utilizes a socio-legal approach and applies a human rights lens to analyse the impact of algorithms in criminal sentencing on due process rights. This project addresses four sub questions: How are algorithms and automated data processing systems utilized in the context of criminal sentencing? How does the use of algorithms in criminal sentencing impact due process rights? What safeguards are available to those subject to automated data processing at criminal sentencing and how effective are they? What, if any, amendments could be made to enhance the protection of due process rights from algorithmic technology?

Currently, courts have demonstrated little restraint regarding the use of algorithmic tools in criminal sentencing, despite the lack of understanding of algorithms and the fact that these tools can result in biased and discriminatory outputs. The use of algorithms in criminal sentencing can have serious implications on due process rights including the right to review and verify sentencing, the right to an individualised sentence, and the right to equality before and under the law. The current safeguards, in the form of data protection legislation, do not effectively protect defendants due process rights who are subject to algorithmic tools at criminal sentencing. This project recommends that various safeguards must be put in place should other countries wish to follow the trend of adopting algorithms at criminal sentencing. These safeguards include the adoption of a human-rights framework, expert testimony, regular audits, enhanced technological transparency through open source codes, and mandatory judicial training on assessing algorithmic evidence.
Date of AwardDec 2019
Original languageEnglish
Awarding Institution
  • Queen's University Belfast
SupervisorDaithi Mac Sithigh (Supervisor) & Cheryl Lawther (Supervisor)

Cite this

'