Would you put your country’s criminal justice system in the hands of AI?

By Shira Jeczmien

Updated May 16, 2020 at 11:27 AM

Reading time: 2 minutes

When it comes to determining someone’s life, using machine learning isn’t the go-to tool that comes to mind. So I would like to think, at least. The U.S., a country with the highest number of incarcerated persons is leaning more and more towards AI to help relieve its civil systems from the sheer volume of admin that’s associated with 1 in 38 adults who are in some sort of correctional facility. To be exact, 2.2 million adults were behind bars in 2016 with a further 4.5 million individuals in correctional facilities. But when a nation historically ridden with racial inequality and mass incarceration of people of colour turns to machine learning to make its most human and moral decisions, further injustice is only to be expected.

Over the past few years, since their introduction into the justice system, numerous prison reform and criminal justice advocates have protested the use of AI tools used in courtrooms to, supposedly, help judges make fair decisions regarding the reshuffling of incarcerated people across state or federal prisons or use facial recognition to identify criminals. However these tools, claiming to be unbiased, have been proven time and time again to have a consistent inclination to pick out people of colour and more so, be completely inaccurate, “even mistaking members of Congress for convicted criminals” as reported by MIT Technology Review’s Karen Hao.

But while this type of AI ‘aid’ is perpetuating injustice and racial bias on a criminal scale, facial recognition tools are, sadly, not the worst of it. Hyperbolically titled ‘criminal risk assessment algorithms’, this machine learning tool is used by police enforcement when a suspect is first arrested. What this algorithm does is digest the defendant’s profile—information such as background, family history, ethnicity, geolocation and so forth—and just as quickly regurgitates back up what’s called a ‘recidivism score’, which is a score number that determines how likely this defendant is to commit another crime in the future.

What’s done with the score is exactly what you’d imagine. The score is used by judges to determine what type of correctional facilities the defendant will be sent to, how severe their sentencing will be, and whether they will be forced to remain in prison until their sentence with bail, without bail, or with an extortionate bail fee. The idea behind this tool is that it both aims to reduce or eliminate bias in judges and also help predict criminal behaviour and thus offer the correct type of services and facilities for rehabilitation. Only issue is that this algorithm has been built on millions of racially bias and often inaccurate files spanning decades from a country that has been, for decades, throwing people of colour into prison for often times minor offences or completely wrong convictions.

The real problem with this algorithm is that it tries to solve a rotten system of injustice and mass incarceration by tackling the tip of it, only adding fresh ingredients to the already toxic rot at the bottom. These AI tools are anchored in an administrative foundation; one that seeks to relieve federal and state justice workers from the sheer mass of admin associated with a country that insists not to reform its justice system.

“We are not risks, we are needs”, said Marbre Stahly-Butts, executive director of Law for Black Lives, “a national network of radical lawyers, law students, and legal workers of color committed to building the power of the Black Lives Matter movement” as described on their website. Marching straight forward with the use of such AI tools is not only dangerous but utterly criminal and irresponsible. The U.S. in particular but other countries too cannot rely on data gathered from the past 30 or 50 years to build its supposedly non-bias algorithms because never in history or today have we been unbiased, especially when it comes to criminal justice.

Keep On Reading

By Abby Amoakuh

Griselda Blanco’s son Michael reveals new Netflix series is inaccurate and files lawsuit

By Charlie Sawyer

What to do if Monzo freezes or closes your bank account

By Charlie Sawyer

Making the case for Louis Theroux to be declared an official Gen Z icon

By Fatou Ferraro Mboup

AI reimagines 10 of your favourite movie characters as pink Barbie-like icons

By Abby Amoakuh

Sydney Sweeney’s boobs have feminists divided: Where does liberation start and objectification end?

By Charlie Sawyer

Dua Lipa fan and Nicki Minaj fan get into a real-life standoff over internet beef

By Fatou Ferraro Mboup

What’s poppin? Not Jack Harlow’s Thanksgiving halftime performance

By Fatou Ferraro Mboup

Who is Bobbi Althoff, the podcaster who’s rumoured to have had an affair with Drake?

By Fatou Ferraro Mboup

Problematic Christmas songs you probably shouldn’t sing anymore

By Charlie Sawyer

O.J. Simpson dies at the age of 76 following a battle with cancer 

By Fatou Ferraro Mboup

Tayo Awoderu, player 107 in Squid Game: The Challenge, shares his behind-the-scenes experience

By Abby Amoakuh

Andrew Garfield is dating a professional witch and the internet can’t handle it

By Abby Amoakuh

Crunchy, silky, scrunchie and almond moms: What’s behind TikTok’s latest parenting craze?

By Fatou Ferraro Mboup

The Last of Us star Bella Ramsey reveals they want their next role to be the Joker

By Charlie Sawyer

Who are Marvel actor Jonathan Majors’ girlfriend and ex-girlfriend, Meagan Good and Grace Jabbari?

By Abby Amoakuh

Who is Brit Smith, the smaller artist JoJo Siwa allegedly stole Karma from?

By Abby Amoakuh

Comedian and actress Tiffany Haddish pokes fun at recent DUI arrest during stand-up routine

By Charlie Sawyer

Dan Schneider addresses accusations revealed in Quiet on Set: The Dark Side of Kids TV 

By Charlie Sawyer

The UK Conservative government is out to get the entire LGBTQIA+ community. Here’s how

By Fatou Ferraro Mboup

Machine Gun Kelly officially changed his name after fans pointed out its problematic issue