Louisiana Technology Park


Tech Park Blog

Questions Loom About Role of Big Data and Artificial Intelligence in the Criminal Justice System

Questions Loom About Role of Big Data and Artificial Intelligence in the Criminal Justice System.jpg

The use of artificial intelligence in the criminal justice system has moved past the realm of fanciful science fiction and into everyday use, raising a host of thorny ethical issues that society is only beginning to work through.

Franz Borghardt, a Baton Rouge criminal defense lawyer who is an LSU Law School instructor, led a discussion about the implications of utilizing AI for decision making in the criminal justice system during a recent Tech Park Academy workshop at the Louisiana Technology Park.

The discussion comes in the wake of a media report that New Orleans police secretly used a  predictive policing tool by Silicon Valley data-mining company Palantir to help identify gang members involved in drug trafficking and violent crimes. The software reportedly predicted the chance that people would commit a violent crime or become a victim by tracking ties to gang members, analyzing criminal histories and examining social media habits.

Beyond crime forecasting, proponents say AI offers the possibility of overcoming problems associated with balance, fairness and bias, taking away the emotion and political pressures humans often experience when making important decisions. Borghardt, though, says it’s far from clear whether this is a good thing. “I don’t know how I feel about certain decisions being taken away from human beings and given to software programs,” he says.

Here are the key takeaways from the Tech Park Academy presentation.

 ‘AI Is Not Magic’

Also participating in the discussion was Josh Parnell of software firm Procedural Reality, a Baton Rouge game developer that uses complex algorithms to create vast virtual landscapes. Although so-called AI programs are capable of processing large data sets and even of learning on their own, Parnell argues that the capability of this software is often exaggerated in popular media. Truly sentient programs capable of supplanting humans in making complex decisions are far down the road. “AI is not magic,” Parnell says. “It’s a bunch of math. It’s not conscious either.”

Still, as in New Orleans, law enforcement and other government agencies around the world are adopting increasingly sophisticated software systems designed to assist in several areas of the criminal justice system. “We know nationally that courts are using these mechanisms,” Borghardt says. “Law enforcement agencies are using them.”

Weighing Different Perspectives

Borghardt says these types of data-mining and artificial intelligence tools are most commonly used for pretrial decision making, often to determine risk assessments on whether to let an accused person out of jail on bond, or to set the amount of bail required for release.

For example, a judge can rely on a data tool that predicts how likely a suspect is to commit another crime, to skip out on the trial or to use a dangerous substance while out on bail. More controversial applications include using similar data tools to recommend sentences for convicted criminals, although judges typically have the final say in these cases, Borghardt says.

These types of predictive tools are attractive, Borghardt says, because they can help municipalities stem the rising costs of incarcerating people awaiting a court date. But they also raise questions about accountability, transparency and bias, he says, noting that a hyper-accurate data-mining tool that relies on information gathered by biased humans will itself be far from impartial.

Borghardt argues that when considering the effectiveness and impartiality of any AI tool for criminal justice, the motivations and needs of multiple parties must be taken into account for the system to work fairly. Those interests include judges, prosecutors, jurists, defendants, law enforcement officials and attorneys.

“There are all these different stakeholders that are going to have different interests in play,” he says. “Those points of view are important to remember when we’re talking about integrating in artificial intelligence.”

On the Horizon

Perhaps the most provocative application for artificial intelligence in criminal justice is the idea of training it to serve as a jury and to render binding decisions on criminal cases.

Borghardt says any system that would allow a judgment to be determined by a machine or software program would be dependent upon the consent of a defendant — similar to the way someone facing a criminal charge can waive the right to a jury trial and have the case decided by a judge. “Otherwise you’ll be trampling on a major state and federal constitutional right,” he says.

Another necessary condition, Borghardt says, would be a technology that is simultaneously transparent and unhackable. Transparency is key, he argues, because defendants are entitled to due process under the law — but that openness makes a system inherently less secure. “If you make something transparent, it invites hacking,” he says. “It creates a cybersecurity risk.”

Borghardt says the idea of a computer jury that is truly fair and impartial is appealing in theory, but that reality is likely far off in the future and would require wholesale adjustments to the law for the criminal justice system.

“Can you train your toy to decide truth, credibility and, most importantly, truth beyond a reasonable doubt?” he says. “It’s a very complicated thing. I don’t know how close we are to doing that, but I imagine it’s theoretically possible.”

Tech Park Academy is the Louisiana Technology Park’s skill-based workshop series designed to provide entrepreneurs with the training and resources required to move their businesses forward and to address the critical business and organizational issues they face daily. The next event will take place on April 5.

Stephen Loy