Broken Tools: Vincent Southerland illuminates the problem of bias in the criminal legal system’s use of algorithmic tools | NYU School of Law – NYU…

Posted: March 17, 2022 at 2:06 am

Assistant Professor of Clinical Law Vincent Southerland recalls that as a staff attorney with the Bronx Defenders early in his legal career, he often dealt with risk assessment instruments algorithm-based tools employed by the court to help determine whether his clients should be released pretrial. Southerland used the assessments to argue on behalf of his client when they were favorable, and when they werent, hed suggest potential problems with the assessment calculation. With all the other elements at work in the courtroom, he says, he didnt think deeply about the broader role that risk assessment instruments played.

But Southerland, who teaches the Criminal Defense and Reentry Clinic, has since come to recognize that the very use of those instruments has an outsize influence on criminal justice. What I also realized, he says, is that the tentacles of the algorithmic ecosystem reach into all these other stages of the criminal systemeverything from policing all the way through to sentencing, parole, probation, supervision, reentry, where youre classified when youre incarcerated.. These tools are ubiquitous across the system, and I feel like theyre just humming along without much of a challenge to them.

In his article The Intersection of Race and Algorithmic Tools in the Criminal Legal System, published last fall in the Maryland Law Review, Southerland takes a hard look at such tools and offers multiple reasons to question them. Arguing that the criminal legal system is plagued by racism and inequity, he writes that to transform the criminal legal system, advocates need to adopt a lens centered on racial justice to inform technology-based efforts rather than simply layering tools onto it in its current state. While algorithmic tools are often characterized as helping to eradicate bias in decision-making, Southerland asserts that they are infected by inevitable systemic bias.

The article begins with an overview of algorithmic tools across the criminal legal system, focusing on predictive tools. They are used by police to forecast where criminal activity is likely to occur; they are used by courts to determine risk of rearrest and failure to reappear when setting bail, and to render sentencing decisions.

The evidence, Southerland writes, casts doubt on the efficacy of this algorithmic approach. In a 2016 study, the Human Rights Data Analysis Group (HRDAG) examined the algorithm behind the predictive policing software PredPol. Inputting crime data from Oakland, California, to predict potential drug crime, HRDAG found that the algorithm suggested targeting low-income neighborhoods of color, despite concurrent evidence from public health data that drug use is more evenly dispersed throughout the city. HRDAG argued that when informed by discriminatory data, the algorithm will work to encourage similarly discriminatory police behavior.

Southerland points to existing scholarship indicating that this data is not merely something that police usethey create it as well, meaning that bias reflected in past police activity is embedded in the statistics that algorithmic tools utilize. Thus, writes Southerland, police decision-making plays an outsized role in shaping our perceptions of crime and criminal behavior. Data contain other flaws, too, he suggests. For example, arrest statistics do not indicate how the arrest is ultimately resolved, including dismissal of charges: What is reflected and read in the data is a community that appears to be dramatically more dangerous than it actually is, he writes.

Such initial distortions, he argues, can be self-perpetuating: increased law enforcement in a specific area based on previous patterns of policing can lead to more arrests, as does the mere presence of police, leading to even greater targeting by law enforcement.

Algorithmically based pretrial risk assessments used in bail decisions, such as those Southerland had encountered as a Bronx Defenders attorney, vary by jurisdiction and are created by a variety of different entities. Many use data about prior convictions and pending charges. The factors used to compute a risk score and how they are weighted are not always revealed, and most tools give a singular score encompassing the risk of both rearrest and failure to appear, even though the two risks are distinct from each other.

Southerland critiques also the algorithmic tools applied to sentencing decisions. The tools, which calculate recidivism risk, typically utilize four categories of risk factors: criminal history, antisocial attitude, demographics, and socioeconomic status. Such actuarial risk assessments operate as a form of digital profiling, prescribing the treatment of an individual based on their similarity to, or membership in, a group, he notes. He cites a recent study that found Virginias use of nonviolent risk assessment tools did not reduce incarceration, recidivism, or racial disparities; simultaneously, it disadvantaged young defendants.

The immediate abolition of algorithmic tools in the criminal legal system is unlikely, Southerland acknowledges, but he sees an opportunity to use them to shape the system for the better, using a racial justice lens. Algorithmic data sets could be adjusted to account for racially disparate impacts in policing and other areas. Applying a public health analysis, hot spots could attract support and investment rather than increased policing. Algorithmic tool vendors and users could be required to eliminate the discriminatory impact of their tools; algorithmic impact assessments, modeled on environmental impact assessments, could also be required; and algorithmic tools could be used to detect bias in decision-making in those who run the system.

Southerland stresses that it is not the tools themselves, but how they are crafted and used, that matters. These tools reflect back to us the world that we live in. If we are honest about it, what we see in that reflection is a criminal legal system riddled with racism and injustice, he writes. A racial justice lens helps us to understand that and demands that we adjust our responses to what we see to create the type of world that we want to inhabit.

Posted March 14, 2022

Read the original post:

Broken Tools: Vincent Southerland illuminates the problem of bias in the criminal legal system's use of algorithmic tools | NYU School of Law - NYU...

Related Posts