In hands of court or correctional officers, risk and/or needs assessment tools can have a significant impact on the lives of those in the criminal justice system. But whether the impact—such as a harsher or more lenient sentence—was intended or not can sometimes come down to whether an assessment tool was properly used.

For some states, ensuring these instruments are appropriately deployed takes minimal effort. But for others, it can mean extensive training, and constant monitoring, of the court and correctional officers tasked with collecting, interpreting and managing the data required by assessment tools. The difference comes down to not only a state's preferences, but what resources it has at its disposal and the demands of the particular risk assessment tool in use.

Most of the assessment instruments Legaltech News found in use across the U.S., save for the the pretrial Public Safety Assessment (PSA), require interviews with a defendant or convicted offender.

But at least one tool—the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS)—has built-in "controls" that flag instances where an interviewee is potentially being untruthful. "There are two built-in schemes in COMPAS to aid in detecting reliability issues," notes Chris Kamin, interim general manager at Equivant, creator of the tool. "First, there are several [questions] that are flagged if responded to in a certain way. A hypothetical example might be someone responding, 'Strongly agree' to the statement 'I can't stand any type of delicious food.' … Second, there are many item pairs whose responses should be consistent."

Still, jurisdictions rely on more than just their tools to uncover inaccuracies. Most, even with those with COMPAS, check interviewees' answers against police, correctional and court records, with some undertaking more comprehensive investigations.

Javed Syed, director of Dallas County Community Supervision and Corrections, says that where possible and applicable, his department will check "official records, police reports, prior probation notes, incarceration notes, therapy/education class progress notes, clinical assessments and recommendations [and contact] collateral sources such as family, neighbors employers and the victims." Texas uses the Ohio Risk Assessment System (ORAS) for pretrial hearings, and probation and parole services.

To be sure, many jurisdictions use multiple criminal databases to collect and verify data for their assessment tools. Robert Reburn, public information officer at Tennessee Department of Correction, notes that for the department's Static Risk and Offender Needs Guide—Revised (STRONG-R) instrument, the state uses the "National Crime Information Center (NCIC) to check arrest records, as well as conviction records from the counties/districts the offenders have convictions in."

Some states will ensure the integrity of their own criminal and court databases by deploying certain IT controls. Maine Pretrial Services executive director Elizabeth Simoni notes the state implemented a three-tier access management system for its criminal records. Maine Pretrial Services, which enters data into the state's Virginia Pretrial Risk Assessment Instrument (VPRAI), has midtier access, meaning it can see, but not edit, criminal records.

For its PSA deployment, Utah's Administrative Office of the Court (AOC) went a step further—it took (most) manual data entry out of the equation. Geoff Fattah, the state AOC's communications director, explains that defendants in Utah's court systems are assigned a unique state identification number by law enforcement, which is tied to their fingerprints. A computer uses that number to pull up criminal history data from state and national databases, which then auto-calculates a PSA score.

Utah's automation, however, has its limits. Keisa Williams, associate general counsel at Utah AOC, notes that "99% of the time" NCIC data cannot be processed by their electronic systems. "Right now, when we get an NCIC hit back that is unintelligible, we do not calculate a PSA. As a result, we are unable to provide judges with a PSA in approximately 30% of cases on any given week statewide."

Still, she adds the state is currently "working on adjusting the system to send PSAs with an NCIC hit to a queue that humans will have to monitor, review the hits manually, and recalculate a PSA."

There's a reason NCIC data needs more manual attention—after all, it's made up of information from local, state and federal entities, all of which have their own laws and standards.

Tara Blair, executive officer for the Kentucky Administrative Office of the Courts, which also uses the PSA, explains what constitutes felonies and misdemeanors in states can vary. What's more, "sometimes there will be a charge on a NCIC report that doesn't have a disposition. You don't know if [people] were found guilty or [had their charges] dismissed, so you have to actually investigate that."­

But those investigations can be difficult. Blair says that in 2015, Kentucky "launched a quality control database where supervisors were reviewing work and keeping up with mistakes employees were making. … What we found was there was a lot of mistakes happening."

Because of the time and effort it took to go through such records, Kentucky created a "risk assessment specialists" unit consisting of officials specifically trained to decipher NCIC data.

"After we initiated the new unit, the error rate did improve—they were only 60% accurate… [then] it jumped up into the 90s," Blair says.

|

Training Day(s)

To ensure court and correction officials know how to properly use an assessment tool, many states will also implement broad training programs, including those provided by assessment tool developers themselves.

New York's county probation departments, for instance, have Equivant train staff on COMPAS, while Missouri's department of correction turns to University of Cincinnati trainers to certify their officials on the ORAS. Some states will use vendor training in a more limited fashion, such as having developers train a select group of officials who go on to become the ongoing trainers for the rest of the staff.

Outside consultants are also brought into some jurisdictions to spearhead training programs. New Jersey, for instance, turned to pretrial justice consulting firm Luminosity to help its judges implement the PSA tool.

Patricia Costello, a former Superior Court and Assignment Judge in New Jersey, who retired right as the state started using the PSA, says the training went over legal, social, cultural and mathematical topics related to pretrial risk assessment. She notes it also explained why the state had adopted the PSA, how the tool performed in other jurisdictions, and the "history of how the algorithms were developed, [including] the kind of legal history and social history of why we are moving towards not requiring bail."

Costello, who is now of counsel at Chiesa Shahinian & Giantomasi, notes the training is mandatory in New Jersey for those new to the bench or for judges who are coming back into the criminal courts after more than five years absence. "That's done before they take the bench, and they are usually assigned a mentor judge that can walk them through anything."

How frequently training occurs in other states, however, often depends on the jurisdiction's preferences and the complexity of the tool. On one end of the spectrum are state-developed assessment tools lauded for their simplicity.

Mike Schmidt, executive director of the Oregon Criminal Justice Commission, says training is not required for Oregon's Public Safety Checklist (PSC) tool, which helps determine what probation or parole services offenders receive, and if they need to be screened by a more comprehensive tool. "That's the beauty of the PSC, literally a 3-year-old could use it." Indeed, the PSC only requires a convicted offender's State Identification Number (SID) and the state law that the offender broke to automatically calculate a risk score.

Similarly, training on the Colorado Pretrial Assessment Tool is not mandated, though it is advised, while Nevada requires a one-off training for its Nevada Pretrial Risk Assessment.

At the other end of spectrum, jurisdictions with more advanced assessment tools often require ongoing training for the court officials and/or correctional staff. While most do an initial training for new hires, what "ongoing" entails after that differs from place to place.

Some states will adhere to a set schedule. Texas and Illinois, for instance, mandate their correctional staff be recertified every three years to use the ORAS.

But many states will offer refresher or booster courses on an as-needed basis. Kathy Waters, director at Arizona's Adult Probation Services Division, notes they "basically trained every time that there has been a change" or update to the state's proprietary Offender Screening Tool (OST) and Field Reassessment Offender Screening Tool (FROST) tools, which Arizona uses for probation.

In Hawaii, the state's Interagency Council on Intermediate Sanctions (ICIS) likewise provides refresher courses and "Tips of the Months" to reinforce interview and assessment standards.

A few states will also take a more hands-on approach by continuously monitoring their staff to ensure certain standards are upheld, and deploy training when they're not.

"We routinely require assessors to tape their interviews, and we send them out to a third-party reviewer to evaluate accuracy of scoring, overall atmosphere during the interview, [and] use of techniques such as motivational interviewing skills," says Bree Derrick, chief of staff at the Idaho Department of Correction, which uses the Level of Service Inventory Revised (LSI-R) tool for presentence reports and probation and parole services. "Feedback is provided to each assessor for each tape that's submitted, and people are required to hit proficiency markers or attend remedial training."

There are other ways besides recordings, however, to confirm whether staff is implementing an assessment tool as intended. Inter-rate reliability studies, for instance, determine whether assessments are being conducted consistently by different court or correctional officials. For the Alaska Department of Corrections' Pretrial Enforcement Division, these studies involve having two officials unknowingly rate the same person using the state's proprietary pretrial risk assessment tool Alaska 2-Scale (AK-2S), and then comparing both scores to see if they match.

When inter-rate reliability studies show room for improvement, states will look to enhance their training programs. For Connecticut, that meant turning to an outside consultant. "It's something we've been talking about for a long time—it's not an inexpensive proposition," says Gary Roberge, executive director of the Court Support Services Division of the Connecticut Judicial Branch.

Still, "we just felt that we're going to be able to increase our proficiency with [the LSI-R] … and make sure all our supervisors and staff have the latest training and are up to date with the assessment tool," he says.

Roberge adds, "If we do a good assessment and accurately identify risk and needs, we have a better shot at reducing recidivism, which is what we're all about we're here."