How objective are expert witnesses when they are retained by one side in adversarial legal proceedings? The social sciences have offered almost no data to address "adversarial allegiance," the presumed tendency for experts to reach conclusions that support the party who retained them. Recently, the principal investigators found strong evidence of adversarial allegiance among some forensic evaluators who scored risk assessment instruments for sex offenders facing trial. But only carefully-controlled experimental research can identify why adversarial allegiance exists.

This study uses an experimental design to examine a) whether allegiance effects in risk assessment scores occur when evaluators in adversarial proceedings do not get to choose the side of the case they work for, and b) the extent to which evaluators' conclusions appear to be the product of decision making processes that are known to lead to biased conclusions in other contexts. Participants are forensic evaluators and graduate students who score offender risk measures for one side in a sex offender trial. Participants will complete questionnaires before and after they score offender data, allowing researchers to assess the pre-existing attitudes and the decision making processes associated with any allegiance effect.

Results will help courts better scrutinize expert scientific testimony. Results will also inform training curricula to reduce bias in forensic psychological evaluation, and perhaps in other forensic science disciplines. Regardless of their discipline, experts who are retained by one party in adversarial legal proceedings are probably vulnerable to allegiance for similar reasons. Eventually, a research program that identifies the processes underlying adversarial allegiance can inform interventions to minimize these processes.

Project Report

? How objective are expert witnesses when they are retained by one side in adversarial legal proceedings? Do they remain objective, as their ethical codes require? Or are they inevitably biased by the adversarial system, as the public and many legal professionals tend to assume? Although these questions have important implications for the justice system, the social sciences have offered almost no data to address what we call "adversarial allegiance," the presumed tendency for forensic evaluators to reach conclusions that support the party who retained them. Therefore, we recently studied real-world trials of sex offenders, where experts hired by opposing sides each scored the same risk assessment instruments for the same offenders. We found that experts hired by the prosecution tended to assign scores indicating higher risk and experts hired by the defense tended to assign scores indicating lower risk to the same offenders, even though experts using these reliable risk instruments outside of adversarial legal proceedings usually assign very similar scores to the same offenders. These pilot studies strongly suggested, but could not prove, adversarial allegiance was a problem. Was bias apparent simply because attorneys chose experts who had pre-existing attitudes that tend to favor their side (selection effects)? Or do evaluators, once retained by a particular side, tend to form opinions that favor that side (allegiance effects)? To answer this question, we conducted a true experiment, in which we recruited over 100 forensic psychologists and psychiatrists, gave them rigorous training on popular risk assessment instruments, then deceived them to believe they were hired for a large forensic consultation, in which they would receive payment to score risk assessment instruments for up to four offender case files. We had participants meet with an attorney, who pretended to hire them, and hint at the possibility of future paid work for his agency. Participants scored offender files, believing it was a paid forensic consultation. But unbeknownst to participants, they all received exactly the same four offender files, and each participant was randomly assigned to believe that they were working for the prosecution or the defense. Results: Experts who believed they were working for the prosecution tended to assign higher risk scores to offenders, whereas those who believed they were working for the defense tended to assign lower risk scores, even though they all had access to exactly the same data from the same offenders. These results provide strong evidence of adversarial allegiance, a form of bias, among some forensic experts. Two follow-up surveys with forensic experts who participated in this study revealed that all agreed adversarial allegiance was a problem in the field. Yet most participants perceived themselves as relatively free from this form of bias. Generally, participants perceived adversarial allegiance as most problematic for those clinicians who were least like themselves. For example, clinicians who worked in the public sector believed clinicians in private practice were most vulnerable to allegiance. Older clinicians believed younger clinicians were more vulnerable, and so forth. Results from this study underscore recent concerns about forensic sciences—and raise concerns specific to forensic psychology—by demonstrating that some experts who score ostensibly objective assessment instruments assign scores that are biased toward the side that retained them. Although this study addressed only one kind of evaluation, there is little reason to believe that this is the only kind of forensic evaluation or forensic science procedure vulnerable to allegiance effects. Indeed, the evidence of allegiance effects on structured, ostensibly objective instruments leaves us even more concerned about the possibility of allegiance effects on expert procedures that are less structured or less guided by scoring rules. Our findings underscore the need for research addressing the cognitive and procedural biases that may facilitate adversarial allegiance, and potential interventions to reduce allegiance.

Agency
National Science Foundation (NSF)
Institute
Division of Social and Economic Sciences (SES)
Type
Standard Grant (Standard)
Application #
0961289
Program Officer
Marjorie Zatz
Project Start
Project End
Budget Start
2010-05-01
Budget End
2013-04-30
Support Year
Fiscal Year
2009
Total Cost
$125,622
Indirect Cost
Name
Sam Houston State University
Department
Type
DUNS #
City
Huntsville
State
TX
Country
United States
Zip Code
77341