All humans-no matter how intelligent, egalitarian, or well-intentioned-are susceptible to cognitive biases in the way they make decisions and judge others. Although these biases can operate unintentionally in opposition to one's conscious intentions, personal beliefs, and objective data, they may unwittingly perpetuate social inequalities. Unexplained disparities in R01 funding outcomes by race and gender have raised concern about bias in NIH peer review. This Transformative R01 will examine if and how implicit (i.e., unintentional) bias might occur in R01 peer review through the following three Specific Aims:
Specific Aim #1. Identify the extent to which investigator characteristics influence the words and descriptors chosen by R01 peer-reviewers and how text relates to assigned scores. We will validate positive and negative grant evaluation word categories, analyze the text of a national sample of R01 reviews, and compare the grant review text for different investigator characteristics. We hypothesize that categories of words and descriptors will differ in ways that suggest implicitly different evaluation standards by applicant race and gender, even when application scores and funding outcomes are similar.
Specific Aim #2. Determine whether investigator race, gender, or institution causally influences the review of identical proposals. We will conduct a randomized, controlled study in which we manipulate characteristics of a grant principal investigator (PI) to assess their influence on grant review outcomes. We will request donations of actual funded R01s and, within each grant, manipulate the PI's gender, race, or home institution. We will then invite reviewers in the appropriate discipline to review the proposals, and we will analyze written reviews and scores. We hypothesize that investigator variables will significantly influence scores and review text such that grants attributed to higher status groups (male, White, prestigious institution) will obtain better scores and text will suppor implicitly different standards of excellence.
Specific Aim #3. Examine how interactional patterns among study section members promote receptivity and resistance to discussion topics and associated grant applicants. In audio- and videotapes of constructed study sections, we will investigate the real-time social interactional processes in the discussions of R01 proposals. We will employ conversation analysis to examine the delivery of initial rankings and their rationales, topic development, and the processes through which final rankings are negotiated. This research is innovative because it examines for the first time the complexities of potential bias in NIH peer review. The potential impact is threefold; this research will 1) discover whether certain forms of cognitive bias are or are not consequential in R01 peer review, 2) determine whether quantitative text analysis is a useful measure of implicit bias, and 3) describe and label real-tim grant reviewer interactional patterns. Taken together, the results of our research could set the stage for transformation in peer review throughout NIH.

Public Health Relevance

NIH research grants are the engine for innovation and discovery with the goal of improving human health at the individual patient and population level. Peer-review is fundamental to the grant award process, ultimately determining what research will be conducted. This multi-method research project will explore whether and how unintentional cognitive bias triggered by characteristics of the applicant could influence R01 grant review outcomes.

Agency
National Institute of Health (NIH)
Institute
National Institute of General Medical Sciences (NIGMS)
Type
Research Project (R01)
Project #
4R01GM111002-04
Application #
9086374
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Marcus, Stephen
Project Start
2013-09-27
Project End
2018-06-30
Budget Start
2016-07-01
Budget End
2017-06-30
Support Year
4
Fiscal Year
2016
Total Cost
Indirect Cost
Name
University of Wisconsin Madison
Department
Miscellaneous
Type
Schools of Medicine
DUNS #
161202122
City
Madison
State
WI
Country
United States
Zip Code
53715
Kolehmainen, Christine; Carnes, Molly (2018) Who Resembles a Scientific Leader-Jack or Jill? How Implicit Bias Could Influence Research Grant Funding. Circulation 137:769-770
Pier, Elizabeth L; Brauer, Markus; Filut, Amarette et al. (2018) Low agreement among reviewers evaluating the same NIH grant applications. Proc Natl Acad Sci U S A 115:2952-2957
Mitamura, Chelsea; Erickson, Lynnsey; Devine, Patricia G (2017) Value-Based Standards Guide Sexism Inferences for Self and Others. J Exp Soc Psychol 72:101-117
Devine, Patricia G; Forscher, Patrick S; Cox, William T L et al. (2017) A Gender Bias Habit-Breaking Intervention Led to Increased Hiring of Female Faculty in STEMM Departments. J Exp Soc Psychol 73:211-215
Pier, Elizabeth L; Raclaw, Joshua; Kaatz, Anna et al. (2017) 'Your comments are meaner than your score': score calibration talk influences intra- and inter-panel variability during scientific grant peer review. Res Eval 26:1-14
Forscher, Patrick S; Mitamura, Chelsea; Dix, Emily L et al. (2017) Breaking the prejudice habit: Mechanisms, timecourse, and longevity. J Exp Soc Psychol 72:133-146
Raclaw, Joshua; Ford, Cecilia E (2017) Laughter and the Management of Divergent Positions in Peer Review Interactions. J Pragmat 113:1-15
Sheridan, Jennifer; Savoy, Julia N; Kaatz, Anna et al. (2017) Write More Articles, Get More Grants: The Impact of Department Climate on Faculty Research Productivity. J Womens Health (Larchmt) 26:587-596
Carnes, Molly; Johnson, Paula; Klein, Wendy et al. (2017) Advancing Women's Health and Women's Leadership With Endowed Chairs in Women's Health. Acad Med 92:167-174
Magua, Wairimu; Zhu, Xiaojin; Bhattacharya, Anupama et al. (2017) Are Female Applicants Disadvantaged in National Institutes of Health Peer Review? Combining Algorithmic Text Mining and Qualitative Methods to Detect Evaluative Differences in R01 Reviewers' Critiques. J Womens Health (Larchmt) 26:560-570

Showing the most recent 10 out of 21 publications