Systematic reviews (SRs) synthesize and critically assess bodies of evidence to produce a comprehensive unbiased assessment of what is known. As such, SRs are vital for evidence-based decision-making. However, given the pace of new research, the process of developing an SR remains too slow. One particularly time-consuming step in the process is citation screening, which requires manual review of thousands of abstracts to identify only a small number of relevant studies. Screening such large numbers of studies is necessary because systematic reviewers place a high priority on identifying all relevant studies to avoid bias. Innovative citation screening tools, which utilize text-mining and new sophisticated machine learning methods, represent one potential solution. Abstrackr (Brown University) and EPPI-Reviewer (University College London) are off- the-shelf, web-based citation screening tools designed to improve screening efficiency. Both programs utilize machine- learning techniques to semi-automate the screening process by modeling the probability that each citation will meet criteria for inclusion. This allows efficiency gains through screening prioritization and screening truncation. With screening prioritization, citations are organized for screening from highest to lowest likelihood of inclusion. This allows earlier retrieval of full-text articles and facilitates workflow planning. Organizing citations by likelihood of inclusion also allows reviewers the option of truncating the screening process when remaining citations fall below a certain threshold. While promising, existing studies have predominantly been performed by computer scientists testing individual tools or comparing different modeling algorithms (e.g., various classifiers). To date, no studies have performed a direct comparison of citation screening tools. Similarly, although automatically excluding citations that fall below particular thresholds could substantively improve efficiency, adoption has been low due to concerns that relevant studies could be missed. However, how often studies would be missed and how important such omissions would be remains unknown. To address these knowledge gaps, this project will (1) Compare screening efficiency for two citation-screening tools, Abstrackr and EPPI-reviewer, and (2) Characterize the potential impact of using thresholds to exclude low probability studies automatically. To address aim 1, using citations from 3 large and 6 small completed evidence reports, we will compare Abstrackr to EPPI-Reviewer for citation screening. Using screening prioritization, we will assess what proportion of articles must be screened to identify all included studies (e.g., to achieve 100% sensitivity).
For Aim 2, we will explore the potential impact of excluding all citations that fall below particular thresholds during the screening process. We will also assess to what extent missing these studies would alter report conclusions. By characterizing potential efficiency gains from new, innovative, and widely accessible tools, this project can facilitate wider adoption by evidence based practice centers seeking to speed systematic review production.

Public Health Relevance

Systematic reviews provide a comprehensive, unbiased assessment of a body of literature and are vital for timely, evidence-based decision-making. However, development of systematic reviews remains too slow, with one particularly time-consuming step being citation screening, in which thousands of research abstracts are manually reviewed to identify a small number of relevant studies. By testing potential gains in citation screening efficiency offered by two innovative, widely-accessible machine- learning tools (Abstrackr and EPPI?Reviewer), and determining if automatically excluding studies to improve speed compromises report conclusions, we hope to enable more rapid production of systematic reviews to inform clinicians and policy-makers and promote high quality, evidence-based patient care.

Agency
National Institute of Health (NIH)
Institute
Agency for Healthcare Research and Quality (AHRQ)
Type
Small Research Grants (R03)
Project #
1R03HS025859-01
Application #
9435231
Study Section
Healthcare Systems and Values Research (HSVR)
Program Officer
Banez, Lionel L
Project Start
2017-10-15
Project End
2018-06-14
Budget Start
2017-10-15
Budget End
2018-10-14
Support Year
1
Fiscal Year
2018
Total Cost
Indirect Cost
Name
Ecri Institute
Department
Type
DUNS #
069042950
City
Plymouth Meeting
State
PA
Country
United States
Zip Code
19462