As part of the Research Domain Criteria (RDoC) initiative, the NIMH seeks to improve measures of neuronal and psychological targets for use in intervention research. RDoC tools must precisely measure cognitive and neuronal systems to produce reliable findings. Unfortunately, many RDoC tasks have not been psychometrically evaluated, nor refined, and are liable to confounds that can lead to inaccurate claims of differential deficit, weakened effect size, and contradictory brain imaging findings. As highlighted by FOA PAR- 18-930, there is a critical need for modern psychometric methods and tools designed to support cognitive and clinical neuroscience research. This project responds to this FOA by evaluating and refining a methodology designed to administer statistically robust variants of RDoC tasks, especially in the context of brain imaging. We have created a quantitative methodology designed to administer computerized adaptive tests (CATs) in the context of cognitive and clinical neuroscience research. CATs manipulate stimulus properties in real time in order to improve measurement precision, avoid ceiling and floor effects, and maximize effect size, even for individuals and groups with highly discrepant levels of cognitive functioning. CATs also perform psychometric adjustments to cognitive tasks so that brain functioning abnormalities can be interpreted independent of performance deficits. In pilot work, we have used this approach with an RDoC working memory task, the N- back, to show that the methodology improves the reliability of both cognitive and brain imaging data. We will evaluate the generalizability and impact of adaptive testing, beyond the N-back task, for use in translational and experimental testing. Patients with schizophrenia and controls will be administered both adaptive and non- adaptive versions of four RDoC paradigms used to assess working memory: delayed match-to-sample, Sternberg, self-ordered pointing, and N-back. Additionally, participants will be administered the 5-Choice Continuous Performance Task and Probabilistic Learning Task, translational measures of control and learning respectively. Both groups will undergo functional neuroimaging and respond to adaptive versions of these tasks.
The specific aims are to: (1) Determine whether adaptive testing improves the precision and effect size estimates of performance differences produced by RDoC tasks; and (2) Determine whether adaptive testing improves the reliability and effect size estimates of brain activation differences produced by RDoC tasks. While these aims are designed to evaluate a methodology, and to address critical concerns related to the use of RDoC working memory tasks as neural probes, mediators, and outcomes in psychosis research, results also have broad implications across populations, brain regions and networks, and cognitive domains. By addressing concerns of poor reliability, weak effect size, and brain activation confounds, this project will show that adaptive testing broadly improves cognitive neuroscience tasks. To facilitate rapid deployment of adaptive RDoC tasks, we will develop freely available versions of the paradigms used and a companion R package ?catCog?.

Public Health Relevance

Experimental cognitive tests developed as part of the NIMH Research Domain Criteria (RDoC) initiative are liable to psychometric challenges that lead to inconsistent measurement precision, weakened effect size, and confounds in brain imaging research. This project evaluates a novel adaptive testing methodology designed to address these challenges. Products will include computerized adaptive measures of RDoC learning, control, and memory domains for use in experimental and translational studies, as well as a free and open-source programming (R) package, ?catCog?, designed to implement the methodology.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
1R01MH121546-01A1
Application #
10051743
Study Section
Cognition and Perception Study Section (CP)
Program Officer
Morris, Sarah E
Project Start
2020-08-14
Project End
2025-05-31
Budget Start
2020-08-14
Budget End
2021-05-31
Support Year
1
Fiscal Year
2020
Total Cost
Indirect Cost
Name
Colorado State University-Fort Collins
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
785979618
City
Fort Collins
State
CO
Country
United States
Zip Code
80523