There are several measures that are regularly used to assess cognitive and behavioral dysfunction in basic Alzheimer's disease (AD) research and in clinical trials of Alzheimer's disease medications. All of these measures rely on total raw scores to assess dysfunction. That is, a person receives a score that reflects the sum of the individual items. There are major problems, however, with this standard approach that limit how precisely one can assess true levels of dysfunction and changes in dysfunction that accompany AD. The major problem is that the current raw score approach weights each item on a measure equally;that is, each item is assumed to contribute equally to the total score. But, in actuality, test items often differ in their difficulty and/or strength of relationship to dysfunction, which means that individual test items differ in their ability to measure dysfunction. Item-Response Theory (IRT)-based analyses are specifically designed to consider these item differences, and should be able to help us more precisely assess dysfunction.
Aim #1 of the proposed research is to use Item Response Theory (IRT)-based scoring to more precisely assess change in AD-related dysfunction. To do this we will extend our preliminary cross-sectional findings for this measure, the MMSE, and the CDR, to examine data longitudinally in Baylor College of Medicine's Alzheimer's Disease and Memory Disorder Center comprehensive dementia database. We anticipate that these data will confirm longitudinally that IRT analyses can provide more precise information about change in AD dysfunction than raw scores. If this is true, then IRT analyses has the potential not only to detect change in AD dysfunction more precisely than the standard raw score approach, but it could result in significant cost savings for AD basic research and clinical trials of AD medications.
Aim #2 is to quantify the number of participants needed to detect change that is gained by using an IRT-based scoring approach compared to a raw score approach. To fulfill this aim we will create a table for each measure (ADAS-cog, MMSE, CDR) and each possible different raw score change on these measures (1-point change, 2-point change, 3-point change, 4-point change, etc.). Then, we will determine the effect sizes associated with these changes using a raw score versus an IRT scoring approach. The IRT scoring approach, because of its greater sensitivity, should yield larger effect sizes (and hence require fewer participants) than the raw score approach. The findings from Aim #2 would be useful for many researchers, including those running clinical drug trials, who for instance might want to determine the precise number of participants needed to detect a certain sized change (3-point change) on one of these measures. These researchers could use these tables to easily compare the number of participants needed to detect different changes if IRT vs. raw scoring were used.

Public Health Relevance

While it is standard practice to assess Alzheimer's disease (AD) dysfunction using raw scores, there are major problems with this practice that limit how precisely one can measure dysfunction or changes in dysfunction that accompany AD. The overarching goal of the proposed research is to use a new scoring approach to improve our ability to precisely and efficiently measure dysfunction associated with AD and improves our ability to detect both individual and group change in AD basic research and clinical trials of AD medications. Because this new scoring system can more precisely detect change than a raw score system, fewer participants would be needed to run a clinical trial, for example, meaning that we would not only be able to detect change in dysfunction associated with the disease more precisely, but also potentially save money when conducting research.

Agency
National Institute of Health (NIH)
Institute
National Institute on Aging (NIA)
Type
Small Research Grants (R03)
Project #
1R03AG039663-01A1
Application #
8302649
Study Section
Adult Psychopathology and Disorders of Aging Study Section (APDA)
Program Officer
Silverberg, Nina B
Project Start
2012-04-01
Project End
2014-03-31
Budget Start
2012-04-01
Budget End
2014-03-31
Support Year
1
Fiscal Year
2012
Total Cost
$73,104
Indirect Cost
$23,104
Name
Texas A&M University
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
078592789
City
College Station
State
TX
Country
United States
Zip Code
77845