To ensure that the best services are being used to improve public health we need to know that those services are effective across broad populations. Increased emphasis on services that have been evaluated using randomized trials carried out in real-world settings (""""""""effectiveness"""""""") is one important step. But the subjects in trials are rarely representative of the target population of interest. A next step is to determine whether the results seen in a trial generalizes to the target population, as highlighted in recent government reports (NIMH, 1999;IOM 2006). The Mentored Quantitative Research Career Development Award (K25) will allow Dr. Elizabeth Stuart, a statistician, to develop a program of research to determine when and how results of effectiveness trials can be generalized. The initial focus will be on universal preventive interventions, with later application to other areas. The proposed training focuses on learning about 1) the theory underlying universal preventive interventions, particularly moderators of their effects, 2) the design and implementation of randomized trials, especially how subjects and contexts are selected and how they may differ from the target population, and 3) existing methods for combining results from multiple studies, such as meta-analysis and research synthesis. This training will be done at Johns Hopkins Bloomberg School of Public Health, in an excellent research environment with two prevention research centers: the NIMH/NIDA funded Center for Prevention and Early Intervention (PI: Dr.Nicholas lalongo), and the CDC funded Center for the Prevention of Youth Violence (PI: Dr. Philip Leaf). Ongoing evaluations at these two centers will be used to develop the methods. Dr. lalongo is the primary mentor for this work, and Dr. Leaf is a consultant. Dr. C. Hendricks Brown, an expert in randomized trials of preventive interventions, is a co- mentor.
The research aims will develop methods to 1) assess when generalization from a randomized trial to a target population is possible, and 2) estimate the effect in that population. This K25 will prepare the candidate to lead future work in this area, including handling variation in program implementation and developing guidelines for future trial design. The training will be integral to help the candidate develop appropriate methods. The resulting methods and results will have broad implications for public health, ensuring that the most effective mental health services are being widely implemented.
|Stuart, Elizabeth A; Bradshaw, Catherine P; Leaf, Philip J (2015) Assessing the generalizability of randomized trial results to target populations. Prev Sci 16:475-85|
|Stuart, Elizabeth A; Jo, Booil (2015) Assessing the sensitivity of methods for estimating principal causal effects. Stat Methods Med Res 24:657-74|
|Green, Kerry M; Stuart, Elizabeth A (2014) Examining moderation analyses in propensity score methods: application to depression and substance use. J Consult Clin Psychol 82:773-83|
|Leacy, Finbarr P; Stuart, Elizabeth A (2014) On the joint use of propensity and prognostic scores in estimation of the average treatment effect on the treated: a simulation study. Stat Med 33:3488-508|
|DuGoff, Eva H; Bekelman, Justin E; Stuart, Elizabeth A et al. (2014) Surgical quality is more than volume: the association between changing urologists and complications for patients with localized prostate cancer. Health Serv Res 49:1165-83|
|Dugoff, Eva H; Schuler, Megan; Stuart, Elizabeth A (2014) Generalizing observational study results: applying propensity score methods to complex surveys. Health Serv Res 49:284-303|
|Rudolph, Kara E; DÃaz, IvÃ¡n; Rosenblum, Michael et al. (2014) Estimating population treatment effects from a survey subsample. Am J Epidemiol 180:737-48|
|Olsen, Robert B; Orr, Larry L; Bell, Stephen H et al. (2013) External Validity in Policy Evaluations that Choose Sites Purposively. J Policy Anal Manage 32:107-121|
|Stuart, Elizabeth A; DuGoff, Eva; Abrams, Michael et al. (2013) Estimating causal effects in observational studies using Electronic Health Data: Challenges and (some) solutions. EGEMS (Wash DC) 1:|
|Liu, Weiwei; Kuramoto, S Janet; Stuart, Elizabeth A (2013) An introduction to sensitivity analysis for unobserved confounding in nonexperimental prevention research. Prev Sci 14:570-80|
Showing the most recent 10 out of 30 publications