Comparative effectiveness research (CER) relies fundamentally on accurate and timely assessment of the benefits and risks of different treatment options. Empirical evidence suggests that a median of 35% of efficacy and 50% of safety outcomes per parallel group trials were incompletely reported, and statistically significant outcomes had a higher likelihood of being fully reported compared to non-significant outcomes, both for efficacy and safety. Such a bias is referred to as outcome reporting bias (ORB), i.e., ?the selective reporting of some outcomes but not others, depending on the nature and direction of the results (i.e., missing certain outcomes).? Selective reporting can invalidate results from meta-analyses. As acknowledged in the Cochrane handbook ?Statistical methods to detect within-study selective reporting (i.e., outcome-reporting bias) are, as yet, not well developed? (chapter 8.14.2, version 5.0.2), there is a critical need to develop methods specifically accounting for ORB. In response to PA-16-160, the overall goal of this proposal is to develop, test and evaluate new statistical methods and user-friendly software to account for ORB in multivariate and network meta-analyses. In this proposal, we will focus on: (1) To propose and evaluate new methods for quantifying the evidence of ORB, to adjusting for ORB, and to develop a procedure of sensitivity analysis under ORB in multivariate meta-analysis. (2) To generalize the methods in Aim 1 to network meta-analyses (where more than 2 treatments are compared simultaneously), and to propose methods to evaluate the evidence consistency. And (3) To develop publicly available, user-friendly and well-documented software and apply the proposed methods to research data sets. We will use carefully designed simulation studies to investigate the performance of the proposed methods, apply the proposed methods to multiple existing databases, and develop statistical software for wider research communities. We propose to perform empirical assessment of the strengths and weaknesses of these methods through carefully designed simulation studies and, more importantly, applications to (network) meta-analyses of clinical trials with multivariate outcomes. Completion of these three aims in this proposal will directly benefit the CER program by providing state-of-the art methods implemented in user-friendly R package that will be made freely available to the public. This has the potential to catalyze the development of many new methods, amplifying the impact of our project.
Systematic reviews of randomized controlled trials are key in the practice of evidence based medicine. Proper conducting meta-analysis is critically important to ensure the quality of systematic reviews. However, it has been increasingly recognized that selective reporting of outcomes is highly prevalent in randomized controlled trials, which can invalidate results from meta-analysis. This issue has been understudied in the literature. In this application, we aim to tackle this important and much-needed research area by proposing a general framework to quantify, adjust and evaluate (through sensitivity analysis) the impact of outcome reporting bias in both multiple outcome meta-analysis and multiple outcome network meta-analysis.
|Huang, Jing; Du, Jingcheng; Duan, Rui et al. (2018) Characterization of the Differential Adverse Event Rates by Race/Ethnicity Groups for HPV Vaccine by Integrating Data From Different Sources. Front Pharmacol 9:539|
|Hong, Chuan; D Riley, Richard; Chen, Yong (2018) An improved method for bivariate meta-analysis when within-study correlations are unknown. Res Synth Methods 9:73-88|
|Jiang, Ying; Weng, Ruihui; Zhang, Yuefeng et al. (2018) The performance of rapid plasma reagin (RPR) titer in HIV-negative general paresis after neurosyphilis therapy. BMC Infect Dis 18:144|
|Huang, Jing; Duan, Rui; Hubbard, Rebecca A et al. (2017) PIE: A prior knowledge guided integrated likelihood estimation method for bias reduction in association studies using electronic health records data. J Am Med Inform Assoc :|
|Hong, Chuan; Ning, Yang; Wang, Shuang et al. (2017) PLEMT: A NOVEL PSEUDOLIKELIHOOD BASED EM TEST FOR HOMOGENEITY IN GENERALIZED EXPONENTIAL TILT MIXTURE MODELS. J Am Stat Assoc 112:1393-1404|
|Cai, Yi; Du, Jingcheng; Huang, Jing et al. (2017) A signal detection method for temporal variation of adverse effect with vaccine adverse event reporting system data. BMC Med Inform Decis Mak 17:76|