The overall goal of this project is to develop novel and efficient statistical methods for simultaneous inference of group- and subject-level functional connectivity networks among adolescents with high suicidal risk. Functional magnetic resonance imaging (fMRI) has been extensively used to detect neural activities, which involves integrated networks consisting of multiple brain regions. To understand how normal brains work and how diseases disrupt normal functioning, it's essential to figure out how different parts of brain are coordinated, in so-called ?functional connectivity networks?. In particular, this proposal seeks to investigate the functional connectivity pattern among adolescents suffering from depression from a real-time fMRI dataset recently collected at the University of Minnesota. Current studies of functional connectivity networks primarily focus on the single subject analyses, which suffers from issues regarding the reliability of fMRI readings, does not borrow strength from fMRI data of other subjects who share the same clinical characteristics, and fails to provide group-level information that might shed light on diagnostic or treatment strategies for patients with a certain disease. Other studies make group-level inference of functional connectivity network by assuming a common network structure shared by all patients in the group, ignoring the differences between subjects and variabilities between fMRI sessions. In this application, we propose to develop novel and efficient statistical methods that can simultaneously learn the group- and subject-level network structures of functional connectivity from mutli- subject fMRI data.
In Specific Aim 1, we will develop a novel and efficient method, the random covariance model, for simultaneous inference of group- and subject-level functional connectivity networks.
In Specific Aim 2, we will extend the random covariance model to a Bayesian model that incorporates a mean structure with fixed and random effects and is expected to be robust to outliers having disparate network structures. Both methods will be evaluated using extensive simulation studies and be applied to an existing fMRI dataset collected recently at the University of Minnesota from depressed adolescents with a group of healthy controls. Our proposed methods will account for the similarity between subjects having the same clinic characteristics in their interregional neural interactions, identify the group-level functional connectivity network that is shared by all subjects in the tested group, and at the same time infer multiple subject-level functional connectivity networks by allowing for differences between subjects. The method achieves group- and subject- level network inference simultaneously by utilizing a penalization term on the Kullback-Leibler (KL) divergence between the subject- and group-level covariance matrices, which makes it computationally efficient for multi- subject fMRI data analysis. More advantageously, the model can be looked at as a random covariance model that mimics the random effect model for mean structures in a linear regression.

Public Health Relevance

The overall goal of this project is to develop novel and efficient statistical methods for bi-level graphical model inference and apply them to multi-subject fMRI data for simultaneous inference of group- and subject-level functional connectivity networks among adolescents who suffer from depression.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Small Research Grants (R03)
Project #
1R03MH115300-01
Application #
9432386
Study Section
Biostatistical Methods and Research Design Study Section (BMRD)
Program Officer
Murphy, Eric Rousseau
Project Start
2017-09-27
Project End
2019-08-31
Budget Start
2017-09-27
Budget End
2018-08-31
Support Year
1
Fiscal Year
2017
Total Cost
Indirect Cost
Name
University of Minnesota Twin Cities
Department
Biostatistics & Other Math Sci
Type
Schools of Public Health
DUNS #
555917996
City
Minneapolis
State
MN
Country
United States
Zip Code
55455