Modern neuroimaging technology has brought developmental neuroscience to the threshold of an era of unprecedented breakthroughs. With high-resolution brain scans acquired in increasingly large samples of children and adults, investigators are mapping both the normal development of the brain over the lifespan, and the developmental abnormalities that are associated with psychiatric disorders. These studies typically entail fitting models at tens of thousands of brain locations, and due in part to this hih computational load, investigators have tended to settle for suboptimal methods. A prominent example is fitting a polynomial model for the development of cortical thickness with age. Nonparametric smoothing offers well-known advantages over polynomial dependence when fitting a single model, but to date, smoothing methodology has not found application to settings in which many thousands of models are fitted concurrently. More broadly, there is a critical need for state-of-the-art statistical methods to tackle the massive neuroimaging data sets generated by studies of the developing brain. The objective of this proposal is to provide a comprehensive toolkit for statistical analyses of normal and abnormal brain development. The investigators have begun to develop a number of innovative techniques toward this end, and have forged a strong multi-institution collaboration ideally suited to meeting the many challenges that lie ahead. The first specific aim focuses on computationally feasible estimation of large numbers of curves representing the mean, or a given percentile, of the distribution of a quantity of interest, conditional on a predictor such as age.
The second aim encompasses several hypothesis testing methods that are particularly relevant to neuroimaging, including tests of polynomial null hypotheses against smooth alternatives, as well as tests for group differences in developmental trajectories and other complex outcomes.
The third aim, originally motivated by the need for succinct visual representations of spline fits at each point in a grid of brain locations, is to develop novel methods for clustering large amounts of functional data. The proposed methods will be applied to data acquired by multiple imaging modalities, including resting-state functional magnetic resonance imaging, diffusion tensor imaging, and cortical thickness measurement. Most of the methods proposed here are applicable to any imaging modality, and many can be applied outside the field of neuroimaging. Thus the proposed research will have a significant impact both on statistical methodology and on neuroscience, psychiatry, and other disciplines.
Brain imaging has emerged as a critical tool for the studying how the human brain normally develops, and how some psychiatric disorders may reflect abnormalities of development. Increasingly massive quantities of data are being generated by brain imaging studies, and standard data analysis techniques are not equipped to extract scientifically relevant information from these data. The proposed work will develop new statistical methods for analyzing such data sets, which will advance scientific understanding of the brain and may ultimately lead to improved treatments for mental illness.
Showing the most recent 10 out of 41 publications