The application's overall goal is twofold: (1) to investigate the representation of animacy in the human object recognition pathway, and (2) to develop improved methods for analyzing functional brain images using multivariate measures of similarity among cognitive states. The following hypothesis will be tested: Pattern-based measurements of neural similarity among cognitive states associated with viewing animate objects will reveal the structure of representation in different parts of the brain with respect to various types of information.
Specific Aim 1 : To identify the underlying dimensions that form the basis of neural similarity structure for patterns of response to a wide variety of animate stimuli focusing on three representational domains: visual form (both within-species and between-species), biological motion, and threat.
Specific Aim 2 : To test whether the similarity structure of the neural representations of animate entities varies across brain areas. Specifically, to evaluate the relative contributions of anatomically distinct regions hypothesized to represent three types of information: inferior-occipital and ventral temporal cortex (visual form), superior temporal sulcus (biological motion), and the amygdala and ventromedial prefrontal cortex (threat).
Specific Aim 3 : To correlate neural similarity-calculated from various brain regions-with behavioral similarity judged by participants along various dimensions. To establish direct links between neural representation and psychological intuitions about similarity structure with respect to the three representational domains: visual form, biological motion, and threat.
Specific aim 4 : To develop a method for calculating neural similarity that exploits the spatial information among voxels. FMRl and behavioral data will be collected in three experiments tailored to meet these specific aims. Data analysis involves a combination of state-of-the-art techniques for image preprocessing, pattern-classification, information-based brain mapping, multivariate measures of neural similarity, cluster analyses, and multidimensional scaling. Relevance to public health: This research will shed light on a major pathway in the brain that supports the representation of animacy-a core component of the social brain. Understanding of this system will provide a cornerstone for research in social neuroscience and translational research relevant to a broad range of disorders in which impaired social cognition is a major feature. The growth of multivariate techniques in human brain imaging has greatly increased the power of brain imaging to reveal the structure of neural representation;in particular, the development of similarity-based approaches promises to provide us with a powerful tool for linking the structure of neural representation with models of cognitive representation.
|Van Uden, Cara E; Nastase, Samuel A; Connolly, Andrew C et al. (2018) Modeling Semantic Encoding in a Common Neural Representational Space. Front Neurosci 12:437|
|Nastase, Samuel A; Connolly, Andrew C; Oosterhof, Nikolaas N et al. (2017) Attention Selectively Reshapes the Geometry of Distributed Semantic Representation. Cereb Cortex 27:4277-4291|
|Connolly, Andrew C; Sha, Long; Guntupalli, J Swaroop et al. (2016) How the Human Brain Represents Perceived Dangerousness or ""Predacity"" of Animals. J Neurosci 36:5373-84|
|Sha, Long; Haxby, James V; Abdi, Herve et al. (2015) The animacy continuum in the human ventral vision pathway. J Cogn Neurosci 27:665-78|
|Raizada, Rajeev D S; Connolly, Andrew C (2012) What makes different people's representations alike: neural similarity space solves the problem of across-subject fMRI decoding. J Cogn Neurosci 24:868-77|
|Vollrath, Margarete E; Tonstad, Serena; Rothbart, Mary K et al. (2011) Infant temperament is associated with potentially obesogenic diet at 18 months. Int J Pediatr Obes 6:e408-14|
|Haxby, James V; Guntupalli, J Swaroop; Connolly, Andrew C et al. (2011) A common, high-dimensional model of the representational space in human ventral temporal cortex. Neuron 72:404-16|