In order to use our hands to manipulate objects, the nervous system must be able to coordinate and interpret a large amount of disparate sensory information about the position of our hands, the arrangement and positioning of the fingers, and what each is contacting. The computations are complicated by the fact that the information from the two hands must be interpreted relative to their spatial position. For example, the left and right index fingers occupy adjacent spaces when the hands are held together but the fingers can occupy vastly different spaces when the arms are spread apart. In these different situations, we can conceivably rely on our sense of vision to help interpret our sense of touch. Vision can provide information about where our hands are in space as well as what objects we touch and manipulate. Despite the obvious importance of vision for interpreting touch information from the hands, the specific ways in which the visual influence occurs remains poorly understood. This research is aimed at investigating the effects of simple and complex visual cues on touch information from the two hands in bimanual tasks. Understanding bimanual touch processing will facilitate effective nonverbal physical communication between people and intelligent machines and it will support the development of highly dexterous robotic systems, advanced neuroprosthetics, and sensorimotor rehabilitation strategies. Establishing how vision influences our perception of our body in real and virtual environments has clear implications for society?s use of immersive and interactive technologies. The project will also promote the participation and education of young women in neuroscience and related STEM fields.

This research program uses rigorous psychophysics, virtual reality technologies, and computational modeling to elucidate the effects of simple and complex visuospatial cues on the perception of bimanual touch. Behavioral experiments will be performed to quantify the effects of simple light flashes on tactile perception and learning. Neural network models will be fitted to the behavioral data to make inferences about how the nervous system mediates visuotactile and spatial interactions and how neural circuits may be modified through learning. The effects of complex visual cues on proprioception and bimanual touch will also be examined by performing psychophysics experiments in virtual reality. Specifically, visual proprioception cues and virtual object information will be altered in a virtual reality design, allowing the investigator to test hypotheses regarding multisensory integration for proprioception and causal inference processing. Collectively, the research program will yield novel insight into how visual information modulates the perception of tactile cues experienced over the two hands.

Co-funded by the M3X (Engineering) and PAC (Social, Behavioral, and Economic Sciences) Programs

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-08-01
Budget End
2023-07-31
Support Year
Fiscal Year
2020
Total Cost
$379,229
Indirect Cost
Name
Baylor College of Medicine
Department
Type
DUNS #
City
Houston
State
TX
Country
United States
Zip Code
77030