Humans interact with their environment in countless ways and can switch seamlessly between activities. Even for seemingly simple tasks, a variety of sensory inputs are integrated to create a motor plan to complete a task. Take the example of picking up a glass. Visual, tactile, and proprioceptive inputs provide cues about the position and weight of the object as well as limb state. Additional sensory and contextual inputs can also influence the movement. For example, a person might pick up a glass differently if she is intending to take a drink, versus clear off a table. She may grasp a glass more carefully if it is very full to avoid spilling it requiring a change in the motor control scheme. Our central hypothesis is that sensory and motor aspects of task context systematically modulate the neural representation of action as well as conscious sensory perception. We will manipulate aspects of task context to study sensorimotor processing during grasping and object exploration tasks. Intracortical recordings from human motor cortex will be used to study the neural representation of action for various manipulations of task goals, motor control schemes, and expected and actual sensory inputs. Sensory input will be provided using both intact perception (vision) and intracortical microstimulation of somatosensory cortex to impart cutaneous sensations. Tasks involving object grasping and exploration will be performed with a brain-computer interface (BCI) and robotic arm and hand, which allows us to precisely control the mapping between neural activity and movement as well as the somatosensory inputs. Previous work supports the idea that neural activity generated during attempted movements in people with spinal cord injury exhibits the same relationship to movement variables as an able-bodied subject. Similarly, stimulation of somatosensory cortex in a person with chronic tetraplegia generates localized sensations that follow the expected spatial organization. Population-level analysis will be applied to the neuronal recordings to reveal patterns of neural activity generated by each facet of context to gain a better understanding of how goals, motor control schemes, and expected and actual sensory inputs influence the neural representation of action. We will study how vision and tactile sensation are integrated to drive conscious perception during object exploration. Conscious perception will be documented via verbal report as well as psychophysical experiments. This project will address basic- science questions about the influence of context on sensorimotor processing. The findings of this study could broadly impact neurorehabilitation approaches, while also improving the generalizability and performance of BCIs for restoring upper limb function.
Human intracortical recording and stimulation technology enables the study sensorimotor processing during complex and varied behavior. We will manipulate task context, specifically task goal, motor control scheme, and expected and actual sensory feedback during object manipulation tasks using an intracortical brain- computer interface that allows for control over sensory inputs and action. A greater understanding of sensorimotor processing could lead to the development of new technology or therapies to assist people with motor or sensory deficits.