The overall goal is to determine how sensory information that evolves over time and space is integrated to inform perception and to control motor behavior. There are two principal aims.
The first aim i s to develop a comprehensive model of how somatosensory information is utilized in haptic sensing.
The second aim i s to determine how visually sensed motion is analyzed to predict a target's trajectory, in order to control interception. Haptic sensing will be assessed utilizing robotic arms that can create virtual contours by generating elastic force fields. Subjects will explore these contours with movements that involve the proximal arm and/or the fingers. Some experiments will involve standard psychophysical procedures (2-alternative forced choice);in other experiments subjects will report what they sensed by adjusting a visual display. The information available during limb movements in this task evolves serially over time and it must be stored in working memory. The perception of complex shapes may be generated from simpler primitives, such as straight lines or elliptical arcs. Positional (proprioceptive) and force (tactile) cues are known to be important in this task and when vision is available, these cues are known to be combined with visual information, as well. The model that will be derived from the planned experiments will incorporate all of these factors. Interception tasks require hand-eye coordination and an extrapolation of a target's trajectory. The planned experiments will involve quasi-random target motions in two dimensions and measurements of hand and eye movements. Target motion will be displayed on a touch-sensitive monitor and subjects will quickly move their index finger along the surface of the monitor, starting from a stationary initial location, to intercept the target. The experiments will define the strategies used to intercept the target, the information that is extracted from target motion and the influence of online corrections of the hand's trajectory. Neurologically normal subjects will participate in this study. The results will provide benchmarks for quantitatively assessing deficits in patient populations and for assessing the effectiveness of rehabilitative strategies. Furthermore, since haptic and visual information informs perception and is used in motor control, the results will define the sensory information that is available in controlling limb trajectories. Providing sensory feedback would clearly enhance the efficacy of brain-machine interfaces and these results will provide new information about the normal utilization of sensory inflow in the control of movements.

Public Health Relevance

The overall goal is to determine how sensory information that evolves over time and space is integrated to inform perception and to control motor behavior. The results will provide benchmarks for quantitatively assessing deficits in patient populations and for assessing the effectiveness of rehabilitative strategies and they will define the sensory information that is available in controlling limb trajectories. Providing such sensory feedback would clearly enhance the efficacy of brain-machine interfaces.

Agency
National Institute of Health (NIH)
Institute
National Institute of Neurological Disorders and Stroke (NINDS)
Type
Research Project (R01)
Project #
5R01NS015018-29
Application #
8091221
Study Section
Motor Function, Speech and Rehabilitation Study Section (MFSR)
Program Officer
Chen, Daofen
Project Start
1979-01-01
Project End
2013-06-30
Budget Start
2011-07-01
Budget End
2013-06-30
Support Year
29
Fiscal Year
2011
Total Cost
$258,965
Indirect Cost
Name
University of Minnesota Twin Cities
Department
Neurosciences
Type
Schools of Medicine
DUNS #
555917996
City
Minneapolis
State
MN
Country
United States
Zip Code
55455
Tramper, Julian J; Flanders, Martha (2013) Predictive mechanisms in the control of contour following. Exp Brain Res 227:535-46
Mrotek, Leigh A (2013) Following and intercepting scribbles: interactions between eye and hand control. Exp Brain Res 227:161-74
Furuya, Shinichi; Soechting, John F (2012) Speed invariance of independent control of finger movements in pianists. J Neurophysiol 108:2060-8
Furuya, Shinichi; Flanders, Martha; Soechting, John F (2011) Hand kinematics of piano playing. J Neurophysiol 106:2849-64
Soechting, John F; Flanders, Martha (2011) Multiple Factors Underlying Haptic Perception of Length and Orientation. IEEE Trans Haptics :263-272
Winges, Sara A; Soechting, John F (2011) Spatial and temporal aspects of cognitive influences on smooth pursuit. Exp Brain Res 211:27-36
Winges, Sara A; Eonta, Stephanie E; Soechting, John F (2010) Does temporal asynchrony affect multimodal curvature detection? Exp Brain Res 203:1-9
Furuya, Shinichi; Soechting, John F (2010) Role of auditory feedback in the control of successive keystrokes during piano playing. Exp Brain Res 204:223-37
Soechting, John F; Rao, Hrishikesh M; Juveli, John Z (2010) Incorporating prediction in models for two-dimensional smooth pursuit. PLoS One 5:e12574
Soechting, John F; Juveli, John Z; Rao, Hrishikesh M (2009) Models for the extrapolation of target motion for manual interception. J Neurophysiol 102:1491-502

Showing the most recent 10 out of 33 publications