The overall goal is to determine how sensory information that evolves over time and space is integrated to inform perception and to control motor behavior. There are two principal aims.
The first aim i s to develop a comprehensive model of how somatosensory information is utilized in haptic sensing.
The second aim i s to determine how visually sensed motion is analyzed to predict a target's trajectory, in order to control interception. Haptic sensing will be assessed utilizing robotic arms that can create virtual contours by generating elastic force fields. Subjects will explore these contours with movements that involve the proximal arm and/or the fingers. Some experiments will involve standard psychophysical procedures (2-alternative forced choice); in other experiments subjects will report what they sensed by adjusting a visual display. The information available during limb movements in this task evolves serially over time and it must be stored in working memory. The perception of complex shapes may be generated from simpler primitives, such as straight lines or elliptical arcs. Positional (proprioceptive) and force (tactile) cues are known to be important in this task. The model that will be derived from the planned experiments will incorporate air of these factors. Interception tasks require hand-eye coordination and an extrapolation of a target's trajectory. The planned experiments will involve quasi-random target motions in two dimensions and measurements of hand and eye movements. Target motion will be displayed on a touch-sensitive monitor and subjects will quickly move their index finger along the surface of the monitor, starting from a stationary initial location, to intercept the target. The experiments will define the strategies used to intercept the target, the information that is extracted from target motion and the influence of online corrections of the hand's trajectory. Neurologically normal subjects will participate in this study. The results will provide benchmarks for quantitatively assessing deficits in patient populations and for assessing the effectiveness of rehabilitative strategies. Furthermore, since haptic and visual information informs perception and is used in motor control, the results will define the sensory information that is available in controlling limb trajectories. Providing sensory feedback would clearly enhance the efficacy of brain-machine interfaces and these results will provide new information about the normal utilization of sensory inflow in the control of movements.
|Tramper, Julian J; Flanders, Martha (2013) Predictive mechanisms in the control of contour following. Exp Brain Res 227:535-46|
|Mrotek, Leigh A (2013) Following and intercepting scribbles: interactions between eye and hand control. Exp Brain Res 227:161-74|
|Furuya, Shinichi; Flanders, Martha; Soechting, John F (2011) Hand kinematics of piano playing. J Neurophysiol 106:2849-64|
|Winges, Sara A; Soechting, John F (2011) Spatial and temporal aspects of cognitive influences on smooth pursuit. Exp Brain Res 211:27-36|
|Winges, Sara A; Eonta, Stephanie E; Soechting, John F (2010) Does temporal asynchrony affect multimodal curvature detection? Exp Brain Res 203:1-9|
|Furuya, Shinichi; Soechting, John F (2010) Role of auditory feedback in the control of successive keystrokes during piano playing. Exp Brain Res 204:223-37|
|Soechting, John F; Rao, Hrishikesh M; Juveli, John Z (2010) Incorporating prediction in models for two-dimensional smooth pursuit. PLoS One 5:e12574|
|Soechting, John F; Juveli, John Z; Rao, Hrishikesh M (2009) Models for the extrapolation of target motion for manual interception. J Neurophysiol 102:1491-502|
|Ehrich, Jonathan M; Flanders, Martha; Soechting, John F (2008) Factors Influencing Haptic Perception of Complex Shapes. IEEE Trans Haptics 1:19-26|
|Soechting, John F; Flanders, Martha (2008) Extrapolation of visual motion for manual interception. J Neurophysiol 99:2956-67|