Whereas there have been rapid advances in developing devices for medical imaging to guide surgery, relatively little attention has been paid to the basic perceptual-motor and cognitive processes that mediate between the image and the surgical intervention. Achieving such an understanding and applying it to develop instruments and stimulators to improve patient safety and facilitate training are long-term goals of my research. Under ongoing NIBIB funding, my current psychophysical research, on the guidance of surgical procedures by ultrasound images, has shown how visualization techniques influence localization of targets and direct the surgeon's actions. The K99 will allow me to transition to a program that studies multimodal perceptually guided action, by incorporating haptics (active touch) in addition to vision. Under this support I will develop technical skills in simulating soft tissue with both force feedback, as realized in a state-of-the-art haptic device, and a virtual ultrasound display. My enhanced research capabilities will be developed under the supervision of Drs. Roberta Klatzky, Ralph Hollis, George Stetten (mentors), and Kenji Shimada (consultant).
My specific aims are: (1) to build a testbed for simulation of behavior of virtual materials that have both elastic and viscous properties, using both haptic and visual interfaces;(2) to use the simulations to determine how viscoelastic properties are perceived from haptic and visual cues, alone and in combination. The haptic simulation will be based on tensor-mass models, found highly effective in simulating viscoelastic behaviors. The visual simulation will show the visible effects of internal tissue deformation. Experiments will be conducted to (1) determine how force/torque cues arise from exploratory interactions and give rise to the perception of viscoelasticity;(2) to test the hypothesis that this percept is also visually derived from the spatiotemporal pattern of deformation in ultrasound-like displays;(3) to measure how haptic and visual cues are combined and weighted in a multi-modal context to co-determine perceived viscoelasticity. The results of my research will lead to an understanding of how physicians perceive and interact with soft tissue using their sense of touch, together with visual information from medical imaging. The research can foster the development of technologies for surgical training and assist the surgeon toward effective interaction.

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Career Transition Award (K99)
Project #
1K99EB008710-01A1
Application #
7661947
Study Section
Special Emphasis Panel (ZEB1-OSR-C (J1))
Program Officer
Peng, Grace
Project Start
2009-07-01
Project End
2011-06-30
Budget Start
2009-07-01
Budget End
2010-06-30
Support Year
1
Fiscal Year
2009
Total Cost
$84,186
Indirect Cost
Name
Carnegie-Mellon University
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
052184116
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213
Wu, Bing; Klatzky, Roberta L; Stetten, George D (2012) Mental visualization of objects from cross-sectional images. Cognition 123:33-49
Wu, Bing; Klatzky, Roberta L; Hollis, Ralph L (2011) Force, Torque and Stiffness: Interactions in Perceptual Discrimination. IEEE Trans Haptics PP:1