This project will advance understanding of mechanics and controls that will enable automated tube thoracostomy, a simple definitive treatment known to be effective in relieving tension pneumothorax. In this procedure, a tube is manually inserted into the chest cavity by a trained physician. Effective placement of the tube limits air pressure in the chest, preventing the buildup of life-threatening tension of major organs. Development of an automated tube thoracostomy system would enable field deployment for civilian and military applications. However, automating the procedure requires advancing our understanding of how to estimate and control forces induced during the tube thoracostomy.

This project will develop an understanding of forces induced by rigid and flexible chest tubes interacting with the chest wall environment during automated insertion procedures. This basis is key to advancing technologies that will enable autonomous insertion by an integrated self-contained system that incorporates visualization (by ultrasound) for process monitoring, and mechanical insertion by a controlled robotic system. Methods for feature extraction from ultrasound images will enable guiding/placing rigid and flexible tubes safely into the chest cavity using visual servoing techniques. A control system that can coordinate visualization and actuation to maximize safety and effectiveness of the procedure will be developed.

This project is part of an ongoing collaboration with trauma surgeons at the University of Texas Health Science Center (Houston) who are working to advance field-deployable systems for emergency medicine. These systems would be applicable to both civilian and military environments, and would require no special training on the part of the operator/medic. It is expected that insight gained in this project will impact development of other devices/systems that can accomplish related medical procedures of higher complexity.

Solving the problems posed showcases bridges that can be made to the field of medicine by mechatronics and controls engineering. To advance this cause, a demonstrative teaching module will be developed to illustrate how low-cost vision and software/control solutions can be used by K-12 and undergraduate students to accomplish robotic tasks, particularly to encourage hands-on creativity and experimentation. This module will be used in undergraduate teaching and disseminated through visits with students, at meetings of professional societies, and via an on-line accessible teaching website.

Project Start
Project End
Budget Start
2007-09-01
Budget End
2010-08-31
Support Year
Fiscal Year
2007
Total Cost
$139,767
Indirect Cost
Name
University of Texas Austin
Department
Type
DUNS #
City
Austin
State
TX
Country
United States
Zip Code
78712