This is the first year funding of a three continuing award IRI-9210423. This project seeks to contribute to automatic systems capable of reasoning about a robotic environment based on imperfect sensory data and recovering errors of robot motions caused by intrinsic and inevitable system uncertainties. Specifically, the project studies how to acquire sufficiently accurate information about the spatial state of an object in the presence of sensing and modeling uncertainties and develops practical strategies for automatic assembly motions based on the spatial states of the parts involved. The project particularly focuses on, in the presence of uncertainties, the automatic recognition of unintended collision/contacts between a held part (by a manipulator) and the environment by combining position/orientation sensing, vision sensing, and force/moment sensing, and the automatic generation of motion plans based on the obtained contact information, as remedy plans, to lead the part to its intended goal from the unintended collision/contact. The project expects to result in the implementation of a prototype system capable of such automatic recognition of contacts and generation of contact-based plans, as well as proper execution of the plans, all in the presence of uncertainties. //