This research program is focused on basic theoretical aspects of ego-moving observer (e.g. a camera), a huge amount of visual information is captured. The aim of this project is to extract relevant visual information from a sequence of images and to use it as part of feedback control loops. In other words, to characterize changing visual data using only a few variables which are essential for the dynamic response of the observer relative to the scene, and use these to control the motion of the observer. The vision-based algorithms will be integrated in a real-time closed loop control system that has been developed at FAU. In addition, the Robot Systems Division at the National Institute of Standards and Technology (NIST) has agreed to provide FAU with all the available hardware and software that are necessary for the successful completion of this project. Based on previous and on-going research work, there is reason to believe that a new, promising direction for visual motion with sound theoretical basis is emerging.