This project develops a novel extension to a computational theory of visual motion perception. The overall goal of the theory is to understand how humans perceive motion in their natural environment; in other words, to understand what goes on inside a person's brain when he or she sees birds flying, snowflakes falling, or other complex patterns of motion that occur in the natural visual world. Building on recent work modeling the appearance of a limited set of motion flow patterns, the present project explores a probabilistic approach, based on Bayesian Ideal Observers, to the representation, learning, and modeling of natural visual, and the use of learned probabilistic models in turn to synthesize pseudo-realistic stimuli. Pseudo-realistic stimuli are a novel class of visual stimuli, which have the appearance of natural visual stimuli but can be quantified and varied in a precisely controlled manner. Stimuli of this type have never been used before and offer the exciting prospect of experimentally understanding the behavior of visual systems when exposed to realistic but controlled stimuli. It is anticipated that understanding how the human visual system processes motion will enable development of more robust and powerful computer vision algorithms which will have many technological applications.