A Bayesian formulation of optimization and search problems that are distinguished by very lax assumptions is adopted in this research. The optimality criterion for algorithms is the minimization of dispersion of the conditional probabilities with respect to the observations. The research addresses the theoretical issues of consistency of optimal myopic algorithms, characterizations of algorithms, and the development of efficient algorithms that are not sensitive to probabilistic assumptions. Problems of optimization and search with partial information abound. A common situation is to start with minimal information about the function to be optimized or the object to be found and then make observations that improve our information. An efficient choice of observations is a difficult and important problem since some choices may be much more informative than others. While problems that fit this description are commonplace, there is very little theory to guide in planning sensible strategies since the strong assumptions needed for most optimization or search methods can not typically be justified.