Recently there has been a great deal of work on formal models of machine learning such as the probably approximately correct (or PAC learning model. This model is a precise framework attempting to capture the notion of learning from examples. Recent progress in machine learning and statistical inference on paradigms such as the PAC model has provided fundamental results on the amount of data needed for function approximation in a completely nonparametric setting. The applicability of these paradigms is limited by the assumptions on the data gathering mechanisms and the performance criteria. Some of these assumptions will be relaxed to allow the extended learning paradigm to be applied to areas such as signal/image processing and geometric reconstruction. The approach is to place mild assumptions on the function classes while allowing more flexibility in the sampling and error criteria. Specifically, the extensions proposed are to allow deterministic sampling strategies, sampling over noncompact domains, and learning with respect to general performance criterion. The extended model will be applied to a variety of problems in signal processing and geometric reconstruction to provide information complexity results for some classical and new reconstruction/estimation problems. In the area of signal processing, the framework will be applied to problems dealing with tomographic image reconstruction, multiresolution signal processing, and classical sampling theorems. In the area of geometric reconstruction, applications to stochastic geometry and shape form probing problems. The approach will provide results on the fundamental capabilities and limitations of reconstruction as well as sample size bounds for these applications.