Problems with only partial or contaminated information arise in many disciplines: computer science, physics and chemistry, control theory, statistics, prediction and estimation, scientific and engineering computation, geophysics, decision theory, and finance. Furthermore, this information is often expensive to obtain. The goal of information-based complexity is to create a computational complexity theory for problems with partial, contaminated, and priced information, and to apply the results to solving specific problems in varied disciplines. The prime foci of this project are the following: (1) Breaking Intractability: Many problems in science, engineering, and even finance, deal with functions that have a large number, d, of variables. However, typical multivariate problems are intractable in the worst case deterministic setting. The only way to break intractability is to weaken the assurance by switching to another setting. Randomized, average case, and probabilistic settings continue to be studied. In particular, it has been proven that multivariate integration and approximation are both strongly tractable on the average. The proof is non- constructive; the problem of how to sample in d dimensions to achieve strong tractability is being addressed. (2) Software Development and Testing: Theoretical investigations have led to algorithms for important problems such as high dimensional integration. To make these methods widely available, software is being developed and tested. Since these methods contain a large amount of inherent parallelism, the implementation is being done on a workstation network. Testing is also planned on parallel computers. Real-world problems are used for testing. (3) Probabilistic Complexity of Piece-Wise Smooth Functions: Piece-wise smooth functions arise in diverse areas of science and technology such as computer vision, image and signal processing, scientific computation, meteorology, seismology, and to mography. Because of negative results of the worst-case setting these problems are being studied in the probabilistic setting. New algorithms for integration, approximation, and detection of singular points are being studied and implemented.