An important recent advance in prediction has been the discovery that getting multiple versions of a predictor and then combining these can lead to dramatic reductions in error rates. Examples are bagging (Breiman) and Adaboost (Freund and Schapire). What is not clear is the mechanism behind these reductions, and this is a main problem in our research. Work by Shang and Breiman show that it is possible to grow binary trees that have considerably lower error rates than trees grown the standard way, i.e. CART or C4.5. This method uses the training set to estimate the input-output distribution and then uses this distribution to grow the tree. More research is necessary to make it robust and efficient. Currently, complex medical studies are analyzed using a class of simple models that depend on estimating parameters in a single linear combination of the input variables. Another research direction is the adaptation of the more general predictive methods developed in machine learning to the analysis of medical data. This research will result in improved prediction methods in combining many predictors, in growing a single binary tree, and in the analysis of medical outcomes.