Recent advances in experimental methods, such as microarrays, fMRI imaging, and protein spectroscopy, have vastly increased the size and scope of biomedical data analyses. This has challenged traditional statistical theory, which was mainly developed to deal with small """"""""one-at-a-time"""""""" problems. New statistical methodology is being invented, and the purpose here is to bring these developments to bear on massively parallel data sets such as those produced in microarray experiments. Two complementary aspects of statistical inference in large data sets will be attacked; identifying individually significant cases (e.g. genes), among hundreds or thousands of possibilities and identifying groups of cases that work together to influence disease etiology.
Showing the most recent 10 out of 15 publications