The estimation of a large class of limited dependent variables (LDV) models, though possessing desirable theoretical properties, is infeasible by conventional methods because it requires the evaluation of multidimensional integrals. Examples are discrete choice models with several contemporaneously correlated alternatives and LDV models with temporally correlated errors. A recently proposed method (McFadden (1986)), the Method of Simulated Moments (MSM), was shown to make the estimation of certain discrete choice models feasible, by avoiding the need for numerically calculating multiple integrals. Hajivassiliou and McFadden (1987) extended the method and presented limited computational experiments on it. With current mainframe computer technology, however, it is not possible to investigate the performance of this novel method (MSM) against the theoretical benchmark of classical maximum likelihood estimation (MLE) which requires the intractable integrals. Given the nature of these estimation problems, the application of supercomputer vectorization techniques will afford a comprehensive set of experiments to compare the MSM against the other classical estimation methods. Moreover, the use of a supercomputer will make feasible the classical method of MLE for several of these models, which, in view of the integrals involved, is currently beyond the capacity of mainframe computers. This award will provide access to the requisite computing capacity at the Pittsburgh Center.