Nuclear Magnetic Resonance (NMR) spectroscopy is uniquely important to almost every modern scientific discipline. It provides definitive structural data for simple molecules as well as complex biologically important systems. Unlike other spectroscopies, however, NMR suffers from conspicuously low sensitivity--partly because of the low energy of the (radiofrequency) photons involved, but more importantly from the small polarizations of nuclear spin systems. The Boltzmann factor appropriate to typical magnetic moments in a laboratory Zeeman field at room temperature leads to a fractional excess population in the absorbing state on the order of 0.0001 to 0.00001. The history of NMR is replete with technical advances which attack this problem by increasing the applied magnetic field, but the current state of the physics and engineering underlying magnet design offer little hope for any further large improvements in the near future. It is the goal of the research in Professor Waugh's laboratory to obtain significant improvements (10,000 to 100,000) in signal by lowering the temperature. The gains to be made are potentially remarkable: at high temperatures the polarization and hence the voltage sensitivity grow directly as 1/T, suggesting potentially unlimited gains through arbitrarily close approaches to T=O. This is a high risk research program as it is technically very difficult to operate NMR equipment in the millikelvin range. However, the payoff to all NMR justifies the effort.