This research will investigate people's perceptions of multidimensional auditory stimuli. Examples of such stimuli include simple sounds, like tones, that can be made to vary in both frequency and intensity and speech sounds, which are inherently complex. The principal research question is whether or not the components of such sounds are perceived independently, and if not, what kinds of interactions take place. Experiments will use speech and nonspeech sounds that are difficult to distinguish. The presence of perceptual interaction can be diagnosed by comparing people's ability to discriminate different combinations of sounds with the predictions of a quantitative theory. Several series of experiments will be conducted. The first series will test whether observed interactions are sensory effects, or arise from the observer using less-than-ideal decision processes. The second will measure whether interactions change with experience and whether such improvement is due to a change in the decision rules the observers use. The third will investigate whether listeners find some auditory dimensions more available than others in analyzing complex sounds. The fourth will compare interactions with hard-to-discriminate stimuli and with clearly audible sounds. The results of these experiments will shed light on the problem of designing informative auditory displays. Multidimensional displays are known to be optimal for conveying complex information, but only if the dimensions are independent; determining whether dimensions are independent, or become independent with training, is a goal of this research.