Abstract Kovera 9711225 Although expert evidence is common in civil litigation, little is known about how jurors evaluate social scientific evidence. Even less is known about the ability of legal professionals to make competent judgments about the scientific evidence they are likely to confront. Methodological reasoning research suggests hat people can be trained to reason in a more sophisticated manner, but this research has not investigated which factors decision makers use when they find themselves needing to reason about the quality of scientific evidence. Two experiments are conducted to examine how judges and attorneys evaluate scientific evidence. In the first, judges read a detailed description of the direct testimony and cross-examination of a plaintiff's expert in a sexual harassment cases. The expert testifies about a study she conducted to address a material fact at issue. Two characteristics of the evidence are manipulated: the scientific community's general acceptance of the research (a heuristic cue) and the quality of the social scientific evidence (e.g., whether the study included appropriate control groups, whether there is a confound in the experimental design, or whether experimenter bias is present in the expert testimony. Judges evaluate the scientific validity and admissibility of this social fact evidence. In the second experiment, attorneys read the description of the expert's direct testimony. Attorneys are asked to evaluate the scientific validity of the research and indicate whether they would file a motion to bar the expert's testimony at trial and if so, on what grounds. The results of this research will advance cognitive psychological theory about the processes used to evaluate scientific evidence in a forensic context. It may suggest the type of training that would assist legal professionals in evaluating such evidence, as well. ***