Experiments at the Large Hadron Collider (LHC) in the CERN Laboratory will produce data of volume and complexity far exceeding that from previous high energy physics (HEP) experiments. A world-wide GRID of computing resources are currently being assembled to meet the computational challenge. To realize the discovery potential of the LHC data, physicists in large collaborations must have efficient and widely-distributed access to the data subsets that they need.
Workflow is the operational aspect of a work procedure: how jobs are structured, who performs them, what is their ordering, how they are synchronized, how information flows to support the tasks, and task tracking.
The field of scientific workflow management is an active area of research and development in the computer sciences (CS) The immediate application of the software discussed here will be the Compact Muon Solenoid (CMS) experiment at the LHC. But by addressing the workflow requirements of CMS analysis, the group will confront problems with broad applicability in data intensive science in a complex distributed computational environment.
The proposal includes participation in two extensive programs of outreach to local K-12 schools, including in-school programs for students and out-of-school workshops for teachers, educators, and high school students, and the local public, as well as programs for undergraduates.