The Data Integration Core will support RCE projects and other cores by providing computational resources to acquire, process, archive, integrate, analyze, query and share experimental data. Most of the effort will be directed toward data generated by high-throughput technologies that include whole genome sequencing, expression microarrays, mass spectrometry proteomics, large-scale phenotyping arrays, transposon mutant libraries and small molecule screening assays. Research projects and cores will be supported by the development of database and interface tools. What is unique about this core is that, by fostering close collaboration between computer experts and those who perform laboratory experiments, computational support for biomedical research can be provided in a focused, timely and creative manner as experimental needs arise. From another point of view, this core also fosters collaboration between researchers within and outside of the RCE by constructing web-based data sharing tools.
The specific aims of this core can be characterized as providing computational resources to experimenters, providing individual repositories and access tools for each of the experimental data types, and adding value to these experiments by integrating them across heterogeneous data types and enabling hypothesis driven metadata analyses.
Analyzing the pathogenesis and host-pathogen interactions of complex biological systems is now achieved using large-scale experimental methods that require computer technologies to store, display and analyze vast amounts of information. Modern computational technologies also now permit Internet-accessible data sharing and collaboration in research. Supporting these activities is the purpose of the Data Integration Core.
Showing the most recent 10 out of 247 publications