This Project focuses on (1) The development of methodology for incorporating """"""""cutting-edge"""""""" research findings into future risk assessments; (2) The development of methods for designing studies to improve risk estimates, especially when mechanistic data is involved; (3) The development of methods for the evaluation of exposure, dose-response shape and potency; (4) The development of methods for evaluating mixtures when multiple mechanisms are involved; (5) The development of methods which harmonize cancer and non-cancer health risk assessments; (6) Direct engagement of the regulatory community through expert panels, peer review and collaborative research; (7) Practical improvement of stochastic processes through careful linkage of theoretical developments with computational methods that are accurate and convenient; (8) Collaboration with research groups within the NIEHS and research groups doing similar work to improve the biological understanding of disease incidence; (9) Iterative improvement of the biological basis for disease incidence models through a process of hypothesis testing and laboratory research; (10) Linkage of disease incidence models to toxicokinetics models in a scientifically credible manner; (11) Use of the broadest array of data in both the development of the model and its application; (12) Support of the National Toxicology Program. We developed a new approach using microarray data to bring clarity into understanding the dynamics of gene networks relating to cell function. A quantitative, statistically sound methodology for the analysis of gene regulatory networks using gene expression data sets was developed. The method is based on Bayesian networks which formalize the direct quantification of gene-expression networks. Through computer simulations, we are able to show how to determine the probability of identifying the correct quantitative relationships in a gene expression network and the probability of identifying the correct network among multiple possibilities. The method was extended to encompass model selection theory in Bayesian regression to find gene network structures suitable for given datasets. By introducing rigorous statistical methods into the estimation problem for gene interaction networks, important issues such as the statistical power for identifying the correct model and the uncertainty in model parameters can be easily assessed. Finally, computer simulations presented demonstrate that the proposed method is capable of extracting knowledge from datasets with a limited number of small replicates as is the current case for most microarray studies available today. This lack of experimental replication is a common issue when analyzing expression data. The above methods have been applied to data on cycle control in mammalian cells and on data for TCDD-induced changes in cellular response. Using GenMAPP, which is a gene microarray pathway profiler based on a gene ontology database, we chose a dataset of cell cycle control mRNA's with Rb1/2, p53, mdm2 and myc genes. We then generated expression-association maps that predict relationships of each path within the network of cell cycle. The maps were developed using a mathematical algorithm based on Bayesian networks assuming a log-linear relationship between the genes. The maps predict that p53, cdk7 and Cdkn2a play important roles as well as known E2F1 during cell cycle. The maps provide information defining the critical genes and pathways within the network. This approach may be useful for assessing transcriptional regulatory networks in more complex mammalian cells. For TCDD, we were able to identify the linkage between TCDD-induced genes and the retinoic acid signaling pathway. Tissue structure, function and differentiation state all play an important role in carcinogenesis. A new mathematical model of colon carcinogenesis was developed to take into account structure and differentiation state of cells making up the colon. A five stage model with controlled replication and differentiation was fit to colon cancer incidence data in males for Burlington, UK. The normal colonic crypts were assumed to undergo no more than 8 replication phases, each moving cells closer to terminal differentiation; the first five replicative steps were assumed to be at risk of mutation as were the basal layer stem cells. Cells with one or more mutation were assumed to have 16 replication phases of which the first 13 were susceptible to mutation. The same model was applied to data on familial adenomatous pappilloma patients with the best fit using a four-stage model for which the initial stage was normal but the remaining stages had one additional replication phase of which the first 15 were susceptible to mutation. The replication rates, death rates and mutation rates were effectively the same in both populations. The Toxic Equivalency Factor (TEF) method has been used to characterize the toxicity of human mixtures of dioxin-like compounds and is being considered for use with other classes of potentially toxic agents. A method was developed for formal testing of the TEF concept and applied to data from a mixture experiment on dioxin-like compounds. The estimated parameters indicate that congener-specific dose-response shapes are significantly different, that additivity fails for these congeners and that the ratios of ED50?s do not predict the response seen for the mixture. These results bring into question the utility of relative potency factors for multiple endpoints in determining TEFs. Non-monotonic dose response (U-shaped) patterns occur when a toxic substance acts as a stimulant in small doses, but as an inhibitor in large doses. A mechanistically based mathematical model was developed and used to demonstrate when non-monotonic dose-response might occur. A mathematical method was developed to extend the use of controlled-birth-death processes to non-cancer endpoints. A biologically-based dose-response model for developmental toxicology was developed using this new method. The model was applied to spermatogenesis. Toxicokinetic models were developed for anthrquinone, methyl eugenol, TCDD and related compounds, vanadium pentoxide, 2-MI and 4-MI, chromium and decalin in support of the NTP. In addition, a number of consultations with scientists at NIEHS, NTP and elsewhere were completed. Daily measures of maximum temperature, particulate matter 10 ?m in aerodynamic diameter (PM10), and gaseous pollution (ozone, nitrogen dioxide, sulfur dioxide, and carbon monoxide) were collected in Denver, Colorado, in July and August between 1993 and 1997. We compared these exposures with concurrent data on the number of daily hospital admissions for cardiovascular diseases in men and women > 65 years of age. The results suggest that O3 is associated with an increase in the risk of hospitalization for acute myocardial infarction, coronary atherosclerosis, and pulmonary heart disease. SO2 appears to be related to increased hospital stays for cardiac dysrhythmias, and CO is significantly associated with congestive heart failure. No association was found between particulate matter or NO2 and any of the health outcomes. This project is ending and is a legacy from inheriting a post-doc Dr. Warren Piver (deceased) was directing.
Showing the most recent 10 out of 57 publications