NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.
National Research Council (US) Committee on Applications of Toxicogenomic Technologies to Predictive Toxicology. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington (DC): National Academies Press (US); 2007.
Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment.
Show detailsScientists, regulators, and the public all desire efficient and accurate approaches to assess the toxicologic effects of chemical, physical, and biologic agents on living systems. Yet, no single approach exists to analyze toxicologic responses, a difficult task given the complexity of human and animal physiology and individual variations. The genomic knowledge and new technologies that have emerged in the post-genomic era promise to inform the understanding of many risks as well as enlighten current approaches and lead to novel predictive approaches for studying disease risk. As biologic knowledge progresses with the science of toxicology, “toxicogenomics” (see Box 1-1 for definition) has the potential to improve risk assessment and hazard screening.
BACKGROUND
Approaches to Assessing Toxicity
Detection of toxicity requires a means to observe (or measure) specific effects of exposures. Toxicology traditionally has focused on phenotypic changes in an organism that result from exposure to chemical, physical, or biologic agents. Such changes range from reversible effects, such as transient skin reactions, to chronic diseases, such as cancer, to the extreme end point of death. Typical whole-animal toxicology studies may range from single-dose acute to chronic lifetime exposures, and (after assessment of absorption, distribution, metabolism, and excretion properties) they include assessments of end points such as clinical signs of toxicity, body and organ weight changes, clinical chemistry, and histopathologic responses.
Toxicology studies generally use multiple doses that span the expected range from where no effects would be observed to where clinical or histopathologic changes would be evident. The highest dose at which no overt toxicity occurs in a 90-day study (the maximum tolerated dose), is generally used to establish animal dosing levels for chronic assays that provide insight into potential latent effects, including cancer, reproductive or developmental toxicity, and immunotoxicity. These studies constitute the mainstays of toxicologic practice.3
In addition to animal studies, efforts to identify and understand the effects of environmental chemicals, drugs, and other agents on human populations have used epidemiologic studies to examine the relationship between a dose and the response to exposures. In contrast to animal studies, in which exposures are experimentally controlled, epidemiologic studies describe exposure with an estimate of error, and they assess the relationship between exposure and disease distribution in human populations. These studies operate under the assumption that many years of chemical exposures or simple passage of time may be required before disease expression can be detected.
As medical science has progressed, so have the tools used to assess animal toxicity. For example, more sensitive diagnostic and monitoring tools have been used to assess organ function, including tools to detect altered heart rhythms, brain activity, and changes in hormone levels as well as to analyze changes visible by electron microscopy. Most notable, however, are the contributions of chemistry, cell and molecular biology, and genetics in detecting adverse effects and identifying cellular and molecular targets of toxicants. It is now possible to observe potential adverse effects on molecules, subcellular structures, and organelles before they manifest at the organismal level. This ability has enhanced etiologic understanding of toxicity and made it possible to assess the relevance of molecular changes to toxicity.
These molecular and cellular changes have been assessed in studies of animals but have also been applied to study human populations (“molecular epidemiology”) with some success. For example, our understanding of geneenvironment interactions has benefited greatly from studies of lung, head, and neck cancer among tobacco users—studies that examined differences in genes (polymorphisms) that are related to carcinogen metabolism and DNA repair. Similarly, studies of UV sunlight exposure and human differences in DNA repair genes have clarified gene-environment interactions in skin cancer risk. Current technology now enables the role of multiple genes of cell signaling pathways to be examined in human population studies aimed at assessing the interplay between environmental exposures and cancer risk.
Although current practice in toxicology continues to strongly emphasize changes observable at the level of the whole organism as well as at the level of the organ, the use of cellular and molecular end points sets the stage for applying toxicogenomic technologies to a more robust examination of how complex molecular and cellular systems contribute to the expression of toxicity.
Predictive Toxicology
Predictive toxicology describes the study of how toxic effects observed in humans or model systems can be used to predict pathogenesis, assess risk, and prevent human disease. Predictive toxicology includes, but is not limited to, risk assessment, the practical facilitation of decision making with scientific information. Many of the concepts described in this report relate to approaches to risk assessment; key risk assessment concepts are reviewed in Appendix C. Typical information gaps and inconsistencies that limit conventional risk assessment are listed in Box 1-2. These gaps and inconsistencies present opportunities for toxicogenomics to provide useful information.
Although toxicogenomics includes effects on wildlife and other environmental effects, this report is limited to a discussion of toxicogenomics as it applies to the study of human health.
Overview of Toxicogenomic Technologies
Toxicogenomic technologies comprise several different technology platforms for analysis of genomes, transcripts, proteins, and metabolites. These technologies are described briefly here and in more detail in Chapter 2. It is important to recognize two additional issues associated with the use of toxicogenomic technologies. First, the large quantity of information that a single experiment can generate, and the comprehensive nature of this information, is much greater than what traditional experiments generate. Second, the advancement of computing power and techniques enable these large amounts of information to be synthesized from different sources and experiments and to be analyzed in novel ways.
Genomic technologies encompass both genome sequencing technologies, which derive DNA sequences from genes and other regions of DNA, and genotype analysis, which detects sequence variations between individuals in individual genes. Whereas the sequencing of genomes was once an extraordinary undertaking, rapid evolution of sequencing technology has dramatically increased throughput and decreased cost, now outperforming the benchmark technology standard used for the Human Genome Project. The convergence of genome sequencing and genotyping technologies will eventually enable whole-genome sequences of individuals to be analyzed. Advances in genotyping technologies (see Chapter 2) allow the simultaneous assessment of multiple variants across the whole genome in large populations rather than just single or several gene polymorphisms.
Transcriptomic technologies (or gene expression profiling) measure mRNA expression in a highly parallel assay system, usually using microarrays. As the first widely available method for global analysis of gene expression, DNA microarrays are the emblematic technology of the post-genomic era. Microarray technology for transcriptomics has enabled the analysis of complex, multigene systems and their responses to environmental perturbations.
Proteomics is the study of collections of proteins in living systems. Because the same proteins may exist in multiple modified and variant forms, proteomes are more complex than the genomes and transcriptomes that encode them. Proteomic technologies use mass spectrometry (MS) and microarray technologies to resolve and identify the components of complex protein mixtures, to identify and map protein modifications, to characterize protein functional associations, and to compare proteomic changes quantitatively in different biologic states.
Metabolomics is the study of small-molecule components of biologic systems, which are the products of metabolic processes. Because metabolites reflect the activities of RNAs, proteins, and the genes that encode them, metabolomics allows for functional assessment of diseases and drug and chemical toxicity. Metabolomics technologies, employing nuclear magnetic resonance spectroscopy and MS, are directed at simultaneously measuring dozens to thousands of compounds in biofluids (for example, urine) or in cell tissue extracts. A key strength of metabolomic approaches is that they can be used to noninvasively and repeatedly measure changes in living tissues and living animals and that they measure changes in the actual metabolic flow. As with proteomics, the major limitation of metabolomics is the difficulty of comprehensively measuring diverse metabolites in complex biologic systems.
Bioinformatics is a branch of computational biology focused on applying advanced computational techniques to the collection, management, and analysis of numerical biologic data. Elements of bioinformatics are essential to the practice of all genomic technologies. Bioinformatics also encompasses the integration of data across genomic technologies, the integration of genomic data with data from other observations and measurements, and the integration of all these data in databases and related information resources. It is helpful to think of bioinformatics not as a separate discipline but as the universal means of analyzing and integrating information in biology.
Policy Context
Regulatory agencies with the biggest stake in predictive toxicology include the EPA, the Occupational Safety and Health Administration (OSHA), and the Food and Drug Administration (FDA). The EPA and OSHA are concerned with potentially toxic exposures in the community and in the workplace. The mission of the EPA is to protect human health and the environment and to safeguard the nation’s air, water, and land. OSHA’s mission is to ensure the safety and health of America’s workers by setting and enforcing standards; providing training, outreach, and education; establishing partnerships; and encouraging continual improvement in workplace safety and health. The FDA is responsible for protecting the public health by ensuring the safety, efficacy, and security of human and veterinary drugs, biologic products, medical devices, the nation’s food supply, cosmetics, and products that emit radiation. The FDA is also responsible for advancing public health by facilitating innovations that make medicines and foods more effective, safer, and more affordable. Finally, the FDA is responsible for helping the public receive the accurate, science-based information it needs to use these regulated products to improve health.
Working in parallel with these agencies and providing additional scientific underpinning to regulatory agency efforts is the Department of Health and Human Services (DHHS), National Institutes of Health (NIH). The NIH (2007) mission is “science in pursuit of fundamental knowledge about the nature and behavior of living systems and the application of that knowledge to extend healthy life and reduce the burdens of illness and disability,” including research on “causes, diagnosis, prevention, and cure of human disease.”
The NIH National Institute of Environmental Health Sciences (NIEHS) strives to use environmental sciences to understand human disease and improve human health, including “how environmental exposures fundamentally alter human biology” and why some people develop disease in response to toxicant exposure and others do not.4
In sum, NIH, regulatory agencies, the chemical and pharmaceutical industries, health professionals, attorneys, the media, and the general public are all interested in knowing how new genomic technologies developed in the aftermath of the Human Genome Project can improve our understanding of toxicity and ultimately protect public health and the environment. Although the FDA and the EPA have developed planning documents on toxicogenomic policies (see Chapter 9), specific policies have not yet emerged, and it is clear that stakeholders are grappling with similar questions:
- Where is the science of toxicogenomics going?
- What are the potential benefits and drawbacks of using toxicogenomics information for regulatory agencies, industry, and the public?
- What are the challenges in implementing toxicogenomic technologies, collecting and using the data, and communicating the results?
- Can genomic technologies predict health effects?
- How will government agencies, industry, academics, and others know when a particular technology is ready to be used for regulatory purposes?
- Will regulatory requirements have to be changed to reap the benefits of the new technologies to protect public health and the environment?
COMMITTEE CHARGE AND RESPONSE
In April 2004, NIEHS asked the National Academies to direct its investigative arm, the National Research Council (NRC), to examine the impact of toxicogenomic technologies on predictive toxicology (see Box 1-3 for the complete statement of task).
In response, the NRC formed the Committee on Applications of Toxicogenomic Technologies to Predictive Toxicology, a panel of 16 members that included experts in toxicology, molecular and cellular biology, epidemiology, law and ethics, bioinformatics (including database development and maintenance), statistics, public health, risk communication, and risk assessment (see Appendix A for committee details). The committee held two public meetings in Washington, DC, to collect information, meet with researchers and decision makers, and accept testimony from the public. The committee met five additional times, in executive session, to deliberate on findings and complete its report. The remaining chapters of this report constitute the findings of the NRC committee.
The committee approached its charge by focusing on potential uses of toxicogenomics often discussed in the broad toxicology community, with a focus on human health issues and not environmental impact. The committee determined that the applications described in Chapters 4-8 capture much of the often-cited potential value of toxicogenomics. After identifying potential toxicogenomic applications, the committee searched the scientific literature for case examples that demonstrate useful implementations of toxicogenomic technologies in these areas.
This report is not intended to be a compendium of all studies that used toxicogenomic technologies and does not attempt to highlight the full range of papers published. Peer-reviewed and published papers in the public domain were selected to illustrate applications the committee identified as worthy of consideration. For example, Box 1-4 contains brief summaries of selected studies where toxicogenomic technologies have shown promise in predictive toxicology. New studies using toxicogenomic technologies are published almost daily. Likewise, approaches to analyzing and interpreting data are rapidly evolving, resulting in changes in attitudes toward various approaches.5 Even while this report was being prepared, such changes were observed, and therefore the committee has attempted to provide a snapshot of this rapidly evolving field.
This report is the product of the efforts of the entire NRC committee. The report underwent extensive, independent external review overseen by the NRC Report Review Committee. It specifically addresses, and is limited to, the statement of task as agreed upon by the NRC and the DHHS.
This report consists of chapters on existing or potential “applications” and chapters that deal with broader issues. The technologies encompassed in toxicogenomics are described in Chapter 2. This is followed by a discussion of experimental design and data analysis in Chapter 3. Chapter 4 discusses the applications of toxicogenomic technologies to assess exposure: “Can toxicogenomic technologies determine whether an individual has been exposed to a substance and, if so, to how much?” Chapter 5 asks “Can toxicogenomic data be used to detect potential toxicity of an unknown compound quickly, reliably, and at a reasonable cost?” Chapter 6 addresses the assessment of individual variability in humans. The question in this context is “Can toxicogenomic technologies detect variability in response to exposures and provide a means to explain variability between individuals?” Chapter 7 addresses the question “What can toxicogenomic technologies teach us about the mechanisms by which toxicants produce adverse effects in biologic systems?” Considerations for risk assessment not covered in these four application chapters are discussed in Chapter 8. Chapter 9 focuses on validation issues that are relevant to most of the applications. In Chapter 10, sample and data collection and analysis are discussed, as are database needs. The ethical, legal, and social issues raised by use of toxicogenomics are considered in Chapter 11. Finally, Chapter 12 summarizes the recommendations from the other chapters and identifies several overarching recommendations.
Footnotes
- 3
For more information, see the National Research Council report Toxicity Testing for Assessment of Environmental Agents: Interim Report (NRC 2006a).
- 4
See http://www
.niehs.nih.gov/od/fromdir.htm (accessed April 2, 2007). - 5
The value of platforms and software is also evolving rapidly and any discussion of platforms or software should not be considered an endorsement by the committee.
- Introduction - Applications of Toxicogenomic Technologies to Predictive Toxicolo...Introduction - Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment
- MIR15C [Ornithorhynchus anatinus]MIR15C [Ornithorhynchus anatinus]Gene ID:100314454Gene
- Kmt5c lysine methyltransferase 5C [Rattus norvegicus]Kmt5c lysine methyltransferase 5C [Rattus norvegicus]Gene ID:308345Gene
- Trav14-3 T cell receptor alpha variable 14-3 [Mus musculus]Trav14-3 T cell receptor alpha variable 14-3 [Mus musculus]Gene ID:547330Gene
- Htr1b 5-hydroxytryptamine receptor 1B [Rattus norvegicus]Htr1b 5-hydroxytryptamine receptor 1B [Rattus norvegicus]Gene ID:25075Gene
Your browsing activity is empty.
Activity recording is turned off.
See more...