Press Release, Feb. 14, 2008
Memorandum of Understanding
On High Throughput Screening, Toxicity Pathway Profiling, and Biological Interpretation of Findings
Transforming Environmental Health Protection
Science, Feb. 15, 2008
Listen to the Toxicity Testing Press Conference (MP3 - 14MB)
Transcript for the Toxicity Testing Press Conference
Sound Bites and Video for Reporters
Finding What's Toxic Fast
NCGC Lends Its Technology to Modernizing Toxicity Screening, Refining Animal Testing
An effort now underway by three collaborating federal research groups seeks to rapidly evaluate larger numbers of chemicals for risks to humans while reducing the role of laboratory animals in regulatory testing.
Innovative toxicity tests — using high-throughput screening technologies — can now identify genes and pathways in cells affected by toxic compounds. The National Institutes of Health (NIH) Chemical Genomics Center (NCGC), administered by the Division of Intramural Research of the National Human Genome Research Institute (NHGRI), recently joined its expertise in these technologies with the experience of the National Toxicology Program (NTP) in animal testing and the computational capacities of the Office of Research and Development in the Environmental Protection Agency (EPA).
Leaders from the three organizations this month signed a Memorandum of Understanding , entering into a five-year collaboration to undertake "the research, development, validation, and translation of new and innovative test methods that characterize key steps in toxicity pathways." The collective budget is yet to be determined.
The aim of the collaboration among NCGC, NTP and EPA is to generate a comprehensive map of the genes, proteins and biological pathways that are affected by each of the chemicals submitted for testing. "We want to find out if and how potential chemical toxicants affect various types of cells and individual genetic targets within them," said NCGC Director Christopher P. Austin, M.D. "We hope to refine many of the toxicity tests done with animals and eventually supplant them with in vitro testing and computational prediction. Animal testing has contributed enormously to our understanding of human health risks, but it's slow, and it's expensive."
Until now, toxicity testing has relied almost exclusively on animal testing. Researchers expose an animal, such as a mouse, to a compound. They observe the general effects of the compound on the animal, such as causing weight loss or shortening life, and then examine the animal's tissues for evidence of damage. This process has been extremely useful in preventing compounds that might be toxic to humans from being approved or used.
However, the increased number of new chemicals produced every year, combined with the expense and growing public uneasiness with animal testing, has led to intense study at the NTP and EPA for alternatives to animal testing. Adding to this interest is the realization that the presence or lack of toxicity in an animal does not necessarily correspond to toxicity in a human. Methods that could more accurately predict a compound's human toxicity would greatly improve chemical development for both industrial and pharmaceutical applications. The approach being taken by the three federal research groups received an important boost recently from the publication of a National Resource Council report titled Toxicity Testing in the 21st Century: A Vision and A Strategy, which advocated just this approach.
NCGC is part of the NIH Roadmap Molecular Libraries Screening Center Network, a nationwide research consortium. NCGC's mission is to discover small molecule probes of biology, and understand their effects on biological targets. Scientists call organic chemical compounds 'small molecules' because they are smaller than larger, biologically active molecules, such as proteins. This class of chemicals, which make up most pharmaceutical drugs, can also be used as tools to modulate gene function, and improve understanding of biological pathways involved in human health and disease.
In 2005, Rockville-based NCGC began partnering with the NTP, an interagency program located at the National Institute of Environmental Health Sciences. The partnership aims to upgrade NTP's ability to detect the toxic effects of chemicals from industry and the environment. "They were pursuing the possibility of doing cell- and protein-based high-throughput screening, but didn't have the expertise to implement some important new technologies," Dr. Austin recalled. "At NCGC, we had developed a new screening method that tests all compounds at a wide range of concentrations, which turns out to be tailor-made for toxicology testing." When the two groups began talking, it became immediately clear that this would be an ideal collaboration.
Later that year, Robert Kavlock, Ph.D., director of EPA's National Center for Computational Toxicology, read about the Molecular Libraries Initiative and contacted Dr. Austin regarding a new research program started to prioritize chemicals for testing from amongst the thousands that EPA needed to assess for human risk. "He heard our presentation and it was like a marriage. Since then, it's really been a fantastic relationship," Dr. Kavlock said. The EPA's involvement is part of its ToxCastTM program — an initiative launched in 2007 to revolutionize the agency's chemical toxicity evaluation procedures. ToxCast will use advances in computers, genomics and cellular biology to speed up toxicity testing and enhance its capacity to screen a bevy of compounds.
NCGC's scientists jumped at the chance to approach their work from a different angle. Most of the time, the center tests hundreds of thousands of small molecule compounds to identify a few that would have an effect on a specific biological target, such as a gene known to play a role in predisposition to diabetes or breast cancer. In its work with NTP and EPA, the center is doing the opposite: using chemicals that are known to have a toxic effect to understand why they are toxic, what genes they affect, and the mechanism by which they have that effect. If these new techniques prove useful for better understanding chemicals already known to be toxic, then they can be applied to the large numbers of chemicals for which little is known.
The agencies collectively contributed more than 2,800 compounds for NCGC analysis. That first undertaking took about one year to complete. "The next goal of the collaborative group is to expand screening to the approximately 11,000 compounds of highest concern to the agencies, but the total number of compounds of potential toxicological interest worldwide might be upwards of 100,000," Dr. Austin said.
The compounds supplied by the EPA include many used in pesticides and industrial chemicals, while a greater variety came from the NTP's collection, including ingredients from cosmetics, plastics and even herbal supplements. Scientists already know or suspect that these compounds are toxic because of structural characteristics, but they are not sure of the precise toxic mechanisms of action. The scientific questions being explored include establishing what these compounds do to an organism to cause toxic reactions.
As a further part of the effort, NCGC tested 1,408 known toxicants submitted by NTP with 13 human and rodent cell lines. The analyses aimed to establish which cells from different species were damaged by the toxicants. "We're looking at the spectrum of activity across various cell types to see if there is a battery of cell types that are more relevant to the observed animal toxicity than others," said Ray Tice, Ph.D., acting branch chief of the Biomolecular Screening Branch of NTP.
An article published in Environmental Health Perspectives on November 22, 2007, reported these initial findings. Some of the compounds were toxic to all the cell types at similar concentrations, whereas others showed species- or cell type-specific toxicity. Human blood cells and nerve cells were most sensitive to the toxic effects of the compounds, and multiple compounds were found that affected only rodent cells but not human cells, mirroring the differences in species toxicity often seen in animal testing.
Dr. Tice said NTP is excited about the opportunity to collaborate with NCGC on this project. "What the NCGC brings to the table is the equipment no one else has. There is no other resource like that, other than those that may be tied to a pharmaceutical firm," he said, noting NCGC's high-throughput, robotic screening offers the power to test as many as two million different chemicals in one assay in a week.
The knowledge base created by the collaboration also may lead to ways of identifying individuals who may be at particularly high risk if exposed to particular chemicals and to new approaches for treating those who may become exposed. "We want to protect humans from toxic exposures and better characterize risk," said Dr. Tice. "We think this partnership represents a major step towards achieving those goals as swiftly and efficiently as possible."
Last Updated: July 30, 2012