Of the more than 80,000 chemicals used in the U.S., only 300 or so have ever undergone health and safety testing. In fact, only five chemicals have ever been restricted or banned by the U.S. Environmental Protection Agency (EPA). But now some 10,000 agricultural and industrial chemicals—as well as food additives—will be screened for toxicity for the first time, with the help of a rapid-fire testing robot.
“We are screening 10,000 chemicals using these rapid tests to characterize the bioactivity of the chemicals to predict their hazard and to use that information to prioritize for further screening and testing,” says biologist David Dix, deputy director of EPA’s National Center for Computational Toxicology. “We can test a lot of chemicals with a lot of repetitions at a lot of different concentrations.”
The program, initially started at EPA as ToxCast to assess 1,000 chemicals (and known as Tox21 in its expanded form), employs a robot to speed chemical screening. On plastic plates filled with 1,536 tiny wells, the robot drops varying amounts of different chemicals onto human cells and human proteins. Essentially, each plate has 1,536 experiments underway at the same time. “In a stack of 100, we have 150,000 combinations of chemicals and targets,” Dix says.
The robot arm and its numerous five- to 10-microliter wells replace the old standby of toxicology—animal testing. In addition to being slow and controversial, animal tests do not reveal how a chemical might impact humans, nor do they deliver any insight into the mechanisms by which a given chemical produced toxic outcomes. Simply by running the robotic tests, the EPA and its partner agencies will generate more information on chemical toxicity in the next few years than has been created in the past century. The effort has already screened more than 2,500 chemicals, including the dispersants employed to clean up BP’s 2010 oil spill in the Gulf of Mexico.
The new information may allow toxicology to evolve from a reactive science to a predictive one; models of liver toxicity based on chemical testing, for example, could predict how new chemicals would interact with the liver, based on molecular structure and other information. Already, ToxCast scientists have made such a predictive model for liver toxicity: It forecast accurately tumor formation in rats and mice that had been exposed for two years to certain chemicals. A similar effort proved accurate for reproductive toxicity, including vascular development and endocrine disruption—an area of keen interest for human exposure to chemicals such as bisphenol A (BPA).
In addition, the high-speed robotic testing will allow toxicologists to better understand mixture and low-dose effects by testing both combinations of chemicals for additive damage as well as how, for example, 15 different concentrations of a given chemical impact human cells. “We suspect that when we look at 10,000 chemicals we’ll see a lot of activity that we didn’t know about,” Dix says of the two-year effort, in which the EPA has partnered with a handful of federal health agencies.
“For a lot of chemicals, there’s no requirement for animal toxicity testing or any other type of testing,” Dix notes. “Tox21 is going to provide information where there is no information.”