Did you know that only a small fraction of the tens of thousands of commercially-used chemicals have undergone toxicological assessment? Time and cost constraints, not to mention the ethical impossibility of studying these chemicals in human trials, hamper large-scale toxicological assessment. Physiologically-based toxicokinetic (PBTK) models can be leveraged to predict TK from in vitro measurements and physico-chemical properties.
Dr. John Wambaugh from the National Center for Computational Toxicology (NCCT) in the Office of Research and Development of the US Environmental Protection Agency joined me to discuss how he’s using PBTK modeling to assess the toxicological risk posed by environmental chemicals.
Suzanne Minton: Can you describe your research at the National Center for Computational Toxicology?
John Wambaugh: Our center is located in Research Triangle Park, North Carolina. My job title is “physical scientist” which conjures up the image of a gym teacher. In fact, my research involves applying my training in physics and computer science to provide insights into how chemicals affect biological systems.
Suzanne Minton: What do you think are some of the biggest misconceptions about assessing the toxicity of environmental compounds, and how would you address them?
John Wambaugh: Well, we first have to clarify what audience we’re referring to. Say, I’m talking to my mom, a non-scientist. Her biggest misconception is that every chemical is thoroughly tested. Actually, the degree of testing depends on a chemical’s intended use.
If a chemical is a pesticide, then its toxicological liability is required to be rigorously evaluated under the existing US law. But if a chemical is designed for a benign use, a blue dye for example, laws are not as strict and no testing may be conducted if its chemical structure resembles other chemicals that are known to be low risk.
So, many chemicals found in our blood in small quantities may not have been thoroughly tested because it was not required by law, and the thought is the chemical and its use were benign. And the resources weren’t there for testing. These industrial chemicals are not drug leads with blockbuster potential. Some of these chemicals in consumer products include adhesives, stabilizers, or fragrances. What always surprises my mom is how little toxicological testing has been performed.
To address the time and resource restraints of traditional chemical testing, the US government is spending millions of dollars on a research effort to evaluate thousands of chemicals via high throughput screening. But this funding is divided by the thousands of chemicals being tested. Per chemical, we’re spending only $20,000 or $30,000. That’s a lot of money for an individual. Yet compared with what pharma might spend on prospectively evaluating the toxicity of a drug lead, $30,000 of in vitro testing is minimal.
Suzanne Minton: As a computational toxicologist, what lifestyle changes would you recommend to minimize the exposure to potentially toxic chemicals?
John Wambaugh: Avoiding exposure chemicals in daily use is a really difficult. Despite the common use of thousands of chemicals, today’s world is significantly cleaner compared to several decades ago. We have a lot less smog; the water is clearer. So, I don’t want people alarmed at the number of minuscule amounts of chemicals that we’re exposed to. On the other hand, we don’t always know the biological consequences of these compounds. That’s why my team is using modeling and simulation to try to predict the toxicological liability of industrial chemicals.
Suzanne Minton: Can you discuss how you use the Simcyp Simulator to assess chemical toxicity?
John Wambaugh: To date, the EPA has publicly released data on 3,000 chemicals and is continuing to test more. For the vast majority of those chemicals, we don’t know which human enzymes metabolize them. If we had additional resources, we might be able to figure that out for a particular chemical. But it takes time and money, and we haven’t done that for all those chemicals. Thus, we generally lack the data required to use the full Simcyp Simulator. So, we use the “Sim” part of the Simcyp Simulator, but we don’t necessarily use the “cyp” part.
The “Sim” part of Simcyp simulates human population variability. So, people vary by height and weight. Blood flow to organs varies between people. All those factors affect how a chemical either accumulates in or clears out of your body. We have been measuring the clearance of a chemical from a pool of human hepatocytes. These livers have been donated to science, and we have mixed hepatocytes from a group of individuals to average out inter-individual variability in metabolism. Then, we measure average in vitro hepatic clearance. Next, we plug that information into the Simcyp Simulator to simulate the variability in body weight, height, and blood flow to determine the range of blood concentration in the U.S. population given a known exposure rate.
The results from these simulations reveal that while people may be exposed to the same dose of a chemical, the resulting plasma concentration of the chemical varies widely across the population. Thus, the most sensitive person would be the person who attains the highest plasma level.
Here’s a hypothetical experiment since we can’t test these chemicals in people for obvious ethical reasons. Say you have two chemicals, A and B. People use a gram of chemical A and a gram of chemical B. The blood level of chemical A that causes effects might be a million times higher than the blood level of chemical B due to differences in clearance that we can predict with Simcyp and biological potency information from high throughput screening.
Probably two orders of magnitude of that difference are due to differences in biological potency revealed in high throughput screening. Chemical A’s potency might be 10 µM whereas chemical B’s potency might be 0.1 µM. Then Simcyp is used to convert these potency values to human doses.
To date, we have publicly released in vitro clearance data on 543 chemicals. Internally, we’ve quantified another couple hundred chemicals. And we’re able to examine those 543 chemicals, determine which are the most potent― those producing the highest blood levels for the lowest dose― and prioritize them for further toxicological assessment.
The EPA has the capacity to perform more comprehensive studies. But, we can only do them on so many chemicals at once. If our resources support studying five chemicals next year, we’re able to use Simcyp and our high throughput screening data to nominate which five chemicals would be most worth studying in 2017 and which five chemicals would be most worth studying after that. We call this process “risk-based prioritization.”
Suzanne Minton: That’s amazing that two chemicals equally dosed could produce orders of magnitude difference in plasma concentration! So you talk about how the Simcyp Simulator facilitates examining population variability. Some of the virtual populations you could generate are pediatric populations, pregnant women, or people who are organ impaired, or people of different ethnicities. Do you use these capabilities of the Simulator?
John Wambaugh: Only with respect to body weight and organ flows. Most of the variation between ethnicities is due to variability in metabolizing enzymes. We lack data on which metabolizing enzymes are involved for most chemicals.
My long-time collaborator— Dr. Barbara Wetmore, who is now EPA staff— attained CYP-specific data for about a dozen chemicals across a dozen CYPs. Then, she used Simcyp to simulate pediatric populations and compare the difference in exposure between adults and children. Thus, the questions that we can address using modeling and simulation are largely limited by the availability of data.
Suzanne Minton: Do you or your team have any new or different ways that you plan on using Certara tools this year or next?
John Wambaugh: We’re exploring using the Simcyp in vitro Data Analysis Toolkit (SIVA). Our ToxCast project has measured around 1,100 in vitro assay-endpoint components for several thousand chemicals, and we’re always trying to figure out better ways to understand the data. For instance, say we add 1 µM of a chemical to a cell culture dish. How much of that chemical is taken up by the cells? How much sticks to the dish’s walls? And how much binds the plasma protein in the cell culture media?
Those calculations interest us because if 1 µM of a chemical causes significant toxicity, you might think that this concentration causes toxicological effects in vivo. But if the cells only take up 0.1% of the chemical, and you still see toxicity, then the chemical is much more potent than previously thought.
Suzanne Minton: What do you find are the major advantages of using the Simcyp Simulator?
John Wambaugh: Really, it’s the ability to extrapolate toxicological exposure to other populations. After identifying high-priority chemicals, we’ll measure their CYP-specific metabolism. Then, we could perform simulations in different virtual ethnic or age groups to determine how variability in CYP expression levels and/or genotypes impacts exposure. The CYP profile in children is radically different from that in adults. In fact, CYP expression also changes after medical interventions.
For example, think about a pregnant woman who is at risk of delivering prematurely. Her doctor might prescribe her steroids to help the child develop his lungs. Steroid exposure will cause that baby to have a different CYP profile than normal. Not adverse, just different. So performing simulations in virtual populations to determine the impact of CYP variability on toxicokinetics will yield insights into which life stages and subpopulations are the most vulnerable to certain chemicals.
PBTK models can also be used to extrapolate from animal models to humans. The EPA has a research database of the results from roughly 5,000 animal studies from the scientific literature.
As we start to understand better how chemicals work in humans, we can potentially ground those predictions by predicting toxicokinetics in a rat, given our in vitro human data. You need a model to do that. The Simcyp Simulator lets us anchor our in vitro human studies in some of these previously performed animal toxicokinetic studies.
Suzanne Minton: I’m glad you brought up animal testing. By using PBTK modeling, could we reduce the amount of toxicological testing performed on animals?
John Wambaugh: Yes and no. There’s a lot of chemicals that need toxicological assessment. Maybe some of these chemicals don’t need to be tested because they are lower priority. The real advantage of modeling and simulation is that it enables us to maximize the value of the animal tests, minimize the number of animal tests needed, and target which animal tests are most appropriate for a given chemical.
If you had models of virtual tissues that enabled predicting toxicodynamics, then you could start to eliminate animal testing. That might be years from now. Currently, we can be more confident selecting which chemicals are worth testing on animals based on the predictions from Simcyp. Our goal is to eliminate unnecessary animal testing altogether.
Suzanne Minton: That makes sense. Is there anything else that you’d like us to know?
John Wambaugh: Science is all about learning. When I started studying pharmacokinetics, I attended some Simcyp workshops, which were extremely helpful. Since then, I’ve sent many of my post-docs to Simcyp workshops as well.
Suzanne Minton: Thank you. That’s great feedback.
For a deeper look at how modeling and simulation can inform toxicological risk assessment, please watch this webinar by Dr. Barbara Wetmore- “A New Method to Quantify Population Variability for Toxicity Testing.”