How to Read a Toxicity Study in 4 Steps
Dr. Meg Christensen is the founder of Interior Medicine, a physician-created resource on non-toxic home products and household exposures. Her layer-by-layer analysis of materials and products draws on her background in medicine, biochemistry, epidemiology, and clinical research.
Published May 2026 | Updated May 2026
How do you tell if a toxicity study is actually solid? Most viral claims fall apart in five minutes if you check four things: what was studied, at what dose, under what conditions, and what the authors actually concluded. This section walks through the four steps, plus why environmental health research so often seems to contradict itself.
A Language for What You Already Know
When a claim about a chemical comes across our social media feed or shows up pre-summarized in a Google result by AI, most of us pause and think, wait, is this real? The pause is the right instinct. What usually follows that pause is a set of quiet, invisible shortcuts we use to make the decision.
So here, we aim to replace those hazy shortcuts with tools that work more deliberately. Even experts spread misinformation or extrapolate a little too far. AI summaries collapse both careful and careless sources into the same paragraph. Search results often reward what's been clicked over what's been verified. The same toolkit applies whether the claim is in a Google search, a TikTok video, a wellness newsletter you subscribe to, a news headline, or a conversation with someone you trust.
At the end, we turn to why research on environmental toxins so often contradicts itself, and how to tell where the science is solid versus where it's still being figured out.
Articles That Make You Go 😮
By the end of this section, I hope seeing upsetting research makes you go 🤨 instead of 😮 — not just at the claim, but at your own first read of it. Questioning your own first reaction isn't a knock on your instincts. It's how you stay open to finding out something is more interesting than you first thought.
If you're on social media, you’ve probably seen the videos about avoiding a certain chemical or product, with a research paper held up in the background as the receipt. The creator makes an alarming claim, the paper looks legitimate, the reasoning seems to make sense, and the comments are full of people thanking them for sharing the truth. Sometimes they're right! But often, they're wrong, or sort of wrong. Reading research carefully is quite difficult, even for the people who do it for a living. Social media, the Land of Nuance (just kidding), makes it harder.
The same dynamic plays out in other places too: for example, the first few Google results and AI summaries on a topic. These tend to echo what the most popular blogs say, and not always the truth. For example: I was recently trying to track down whether formaldehyde in toilet paper is still a problem. The first several pages of search results, including the AI summary at the top, all said yes. But looking at the little source names showed they were all coming from blogs. It took some digging to find the primary research studies that actually measure formaldehyde in finished tissue products. It turns out the practice mostly stopped a while ago, but a lot of us are still echoing the old information around.
The four steps below are the steps I use on myself, in those moments, and others. The goal is to use discernment — this guy: 🤨 — never automatic acceptance. Think about reaching for these steps whenever a claim gives you a flash of alarm, but also whenever one gives you that satisfying flash of told-you-so. It takes about 5 minutes, and you might surprise yourself.
We'll walk through all four steps with one example: the claim that PEVA shower curtains are toxic. This is a claim you may have seen, with research appearing to back it up. Look closer, though, and you might laugh at how little the research actually supports it.
Step 1: Find the Actual Paper
Google the title and find the actual study. Look for the original article, not a press release, news piece, or someone else’s summary of it.
When you do this for the PEVA claim, here’s what comes up:
When the result for the paper comes up, ignore the often attention-grabbing summary line (you'll see in Step 3 that the actual conclusions are usually much more careful). Click through to open it. Just by pausing, searching, and opening the article, you've put yourself, not the video, in charge of what to make of it.
-
Step 1Pause. Google the title and open the paper.
-
Step 2
-
Step 3
-
Step 4
Step 2: Skim the paper. What did they do, and what did they find?
You don't have to read the full paper. Just answer these four questions. Start with the Abstract, which is the short summary at the top. If you need more, open the full text and look at the Methods and Conclusion sections too. What you're looking for is: what was studied, at what dose, under what conditions, and what the authors actually concluded.
What was studied: people vs. animals vs. cells in a petri dish. The PEVA study used aquatic worms that don't have lungs, a liver, or kidneys.
At what dose: 1000x typical exposure vs. realistic levels. For the PEVA study, VOC fumes were extracted from PEVA by heating it in a 150°F water bath.
Under what conditions: sealed chamber vs. a regular room. For the PEVA study, the worms were put in a sealed chamber with the VOCs for eight days.
What the authors concluded: a verdict, or something more careful? The PEVA paper's title implied certainty (PEVA is harmful). The conclusion section was much more careful: effects were observed under the specific high-temperature, sealed-chamber, multi-day conditions used in the study, and the authors called for further research, particularly in mammalian models, before any conclusions could be drawn about human health.
For the PEVA example, just getting to this step is where any worrying (or told you so) starts to fall apart.
-
Step 1Pause. Google the title and open the paper.
-
Step 2Skim the paper. What was studied? At what dose? Under what conditions? What did the authors conclude?
-
Step 3
-
Step 4
Step 3: Watch for the Leap
This is the silent step that catches almost everyone, including people with research or science backgrounds. It's often where the person in the video goes wrong. Once you see what a study found, the mind reaches for the next step almost automatically: well, if it does this in worms, it would probably do something similar in people, right? And we use shower curtains every day, so even a small effect would add up over time, right? Or, going the other direction: worm studies always overstate things, so this one probably doesn't mean anything for humans either. In short, we love to extrapolate.
Extrapolation feels right because it follows a logical chain of thought. But the chain often doesn't hold up to actual science, and this is exactly how someone turns a screening study on aquatic invertebrates into "PEVA shower curtains are toxic" or "this study means nothing, ignore it entirely." Watch for the urge in yourself, and watch for it even more carefully in whoever is making the claim, because they had to take the leap first to make the video at all.
A few specific traps to watch for, both in yourself and in the person in the video:
The species jump: assuming a chemical that affects one organism will affect another, even when the two species don't share the relevant systems. The PEVA worms don't have lungs, a liver, or kidneys, which are the three systems most relevant to how a human would actually inhale and detoxify a VOC.
The dose jump: assuming an effect at high doses scales smoothly down to low doses. Sometimes it does, but often it doesn't. Many substances have threshold effects (no harm below a certain dose) or non-linear dose-response curves, both of which we covered in Part 2. "1000x exposure causes harm" doesn't automatically mean "1x exposure causes 1/1000th of the harm."
The conditions jump: assuming an effect under extreme conditions translates to normal ones. The PEVA study used 150°F water in a sealed chamber for eight days. A 90°F bathroom for 10 minutes is a completely different situation, not just a smaller version of the same one.
The repeated-exposure jump: assuming brief exposures repeated over years equal a single sustained exposure. Some chemicals do bioaccumulate and act this way, but most don't. The body clears most things between exposures.
The better answer to "couldn't this big risk in worms mean a small risk in us?" is: maybe. This study illustrates that the only way to know for sure is from more research, in better-designed studies, in species closer to our biology. A single screening study isn't a basis for extrapolating. It's a basis for asking the next questions.
This is why a lot of environmental toxin research takes so long. The gap between the studies we can do and real-life exposure is very wide, and bridging it ethically takes time.
Finally, resisting extrapolation means a willingness to sit with uncertainty, to make our best decisions with what we know, and to leave room for the mystery of how all of this actually works. It doesn't mean you can't take action. Choosing to avoid a substance because the science isn't settled isn't giving in to fear-mongering. You can do it out of precaution and respect for how much we still don't understand. The difference between that and blindly accepting someone else's extrapolation is that you've made the choice yourself, knowing exactly what's known and what isn't.
-
Step 1Pause. Google the title and open the paper.
-
Step 2Skim the paper. What was studied? At what dose? Under what conditions? What did the authors conclude?
-
Step 3Resist the urge to extrapolate. Sit with what we don't know yet.
-
Step 4
Step 4: Zoom Out. Is this one weird study, or one of many?
Steps 1-3 work on any single paper. Step 4 is the one that gives you the most honest read on whether a substance is worth worrying about. The payoff of Step 4 is that you walk away closer to the truth instead of stuck with someone else's framing. A scary video lands differently when you can already tell the study was just a basic first look. A quieter video lands differently too, because you can tell when the science actually agrees. You stop bouncing between panic and dismissal, and you start landing somewhere closer to your own settled sense of what to care about.
Quantity: ten studies pointing the same way is a stronger signal than one study.
Quality: a well-designed study with a lot of people in it counts for more than a small, sloppy one.
Study type: cell and animal studies are useful for screening but don't always translate to humans. Observational studies in people are better, but can't prove causation on their own. Stronger still are summaries of many studies put together (often called "review articles" or "meta-analyses"). Strongest of all are reviews by official agencies (like ATSDR, IARC, EPA, or NTP) that bring everything together, weigh study quality, and issue a judgment.
Consistency: do the studies agree, or do they contradict each other? Twenty studies pointing the same direction across different populations, methods, and research groups is much stronger than twenty with mixed results, even if the count is the same.
The fastest way to find this out during a scroll is to Google the substance name plus "review" or "ATSDR." If a major agency has weighed in, their summary is usually written more accessibly than individual research papers and is the most synthesized answer you'll find in one place.
There is another type of zooming out. This kind is backing up a step and asking: what is this material made of, and what does it resemble? A search for PEVA itself might come up short, but a search for its components or its close chemical relatives often turns up more. There are no other studies on PEVA shower curtains directly, but if you zoom out further to vinyl acetate (one of the chemical building blocks of PEVA), there's more to look at. Most of the research is from factories, where workers handle it as a liquid chemical before it's been turned into a polymer. That's a totally different situation than what you're dealing with at home. By the time vinyl acetate has been turned into PEVA, it's chemically locked into the plastic and behaves differently. The European Union's risk assessment of vinyl acetate concluded that current consumer exposures don't warrant additional restrictions. That's reassuring, but it's still a thin evidence base on a related substance, not on PEVA shower curtains directly. Just be aware you're widening the question, and that adjacent evidence isn't direct evidence on PEVA shower curtains specifically.
So right now, we just can’t say definitively that PEVA shower curtains are toxic to humans. But also: we can't say PEVA shower curtains are definitively safe, because the evidence base isn't there for that either. Again, even when the evidence is thin or adjacent, choosing to avoid something is still more than valid. You could be extra cautious. You could avoid it out of care for the workers exposed to it. We dive deeply into the difference between the facts vs. what you choose to do with them in upcoming sections of the course.
-
Step 1Pause. Google the title and open the paper.
-
Step 2Skim the paper. What was studied? At what dose? Under what conditions? What did the authors conclude?
-
Step 3Resist the urge to extrapolate. Sit with what we don't know yet.
-
Step 4Zoom out. Is this one weird study, or one of many that agree?
Why Does Environmental Toxin Research Contradict Itself So Much?
You've probably noticed that for almost every chemical, there's a study saying it's harmful and another saying it's fine. That’s because this is what early-stage research looks like. It's normal, and expected. A few things cause it:
Studies on environmental toxicants are mostly observational, because researchers can't randomly assign people to exposures and watch what happens. They have to work with populations who are already exposed, which means other variables (diet, air quality, income, stress, age, what else they're exposed to at the same time) can explain the finding instead of the chemical itself. A study might link a chemical to higher rates of asthma in one neighborhood, but if that neighborhood also has more highway traffic and, on average, worse diets, any of those could be the actual cause. Observe the same chemical somewhere else, and the pattern might disappear because the highway traffic and the diet patterns aren't the same. That's normal and expected at first. But over time, and sometimes it takes a long time, you get enough studies that the real pattern starts to emerge.
Publication bias makes this worse: studies with alarming results are more likely to get published and shared widely than studies that find nothing, which skews the published literature toward concern.
This is exactly why a single dramatic study isn't enough to act on. The bigger picture is what tells you what's actually going on. With environmental research, it's better to pay attention to the forest than each individual tree.
The Six Step Framework, Now with Research
Remember our risk framework from Part 2? Each box — hazard, exposure, dose, dose-response, susceptibility, and risk — can have a different amount of research behind it. This matters because risk is what the other five add up to, and knowing how solid the evidence is on each of them is very helpful in figuring out what to actually worry about. The hazard might be well-studied while the actual exposure pathway hasn't been touched. We might know a lot about typical doses but very little about who's most susceptible.
So going forward in this course, I'll show each box in the framework filled in according to how much research supports it. Four levels will be used to keep it visually simple. Each level represents everything we just covered: how much research exists, how good it is, what types of studies have been done, and whether the results agree.
For PEVA right now, we have a little information about its hazard profile (one study), and we don't have any information about exposure pathways relevant to humans, dose-response, or susceptibility. Compare that to PFAS, which has decades of research, multiple agency reviews, and consistent findings across the framework. Same six-box structure, very different evidence depth.
Next: Part 4
And now we have perfect risk frameworks, with every box perfectly filled, ready to make perfect risk assessments.
Just kidding! The boxes are almost never all filled. Part of being honest about risk is sitting with what we don't know yet, and that's what Part 4 is all about. It’s a shorter section, and mostly visual.
Enter the giveaway below, or start Part 4 now ➜
Enter the Giveaway
You finished Part 3! You’re eligible to add an entry into the giveaway.
Fill out the form below to submit your feedback and add 1 entry to the giveaway. Each of the 10 sections has its own form, so you can earn up to 10 entries total. Drawing is June 8, 2026. See full giveaway details and prize list here.
Heads up: if you've been on this page a while, refresh before submitting. The spam-prevention captcha times out after a couple minutes, so refreshing prevents an error message and having to resubmit.
Part 3 FAQ
How do I know if a health study I see online is actually reliable? Four quick checks catch most bad claims: find the actual paper (not the press release or social media summary), skim it to see what was studied, at what dose, under what conditions, and what the authors concluded, resist the urge to extrapolate (an effect in worms or at 1000x typical exposure doesn't translate cleanly to humans at normal exposure), and zoom out to see if this is one study or one of many that agree. Most viral toxicity claims fall apart at step two or three.
How do I read a research paper if I'm not a scientist? You don't have to read the whole paper. Open the abstract, then if you need more detail, the Methods and Conclusion sections. Look for four things: what was studied (people vs animals vs cells in a petri dish), at what dose (1000x typical exposure vs realistic levels), under what conditions (sealed chamber vs a regular room), and what the authors actually concluded. The title and a viral summary often imply more certainty than the conclusion section does.
Why does environmental health research seem to contradict itself so much? Most environmental toxicant studies are observational because researchers can't ethically assign people to chemical exposures and watch what happens. They have to work with populations already exposed, which means other variables (diet, air quality, income, stress, age, co-exposures) can explain the finding instead of the chemical itself. A study might link a chemical to higher asthma rates in one neighborhood, but if that neighborhood also has more highway traffic, any of those factors could be the cause. Publication bias makes this worse, because studies with alarming results are more likely to get published than studies that find nothing. Over time, with enough studies, real patterns emerge, but the early literature on most chemicals looks contradictory by design.
How do I tell if a scientific journal is predatory or legitimate? Predatory journals will publish almost anything for a fee, while legitimate journals require peer review before publication. The fastest check is to Google the journal name plus the word "predatory." For a more rigorous check, search the journal at the NLM Catalog (ncbi.nlm.nih.gov/nlmcatalog) and look for "Currently indexed for MEDLINE." PubMed listing alone doesn't guarantee quality, because PubMed is a search index, not a quality filter. MEDLINE indexing is the higher bar.
Part 3 References
The four steps aren't a new system I'm proposing. They're what I personally use when I see a research-backed claim about a chemical, distilled from established frameworks for evaluating research and online sources: SIFT, GRADE, and CRABS. Those frameworks are excellent, but I wanted to honor the reality of how we see most non-toxic content and give you the honest steps I use instead of the idealized ones. Here's what I changed and why:
I focused on what catches the most claims fastest. Most of us aren't motivated to settle in and apply these frameworks carefully during a social media scroll, so the four steps above will flag the majority of bad claims on their own, in a few minutes, on your phone.
I emphasized looking at dose and exposure. This is where most misleading claims in non-toxic spaces come from. A study finds a harmful effect at 1000x typical exposure in worms, and a video turns it into "this chemical is toxic" without ever mentioning the dose or the fact that the species studied doesn't even have a liver.
I separated out the extrapolation piece. In most frameworks, this is a silent step that gets folded into "evaluate the methods" or "consider applicability." But I think extrapolation is the actual move that turns a screening study into a panic post, and naming it as its own step makes it visible enough to catch yourself doing it.
I dropped the "who is making this claim" check. Authority and credentials absolutely tell you something, but they're not everything anymore, especially on social media where the lines are blurred. There are MDs who are also wellness influencers, and I've seen them make claims that don't hold up. Some wellness influencers without credentials are incredibly careful about the claims they make. So the four steps focus more on what's being said, not who's saying it.
There are two more things that come from these frameworks that are worthwhile mentions. You can check the journal it was published in, and the authors and funding behind it.
The journal: PubMed is a search tool, not a quality filter. Some journals are predatory and will publish almost anything for a fee, while others are the heavy-hitters where work has cleared serious peer review. The fastest check is to Google the journal's name plus the word "predatory." For the gold standard, search the journal at the NLM Catalog (ncbi.nlm.nih.gov/nlmcatalog) and look for "Currently indexed for MEDLINE."
Authors and funding: at the top or bottom of the paper, look for who the researchers are and who paid for the work. Are any conflicts of interest declared? An industry-funded study isn't automatically wrong, and an advocacy-funded study isn't automatically right, but both are reasons to read more carefully. For vinyl acetate, there is a 2025 study (Gauthier et al.) that tested 71 consumer products and found exposures well below health thresholds. But it was funded by the Vinyl Acetate Council, the industry trade association, which is a major conflict of interest. The European Union's earlier independent assessment reached a similar conclusion, which is why I'd lean on the EU's review rather than this single study alone.
Assessment frameworks references
Guyatt, G. H.; Oxman, A. D.; Vist, G. E.; Kunz, R.; Falck-Ytter, Y.; Alonso-Coello, P.; Schünemann, H. J. GRADE: An Emerging Consensus on Rating Quality of Evidence and Strength of Recommendations. BMJ 2008, 336 (7650), 924–926.
Caulfield, M. SIFT (The Four Moves). Hapgood, June 19, 2019.
Stokes-Parish, J. Navigating Health Misinformation: Use of the CRABS Framework as a Tool for Health Professionals. J. Med. Internet Res. 2022.
Grudniewicz, A.; Moher, D.; Cobey, K. D.; Bryson, G. L.; Cress, S.; Frank, K.; Manca, M.; Lem, M.; Dobler, G.; Shamseer, L.; et al. Predatory Journals: No Definition, No Defence. Nature 2019, 576 (7786), 210–212.
PEVA study and vinyl acetate monomers
Meng, T.T. Volatile Organic Compounds of Polyethylene Vinyl Acetate Plastic Are Toxic to Living Organisms. J. Toxicol. Sci. 2014, 39 (5), 795–802.
ATSDR. Toxicological Profile for Vinyl Acetate. U.S. Department of Health and Human Services: Atlanta, GA, January 2025.
Scientific Committee on Health and Environmental Risks (SCHER). Scientific Opinion on the Risk Assessment Report on Vinyl Acetate, CAS 108-05-4: Human Health Part. European Commission: Brussels, November 17, 2008.
PEVA vs PFAS frameworks
ATSDR. Toxicological Profile for Perfluoroalkyls. U.S. Department of Health and Human Services: Atlanta, GA, 2021.
