Rumors of even the faintest trace of radiation can spark a panic in the modern world. We assume that any exposure to any type of radioactivity, no matter how small, poses a serious cancer risk.
Our leaders encourage us in this phobia by relying on the Linear No-Threshold model—the scientific framework regulators use to dictate radiation safety standards. Although fears are understandable, especially when promoted by authoritative bodies, the warnings often exaggerate the real risks. Unnecessary alarm about radiation stands in the way of new development of CO2-free nuclear energy generation. We must confront this problem if we are serious about reducing carbon emissions.
What is the Linear No-Threshold model, and why is it central to radiation safety? This model suggests that any exposure to ionizing radiation carries a risk of cancer, with the risk increasing in direct proportion to the amount of radiation. The notion that there is no safe exposure level for radiation has significantly influenced regulations governing nuclear energy, medical imaging, and even food safety.
The flaw in this assumption is that we are being exposed to radiation all the time, irrespective of anything built by human beings. It’s essential to know how radiation exposure is measured. When radiation passes through the body, some of it is absorbed, contributing to that person’s radiation dose. This dose is often measured in millisieverts (mSv), a standard unit used to quantify how much radiation the body absorbs.
It is difficult to reconcile the Linear No-Threshold model with real-world data on radiation exposure. Radiation is all around us—both natural and man-made. On average, Americans receive a total radiation dose of 6.2 millisieverts annually, according to the National Council on Radiation Protection and Measurements. About half of this dose comes from natural sources, like cosmic radiation from the sun, radioactive minerals in the ground, and even trace amounts of radioactive materials in our bodies. The other half comes from man-made sources, including medical imaging such as X-rays and CT scans.
Different regions experience varying levels of background radiation without corresponding increases in cancer risk. In Kerala, India, some coastal areas expose residents to radiation levels as high as 70 mSv annually—more than ten times the average exposure in the United States. Long-term studies show no elevated cancer risk in the region. Denver, Colorado, has over 10 mSv of background radiation each year, nearly double the national average. Yet Denver's cancer rates are no higher than those in lower-radiation areas. These discrepancies raise critical questions: if even the smallest increase in radiation poses a greater risk, how do these communities, living with far more radiation, remain unaffected?
Occupational studies also challenge the core assumptions of the LNT model. A century-long study of radiologists in the United Kingdom reveals a striking contradiction. Radiologists, who historically have faced higher radiation exposure, showed a remarkable 32% lower mortality rate compared to their peers in other medical fields.
Like medicine, radiation operates within a threshold framework. A small dose of a medication can be beneficial for relieving symptoms or aiding recovery, but excessive doses lead to harmful side effects. Research from the National Library of Medicine shows that low doses of radiation can stimulate cellular repair mechanisms—a phenomenon known as radiation hormesis—potentially reducing overall cancer risk. Unlike the Linear No-Threshold model, which insists that any exposure always carries risk, the threshold approach acknowledges that minimal doses of radiation can be safe or even beneficial, while only higher doses present significant risks.
So, how can a flawed model like Linear No-Threshold dominate radiation safety standards? The answer lies in the scientific misconduct that led to the model’s establishment.
In his Nobel Prize lecture of 1946, Hermann J. Muller, a researcher in radiation genetics, made a bold declaration. Muller was renowned for his groundbreaking discoveries of how radiation causes genetic mutations. He claimed that even the smallest doses of radiation could significantly increase the risk of genetic mutations and cancer. His assertion that we cannot “escape from the conclusion that there is no threshold” was a central feature of his Nobel Prize address.
However, recent revelations expose serious scientific misconduct in Muller’s approach. Muller intentionally ignored studies that discredited his claims and conflicted with evidence in his Nobel lecture. Muller used his prestigious platform to advocate for the LNT model based on flawed premises, omitting critical data and presenting an absolute, unprovable statement. This statement misled policymakers and the public, cementing a contentious framework that still influences radiation safety standards.
The Nuclear Regulatory Commission rejected three petitions to repudiate the Linear No-Threshold model in 2021. The commission admitted that no substantial evidence supported the LNT model, but it required that the model remain in use until it was conclusively disproved.
The evacuation measures following the Fukushima accident provide a vivid example of how harmful the LNT model can be. Guided by Linear No-Threshold assumptions, Japanese authorities rushed to evacuate people from areas where radiation levels were significantly lower than the natural background radiation found in places like India, where people live safely.
The consequences extended beyond the immediate aftermath of the Fukushima accident. Exaggerated fears of nuclear energy’s radioactive dangers led to the shutdown of nuclear power plants, which caused a significant spike in electricity prices. The resulting drop in energy consumption may account for 4,500 deaths due to the cold.
The Linear No-Threshold model is indefensibly cautious and should be scrapped in favor of a model that estimates real radiation risk and takes a more balanced approach to radiation safety.
Permission to reprint this blog post in whole or in part is hereby granted, provided that the author (or authors) and the Mackinac Center for Public Policy are properly cited.
Get insightful commentary and the most reliable research on Michigan issues sent straight to your inbox.
The Mackinac Center for Public Policy is a nonprofit research and educational institute that advances the principles of free markets and limited government. Through our research and education programs, we challenge government overreach and advocate for a free-market approach to public policy that frees people to realize their potential and dreams.
Please consider contributing to our work to advance a freer and more prosperous state.
Donate | About | Blog | Pressroom | Publications | Careers | Site Map | Email Signup | Contact