Chernobyl, the HBO miniseries: Fact and fiction (Part III)

Share on facebook
Share on twitter
Share on linkedin
Share on email
Share on print

This is Part 3 of a series.

With episode four of the series, we moved even further from reality than in prior episodes.

As I wrote in Part I, I realize the need to tell a story which grabs the viewer. Surely the cow assassination scene will go down in cinematic history, although it falls short of Mongo knocking out a horse in Blazing Saddles. (I wonder how Mel Brooks might have told the Chernobyl story.)

I’m amazed the producers didn’t get technical advice from a health physicist or radiobiologist rather than basing much of their screenplay on a novel (Voices of Chernobyl). Much of episode four focused on the effects of radiation exposure on several hundred thousand personnel involved in mitigating the accident and referred to as liquidators in Russian. (I suggested to my Soviet colleagues this does not translate well into English.)

In this editorial I focus on one of the most controversial and misunderstood aspects of the Chernobyl NPF accident: long-term consequences.

First, we need background. Exposure to ionizing radiations causes two types of medical effects: deterministic and stochastic. Deterministic effects are predictable, dose-dependent, and occur in everyone exposed to the same dose. For example, everyone exposed to an acute whole-body dose of 5 gray (5,000 millisieverts) will have a marked immediate decrease in blood granulocytes.

However, not all deterministic effects of radiation are immediate. For example, development of cataracts and of coronary artery disease are deterministic effects of high-dose radiation exposure, which occur many years later.

Stochastic effects are different. Although they are also dose-dependent (the higher the dose, the greater their likelihood), not everyone exposed to the same radiation dose will develop the effect. The most important stochastic effects of radiation exposure are genetic abnormalities, birth defects, and cancer.

As I discussed in Part II, exposing 100,000 people to 100 millisieverts of radiation will cause about 2,200 extra cancers and about 1,100 extra cancer deaths. Meanwhile, the background cancer rate in these 100,000 people will be 80,000, and cancer deaths, about 40,000.

There are several messages from these data: First, only about 2% of exposed persons will get cancer from their radiation exposure. Second, only 3% of cancers in this population of exposed persons will be caused by their radiation exposure. Namely, 97% of cancers would have occurred anyway and have nothing to do with their additional radiation exposure.


The obvious challenges to us in determining if a radiation exposure increases a person’s cancer risk are twofold:

First, how to detect such a small increase in cancers. For example, if the collapse of the Soviet Union caused people to smoke and drink more (which it did), the increased cancers caused by these exposures would greatly overwhelm any radiation-induced cancers caused by the Chernobyl NPF accident. One can easily imagine the liquidators, aware of the potential risks associated with their radiation exposures, might change their smoking and drinking habits. We have strong evidence of this.

(It was widely believed in the Soviet Union that I had recommended drinking alcohol to protect against radiation-induced damage. Given living conditions in the Soviet Union at that time, drinking alcohol might not have been such a bad idea, but not to prevent radiation-induced cancers. Actually, as readers know, alcohol exposure causes far more cancers than radiation.)

A second challenge is how to distinguish radiation-induced cancers from cancers that would have occurred anyway. There is nothing unique about radiation-induced cancers that would help us spot them from non-radiation-induced cases. Add to this the disintegration of the Soviet Union such that epidemiologists must now deal with three countries—Russia, Ukraine, and Belarus, with the first two at war, and none of which have a high-quality population-based cancer registry like the Surveillance, Epidemiology and End Results (SEER) registry in the U.S.

The bottom line is, it’s difficult or impossible to detect whether radiation exposures like those from the Chernobyl NPF accident increase cancers, unless something extraordinary happens (more on this below).

To estimate potential long-term consequences of the Chernobyl NPF accident, we rely on prior studies, especially data from the Japanese A-bomb survivors.

However, the circumstances of populations exposed to the A-bomb are different than the Chernobyl-exposed populations (liquidators, persons who were evacuated, and those living in contaminated areas).

The A-bomb survivors were exposed instantaneously to external high-dose gamma radiations. Although liquidators had a somewhat similar exposure, people living in areas contaminated with radionuclides released from the Chernobyl NPF accident have a rather different type of exposure. Simply put, most of their exposure occurred (and will occur) over many years.

The external component comes predominately from 137-cesium deposited on the ground, but also from eating foods and drinking water containing radionuclides. This means we use unproved assumptions to get from the A-bomb data-based risk coefficients to predict what will happen post-Chernobyl. For example, most data suggest exposing people to the same amount of radiation over a prolonged vs. a brief interval is less likely to cause cancer. (Admittedly, some recent data suggest the converse.)

Other important background information may be new to some readers. All of us are exposed to ionizing radiations all our lives. Moreover, all of us are radioactive. The average radiation dose to Americans is 6.2 millisieverts per year. About half of this dose results from physicians ordering radiological studies, especially CT scans. If a person lives 80 years, their lifetime cumulative dose will be about 500 millisieverts, or one-half a sievert. Compare this to the average dose of an A-bomb survivors, 200 millisieverts.

More importantly, let’s compare these doses to populations exposed because of the Chernobyl NPF accident. The average dose to the liquidators was 120 millisieverts, to the evacuated population, 30 millisieverts, and to the people living in contaminated lands, 10 millisieverts. You can see from these data, most of these Chernobyl-related doses are less than most of us receive in our lifetime.

There are several other ways to view these data. For example, people living in Denver (1-mile-high and sitting on the Rockies) receive about 80 millisieverts more radiation over their lifetime, than a person living in New York (sea level and on a sandy base). Another yardstick is, exposure to 50 millisieverts increases our lifetime cancer risk from 43% to 43.5%, a 0.5 percent increase.

Lastly, a CT/PET scan exposes someone to about 30 millisieverts. So, one way to look at the exposure of the liquidators is to think of them getting four CT/PET scans, the evacuated population—one CT/PET scan, and the population living in contaminated areas as getting an abdominal CT scan.


With this background, we can return to the Chernobyl accident, consider what has happened, and predict what might happen in the future. First, the bad news. There were about 7,000 cases of thyroid cancer caused by exposure to 131-iodine. All these cancers occurred in children less than 16 years old at the time of the accident and was caused by inhalation of 131-iodine and ingesting it in milk. Because thyroid cancer is rare in children, there is no question these cancers were caused by the Chernobyl NPF accident. But because thyroid cancer is treatable, there are fewer than 10 deaths.

What about other cancers?


There is only one report of an increase in other cancers amongst the exposed populations: an increased incidence of chronic lymphocytic leukemia (CLL) amongst the liquidators. This is curious, because most data suggest CLL is not a cancer caused by radiation. (It was the only leukemia not increased in the A-bomb survivors.) Also, because many cases of CLL are detected by routine blood testing, we need to exclude the possibility of surveillance bias, namely more blood tests in liquidators than amongst the general population.

However, more importantly, there are no reports of an increase in other leukemias known to be caused by radiation. This absence is critical, because these other leukemias were the most increased cancers in the A-bomb survivors, because they occurred about 10 years after exposure, which is 20-30 years earlier than more common cancers such as lung and breast cancers. These data suggest a large wave of radiation-induced solid cancers is unlikely to occur over the next several decades.

My intent is not to minimize potential cancer consequences of the Chernobyl NPF accident. If we use standard risk estimators of radiation-induced cancers based mostly on the A-bomb data (with the caveats I discussed), one can estimate 11,000 to 25,000 cancers over 80 years (95% confidence interval).

However, this should be compared with a background incidence of about 200 million over this timeframe, or about a 0.008 percent increase. Every extra death is, of course, tragic, but perspective is needed. For every terawatt (TWt) of electricity produced, nuclear energy is 10 to 100 times safer than coal or gas.

Also, as I discussed in Part II, there are no convincing data of an increase in the two other stochastic effects of radiation: genetic abnormalities or birth defects. This is not surprising, as no increases were detected in the A-bomb survivors exposed to much higher radiation doses than any of the populations we are discussing. For a list of activities associated with the same risk of death as being exposed to 1 millisievert of radiation, please see Figure 1.

Lastly, although many readers have commented favorably on this series, some have said, “What does this jackass (or worse names) know about reviewing movies?” True, I am a failed screenplay writer, but all is not lost: I have an Emmy, I’m a member of the Screen Actors Guild, and I get to vote on best actor for the Academy Awards. Does this qualify me to review movies? Not according to my wife, children, and any intelligent person.

In the final installment, I will tackle the series’ portrayal of the Soviet government and of our medical and scientific colleagues who have, so far, been shown in a most unfavorable light. Please tune in next week.

Visiting Professor of Hematology, Imperial College London
Executive director of clinical research in hematology and oncology, Celgene Corp.
In This Issue


President Joe Biden’s proposed Advanced Research Projects Agency-Health would be a welcome partner to NCI—particularly in conducting large, collaborative clinical investigations, NCI Director Ned Sharpless said.“I think having ARPA-H as part of the NIH is good for the NCI,” Sharpless said April 11 in his remarks at the annual meeting of the American Association for Cancer Research. “How this would fit with the ongoing efforts in cancer at the NCI is still something to work out.”
Visiting Professor of Hematology, Imperial College London
Executive director of clinical research in hematology and oncology, Celgene Corp.