Volume 24, Number 3 July 1995

ARTICLES


Symposium on Radioactivity and Health: The Cold War Legacy

Physics and Society presents here papers based on the four talks given at the invited session on Radioactivity and Health: The Cold War Legacy. The session was held at the APS meeting in Washington, DC, on 18 April 1995, with Lois Joellenbeck presiding.


Introduction to the Symposium

Lois Joellenbeck

As we approach the 50th anniversary of the military use of atomic bombs, it is appropriate to reflect on some of the issues ushered in with the start of the nuclear era. Traditionally, wars have driven technology development. They have also provided motivation and means for major research efforts and innovation. The result of one of the most famous U.S. wartime research efforts, the Manhattan Project, was a powerful weapon that led to both an intensification of old societal challenges and a host of new ones.

Both old and new challenges are addressed in the following papers. Mark Goodman's presentation on the work of the Advisory Committee on Human Radiation Experiments addresses some of the difficult ethical issues raised by human experimentation. Even under ideal circumstances, experiments carried out on human beings require vigilant attention to ethical issues. In the special circumstances of the Cold War, secrecy and the concern for national security provided additional potential for compromising the rights of study participants. Scientific and technical training alone do not equip investigators with the tools to handle these questions, and under conditions of secrecy there is little opportunity to consider other perspectives.

During the Cold War, the novelty and limited understanding of radioactivity and its health effects provided another challenge to decision-making about radiation and its uses in research and industry. Science is a process; increased understanding in a discipline requires open dialogue between researchers about the interpretation of data. This can prove frustrating to a public seeking unequivocal answers about issues of health and safety. Barton Hacker's paper considers the continuing scientific discussions about the health effects of low-level doses of radiation, the role of this controversy in setting radiation protection standards, and the evolution of public concern about radiation health effects.

Cold War research, development, and production of nuclear weapons has also had a significant legacy in the former Soviet Union. Best known are the considerable releases of radionuclides from the Mayak weapons production site near Chelyabinsk in the Southern Ural Mountains. Data have also become available about high exposures to workers at this site. Marvin Goldman's paper reviews the findings to date from studies of health effects from both the environmental and occupational exposures in Russia, considering them in the context of other findings about health effects of radiation.

The effects of the Cold War have a long half-life. The three following presentations illustrate how, in the 1990's, scientists and society at large must still strive to address and learn from past problems. The questions raised are big ones, bigger than the topic of radioactivity alone, and not readily answered. How can individual rights be balanced against national interests? How can we best make decisions about environmental and occupational safety in the face of scientific uncertainty? How can the public engage in decisions which directly affect them, when those decisions have a large technical component?


The author is at the U.S. Congressional Office of Technology Assessment, which is currently carrying out a study of impacts and future concerns from nuclear contamination in the Arctic and North Pacific.


Symposium on Radioactivity and Health: The Cold War Legacy

Physics and Society presents here papers based on the four talks given at the i\ nvited session on Radioactivity and Health: The Cold War Legacy. The session \ was held at the APS meeting in Washington, DC, on 18 April 1995, with Lois Joel\ lenbeck presiding.


Human Radiation Experiments

Mark Goodman

You may have been present at a session this morning for celebration of the 100th anniversary of Roentgen's discovery of x-rays. And there is plenty of reason to celebrate. X-rays have become a ubiquitous medical tool for observing the body's inner workings, for diagnosing and providing the basis for treating and curing disease. Ionizing radiation in its various forms offers other medical benefits: radiation therapy using x-rays or charged particle beams; diagnostic tools using radioisotopes injected into the body to provide an externally visible trace of the body's inner workings; radioisotope tracers for research in biochemical processes that feeds into medical practice. These benefits are the fruits of an illustrious chain of biomedical research.

This year also marks the 50th anniversary of the first use of nuclear energy for military purposes, beginning a darker history of cold war human radiation research. Often shrouded in secrecy, this history has provoked many ethical questions along with calls for investigation, punishment of those responsible, and compensation of those who may have been harmed.

In December 1993, Energy Secretary Hazel O'Leary learned of a newspaper article by Eileen Welsome, an Albuquerque reporter, about people who were subjects of experiments that injected plutonium into their bodies. O'Leary was shocked. As part of her openness initiative, she called for an outside investigation of these and other experiments that had come to light. She persuaded President Clinton to establish the Advisory Committee on Human Radiation Experiments, to report on human radiation experiments performed the Department of Energy and five other agencies that had been implicated in similar activities: The Defense Department, the Veteran's Administration, the Department of Health and Human Services, NASA, and the CIA. The final report of this Committee is due later this year.

The role of a physicist

The Advisory Committee has fourteen members, who are experts in medical science, biomedical ethics, and other relevant fields. Ruth Faden, the Chair, is a Professor at the Johns Hopkins School of Public Health and Senior Research Scholar at Georgetown's Kennedy Institute of Ethics. The Committee itself has not reached final conclusions, and I am not here to preview those conclusions.

I am not a member of the Committee, but work on its staff. The staff, like the Committee, has people with different backgrounds and different perspectives on the Committee's work. As a physicist with an avocation for public policy, I saw the Committee as an exercise in forensic science policy. Its purpose is to discover what happened, decide how to judge what was right or wrong, and recommend ways to avoid repeating past mistakes.

My work has focused on the intentional release of radioactive materials into the environment for experimental purposes. This includes experiments conducted for atomic energy intelligence, for the design of radiological and nuclear weapons, to test specialized nuclear reactors, and to understand basic environmental processes, but not atmospheric nuclear tests themselves.

What is biomedical ethics?

As a physicist, I was not familiar with the field of biomedical ethics. In my experience, research ethics meant primarily honest reporting and attribution of research results. I would also argue that it was important to consider the ends to which research might be put. Biomedical researchers also face ethical question of what means are legitimate for gathering experimental information, particularly when those experiments involve human beings as subjects of research.

In simple terms, biomedical ethics involves two basic principles. First, researchers must weight the anticipated benefits of an experiment with the anticipated risks. People should not even be asked to take part in experiments where the risks are too severe, or the benefits too slight. Second, people who are subjects of research must knowingly agree to take part in that research, a requirement known as informed consent.

These principles seem fairly obvious, but they have not been applied universally. In general, the relationship between physician and patient was different, with much more authority vested in the physician; informed consent was often honored in the breach.

But the Cold War history of human experimentation raises deeper concerns about secrecy and whether national security interests overrode respect for basic human dignity. I will present a series of anecdotes from this history, describing specific experiments and some of the bureaucratic background behind the experimental programs.

The demands of nuclear production

One of the first challenges of the Manhattan project was the safe handling of large quantities of radioactive material. The entire pre-war stockpile of radium amounted to no more than 100 grams, and had already caused significant health hazards in the radium watch-dial industry.

Women who painted these watch dials often licked their brushes to point them, ingesting significant quantities of radium. Radium, chemically analogous to calcium, would concentrate in the bone. Radium decays by emitting an alpha particle. Unlike beta and gamma radiation, which can penetrate to significant depths in the body, the energy of an alpha particle is entirely absorbed within about 50 microns. This causes serious local damage to the bone, with high rates of necrosis and cancer of the bone.

The Manhattan project would produce tens of kilograms of plutonium, also a strong alpha-emitter, along with large quantities of high-level radioactive waste, consisting of a mixture of fission products. For most of these new materials, including plutonium, the biological hazard was not yet known.

The field that developed to deal with these issues was known as health physics. We are fortunate to have with us today Marvin Goldman, the President of the Health Physics Society. According to one early health physicist I interviewed, this was a code for radiation protection, without mentioning the word "radiation." This is one example of the widespread use of code words, including "tuballoy" for uranium, "product" for plutonium, "postum' for polonium, and others. These code words were part of an overarching policy of secrecy, designed to prevent Germany or Japan from learning of the bomb project.

Researchers at Berkeley performed a number of experiments on animals to determine how and where various radionuclides, including plutonium, went within the body. These data provided a general indication of the hazard level, but could not replace data taken from humans. Plutonium was particularly difficult to monitor because its alpha radiation, unlike the gamma radiation emitted by most fission products, could not be detected outside the body. It could cause severe, chronic radiation exposures to any tissues where it might accumulate. Based on urine samples and comparison with animal experiments, it appeared in the Spring of 1945 that Los Alamos workers were approaching the maximum permissible body burdens of plutonium. Health physicists therefore felt an urgent need for human experiments on the metabolism of plutonium.

Plutonium was a secret material at the time, and radioactivity in general was a sensitive subject. This made it impossible to provide the information necessary to obtain informed consent from volunteer human subjects. Instead, researchers at Oak Ridge, the University of Rochester, and Berkeley injected plutonium into the veins of 18 unwitting patients who were in the hospital for other reasons.

The risk from these experiments is somewhat difficult to assess, and depends on how long the patients were expected to survive. Although these patients were generally described as "hopelessly sick" or "terminal," few in fact suffered from such conditions. Several survived for decades, and one lived until 1991. As far as I know, none suffered from their exposures to plutonium. However, the amounts injected into these people's blood ranged from 5 micrograms to nearly 100 micrograms of plutonium, a level believed likely to cause acute harm. The occupational standard at the time was being lowered from 5 micrograms to 1 microgram maximum permissible body burden, and the current figure is 0.4 microgram.

It remains unclear what useful result was expected from these plutonium injection experiments. The most common route of exposure to plutonium was the inhalation of small particles (a few microns), not direct blood exposure through an open wound. The metabolism or plutonium through these two pathways is quite different. In any case, data from these experiments was not used to establish exposure standards for plutonium. It was used for a more limited purpose, to provide an expected profile for the excretion of plutonium once it enters the bloodstream.

Secrecy

After the war, researchers wanted to publish an article on these plutonium injection studies in an official history of the Manhattan Project. The draft paper was classified on the basis that it might expose the government to claims of liability -- i.e. to shield public officials from the consequences of their actions rather than to protect national security. After that point, the AEC's insurance branch had a formal role in declassification, to prevent the release of information that could expose the AEC to lawsuits.

Other examples of secrecy include the radiological warfare field testing program. From 1949 to 1952, the Army's Chemical Corps undertook a program to test prototype radiological weapons -- non-explosive weapons designed to cause radiation injury -- at the Dugway Proving Ground in Utah. Although some called for releasing information about the tests, defense officials decided to keep them secret in order to avoid causing public "anxiety," "alarm," or "hysteria." There is no evidence of a genuine national security concern underlying this secrecy.

The prospect of nuclear war

I had intended to speak about some of the biomedical research motivated by the prospect of nuclear war, but time does not permit. The Advisory Committee has uncovered a fascinating history of how concerns over possible casualties from radiation led to research on radiation effects that put many people at risk, including soldiers taking part in atomic tests and patients undergoing radiation treatment for cancer.

Atomic intelligence and the Green Run

So far, I have described work mostly done by others. I'd like to explain one of the historical events that I have studied, known as the Green Run.

The Green Run was an intelligence experiment involving the deliberate release of large quantities of radioactive gases from the Hanford plutonium separation plant in December 1949, less than three months after the discovery of debris from the first Soviet nuclear test. This experiment tested equipment and techniques that could be used to monitor Soviet nuclear production

. Plutonium production normally releases iodine-131, with an 8-day half-life, into the atmosphere. The amount of I-131 can be limited by allowing the irradiated fuel elements to cool for an extended period of time, and by scrubbing it from the gases going up the exhaust stack. By 1949, it was normal to cool fuel elements for roughly three months, and scrubbers removed roughly 90% of what iodine was then released. The Green Run got its name from the use of "green" fuel elements, cooled for only sixteen days. The scrubbers were also turned off for the experiment.

As a result, roughly 8,000 curies (a curie is 3.7 x 10^10 disintegrations per second, based on the radioactivity of a gram of radium) of I-131 were released into the atmosphere near Hanford. This was large then compared to the roughly 1 curie/day emissions at the time, but not when compared to Hanford's peak operations during the war and soon after, when over 700,000 curies were released. As it turns out, the risk from the Green Run was relatively small, primarily because it occurred during the winter, when people were not drinking milk from cows that grazed on contaminated pastures.

The Green Run raises several questions:

(1) Was such a large release necessary? People at Hanford were routinely monitoring contamination in the nearby environment, even at relatively low release rates. There was also an airplane tracking the radioactive plume at low altitude, but this was probably not a technique that would be used in intelligence operations; little could be detected outside Soviet air space. I searched for actual and anticipated benefits from the Green Run, and came up empty-handed.

(2) Should people have been more careful, given their ignorance about the risks? Health physicists knew enough to understand that bioconcentration was an important factor in environmental exposures to radioactive materials, even if they had not identified the dominant milk pathway.

(3) Would better procedures for weighing environmental risks have produced a better result? The available documents provide no indication that anyone ever weighed the risks and benefits of the Green Run and concluded that it was worth the risk. In fact, it remains unclear today whether environmental risks undertaken in secret in the name of national security receive adequate review.

Conclusion

The story the Advisory Committee on Human Radiation Experiments has uncovered concerns the ethical obligations of scientists, particularly government-sponsored scientists, to society. What limits should be imposed on the means scientists use to obtain new knowledge? How do we weigh the costs of research, in dollars and in human health, against the benefits of the knowledge to be gained?

Scientists are used to thinking that the benefits outweigh the costs, but the taxpayers who pay us are not so sure. Citizens have grown increasingly suspicious of science and question the credibility of scientific information, in this case reassurances about the hazards of ionizing radiation.

To some extent, scientists are not to blame for this growing suspicion. The government's record of withholding information and providing misleading reassurances of the risks of nuclear testing and nuclear power have helped create a legacy of distrust. Society places little faith in learning but relies heavily on increasingly complicated technology. But scientists also bear a significant share of the responsibility for their poor image, in large part because they interpret their obligations as scientists too narrowly. Scientists prefer to avoid thinking about the value of their work, and when they do they rely on shopworn platitudes about the benefits of scientific progress. To restore their credibility, scientists will have to take seriously both the need to explain what they are doing to those who pay for it, and the obligation to understand how their work affects people.


The author is a research analyst with the Advisory Committee on Human Radiation Experiments, 1726 M Street, NW, Suite 600, Washington, DC 20036.


Symposium on Radioactivity and Health: The Cold War Legacy

Physics and Society presents here papers based on the four talks given at the i\ nvited session on Radioactivity and Health: The Cold War Legacy. The session \ was held at the APS meeting in Washington, DC, on 18 April 1995, with Lois Joel\ lenbeck presiding.


Setting Radiation Protection Standards: Science, Politics, and Public Attitudes in Historical Perspective

Barton C. Hacker

Radiation safety is a peculiarly 20th century issue, dating from the discovery of x-rays and radioactivity in the late 1890s. Initially, only a few doctors and technicians risked harm from prolonged or intense contact with x-ray machines or radium. Working together they devised radiological safety codes to protect themselves and, in due course, others whose work might put them at risk. These self-imposed standards defined what informed medical judgment accepted as safe levels of exposure to external x-rays and gamma rays or to radioactive substances that somehow entered the body. By and large, they worked.

Early standards

From 1928 onward, standard setters expressed acceptable limits for external radiation in roentgens. Technically defined in terms of radiation-caused ionization of air, the roentgen strictly speaking measured exposure, not dose. Specifying dose required another unit that considered both energy absorbed in tissue and the relative biological effect of the kind of radiation. Although one such unit, the rem, was devised during World War II, many practitioners persisted in using roentgens to express exposure or dose indifferently for another decade or more. By the late 1950s, however, the rem had become the standard unit of dose, while still another unit, the rad, was coming into use to express energy absorbed. Ordinarily, shifting from one unit to another did not greatly alter the numbers, which may account for the sometimes casual usage of the several units, even among knowledgeable practitioners.

What do these units mean? According to an authoritative 1950 manual, acute whole-body exposures up to 50 roentgens produced little more than blood changes; serious injury and any likelihood of disability took more than 100. Table 1, first published in 1950, summarized the acute effects of radiation exposure. Essentially similar tables, with "r" standing for rads instead of roentgens, can be found in the latest textbooks. Controversy over health effects of radiation, however, barely touches the area defined by this table. It centers rather at the very lowest end of the exposure spectrum, where "no obvious injury" tends toward no directly observable effect of any kind.

Table 1.  Probable Early Effects of Acute Radiation Doses over Whole Body	
Acute dose		Probable effect
	
0-25 r[oentgens]	No obvious injury
25-50			Possible blood changes but no serious injury
50-100			Blood-cell changes, some injury, no disability
100-200			Injury, possible disability
200-400			Injury and disability certain, death possible
400			Fatal 50 percent
600			Fatal
	
Source:  Samuel Glasstone, ed., The Effects of Atomic Weapons (Los Alamos:  Los Alamos Scientific Laboratory, September 1950), Table 11.28.

World War II

The advent of controlled nuclear fission and then atomic bombs during World War II did not so much transform the nature of radiation hazards as vastly expand their scope. Radioactivity that had once mattered chiefly in the laboratory or clinic came to concern much of postwar society, especially after 1951 when the new Atomic Energy Commission (AEC) began regularly testing nuclear weapons. Test fallout stirred widespread fears and provoked public outcries over the hazards of radiation at any level of exposure. Undoubtedly, thousands of people who lived and worked in regions affected by fallout received varying, though almost always very low, doses of external gamma and beta radiation during the era of aboveground testing, from 1945 to 1963. Many also must have ingested radionuclides through normal eating, drinking, and breathing in fallout regions.

At issue were the hidden risks of fallout. Only in a few well documented and widely known instances, most notably the Marshall islanders and Japanese fishermen caught by fallout from the Castle Bravo test in 1954, were doses high enough to cause radiation sickness and even threaten life. Radiation at such high doses produces well known and obvious effects, as Table 1 shows. But most doses were far lower, so low as to defy detection except by laboratory analysis of blood samples from those exposed, and sometimes not even then; they certainly produced no evident damage to health, despite the controversial retrospective studies that suggest an unusually high incidence of radiation-related disease in regions affected by fallout.

The question of low doses

Scientifically, the real question is whether or not very low levels of exposure have had disproportionately great health consequences. It remains unanswerable. Mainstream scientific opinion still judges the danger minute; that is, very low dose implies very low risk, though experts disagree about precisely what that means quantitatively. Apparently contradictory conclusions about the risks of low-level radiation in successive reports of the National Academy of Sciences-National Research Council Advisory Committee on the Biological Effects of Ionizing Radiation (BEIR III in 1980, BEIR V in 1990), partly reflected new data, but also arose from persistently divergent viewpoints about how to evaluate them.

What limits should be imposed on exposure to radiation? The question persists because it depends as much on public policy, social values and philosophy, as on science.

Ambiguity begins with the study of radiation-caused harm to living things and the concept of dose itself. For individuals, damage depends on both dose (how much radiation absorbed) and dose rate (how fast). A dose lethal if received in a day might well be survived if spread over a month and prove harmless if stretched through years. Precisely what physical processes convert absorbed radiation dose into biological damage are still unclear. Inevitable death follows only very high acute doses. Exposure at lower levels and rates produces much less clear-cut results. In many instances, no one can say for sure that a certain dose will cause a certain injury to a certain living thing.

Prognosis becomes instead an exercise in probability, as witness the so- called median lethal dose: the dose that will kill half of a large number of exposed subjects within a specified time. Statistically, one might know that half those exposed in 24 hours to 350 roentgens will die in 30 days, yet remain unable to predict which persons that half will comprise. Efforts to reconcile such statistical group effects with individual harm have been a source of constant tension in the medical-legal sphere, and a fertile source of confusion in popular discussions. Fortunately, as we shall see, setting the problem in a new frame can do much to clarify the issues.

All such questions grow harder to answer as dose and dose rate decrease. Acute or short-term radiation effects, the result of large and rapid exposures, have never seemed baffling or controversial. Acute effects were clear, how to deal with them obvious: Straightforward measures like shielding sufficed to safeguard those at risk. Chronic or long-term effects were another matter entirely. That exposure to relatively low levels of radiation could cause harmful late effects had been known since the early years of the century. Such effects might be downplayed or ignored in the midst of war, including Cold War, but everyone knew that some forms of cancer and other disease sometimes occurred many years after exposure.

Evidence of damage appears more slowly and rates of injury decline as dose and dose rate fall. Someone exposed to a massive gamma ray burst quickly shows the effects, leaving no doubt about the cause. Some lesser dose might induce leukemia years later, when the cause will seem less clear-cut. At still lower levels and longer times between exposure and injury, causal links grow fainter yet. Just how very low doses trigger biological responses remains obscure. So, too, does the full range of possible late effects, which may include metabolic or immune system disorders as well as cancers. Medical and legal questions alike multiply as ties between cause and effect loosen, and the evidence of injury becomes statistical rather than clinical.

Different approaches to low doses

Scientific opinion divides about the shape of the dose-response curve at the very lowest levels, where cause and effect become hardest to measure or even to detect (Figure 1). Scientists taking one approach in effect graph the curve as a straight line from known higher values through lower unknown values to zero. Any exposure thus implies some chance of harm, even if damage cannot always be detected. Linear extrapolation, in other words, means that only zero dose causes zero damage.

Other approaches adjust the curve for biology. Biologically active agents, such as drugs or poisons, normally must exceed some threshold before working damage. Biological systems exposed at levels below threshold can restore themselves and so suffer no lasting harm. Radiation at low enough levels, in this model, causes no cumulative damage. The dose-response curve turns sharply downward toward zero damage at some dose higher than zero.

Still a third view, much less widely held, has disproportionately greater risk at very low doses. This seeming paradox is resolved by explaining that very low dose allows damaged cells to survive and become the seeds of cancer.

Since the mid-1970s scientists have begun to frame the problem in different terms, stochastic versus nonstochastic effects of radiation, although the basic issues remain the same. Stochastic effects are those for which incidence is the chief function of dose. Higher dose makes cancer more likely, for instance, not more severe, just as flicking a switch turns a lamp on or off but has no bearing on its brightness. Nonstochastic effects are precisely those for which severity is the chief function of dose. Cataracts of the eye offer a well-known instance: how great the injury depends on how high the dose. Radiation damage to blood vessels likewise increases as dose rises.

Restating the problem in these terms also helps distinquish group from individual effects. Stochastic effects are in essence population effects, the kind that can be predicted statistically but not individually; they also are low-dose effects and, as the word's Greek root implies, matters of conjecture. Nonstochastic effects, in contrast, are the predictable and unambiguous consequences of higher radiation doses. Because such effects can, in general, be observed only at fairly high levels of exposure, they also provide the strongest evidence for thresholds.

The view that radiation damage had biological thresholds largely prevailed during the first half-century of radiation protection, and finds many supporters today. Experiment cannot easily resolve the issue. Meaningful data on the rare and often minor damage inflicted by very low doses or dose rates could come only from armies of animals studied over many generations. Possible in theory, such studies simply exceed the limits of any realistic research program, all the more so because animal findings will not necessarily apply to humans; even closely-related species show markedly different effects. Practically, this seeming impasse poses no insuperable problem. Radiation safety has never relied on final answers. Pragmatic safeguards countered the everyday hazards long before science could explain either hazard or safeguard.

The concept of permissible exposure

Threshold thinking shaped early safety codes. "Tolerance" expressed the basic idea: Living things could survive, without patent ill effect, some defined level of radiation for an indefinitely long time. Inhabitants of Denver, after all, seem as healthy as New Yorkers, although Denver's altitude means they receive double the background radiation of dwellers at sea level. "Permissible exposure" first emerged as an alternate concept in the mid-1930s, and gained wide currency only in the early 1950s. The newer term added social-political views to medical-biological judgments about what might be harmless. Its adoption would, in effect, shift the thrust of radiation protection from seeking biological thresholds to weighing risks and benefits. Although the bulk of evidence in fact argued threshold, guideline writers assumed philosophically that any exposure was risky. Whatever they believed about physical realities, many experts came to prefer erring on the side of caution, acting as if any exposure could be harmful.

Steadily falling dose limits should not be construed as solely, perhaps even chiefly, a product of greater knowledge. Certainly, better data and new findings have affected standards, but no new danger needed proving to invest exposures regarded calmly in one decade with deep concern in another. Technology here has played a crucial role over the years in changing what "low" means with respect to radiation exposure. Improving instruments detect ever smaller amounts of radiation in the field as well as the laboratory. As for many other toxic hazards, detection in practice if not in theory implies danger. Technical prowess, in other words, rather than assured hazard tends to define safe limits, for radiation as for other potential hazards. Changing political climate has often affected standards far more than new data or deeper understanding.

For radiation the crucial question has scarcely varied since turn-of-the-century x-ray and radium users began to worry about the side effects of their wonderful new tools: What limits should be imposed on human exposure to ionizing radiation? The question persists because the answer depends at least as much on public policy, social values and even on philosophy, as on science. Social concerns in the widest sense have always molded safety standards, science at best setting guidelines for decision makers.

That radiation protection standards are socially constructed and politically, rather than scientifically, decided may be the most pervasively misunderstood point in the entire public controversy. Whatever the dose, more often than not a showing of exposure is simply assumed by critics to be a proof of damage, a transition so easy as hardly to be noticed. It reflects a long history of public apprehensions about nuclear matters and the subjective judgment of their unseen risks. Quantitatively assessed risk, the "real" danger, need bear little relationship to danger perceived or risk deemed acceptable; this is perhaps even more true of the atom than of other technological hazards. Unresolved questions about test veterans or downwinders have simply added to nagging doubts about long-term health effects of exposure to low levels of ionizing radiation.

Discrepancies between real and perceived risk surely owe something to public misunderstandings and confusion about the scientific bases of radiation protection standards, about distinctions between natural law and practical guidelines, about the ambiguities of cause and effect in radiation injury. But such discrepancies may owe even more to wide and growing mistrust of government motives and suspected conflicts of interest in choosing and applying standards. Unfortunately, skepticism seems all too well deserved.

When fallout from nuclear weapons testing became an issue during the 1950s, government officials mostly preferred to reassure rather than inform. Practically no one doubted that testing could be conducted safely, in other words without seriously endangering either test participants or members of the public, provided suitable precautions were observed. On the one hand convinced that trying to explain risks so small would simply confuse people and cause panic, on the other hand fearing to jeopardize the testing vital to American security, officials simply refused to admit any risk at all. When outsiders revealed that fallout might indeed be hazardous, government credibility suffered prompt and long-lasting damage.

Moderately greater openness in the 1960s with the advent of commercial nuclear power, and even more in the late 1970s after the AEC's demise, proved inadequate: What looked like efforts to downplay risks came to seem no less suspect than to deny them altogether. Each time the AEC or one of its successors faced questions about possible hazards, it tended to issue reassuring statements and discount the danger. Assuming that the public could not grasp the nuances of minor versus major risk, agency representatives preferred to claim no risk at all. No one likely thought of that as lying, but powerful officials free of much direct accountability in a secrecy-shrouded program found it all too easy to deny, dissemble, or mislead as a matter of course without a second thought. Forgetting how much their special knowledge owed to their places rather than their virtues, they too lightly dismissed the costs their high purposes may have imposed on their fellow citizens.


Copyright, 1995, by Barton C. Hacker. All rights reserved. The author is at L-451, Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94551, hacker1@llnl.gov . He will be happy to provide an annotated version of this essay upon request. Alternatively, interested readers may consult the two books, both published by the University of California Press, in which he more fully discusses and documents the issues raised here. See Barton C. Hacker, The Dragon's Tail: Radiation Safety in the Manhattan Project, 1942-1946 (1987) and Elements of Controversy: The Atomic Energy Commission and Radiation Safety in Nuclear Weapons Testing, 1947-1974 (1994).

Symposium on Radioactivity and Health: The Cold War Legacy

Physics and Society presents here papers based on the four talks given at the i\ nvited session on Radioactivity and Health: The Cold War Legacy. The session \ was held at the APS meeting in Washington, DC, on 18 April 1995, with Lois Joel\ lenbeck presiding.


Radiation Lessons From Russia

Marvin Goldman

Radiation exposures of workers and populations in Russia are for the most part the result of accidents and some early flaws in emerging nuclear technologies. Of principal biomedical and environmental significance are the 1986 accident at Chernobyl in Ukraine, and radiation releases associated with the operation of the MAYAK nuclear production and reprocessing facility in the South Urals. There are other nuclear weapons sites about which we now know very little. Recently, progress has been made in providing a better understanding of what went on and what it means. The decades of secrecy have left an atmosphere of confusion, distrust and misinformation about all things nuclear. This seems to be a common problem in all the nuclear powers. Hopefully, international "glasnost" will replace ignorance and suspicion with information and understanding.

Since 1949, the atomic device testing site in Semipalatinsk, Kazakhstan was used for some 468 tests. About 124 tests were in the atmosphere, 26 were surface shots, and the rest were underground. Starting about two years ago, a large review of possible medical and environmental consequences has begun in Russia's Siberian Altai, 100 miles to the north, but I do not know if a significant study also has been undertaken below the border in the nearby Kazakh communities. (These communities are also in a downwind location from the Chinese test site at Lop Nor). When I was at Semipalatinsk's "Polygon" several months ago, it was clear that there was much anecdotal information about past events, and little in the way of full integration of the data. The first blast in 1949 released a large I-131-rich cloud which covered populated areas. My brief visit only showed current increases in radiation levels at the lips of the blast craters. There are reports of other hot spots, but no hard data as yet.

East of Chelyabinsk, on the Tom River, is the nuclear production and recycling site at Tomsk-7, (Seversk), near the large Siberian city of Tomsk. There are reports of some 24 accidents, four of which were allegedly serious; the most recent was two years ago, involving a plutonium release. The complex, whose first reactor become operational in 1953, is also known as the Siberian Chemical Complex. It is reported that they have produced some 35 million cubic meters of liquid waste. There are reports of plutonium releases and aquifer contamination, but I have no information about health or environmental consequences.

Further to the east, on the Yenesei River, is the third complex, located at Krasnoyarsk-26 (Zhelenogorsk), the "Mining Chemical Complex" some 80 km from the city of Krasnoyarsk. This site too is still highly classified and little is known of any problems there. There is a large reprocessing plant, RT 2, under construction, but which is behind schedule and facing local opposition. The data on contamination, exposures and possible health effects is still classified and there is no indication at this time as to when it may become available.

It is also not clear whether there are Health Ministry branches at Tomsk and Krasnoyarsk engaged in environmental and biomedical research as there are at Chelyabinsk. The rivers servicing these three complexes ultimately drain north into the Yenesei and Ob Rivers, and empty into the Arctic Sea. That these last two were constructed five or more years after the Chelyabinsk complex leads to the possibility, and hope, that the deficiencies in early technology design and operation were not repeated in the newer facilities.

Although the Chernobyl explosion had sufficient energy to inject its plume high enough to cause fallout over most of the Northern Hemisphere, it now appears that the significant radiation exposure was to nearby residents, especially children. Due to contamination of fodder of dairy cows, thousands of children received substantial radioiodine doses to their thyroids and are now showing a very large increase in thyroid cancer (mostly benign), almost a thousand to date. The disease rate is still increasing and could reach a total of 5,000 to 10,000. Although no other carcinogenic effects, including leukemia, have been noted, remember that most radiogenic cancers have a latent period of about 10 years. Further follow-up is needed.

The workers at the MAYAK complex, at Chelyabinsk-65, (Osersk), received significant radiation overdoses in the late 40s and early 50s. Fatalities from lung fibrosis due to large plutonium inhalation exposures, increases in lung cancer, chronic radiation sickness and leukemia have also been documented. These data are currently being reviewed by an international team of experts to learn more about the consequence of these unique exposures.

Due to early MAYAK releases from 1948 to 1955, populations living adjacent to the nearby Techa River show an apparent increase in Sr-90-induced leukemia. The 1957 Kyshtym explosion of a reprocessing tank, followed a decade later by a major resuspension of highly contaminated sediments from a dried out holding pond, exposed a quarter million people to long-lived radionuclides, and these populations are also under study. From all exposures, I estimate that the collective dose is larger than that for the Japanese survivors of the atomic bombings. Preliminary evaluations suggest that the radiation-induced leukemias show a dose rate effect relative to the risks as estimated from the Japanese atomic bomb study. The chronic doses appear to be about one-third as leukemogenic as acute doses.

The main radiobiological lessons from the former Soviet Union regarding radiation address the role of dose rate and radionuclide exposures in workers and the general population. In addition to a better understanding of radiation risks in people, the information may demonstrate that todayUs assessment of risks from chronic low-level doses may be much more conservative than we think. It may be that the data provide some quantitative information on the role of repair of low dose molecular lesions and cellular compensation for small dose rates that reduce the unit carcinogenicity of low dose rates. The magnitude and significance of these events have important potential lessons for us in light of current radiological and nuclear issues and options.


The author is President of the Health Physics Society and Professor Emeritus at the University of California, 1122 Pine Lane, Davis, CA 95616-1729, mgoldman@ucdavis.edu.

The Imprudence of "Prudent Avoidance"

David Hafemeister

In 1988, shortly before he died, Andrei Sakharov commented on the fate of the earth. Interestingly enough, rather than comment on the hydrogen bombs that he co-invented, he stated: "... in fact, I am now inclined to regard the many-faceted ecological threat to our environment as our most serious long-term problem " (1). Because I agree with this very long-term assessment, it is troubling to me to see environmental funds and political capital wasted on false threats. In particular, I am concerned that the quasi-legalistic concept of "prudent avoidance" is being used to chase the phantom risk of cancer caused by extremely low frequency (ELF) electromagnetic fields (EMF) from power lines. This needless chase costs some one to three billion dollars per year (2,3) and unnecessarily frightens the public with "electrophobia." The burden of these fiscal and emotional costs placed on the American public are incommensurate with the risk, if any, being mitigated. This outcome is not a use of science for the public good.

What is prudent avoidance?

In the absence of any firm scientifically demonstrated connection between ELF/EMF and cancer, the concept of "prudent avoidance" has been invoked by many utility commissions (at least eleven by recent count) as a basis for promulgating regulations. Granger Morgan (4) defines "prudent avoidance" as "exercising sound judgment in practical matters. It means being cautious, sensible, not rash in conduct." Morgan continues, prudent avoidance "is to try to keep people out of fields when that can be done at modest cost...but not to go off the deep end with expensive controls which may not be beneficial."

Prudent avoidance, as thus defined, might seem reasonable if one understood the nature and severity of the risk, which is not the case for the alleged EMF health hazard. From there Morgan moves towards suggesting the arbitrary spending of money without measurable benefits: "Utilities and utility regulators must consider both distribution systems and transmission systems. Activities that may warrant consideration at the distribution level include: paying greater attention to population distributions around facilities; incorporating more consideration of exposure management in maintenance and facility upgrade policies...making selected use of undergrounding..." At this point "prudent avoidance" becomes imprudent because it leads to an open-ended, unbounded approach to risk mitigation. It stimulates a fearful public to use the threat of litigation to force utilities and school boards to take steps to mitigate a phantom effect. These institutions have little incentive to risk litigation, as long as the costs of compliance will be covered by rate payers or tax payers.

Morgan's approach appears to be driven by his statement that "there is some significant chance that fields pose a modest public health risk, and not much chance that the risk to any one of us will be very big." (4) Morgan seems to have placed great reliance on the very questionable work of Wertheimer and Leeper (5) when Morgan stated in 1992 that "a series of epidemiological studies, including studies of childhood leukemia by Nancy Wertheimer and Ed Leeper....have provided a growing basis for concern."

By now there have been at least 13 studies on childhood leukemia. These studies been examined by Washburn et al, and they find "no significant relation between combined relative risk estimates and 15 indicators of epidemiological quality. Assessment of EMF exposure in the primary studies was found to be imperfect and imprecise." (6)

In addition, Morgan has failed to examine the epidemiology risk factors by type of cancer, an approach that shows glaring inconsistencies. Because of the great impact of his writings, it would be useful for Morgan to update his analysis to determine what benefits have been gained by the annual spending of some $1-3 billion.

Philosophically, Morgan alludes (4) to Thomas Kuhn's Structures Of Scientific Revolutions by stating that "paradigm shifts" are affecting "scientific thinking about biological effects from electric and magnetic fields." It is premature to talk of paradigm shifts when the preponderance of the data does not demonstrate that there is a connection between cancer and these fields. Morgan is concerned that public perceptions may drive regulations rather than scientific fact. However, I conclude that it is his own papers that have strongly pushed the EMF-risk process away from science and toward irrationality. I agree with the critics of "prudent avoidance" who have call it "the abandonment of science," "the triumph of fear of the unknown over reason," and "being so vague as to be useless." (4) Prudent avoidance is a delight for plaintiff lawyers since it is essentially a conclusion that the danger is probable.

A General Accounting Office (GAO) report (7) acknowledges this misuse of science by concluding: "Regulators in at least eleven states that we contacted have adopted practices for mitigating exposure to EMFs.... Some commercial utilities have also adopted prudent avoidance or other 'low cost/no cost' policies to address the public's concerns about EMFs. Such policies are not based on scientific knowledge about health effects of exposure to EMFs."

The cost of mitigation

One of Morgan's co-authors, Keith Florig, commented in 1992: "...it seems likely that the total economic costs of the [EMF mitigation] activities described above now exceed $1 billion annually, with the promise of growing costs in the years to come.... If we were to value the reduction of a unit of EMF risk at comparable levels, the most that we could justify spending on EMF mitigation would be something in the neighborhood of $10 billion per year.... Recent examples include a town that moved several blocks of distribution lines underground at a cost of $20,000 per exposed person; a utility that rerouted an existing line around a school at a cost of $8.6 million; a new office complex that incorporated EMF exposure in its design at a cost of $100-200 per worker; and a number of firms that have installed ferrous shielding on office walls and floors to reduce magnetic field exposures from nearby power handling equipment at costs ranging up to $400 per square meter of office space." (2)

The GAO study (7) estimates the following costs for EMF mitigation, which would not reduce the EMF from appliances from within the home:

--- $90,000/mile for delta design above-ground transmission lines to reduce magnetic fields by 45%,

--- $2 million/mile to bury transmission lines in fluid-filled steel pipes to reduce magnetic fields by 99%,

--- $1 billion to limit magnetic fields to 10 milligauss at edges of rights-of-way for planned new transmission lines,

--- $3-9 billion to reduce magnetic fields at homes where grounding systems are the dominant source,

--- $200 billion to bury transmission lines nationwide near homes with fields greater than 1milligauss,

--- $250 billion to reduce average exposure to less than 2 milligauss from all transmission and distribution lines.

Allan Bromley, President Bush's Science Advisor, recently commented on an EMF study done by the Office of Science and Technology Policy: "It is safe, however, to conclude that the EMF risk issue will continue to be contentious and of immense potential economic importance; the current best estimate is that prior to 1993 it has cost the American public more than $23 billion to respond to public worries about EMF ...particularly in connection with the placement of high-voltage power lines." (3)

Recently a law suit was filed against Houston Light and Power and the Electric Power Research Institute (EPRI) on behalf of eleven families with children suffering from cancer. The suit charges both the power company and EPRI with "fraudulent concealment of the carcinogenic nature of the fields that secretly and silently invaded their homes." To avoid such litigation and the associated unfavorable publicity, other institutions have decided to give in, rather than fight. For example, the San Diego Gas and Electric Company canceled a power plant upgrade and compromised on a 69-kilovolt line. Hawaiian Electric Industries, Inc., spent nearly $5 million to reroute and reconfigure power lines. In the Mill Valley School District, four classrooms, a day care center, and a part of the playground located near power lines have been closed. The policy of prudent avoidance added about $500,000 to the construction costs of the World Bank Building, and this approach is now considered to be a model in this area. The California Public Utility Commission has required the utilities to spend up to 4% of the cost of electrical projects to mitigate EMF. Thus, we see that the advocates of "prudent avoidance" are willing to spend large sums for mitigation efforts with no clear assessment of any benefits to be gained.

Evidence bearing on EMF effects

The scientific literature and the reports of review panels show no consistent, significant link between cancer and the EMF from power lines (8). This literature includes epidemiological studies, research on biological systems, and the analyses of theoretical mechanisms. This negative result is consistent with the implications of arguments which have been advanced that there can be no such link. The preponderance of the epidemiological and biophysical/biological research findings have failed to substantiate those studies that have reported specific adverse health effects from the exposure to 60-Hz EMFs. It is always possible that some minor carcinogenic connection might be found, but the present data do not establish that connection. To justify expenditures on mitigation, there should be some consistent, meaningful combination of the following factors:

--- a plausible coupling mechanism at the cellular level exists,

--- evidence that the coupling must produce consistent biochemical changes,

--- indications that the biochemical changes must be detrimental,

--- meaningful epidemiology data that determine the degree of danger, and finally,

--- application of upper-bound mitigation costs for EMF that are comparable to the mitigation costs for other dangers in society.

Epidemiology. The scientific panels that have reviewed the EMF epidemiology data have found them inconsistent and inconclusive (8). It is necessary when comparing the data to separate the results by cancer type. For example, consider the recent case of three studies of electrical workers and a recent study on non-electrical workers in Sweden (8). The 1993 California study reported no association of EMF with either leukemia or brain cancer. The 1993 Canadian-French study reported an association with leukemia, and astrocytoma, out of the 32 cancer types studied, but this study suffers from problems of internal inconsistencies. The 1995 Savitz/Loomis study reported no association of EMF with leukemia, but they reported an association with brain cancer. The 1993 Swedish study reported an association with leukemia, but not with brain cancer. Thus, these four "best studies" report very contradictory results. It is very difficult to determine statistically relative risk factors of less than two for rare modes of death because of the many confounding factors such as economic status and chemical pollutants.

Biology and Biophysics Experiments. The scientific review panels, the review articles, and the research papers that we have reviewed (8) do not claim a causal link between EMF and cancer. In addition, the review panels and review articles have pointed out that there is a continuing problem with replicating the experimental results on cells and animals.

Theoretical Mechanisms. No plausible biophysical mechanism for the systematic initiation or promotion of cancer by these extremely weak EMF's has been identified (8). The lack of epidemiological evidence and experimental evidence establishing a link between EMF and cancer is consistent with the biophysical calculations that rule out the carcinogenic effects because the thermal noise fields are larger than the fields from EMF. Since quantum mechanics, thermal noise fluctuations, and cancer promotion are all statistical effects, it is difficult to derive a proof that is a necessary and sufficient condition to preclude all cancer promotion. However, these fundamental calculations are a significant guide to conclude that the EMF-cancer link, if any, should be extremely difficult to detect because its magnitude is, at most, very small.

Journalism. The number of newspaper stories on EMF rose from 233 in 1992 to 548 in 1993. The number of magazine stories rose from 101 in 1992 to 216 in 1993. The writings of Paul Brodeur, such as Currents Of Death, have been followed with headlines such as "is my electric blanket killing me; chilling possibility that a power that has improved life could also destroy it; warning: Electricity can be hazardous to your health." Even when an article is even-handed, its caption at the top read: "Steps to Protect Yourself from Danger -- Real and Potential." It is my conclusion that science and the relative risk methodology are often undercut by the quality of journalism in a free and fear-prone society.

The statement issued by the Council of the American Physical Society (reprinted below in the Comment section) addresses these concerns in more general terms.

1.	A. Sakharov, Memoirs (Knopf, New York, 1990), 409.
2.	H. Florig, Science Vol. 257, 468-9, 488, 490, 492 (1992).
3.	D. A. Bromley, The President's Scientists (Yale Univ. Press, 1994).
4.	G. Morgan, Physics and Society October 1990, 10; Public Utility
	 Fortnightly 15 March 1992.
5.	N. Wertheimer and E. Leeper, Am. J. Epidemiology Vol. 109, 273 (1979).
6.	E. Washburn et al, Cancer Causes and Control Vol. 5, 299-309 (1994).  
7.	General Accounting Office, Electromagnetic Fields (GAO/RCED-94-115, 
	Washington DC, June 1994).
8.	D. Hafemeister, "Background Paper on Power Line Fields and Public 
	Health," (http://www.calpoly.edu/~dhafemei), 8 May 1995.
	


The author is with the Physics Department at California Polytechnic State University, San Luis Obispo, CA 93407.
armd@physics.wm.edu