ARTICLES

 

The American public seems concerned with the potential environmental impact of nuclear power, unaware of the "carbon problem". As indicated by these articles, physicists are concerned with both. Politicians’ prime concern often seems to be just getting elected, no matter what the present or future environmental or security problems may be. The guiding premise of our Forum has been that physicists have an obligation to help the public force their politicians to deal effectively with these problems. They can do so as individuals, via "non-partisan" educational groups such as the Forum, or via issue- oriented "pressure groups" (such as FAS, UCS, etc.) But they should be active!

 

 

The Science and Politics of Climate

Freeman J. Dyson

Talk given at American Physical Society Centennial Meeting Atlanta, Georgia, March 25, 1999

Responding to the Joseph A. Burton Award Given by the APS Forum on Physics and Society

Three agencies of the US government have serious programs of climate research, NASA, NOAA and the Department of Energy. I shall talk mostly about the Department of Energy because that is my home territory. The Department of Energy program is the smallest of the three. Anybody who had been primarily involved with the NASA or NOAA programs could tell similar stories about them. My involvement began at the Oak Ridge National Laboratory in 1972. Alvin Weinberg, who was director of Oak Ridge for many years, started a program of climate studies there. He was running a major nuclear power development program, with a large effort devoted to studying the environmental and public health problems of nuclear power. He decided to broaden the environmental studies to include effects of power-plants burning fossil fuels. Weinberg is an interesting character in many ways. He is himself a strong pro-nuke. He helped to build the first nuclear reactors at Oak Ridge and spent most of his life promoting nuclear power. But he likes to listen to opposing views. He collected at Oak Ridge a bunch of brilliant people, including anti-nukes as well as pro-nukes, to study the environmental problems associated with all kinds of energy. One of the anti-nukes at Oak Ridge was Claire Nader, the sister of Ralph Nader. Weinberg liked her and always listened to what she had to say. Allan Poole was another unusual character in the group around Weinberg. Poole had been for some years a Buddhist monk in Thailand. He was an expert on tropical forests. Another member of the group was Jack Gibbons, who later became head of the Office of Technology Assessment and science advisor to President Clinton.

The practical advice that Alvin Weinberg gave to the Department of Energy was to increase the funding of field measurements, physical measurements in the atmosphere and biological measurements on the ground. The purpose of measurements in the atmosphere was to test the climate models with real data. The purpose of measurements on the ground was to explore the non- climatic effects of carbon dioxide on farms and forests. The department did not pay much attention to his advice. The lion's share of the budget for carbon dioxide research continued to be spent on computer models. The amount of money spent on local observations is small, but the money has been well spent.

Several successful programs of observation have been started in recent years. One of them is a Department of Energy program called ARM, Atmospheric Radiation Measurements. ARM's activities are mainly concentrated at a single permanent site in Oklahoma, where systematic observations of radiation fluxes in the atmosphere are made with instruments on the ground and on airplanes flying at various altitudes. Measurements are made all the year round in a variety of weather conditions. As a result, we have a data-base of radiation fluxes as a function of wave-length, angle and altitude, in clear sky and in cloud and between clouds. One of the most important measurements is made by two airplanes flying one above the other at different altitudes. Each airplane measures the fluxes of radiation coming up from below and down from above. The difference measures the local absorption of radiation by the atmosphere as a function of wave-length. The measured absorption of sunlight turns out to be substantially larger than expected. The expected absorption was derived partly from theory and partly from space-based measurements. The discrepancy is still unexplained. If it turns out that the anomalous absorption measured by ARM is real, this will mean that all the global climate models are using wrong numbers for absorption.

The ARM program also has active sites in the south-west Pacific and on the north shore of Alaska. The south-west Pacific site made important contributions to the international TOGA program studying El Nino. The south-west Pacific is the place where sea surface temperatures are highest, and El Nino begins with a massive movement of hot surface water from west to east. If we consider the global climate to be a heat-engine, the south-west Pacific is the hot end of the engine and the north shore of Alaska is the cold end. The ARM sites were chosen so that we can study the hot and cold ends of the engine, with the Oklahoma site somewhere in the middle. The original plan for ARM had two additional sites, one in tropical forest and one in desert, but the funding for more sites never materialized.

Another successful program of local observation is measuring directly the fluxes of carbon dioxide moving between the atmosphere and the biosphere. This is done by putting instruments on towers above the local trees or other vegetation. Accurate anemometers (wind-speed meters) measure the vertical motion of the air, while infrared gas analyzers measure the carbon dioxide content at the same place and the same time. Both measurements are made instantaneously, four times a second, so that you are measuring the carbon dioxide carried by each local eddy in the atmosphere as it moves up or down. If, as usually happens in daytime in the summer, the trees are absorbing carbon dioxide, each packet of air moving down carries more carbon dioxide and each packet moving up carries less. You can derive the flux of carbon dioxide going into the trees by multiplying the vertical speed by the carbon dioxide abundance and averaging over time. This is called the eddy covariance method of measuring fluxes. It is remarkably accurate, because it turns out that the vertical speed and the carbon dioxide abundance are almost a hundred percent correlated. When you measure at night or in winter, you find that the flux is going the other way. Trees are then not photosynthesizing but giving off carbon dioxide by respiration. The soil also gives off substantial fluxes of carbon dioxide, mostly from respiration of microbes and fungi. The eddy covariance method does not distinguish between vegetation and soil. It measures the total flux leaving or entering the atmosphere.

For many years the eddy covariance measurements were made in only three places in the world, one over a temperate forest in Massachusetts, one over a tropical forest in Brazil, and one over a boreal forest in Canada. Steven Wofsy at Harvard was the pioneer who got the whole thing started at the site in Massachusetts, (Wofsy et al., 1993). The results of the first measurements were startling. The Massachusetts forest was absorbing carbon at a rate of 3.7 tons per hectare per year, far more than was expected for a mature forest. If you supposed that all the temperate forests of the world were absorbing carbon at this rate, the result would be an absorption of 5 gigatons of carbon per year, which happens to be almost exactly the amount of missing carbon that disappears from the atmosphere. The Amazon forest shows an absorption of one ton per hectare per year, not so large but still more than was expected, (Grace et al., 1995). The Canadian forest is emitting carbon at a rate of 0.3 tons per hectare per year, probably mostly from soil respiring more as the arctic climate grows warmer, (Goulden et al., 1998). If these numbers are also representative of forests all over the world, the tropical forests and the boreal forests roughly cancel each other out, the tropical forests absorbing and the boreal forests emitting about a gigaton each. The total for all forests would then be 5 gigatons of absorption.

Finally, during the last few years, a serious program of eddy covariance measurements has been started, with instrumented sites in many countries around the world, to see whether the results observed at the first three sites are really representative of forests in general. A consortium called Ameriflux has been organized with 24 sites in north America, and many other sites are operating in Europe and Asia. Results so far seem to confirm the earlier measurements. One temperate forest site in Italy measures 5 tons per hectare per year absorption, and one boreal forest site in Sweden measures half a ton per hectare emission. Within a few years, we will know for sure whether the temperate forests are really the main sink of the missing carbon. And the same technique of eddy covariance can be used to monitor the carbon fluxes over agricultural croplands, wetlands and grasslands. It will give us the knowledge required, so that we can use the tools of land management intelligently to regulate the carbon in the atmosphere. Whether we manage the land wisely or mismanage it foolishly, we shall at least know what good or harm we are doing to the atmosphere.

Besides ARM and Ameriflux, there is a third highly successful program of local measurements called ATOC, Acoustic Thermometry of Ocean Climate, the brain-child of Walter Munk at the Scripps Institution of Oceanography. ATOC uses low-frequency underwater sound to measure ocean temperatures, (ATOC Consortium, 1998). A signal is transmitted from a source on top of a seamount at a depth of 900 meters near San Francisco, and received at six receivers in deep water around the north Pacific. The times of arrival of signals at the receivers are accurately measured. Since the speed of propagation depends on temperature, average temperatures of the water along the propagation paths can be deduced. The main obstacle that Walter Munk had to overcome to get the ATOC project started was the opposition of environmental activists. This is a long and sad story which I don't have time to tell. The activists decided that Munk was an evil character and that his acoustic transmissions would endanger the whales in the ocean by interfering with their social communications. They harassed him with lawsuits which delayed the project for several years. Munk tried in vain to convince them that he also cares about the whales and is determined not to do them any unintentional harm. In the end the project was allowed to go forward, with less than half of the small budget spent on monitoring the ocean and more than half spent on monitoring the whales. No evidence was found that any whale ever paid any attention to the transmissions. But the activists are continuing their opposition to the project and its future is still in doubt.

During the two years that the ATOC system has been operating, seasonal variations of temperature have been observed, giving important new information about energy transport in the ocean. If measurements are continued for ten years and extended to other oceans, it should be possible to separate a steady increase of temperature due to global warming from fluctuations due to processes like El Nino that vary from year to year. Since the ocean is the major reservoir of heat for the entire climate system, a measurement of ocean temperature is the most reliable indicator of global warming. We may hope that the activists will one day admit that an understanding of climatic change is as essential to the preservation of wildlife as it is to the progress of science.

It is time now to wind up this talk and summarize what we have learned. There is good news and bad news. The good news is that we are at last putting serious effort and serious money into local observations. Local observations are laborious and slow, but they are essential if we are ever to have an accurate picture of climate. The bad news is that the climate models on which so much effort is expended are unreliable. The models are unreliable because they still use fudge-factors rather than physics to represent processes occurring on scales smaller than the grid-size. Besides the general prevalence of fudge-factors, the climate models have other more specific defects that make them unreliable. First, with one exception, they do not predict the existence of El Nino. Since El Nino is a major and important feature of the observed climate, any model that fails to predict it is clearly deficient. Second, the models fail to predict the marine stratus clouds that often cover large areas of ocean. Marine stratus clouds have a large effect on climate in the oceans and in coastal regions on their eastern margins. Third, the climate models do not take into account the anomalous absorption of radiation revealed by the ARM measurements. This is not a small error. If the ARM measurements are correct, the error in the atmospheric absorption of sunlight calculated by the climate models is about 28 watts per square meter, averaged over the whole earth, day and night, summer and winter. The entire effect of doubling the present abundance of carbon dioxide is calculated to be about 4 watts per square meter. So the error in the models is much larger than the global warming effect that the models are supposed to predict. Until the ARM measurements were done, the error was not detected, because it was compensated by fudge-factors that forced the models to agree with the existing climate. Other equally large errors may still be hiding in the models, concealed by other fudge-factors. Until the fudge-factors are eliminated and the computer programs are solidly based on local observations and on the laws of physics, we have no good reason to believe the predictions of the models.

The bad news does not mean that climate models are worthless. Syukuro Manabe, who ran the climate modeling program at the Geophysical Fluid Dynamics Laboratory at Princeton, always used to say that the purpose of his models was not to predict climate but to understand it. Climate models are still, as Manabe said, essential tools for understanding climate. They are not yet adequate tools for predicting climate. If we persevere patiently with observing the real world and improving the models, the time will come when we are able both to understand and to predict. Until then, we must continue to warn the politicians and the public, don't believe the numbers just because they come out of a supercomputer.

References:

ATOC Consortium, 1998. Ocean Climate Change: Comparison of Acoustic Tomography, Satellite Altimetry and Modeling, Science, 281, 1327-1332.

Goulden, M. L. et al., 1998. Sensitivity of Boreal Forest Carbon Balance to Soil Thaw, Science, 279, 214-217.

Grace, J. et al., 1995. Carbon Dioxide Uptake by an Undisturbed Tropical Rain Forest in Southwest Amazonia, 1992 to 1993, Science, 270, 778-780.

Wofsy, S. C. et al., 1993. Net Exchange of $CO2$ in a Mid-Latitude Forest, Science, 260, 1314-1417.

Freeman J. Dyson

Institute for Advanced Study, Princeton, New Jersey

dyson@ias.edu

Nuclear Power and the Large Environment

David Bodansky

Talk given at American Physical Society Centennial Meeting, Atlanta, Georgia, March 25, 1999

1. Introduction

The development of nuclear energy has come to a near halt in the United States and in much of the rest of the world. The construction of new U.S. reactors has ended and although there has been a rise in nuclear electricity generation in the past decade, due to better performance of existing reactors, a future decline appears inevitable as individual reactors reach the end of their economically useful lives.

An obstacle to nuclear power is the publicly perceived environmental risk. During this development hiatus, it is useful to step back and take a look at nuclear-related risks in a broad perspective. For this purpose, we categorize these risks as follows:

• Confined risks. These are risks that can be quantitatively analyzed, and for which the likelihood and scale of possible damage can be made relatively small.

• Open-ended risks. These are risks that cannot be well quantified by present analyses, but which involve major dangers on a global scale.

As discussed below, public concern has focussed on risks in the confined category, particularly reactor safety and waste disposal. This has diverted attention from more threatening, open-ended risks of nuclear weapons proliferation, global climate change, and potential scarcity of energy in a world of growing population. The rationale for this categorization and the connection between nuclear power and these open-ended risks are discussed below.

 

2. Confined risks.

 

a. Nuclear reactor accidents.

The belief that reactor accident risks are small is based on detailed analyses of reactor design and performance, and is supported by the past safety record of nuclear reactors, excluding the accident at Chernobyl in 1986. Defects in the design and operation of the Chernobyl reactor were so egregious that the Chernobyl experience has virtually no implications for present reactors outside the former Soviet Union. Chernobyl is a reminder, however, of the need for careful, error-resistant design if there is to be a large expansion of nuclear power in many countries.

At the end of 1998 there had been over 8000 reactor-years of operation outside the former Soviet Union, including about 2350 in the United States. Only one accident, that at Three Mile Island, has marred an otherwise excellent safety record. Even at TMI, although the reactor core was severely damaged, there was very little release of radioactivity to the environment outside the reactor containment. Subsequently, U.S. reactors have been retrofitted to achieve improved safety and, with improved equipment and greater attention to careful procedures, their operation has become steadily more reliable.

A next generation of reactors can be even safer, either through a series of relatively small evolutionary steps that build directly upon past experience or through more radical changes that place greater reliance on passive safety features--such as cooling water systems that are directly triggered by pressure changes (not electrical signals) and that rely on gravity (not pumps). It would in fact be remarkable if the accumulated past experience, both good and bad, would not improve the next generation.

b. Nuclear waste disposal

The second dominant public concern is over nuclear wastes. Current plans are to dispose of spent fuel directly, without reprocessing, keeping it in solid form. Confinement of the spent fuel is predicated on its small volume, the ruggedness of the planned containers, the slowness of water movement to and from a site such as Yucca Mountain, and the continual decrease in the inventory of radionuclides through radioactive decay.

Innumerable studies have been made to determine the degree to which the radionuclides will remain confined. One way to judge the risks is to examine these studies as well as independent reviews. An alternate perspective on the scale of the problem can be gained by considering the protective standards that have been proposed for Yucca Mountain.

Proposed standards were put forth in preliminary form by the EPA in 1985. These set limits on the release of individual radionuclides from the repository, such that the attributable total cancer fatalities over 10,000 years would total less than 1000. This target was thought to be achievable when the only pathways considered for the movement of radionuclides from the repository were by water. However, the development of the site was put in jeopardy when it was later recognized that escaping 14C could reach the "accessible environment" relatively quickly in the form of gaseous carbon dioxide. A release over several centuries of the entire 14C inventory at Yucca Mountain would increase the worldwide atmospheric concentration of 14C by about 0.1%, corresponding to an annual average dose of about 0.001 mrem per year for hundreds of years. The resulting collective dose to 10 billion people could be sufficient to lead to more than 1000 calculated deaths.

It is startling that 14C might have been the show-stopper for Yucca Mountain. It appeared that this could occur, until Congress took the authority to set Yucca Mountain standards away from the EPA pending future recommendations from a panel to be established by the National Academy of Sciences (NAS). The panel issued its Report in 1995. It recommended that the period of concern extend to up to one million years and that the key criterion be the average risk to members of a "critical group" (probably numbering less than 100), representing the individuals at highest risk from potentially contaminated drinking water. It was recommended that the calculated average risk of fatal cancer be limited to 10-6 or 10-5 per person per year. According to the estimates now used by federal agencies to relate dose to risk, this range corresponds to between 2 mrem/year and 20 mrem/year.

Taking the NAS panel recommendations into consideration, but not fully accepting them, the EPA in August 1999 proposed a standard whose essential stipulation is that for the next 10,000 years the dose to the maximally exposed future individual is not to exceed 15 mrem per year. This may be compared to the dose of roughly 300 mrem per year now received by the average person in the United States from natural radiation, including indoor radon.

Attention to future dangers at the levels represented by any of these three standards can be contrasted to our neglect of much more serious future problems, to say nothing of the manner in which we accept larger tolls today from accidents, pollution, and violent natural events. While we have responsibilities to future generations, the focus should be on avoiding potential disasters, not on guarding people thousands of years hence from insults that are small compared to those that are routine today.

 

c. Fuel cycle risks

Risks from accidents in the remainder of the fuel cycle, which includes mining, fuel production and waste transportation have not attracted as much attention as those for reactor accidents and waste disposal, in part because they manifestly fall into the confined-risk category. Thus, the September 1999 accident at the Tokaimura fuel preparation facility resulted in the exposure of many of the workers, including two cases of possibly fatal exposures. It involved an inexcusable level of ignorance and carelessness and may prove a serious setback to nuclear power in Japan and elsewhere. However, the effects were at a level of harm that is otherwise barely noticed in a world that is accustomed to coal mine accidents, oil rig accidents, and gas explosions. The degree of attention given the accident is a measure of the uniquely strict demands placed on the nuclear industry.

 

3. Open-ended risks

 

a. Nuclear weapons proliferation.

The first of the open-ended risks to be considered is that of nuclear weapons proliferation. A commercial nuclear power program might increase this threat in two ways:

  • A country that opts for nuclear weapons will have a head start if it has the people, facilities, and equipment gained from using nuclear power to generate electricity. This concern can explain the U.S. opposition to Russian efforts to help Iran build two nuclear power reactors.
  • A terrorist group might attempt the theft of plutonium from the civilian fuel cycle. Without reprocessing, however, the spent fuel is so highly radioactive that it would be very difficult for any sub-national group to extract the plutonium even if the theft could be accomplished.

 

To date, the potential case of Iran aside, commercial nuclear power has played little if any role in nuclear weapons proliferation. The long-recognized nuclear weapons states---the United States, the Soviet Union, the United Kingdom, France, and China---each had nuclear weapons before they had electricity from nuclear power. India's weapons program was initially based on plutonium from research reactors and Pakistan's on enriched uranium. The three other countries that currently have nuclear weapons, or are most suspected of recently attempting to gain them, have no civilian nuclear power whatsoever: Israel, Iraq, and North Korea.

On the other side of the coin, the threat of future wars may be diminished if the world is less critically dependent on oil. Competition over oil resources was an important factor in Japan's entry into World War II and in the U.S. military response to Iraq’s invasion of Kuwait. Nuclear energy can contribute to reducing the urgency of such competition, albeit without eliminating it. A more direct hope lies in stringent control and monitoring of nuclear programs, such as attempted by the International Atomic Energy Agency. The United States' voice in the planning of future reactors and fuel cycles and in the shaping of the international nuclear regulatory regime is likely to be stronger if the United States remains a leading player in the development of civilian nuclear power.

In any event, the relinquishment of nuclear power by the United States would not inhibit potential proliferation unless we succeeded in stimulating a broad international taboo against all things nuclear. A comprehensive nuclear taboo is highly unlikely, given the heavy dependence of France, Japan, and others on nuclear power, the importance of radionuclides in medical procedures, and the wide diffusion of nuclear knowledge —— to say nothing of the unwillingness of the nuclear weapons states to abandon their own nuclear weapons.

b. Global climate change.

The prospect of global climate change arises largely from the increase in the atmospheric concentration of carbon dioxide that is caused by the combustion of fossil fuels. While the extent of the eventual damage is in dispute, there are authoritative predictions of adverse effects impacting many millions of people due to changes in temperature, rainfall, and sea level. Most governments profess to take these dangers seriously, as do most atmospheric scientists. Under the Kyoto agreements, the United States committed itself to bring carbon dioxide emissions in the year 2010 to a level that is 7% lower than the 1990 level. Given the 11% increase from 1990 to 1997, this will be a very difficult target to achieve.

Nuclear power is not the only means for reducing CO2 emissions. Conservation can reduce energy use, and renewable energy or fusion could in principle replace fossil fuels. However, the practicality of the necessary enormous expansion of the most promising forms of renewable energy, namely wind and photovoltaic power, has not been firmly established. Additionally, we cannot anticipate the full range of resulting impacts. Fusion is even more speculative, as is the possibility of large-scale carbon sequestration. If restraining the growth of CO2 in the atmosphere warrants a high priority, it important to take advantage of the contribution that nuclear power can make---a contribution clearly illustrated by French reliance upon nuclear power.

c. Global population growth and energy limits.

 

The third of the open-ended risks to be considered is the problem of providing sufficient energy for a world population that is growing in numbers and in economic aspirations. The world population was 2.5 billion in 1950, has risen to about 6 billion in 1999, and seems headed to some 10 billion in the next century. This growth will progress in the face of eventual shortages of oil, later of gas, and still later of coal.

The broad problem of resource limitations and rising population is sometimes couched in terms of the "carrying capacity" of the Earth or, alternatively, as the question posed by the title of the 1995 book by Joel Cohen, How Many People Can the Earth Support? As summarized in a broad review by Cohen, recent estimates of this number range from under 2 billion to well over 20 billion, centering around a value of 10 billion.

The limits on world population include material constraints as well as constraints based on ecological, aesthetic or philosophical considerations. Perhaps because they are the easiest to put in "objective terms," most of the stated rationales for a given carrying capacity are based on material constraints, especially on food supply which in turn depends upon arable land area, energy, and water.

Carrying capacity estimates made directly in terms of energy, in papers by David Pimentel et al. and by Gretchen Daily et al., are particularly interesting in the present context as illustrations of the possible implications of a restricted energy supply. Each group concludes that an acceptable sustainable long-term limit to global population is under 2 billion, a much lower limit than given in most other estimates. They both envisage a world in which solar energy is the only sustainable energy source. For example, in the Pimentel paper the authors conclude that a maximum of 35 quad of primary solar energy could be captured each year in the United States which, at one-half the present average per capita U.S. energy consumption rate, would suffice for a population of 200 million. For the world as a whole, the total available energy would be about 200 quads, which Pimentel et al. conclude means that "1 to 2 billion people could be supported living in relative prosperity."

One can quarrel with the details of this argument, including the maximum assumed for solar power, but it dramatically illustrates the magnitude of the stakes, and the centrality of energy considerations.

 

4. Conclusions.

If a serious discussion of the role of nuclear power in the nation's and world's energy future is to resume, it should focus on the crucial issues. Of course, it is important to maintain the excellent safety record of nuclear reactors, to avoid further Tokaimuras, and to develop secure nuclear waste repositories. But here --considering probabilities and magnitudes together -- the dangers are of a considerably smaller magnitude than those from nuclear weapons, from climate change, and from a mismatch between world population and energy supply.

The most dramatic of the dangers are those from nuclear weapons. However, as discussed above, the implications for nuclear power are ambiguous. For the other major areas, the picture is much clearer. Nuclear power can help to lessen the severity of predicted climate changes and can help ease the energy pressures that will arise as fossil fuel supplies shrink and world population grows. Given the seriousness of the possible consequences of a failure to address these matters effectively, it is an imprudent gamble to let nuclear power atrophy in the hopes that conservation and renewable energy, supplemented perhaps by fusion, will suffice.

It is therefore important to strengthen the foundations upon which a nuclear expansion can be based, so that the expansion can proceed in an orderly manner — if and when it is recognized as necessary. Towards this end, the federal government should increase support for academic and industrial research on nuclear reactors and on the nuclear fuel cycle, adopt reasonable standards for waste disposal at Yucca Mountain, and encourage the construction of prototypes of the next generation of reactors for use here and abroad. Little of this can be done without a change in public attitudes towards nuclear power. Such a change might be forcibly stimulated by a crisis in energy supply. It could also occur if a maverick environmental movement were to take hold, driven by the conclusion that the risks of using nuclear power are less than those of trying to get by without it.

David Bodansky

Department of Physics, Box 351560

University of Washington

Seattle, WA 98195

bodansky@phys.washington.edu

     1. Joel E. Cohen, How Many People Can the Earth Support?
        (W.W. Norton & Co, New York, 1995).
     2. David Pimentel et al,  "Natural Resources and Optimum Human
        Population," Population and the Environment, A Journal of
        Interdisciplinary Studies 15, no. 5 (May 1994), 347-69.
     3. Gretchen C.  Daily, Ann H. Ehrlich and Paul R. Ehrlich, "Optimum
        Human Population Size," Population and the Environment, A
        Journal of Interdisciplinary Studies 15, no. 6 (July 1994),
        469-475.