Reviews

 

 

Beyond Growth: The Economics of Sustainable Development

By Herman E. Daly, Beacon Press, Boston, 1996, ISBN 0-8070-4709-0 (paper)

Many of us believe that human impacts on the environment have brought us into a new era. Herman Daly, professor at the University of Maryland's School of Public Affairs and former member of the Environment Department of the World Bank, puts his finger on the economics of that change. We have moved, he argues, from a natural environment that was empty and thus effectively limitless to one that is filled up.

Every economist needs to read this book. It could be aptly subtitled "Economics As Though Earth and Natural Law Mattered." In the tradition of E. F. Schumacher's Small Is Beautiful, Amory Lovins Soft Energy Paths, and Nicholas Georgescu-Roegen's The Entropy Law and Economic Process, it steps outside of the conventional economist's world of infinite environmental sources and sinks, to the real finite world. This book will certainly interest environmentalists. Because it takes natural law into account, it should also interest scientists. My guess, however, is that most economists will not read it, because it begins from assumptions that fall outside of their customary world view, a world view that Daly shows to be dangerously outdated but that is impervious to change because it is seldom explicitly acknowledged.

My own liberal-arts physics course includes environmental topics, and touches on their economic feedbacks. Although these feedbacks are increasingly obvious, business students inform me that their business courses have little to say about such matters. Judging from Daly's book, this situation is the rule. Today's academic economists operate in a sort of vacuum, an empty world bereft of natural and environmental limitations.

Daly's unconventional economics begins from assumptions that are by now commonplace among environmentalists and scientists. The conventional macroeconomic vision treats industry and consumers as an isolated loop of industry, goods, services, and capital. Daly puts this loop into the natural world, with its finite and degradable throughputs of energy and matter. Daly connects these fundamentals with the two great thermodynamic laws, for instance in a figure (page 29) showing Georgescu-Roegen's "entropy hourglass" in which low-entropy solar energy flows through Earth at a constant (hence limited) rate, while limited terrestrial resources of matter and energy flow from low to high entropy.

Starting from an open system changes everything. The closed-loop analysis might have made sense in a relatively empty world. But today, when 33-50% of the land has been transformed by human action, when atmospheric CO2 has increased by 30%, when more than half of all accessible surface fresh water is put to human use, when 25% of the bird species have been driven to extinction and two-thirds of marine fisheries are fully exploited or depleted, it is folly to assume that the environment does not enter into the economic equation (see Jane Lubchenco, "Entering the Century of the Environment," Science 23 January 1998, pp. 491-497). In fact we live in a full world in which exponential expectations must be replaced by steady-state sustainability. Unfortunately, according to Daly, conventional economists have yet to figure this out.

To convey a sense of Daly's ideas I will quote, without comment, several of his observations.

"Sustainable growth [is] a clear oxymoron to those who see the economy as a subsystem [of the natural world]. At some point quantitative growth must give way to qualitative development as the path of progress. I believe we are at that point today." (p. 7)

"There is thus an important asymmetry between our two sources of low entropy. The solar source is stock-abundant, but flow-limited. The terrestrial source is stock-limited, but flow-abundant (temporarily). Peasant societies lived off the abundant solar flow; industrial societies have come to depend on enormous supplements from the limited terrestrial stocks. Reversing this dependence will be an enormous evolutionary shift." (30)

"Our national accounts are designed in such a way that they cannot reflect the costs of growth, except by perversely counting the resulting defensive expenditures as further growth. ...Unsustainable consumption is treated no differently from sustainable yield production (true income) in GNP. ...To design national policies to maximize GNP is...practically equivalent to maximizing depletion and pollution." (40-42)

"The tradeable pollution permits scheme...is a beautiful example of the independence and proper relationship among allocation, distribution, and scale. . .This scheme limits the total scale of pollution, need not give away anything but can sell the rights for public revenue, yet allows reallocation among individuals in the interest of efficiency" (52-53).

"As our exactions from and insertions back into the ecosystem increase in scale, the qualitative change induced in the ecosystem must also increase, for two reasons: ...the first law of thermodynamics [and] the second law of thermodynamics." (58)

"While all countries must worry about both population and per capita resource consumption, it is evident that the South needs to focus more on population, and the North more on per capita resource consumption. ...Without for a minute minimizing the necessity of population control, it is nevertheless incumbent on the North to get serious about consumption control." (61)

"Kenneth Boulding got it right fifty years ago [when he stated that] 'the objective of economic policy should not be to maximize consumption or production, but rather to minimize it, i.e. to enable us to maintain the our capital stock'." (p. 68)

"In sum, we found that empirical evidence that GNP growth has increased economic welfare in the U.S. since about 1970 is nonexistent." (97)

"These considerations also suggest a concept of 'overdevelopment' as correlative to 'underdevelopment': an overdeveloped country might be defined as one whose level of per capita resource consumption is such that if generalized to all countries could not be sustained indefinitely." (106)

"Population control is the sine qua non of sustainable development... Birth control [in less developed countries] is already practiced by the upper and urban classes. The relatively high rate of reproduction of the lower class insures an 'unlimited' supply of labor at low wages which promotes inequality in the distribution of income. ...A low birth rate tends to equalize the distribution of per capita income, in two ways: it reduces the number of heads among which a wage must be shared in the short run, and it permits the wage to rise by moving away from an unlimited supply of labor in the long run." (125)

"None of this is meant to imply that carrying capacity is only relevant to developing countries. If the U.S. had worried about carrying capacity, it would not have become so dangerously dependent on depleting petroleum reserves belonging to other nations. If the U.S. cannot even pass a reasonable gasoline tax to discipline unsustainable consumption, is it realistic to expect Paraguay and Ecuador to control population?" (p. 128)

"We therefore need a compensatory tariff to correct for differences in internalization of external costs among the nations. This is derided as 'protectionism' by free traders. But ...the compensatory tariff ...protects an efficient national policy of cost internalization against standards-lowering competition from countries that, for whatever reason, do not count all environmental and social costs." (147)

"Globalism does not serve world community--it is just individualism writ large. We can either leave transnational capital free of community constraint, or create an international government capable of controlling it, or renationalize capital and put it back under control of the national community. I favor the last alternative. I know it is hard to imagine right now, but so are the others. It may be easier to imagine after an international market crash." (148)

"A maximum income [is] part of the institutional basis appropriate to a steady-state economy. The notion of a maximum income ...seems to strike people as mean, petty, and invidious. I believe this is because growth in total wealth is assumed to be unlimited. ...Inequality is increasing in the U.S. and will become an issue again in spite of political efforts to deny its importance. ...At some point distribution must become an issue. ...In 1960 after-tax average pay for chief executives was about twelve times that of the average factory worker. ...In 1995 it is well over one hundred. ...No one is arguing for an invidious, forced equality. ...But bonds of community break at or before a factor of one hundred. Class warfare is already beginning." (202-203)

"To capture the cluster of values expressed by 'sustainability-sufficiency-equity-efficiency' in one sentance, I suggest the following: We should strive for sufficient per capita wealth--efficiently maintained and allocated, and equitably distributed--for the maximum number of people that can be sustained over time under these conditions." (220)

Art Hobson

University of Arkansas at Fayetteville

ahobson@comp.uark.edu

The Truth of Science

by Roger G. Newton. Harvard University Press, Cambridge, MA, 1997, ISBN 0-674-91092-3

When David Mermin panned The Golem in the pages of Physics Today (March and April 1996), he served public notice that physics had discovered social constructivism and found it annoying. Physicists might grant that natural science, being pursued by interacting humans, provides scope for sociological study. But to suggest that scientific results are nothing but a social construct, constrained little or not at all by the real world, is flagrant nonsense.

Roger G. Newton's irritation prompted him to write The Truth of Science to describe the intellectual structure of physical science and the understanding of reality that modern physics engenders. The book is "not intended as a polemic," but the offending sociologists are said to be "robbing rational thought of all intellectual and cognitive value, leaving its expression a hollow rhetorical shell" (pp. 3-4). Their views are called "malignant," their statements "disingenuous," "cynical," and "arrogant," and they themselves are found guilty of "anti-intellectualism" and "hubris." These are intellectual fighting words. Unfortunately, the author's arguments are inadequate to back them up.

For example, Newton thinks he has caught David Bloor in a contradiction, between Bloor's "principle of symmetry" (that both true and false beliefs are to receive causal sociological explanation) and his statement that "when we talk of truth...we mean that some belief...portrays how things stand in the world" (p. 34). Newton takes "how things stand in the world" to amount to "an external criterion of truth" that must either conflict with the sociological explanation or render it superfluous, but there simply is no need to interpret Bloor's statement that way. It seems more likely that Bloor is just affirming the common sense of what people mean when they speak of truth.

An analogy offered by Alan Chalmers, adopted by Newton against Bloor, is similarly misplaced. The idea is that when science comes up with correct results, no more external explanation is required than when a soccer player kicks the ball into the opposing goal--"he just followed the rules" (p. 34). For this analogy to hold, there would have to be a set of hard and fast rules by which science is done, just as there is a set of rules for playing soccer. No one has successfully articulated such a set of rules. It would be the holy grail for philosophers of science, who by now seem to have given up the quest.

Let sociologists investigate the social processes by which consensus is achieved in science. Let them, if they choose, seek to view those processes from the vantage point of the participants, refraining from judgment in the light of subsequently settled scientific knowledge. Where they make specific false or exaggerated claims to the effect that scientific results are just a social construct, these may best be debunked on a case-by-case basis, as Newton himself does with regard to assertions by Paul Forman and Lewis Feuer that "acausal" quantum concepts were an outgrowth of the political milieu of Weimar Germany. By appealing to the historical record, he effectively deflates their thesis (pp. 27-28).

Newton devotes only one chapter to attacking the social constructivists, but they might quote from his book for their own purposes. For example, on pages 108-9 we learn that there are fashions and fads in science, and on page 169 we are told that "Richard Feynman went to extremes in replacing all remnants of wave fields...by the quantum motion of particles". "Feynman diagrams", the author complains, "have become firmly entrenched in the in the language and imagination of physics." Entrenched!

What is the truth of science? Newton identifies coherence as the key criterion establishing the truth of scientific knowledge, and approvingly quotes John Ziman: "Scientific knowledge eventually becomes a web or network of laws, models, theoretical principles, formulae, hypotheses, interpretations, etc., which are so closely woven together that the whole assembly is much stronger than any single element" (p. 209). Where this edifice, as Newton calls it, makes direct contact with our experience, it works, in that it enables us to build powerful technology, accurately predict the outcome of experiments, and reliably assert that certain things will not happen. Scientists rightly dismiss parapsychology, astrology, and witchcraft because they are incoherent with the established edifice.

The problem with all this is that we cannot enforce a criterion of coherence against outsiders while giving the members of our club a free pass--not without inviting the social constructivists to our table. If scientific knowledge is tightly interconnected, then the entire edifice stands or falls together. There can be no "revolutionary escape clause" (p. 106) permitting serious consideration of, say, the Bohr atom, which clearly violated the laws of physics when introduced. When Newton says that "even though parts of the edifice may be found to be rotten, the coherence of the body of scientific truths accounts for its stability" (p. 209), he is denying the effectiveness of coherence as a criterion. The "edifice" is the coherent body of scientific truth. How can part of it be "rotten"? If something rotten can be part of it, why not ESP?

The preface to this book assures us that "no specific knowledge of physics is assumed," but my impression is that any reader lacking a substantial background in physics will come away from this book more intimidated than enlightened. Being informed on page 215 that "the validity of scientific statements...rests on evidence that could, in principle, be checked by anyone with the needed fundamental knowledge and apparatus," our reader will understand that "anyone" is not likely ever to include him or her. "The truth of science" is to be judged by a tiny group of initiates. The role of everyone else is to appreciate from afar and send money.

Allan Walstad

Physics Department

University of Pittsburgh at Johnstown

Johnstown, PA 15904

awalstad+@pitt.edu

Science in Culture

Special Issue of Daedalus, Winter 1998, $7.95

This 40th anniversary issue of Daedalus, the journal of the American Academy of Arts and Sciences, is in honor of Gerald Holton, Mallinckrodt Professor of Physics and History of Science at Harvard, who had launched the journal as a quarterly. The issue's ten articles were compiled from presentations at a conference honoring Holton. Editor Stephen Graubard writes this issue "acknowledges that both the humanities and the social sciences are as relevant to [the cultural development of science] as are ... the physical and the biological sciences."

Holton's article "Einstein and the Cultural Roots of Modern Science" examines Einstein's creativity from the perspective of Holton's belief "that the full understanding of any particular scientific advance requires attention to both content and context." Holton writes that "Einstein's assertion of obstinate non-conformity enabled him to clear the ground ruthlessly of obstacles impeding his great scientific advances." Holton tracks Einstein's development from a rebellious youth that had a "selective reverence for tradition....The essential point for [Einstein] was freedom, the 'free play' of the individual imagination, within the empirical boundaries the world has set for us." Having access to many of Einstein's papers and correspondence, Holton was "impressed... by his courage to place his confidence, often against all available evidence, in a few fundamental guiding ideas or presuppositions." Holton attributes Einstein's breadth partly to his wide range of readings within an education that was much broader than a narrow scientific development. Holton concludes that Einstein had a "life-long struggle for simplicity and against ordinary convention."

The breadth of this collection is revealed in "Misconceptions: Female Imaginations and Male Fantasies in Parental Imprinting" by Wendy Doniger, Distinguished Service Professor of the History of Religions at the University of Chicago, and Gregory Spinner at Tulane University. The article discusses the belief in different cultures over many centuries that "a woman who imagines or sees someone other than her sexual partner at the moment of conception may imprint that image upon her child ... thus predetermining its appearance, aspects of its character, or both." The authors write that "this essay will consider a number of stories about the workings of maternal imagination, impression, or imprinting, terms that are often conflated." They discuss the Hebrew Bible, Aristotle, Heliodorus, Jerome and other Greek and Latin sources, Maimonides, various Midrashim, Goethe, Hoffmann, and Voltaire. They note "Christian theories grow out of the father's fear that his child may not be his child, or, rather, that he can never be sure that his child is his child ... unless, of course, he trusts his wife. Resemblance was the straw that men grasped in the storm of their sexual paranoia." The authors include a discussion of ancient India. They conclude that "what seems most astonishing in all of this is the extent to which the seemingly most plausible explanation for the birth of a child who does not resemble his father--namely, the fact that some other man fathered him--is rejected by most of our sources (with the notable exception of the Jewish sources) in favor of fantasies of a most extravagant nature."

In "The Americanization of Unity," Harvard's Peter Galison treats the unity of science, based on the work of Philipp Frank. He discusses a variety of attempts to develop a unified approach based upon "a unification through localized sets of common concepts."

"Fear and Loathing of the Imagination of Science," by Lorraine Daston, Director of the Max Planck Institute for the History of Science, is aimed at exploring "how and why large portions of the educated public--and many working scientists--came to think [that good science does not require imagination], systematically opposing imagination to science." Immanuel Kant believed that originality was necessary for genius and therefore denied Newton the title of genius because in Newton's work there were not the type of originalities to be found, for example, in Homer. Datson notes that, in the 19th century, "at the crossroads of the choice between objective and subjective modes stood the imagination" and that the French psychologist, Theodore Ribot "could not free himself from a certain suspicion that imagination was linked to scientific error: the 'false sciences' of astrology, alchemy, and magic." In this 19th century view, "pure facts, severed from theory and sheltered from the imagination, were the last, best hope for permanence in scientific achievement."

Edward O. Wilson, Research Professor in Entomology at Harvard, writes on "Consilience Among the Great Branches of Learning." Contrasting the Western development of science to that of Chinese science, he notes that the Chinese made brilliant advances "but they never acquired the habit of reductive analysis in search of general laws. ...Western scientists also succeeded because they believed that the abstract laws of the various disciplines in some manner interlock. A useful term to capture this idea is consilience." Wilson writes that using consilience enables one to see modern science in a clearer perspective. He illustrates this by the discipline of biology, discussing evolutionary space-time, ecological space-time, organismic space-time, cellular space-time, and biochemical space-time. He writes that "two superordinate ideas unite and drive the biological sciences at each of these space-time segments. The first is that all living phenomena are ultimately obedient to the laws of physics and chemistry; the second is that all biological phenomena are products of evolution, and principally evolution by natural selection." He argues that social sciences and humanities will change as the brain sciences and evolutionary biology will "serve as bridges between the great branches of learning. ...The boundary between [C. P. Snow's] two cultures is instead a vast, unexplored terrain of phenomena awaiting entry from both sides. The terrain is the interaction between genetic evolution and cultural evolution. ...The relation between biological evolution and cultural evolution is, in my opinion, both the central problem of the social sciences and humanities and one of the great remaining problems of the natural sciences." He believes "biology is the logical foundational discipline of the social sciences," an insight that may be a novel idea to many social scientists.

In "Physics and History," Nobel laureate Steven Weinberg considers "the uses that history has for physics, and the dangers both pose to each other." He believes that "one of the best uses of the history of physics is to help us teach physics to nonphysicists. Physicists get tremendous pleasure out of being able to calculate all sorts of things. Nonphysicists, for some reason, do not appear to experience a comparable thrill in considering such matters. This is sad but true. History offers a way around this pedagogical problem: Everyone loves a story." However, the danger that scientific knowledge presents to history is "a tendency to imagine that discoveries are made according to our present understandings." This leads us to "lose appreciation for the difficulties, for the intellectual challenges, that [past scientists] faced." While agreeing that "a scientific theory is in a sense a social consensus, it is unlike any other sort of consensus in that it is culture-free and permanent. This is just what many sociologists of science deny." He argues that the laws of nature are also "culture-free and...permanent," and that "as the number of women and Asians in physics has increased, the nature of our understanding of physics has not changed." He concludes that "physical theories are like fixed points, toward which we are attracted. Starting points may be culturally determined, paths may be affected by personal philosophies, but the fixed point is there nonetheless."

Harvard's Dudley Herschbach and Bretislav Friedrich contributed "Space Quantization: Otto Stern's Lucky Star," describing the Stern-Gerlach experiment and its historical development. The experiment showed the reality of space quanitzation and "provided compelling evidence that a new mechanics was required to describe the atomic world." This well-written article brings the reader back to those early days when quantum mechanics was debated and developed.

E.H. Gombrich, former historian at the University of London, contributed "Eastern Inventions and Western Response." He describes "the Western response to the technical inventions that had reached Europe from the East. ...In the venerable civilizations of the East, custom was king and tradition the guiding principle. The spirit of questioning, the systematic rejection of authority, was the one invention the East may have failed to develop. It originated in ancient Greece."

Harvard's James S. Ackerman contributed "Leonardo da Vinci: Art in Science." Reviewing many of da Vinci's writings, he focuses upon the artist's "gift and patience for intensive observation," which served as the foundation for both his scientific investigations and his artistry. Ackerman looks in particular at da Vinci's descriptions of the work of the human heart.

The final short essay "Educational Dilemmas for Americans" by Patricia Albjerg Graham, historian of education at Harvard, asks "[why] does education present itself as such a persistent dilemma in the United States?" Her conclusion is that Americans change their mind about what tasks schools should accomplish, are ambivalent about what the adolescent experience should be, and are not really sure how important school is in a child's education. She notes the tremendous disruption that was caused when teachers who had spent years teaching in one manner suddenly had to adjust to the demands of the special education movement and to a philosophy whose purpose was "to prevent dropouts, not to create learners." Noting that there is a growing tendency for students to work during the school year, she writes "calculus is more valuable in terms of discipline than a perfect attendance record at McDonald's." She concludes that "schools are more important for the children of the poor than they are for the children of the affluent" and that "it is thus an extraordinary tragedy that the worst schools--whether in terms of faculty and administrative skills or per-pupil expenditures--serve the children who most need excellent schools, the children of the poor, while the best ones serve the children who have the most educational alternatives, the children of well-educated and prosperous."

This edition of Daedalus is remarkable in the breadth of the articles presented. Each brings an insight into an aspect of how culture and science intertwine.

John F. Ahearne

ahearne@sigmaxi.org

 

Possibly Vast Greenhouse Gas Sponge Ignites Controversy by Jocelyn Kaiser, Science, 1998, vol. 282, pp. 386 - 387.

Science staff writer Jocelyn Kaiser first noticed a problem of inconsistency in certain papers [1,2] published in the October 16 issue of Science Magazine. The Earth's carbon cycle is not known well enough to account for a missing one or two petagrams (1015 gms) of carbon per year. Both of the studies reported huge new carbon sinks. Such sinks would bear on possible "credits" for industrialized carbon output (human activity puts some 7.1 petagrams of carbon into the atmosphere yearly), under the recent Kyoto agreement on worldwide greenhouse gas production. Not only does each study report a huge local sink (40% of the missing carbon in South America [1]; 90% in North America [2]), but the two do not agree closely with one another. The disagreement easily might be attributed to differences in methodology and the primitive state of the art in carbon budgeting. Perhaps unfortunately, the membership of each reporting team [1,2] had considerable representation of the national labs in the respective region reported as a sink. Adding this to the recent committment at Kyoto, one has a difficult time imagining how these analyses might have been done completely free of national interest. The issue is credibility, not one of exaggeration or dishonesty. We must keep in mind that, if scientific work is to be used for treaty enforcement, the people required to limit carbon emissions can be expected to adhere to the treaty only if they know the evidence is certain and the enforcement is fair. Kaiser points out several studies done in different ways which contradict the [1,2] results above. In particular, the sink rate per unit area in North America has been estimated by a study in press at only about 1/3 that of the rate reported here. Of the numerous possible ways to determine a carbon sink rate over continents, [1] used on-site measurement of tree-trunk thicknesses, and [2] used quasi-stable computer simulation based on seawater and atmospheric CO2. One feels obliged to point out that, should Kyoto be given enforcement teeth, and should "credits" for local sinkings be allowed to permit excess local industrial greenhouse gas production, on-site measurement or computer inference from distantly-related data are not going to be credible in any kind of world enforcement system. One would suggest either that "credits" be abandoned as too prone to bias, or that an OBJECTIVE system of greenhouse gas assessment be set up as agreeable to all signers. Because world-wide averages over large areas would seem obviously to be a proper context for monitoring greenhouse gas balance over months or years, we would suggest use of published satellite analyses to document every enforcement action. This means specification, if not development, of satellite measurement protocols and standards before any more action be taken. In the future, under Kyoto, advisory or appellate actions might be based on local interests, but the system never will work if individual nations are permitted court-like introduction of their own evidence to argue for special treatment.

[1] Oliver L. Phillips, et al, "Changes in the Carbon Balance of Tropical Forests: Evidence from Long-Term Plots", Science, 1998, vol. 282, pp. 449-442.

[2] S. Fan, et al, "A Large Terrestrial Carbon Sink in North America Implied by Atmospheric and Oceanic Carbon Dioxide Data and Models", Science, 1998, vol. 282, pp. 442 - 446.

John Michael Williams

jwill@pacbell.net

Atomic Audit: The Costs and Consequences of US Nuclear Weapons Since 1940

Stephen Schwartz, editor, Brooking Press, 1998

Killing Detente: The Right Attacks the CIA

Anne Cahn, Penn State Press, 1998, 232 pages, $35, $18 (paper).

Science aims to link cause and effect for natural phenomena. Linking cause and effect for historical events is often more difficult since historical events cannot be tested by rerunning history with varied parameters. Despite the difficulty, it is worthwhile to review the causes behind the magnitude of the U.S. nuclear buildup. Two critical questions should guide this analysis: How much of the $5.8 trillion (1996 dollars) that the U.S spent to build 70,000 nuclear warheads, deployed on 75,000 missiles and 8600 bombers, was too much? And, was the effectiveness of the Soviet military exaggerated with false predictions? These two books go a long way towards quantifying the costs, and explaining the large size, of the buildup.

Atomic Audit, edited by Stephen Schwartz, thoroughly compiles the U.S. costs--$5.8 trillion between 1940 and 1996--for its atomic arsenal, from the fuel cycle to the weapons to the delivery systems and to decommissioning. As a companion book, Killing Detente supplies some of the causes which prompted this spending level. Anne Cahn describes the history behind the 1970s Team B estimates of the Soviet nuclear triad. These estimates provided some of the rational for the $5.8 trillion in spending, and the demise of U.S.-Soviet detente. At the time of the 1993 Senate hearings1 on the Evaluation of the U.S. Strategic Nuclear Triad, former Secretary of Defense Casper Weinberger (1981-87), said, "Yes, we used a worst -case analysis. You should always use a worst-case analysis in this business. You can't afford to be wrong. In the end, we won the Cold War, and if we won by too much, if it was overkill, so be it." Make no mistake about it, it was better to have too much nuclear hardware and not go to the Armageddon, as compared to the converse. However, since there is no limit to spending under this argument, and more nuclear hardware can be destabilizing, it is useful to separate truth from fantasy if we are to learn from the past.

Atomic Audit provides an excellent, comprehensive description of each of the nuclear-capable systems: 14 types of deployed heavy bombers, 47 deployed and 25 canceled missiles, 91 types of nuclear warheads. It describes and quantifies the technical and economic facts for the Manhattan Project ($25 billion), command/control, SIOP targets, defensive systems, dismantlement, nuclear waste, plutonium disposition, production and testing victims, secrecy, congressional oversight, and much more. It describes programs that were developed but not really deployed such as the nuclear-powered aircraft ($7 billion), the 1970's Safeguard ABM system ($25 B), SDI ($40 B), nuclear-propelled missiles ($3 B), Project Orion (nuclear-explosion propelled rocket to stars, $50 million), and Plowshares (peaceful nuclear explosions, $0.7 B). Certainly the SIOP strategic target list, which peaked at 12,000 in 1990, dictated the size of the U.S. triad. The Brookings group concludes the obvious: (1) Nuclear weapons were much more expensive than the projected 3% of all military spending. The $5.8 T total nuclear spending was 31% of all military spending ($18.7 T) and it was 44% of all non-nuclear military spending ($13.2 T). (2) Congress, the Defense Science Board, and most presidents made errors of omission and commission. (3) The SIOP target list of 10,000 targets was more than excessive, increasing the size of U.S. nuclear forces.

In Killing Detente, Anne Cahn gives us the detailed background of the 1970s "Team B" estimates. At the end of President Ford's term, the future members of the Committee on the Present Danger lobbied hard to create a process which would, in essence, write National Intelligence Estimates (NIEs) from outside the government. This happened at a time when the CIA was weakened from attacks coming from both the left and the right. The left was angry about CIA covert action in Vietnam, such as Project Phoenix which was responsible for the deaths of 20,000 civilians. The right was angry about projections of an unwinable war in Vietnam, projections that "no amount of bombing would deter North Vietnam from its objective." Because of the very close primary elections in 1976, Ford stopped using the word "detente" and allowed a team of outsiders to create a rival NIE. CIA Director George Bush signed off on the birth of Team B with "Let her fly, OK, GB."

Cahn describes how Professor Richard Pipes of the Harvard History Department became the leader of Team B, an unbalanced panel which greatly feared the Soviets. In the only three months, Team B developed three NIE alternatives on Soviet strategic objectives (a catch-all that covered every possibility), ICBM accuracy, and low-altitude air defense. These Team B reports laid the foundation for the nuclear buildup of the 1980s. When Pipes testified before a House committee, he stated that if "it [the intelligence community's NIE's] were presented in a seminar paper to me at Harvard, I would have rejected or failed [it]." Now that 20 years have passed since the Team B trilogy was written, it seems only fair to judge Professor Pipes' work along side the NIE's. As one who has consulted many NIE's over a decade while at the Senate Foreign Relations Committee, the State Department and ACDA, I have seen bench marks by which to grade Pipes' work and I give him an "F" for failure. So that you can judge for yourself, I have included parts of Team B's declassified (Top Secret) reports at the end of this review, with my comments in brackets.

In conclusion, the Soviet Union caused honest fear in the U.S., but we could have used better scholarship than offered by Team B. Perhaps we have learned this lesson. In 1995 the Congress was concerned that the NIE on the emerging ballistic missile threat from smaller countries had understated the threat. In contrast to Team B, the congressionally mandated commission that examined the NIE was well balanced. By having a well-balanced commission, the Rumsfeld Panel stayed clear of the extremists who could have captured the process. The Rumsfeld Panel analyzed those cases that were possible, but did not specify which ones were likely.

I strongly recommend both books as excellent studies on the Cold War nuclear arms race. Hopefully, we can learn from the errors of our predecessors.

Statements by Team B:

On Soviet Low Altitude Air Defense: "It specifically does not address Soviet capability against the B-1, cruise missiles or advanced penetration aids." [This forced a comparison between future Soviet air defenses attacking 1976 U.S. airplanes without cruise missiles, or without B1 and B2 bombers.]

"Put more starkly, it is not inconsistent with current evidence that the Soviets believe they have and may actually possess the inherent ability to prevent most, if not all, penetrating U.S. bombers (of the kind presently in the force, in raid sizes of a few hundred) from reaching targets the Soviets value." [It is hard to believe "most if not all."]

"In future years, high-energy laser weapons may play a role in the air defense of the Soviet Union.... Accordingly, they could possibly begin deploying ground-based laser antiaircraft weapons in the 1985-1990 time period, if they so desire." [The U.S. continues to fund laser anti-aircraft weapons, but without being able to successfully deploy it.]

On Soviet ICBM Accuracy: "Considering the magnitude of this effort and the fact that much of the Western research in this area is available to them, we find it hard to believe... that Soviet G&G errors will be significantly greater than those of the United States. For this reason, we will assume these errors to be equal to those of the Minuteman III in 1975, 70 m downrange and 35 m crossrange." [It is the strong consensus view that Soviet accuracy has always been poorer, by more than a factor of two, than U.S accuracy. Since a factor of 2 in accuracy corresponds to a factor of 8 in yield, this is a very large effect. To say that the Soviet missiles will have an accuracy of 70 m downrange and 35 m crossrange is far beyond the pale.]

On Soviet Strategic Objectives: "After some apparent division of opinion intermittently in the 1960's, the Soviet leadership seems to have concluded that nuclear war could be fought and won." [Since our SLBMs have always been invulnerable and since some bombers and ICBMs would survive, it is not logical to think that a Soviet leader could think the Soviet Union could actually destroy all of the U.S. nuclear forces and prevent the U.S. from destroying Russian cities. Since the Cuban missile crisis, Soviet leadership indicated no desire to risk a nuclear confrontation.]

"We have good evidence that it [the Backfire bomber] will be produced in substantial numbers, with perhaps 500 aircraft off the line by early 1984. [The Soviets had 235 Backfire bombers in 1984, which need considerable in-air refueling.]

"Given this extensive commitment of resources and the incomplete appreciation in the U.S. of the full implications of many of the [ASW] technologies involved, the absence of a deployed system by this time is difficult to understand. The implication could be that the Soviets have, in fact, deployed some operational non-acoustic systems and will deploy more in the next few years." [The logic is that if powerful Soviet ASW has not been observed, it will be there in the next few years.]

"... we cannot with any assurance whatever forecast the probability or extent of success of Soviet ASW efforts. However, we are certain that these probabilities are not zero, as the current NIE implies." [The strong consensus view is that at-sea U.S. SLBMs were never threatened by Soviet ASW, and that hypothetical new ASW technologies have all failed." (See reference 1, GAO report on the triad.)]

"... it is clear that the Soviets have mounted ABM efforts in both areas of a magnitude that is difficult to overestimate." [New Soviet ABM systems did not make much technical progress, nor were they deployed. Thus, "difficult to overestimate" is fear mongering.]

"While it may be possible (though often erroneously, in our view) to disparage the effectiveness of each component of Strategic Defense taken separately, the combined and cumulative efforts may posses considerable strategic significance." [Twenty years later SDI is not capable of destroying ICBMs.]

Reference: 1. Evaluation of the U.S. Strategic Nuclear Triad, Senate Hearing 103-457, June 10, 1993.

David Hafemeister
Physics Department
California Polytechnic State University
San Luis Obispo, CA 93407