Reprint: The Fable of Managed Earth

Editor’s Note: This article is republished with permission from Ehrenfeld, D. (2014). The fable of managed earth. In G Wuerthner, E. Crist, & T. Butler (Eds.), Keeping the Wild: Against the Domestication of Earth (pp. 85-108). The Foundation for Deep Ecology.

We must judge with more reverence the infinite power of nature, and with more consciousness of our own ignorance and weakness…. Why do we not remember how much contradiction we sense even in our own judgment, how many things were articles of faith to us yesterday, which are fables to us today? — Michel de Montaigne, Essays, 1580

Human civilization can thrive only in a healthy natural world. For at least two centuries, environmentalists, conservationists, and ecologists—greens—have, to their everlasting credit, made this point, showing that technology, for all its genius, will not last if it stands alone, damaging the natural world and disregarding the essential place of nature in our lives. Techno-optimism is a deeply flawed worldview—not only morally and ethically but also technologically. Yet in the midst of planetary-scale destruction, technology remains seductive; even some greens now proclaim the coming of a gardened planet,in which all nature is tamed, preserved, and managed for its own good by enlightened, sophisticated humans.[1] But these “neo-greens,” or “ecological modernists” as some call them, are doomed to disappointment: The gardened planet is only a virtual image; it will never happen in the real world.

We do not need to be prophets to know that we do not have the technological ability to produce and sustain a smoothly running, completely managed Earth. Of the existing technologies that are supposed to service a managed Earth, it is easy to show that many don’t work well now, and they will be even more prone to failure in a future without extensive natural systems to serve as emergency backup.

From a human perspective, planetary gardening can be divided into a number of critical management areas. These include: food production; energy production; global climate control by geoengineering; accident prediction/control/repair; restoration of damaged ecosystems; assuring water supplies; regulation of human population size; and the maintenance of cooperative working relationships among nations. I will concentrate on the first four, but the others are also critically important. All of these processes must interact smoothly; positive adjustment of one set of variables should not negatively affect others.

I.     Sustainable food production

Beginning in the 1940s, a technology that came to be known as “the Green Revolution” created enormous increases in crop production, primarily the grains—rice, wheat, corn, etc.—which comprise the bulk of our food supply. These increases were achieved by breeding dwarf plants that could respond to the application of synthetic nitrogen fertilizer by increasing their production of edible grain rather than growing longer stems and more leaves. The dramatic increase in food production brought about by the Green Revolution saved many millions of people from starvation. Yields of rice, the first crop to benefit from Green Revolution technology, increased as much as tenfold, and prices fell accordingly. Norman Borlaug, the geneticist who was the father of the Green Revolution, was awarded the Nobel Peace Prize for his achievement.

An essential feature of the new agricultural technology was the growing of grains in fertilized, irrigated monocultures—only one crop at a time in supersized fields. In these very large fields, the plants were more accessible to the machinery that applied not only the necessary chemical fertilizer but also the newly developed insecticides, herbicides, and fungicides needed to protect the vulnerable crops from the insect pests, weeds, and fungi that thrive in monocultures. The big fields also allowed more convenient use of the irrigation apparatus that provides water to wash the fertilizer into the soil, and to water the dwarf crops, whose small root systems are less able than roots of traditional varieties to extract water from dry soils. The dramatic yield increases brought about by the Green Revolution peaked in the 1960s, 1970s, and 1980s. By the 1990s, it was becoming clear that yields, especially of wheat and rice, had started to plateau. Farmers around the world had achieved the maximum benefit that the technology had to offer. Lester R. Brown, then president of the Worldwatch Institute, wrote in 1997:

In every farming environment, where yields are increased substantially, there comes a time when the increase slows and either levels off or shows signs of doing so…. During the four decades from 1950 to 1990, the world’s grain farmers raised the productivity of their land by an unprecedented 2.1 percent per year, but since 1990, there has been a dramatic loss of momentum in this rise.[2]

According to Vital Signs 2006–2007, world grain production per person peaked around 1985.[3] A growing world population (a growth propelled, ironically, by the Green Revolution) needs more food, but supply is no longer increasing proportionally.

Nevertheless, people had become accustomed to the idea that technology would solve their food problems, and technology appeared to be about to respond. Genetic engineering of food crops rose to the fore in the 1990s and in the early twenty-first century. People hoped that genetically modified (GM) crops would end world hunger.

But the great increases in crop yields that were supposed to be the result of genetic engineering have not materialized, and they seem unlikely to do so in the foreseeable future. In fact, compared with conventional crops, GM yields have often decreased, and sometimes the quality of the GM seeds is poor.[4] Yet despite this mixed performance, by the beginning of the second decade of this century, the acreage planted to GM crops in the United States, Brazil, China, and other countries had increased substantially. This increase happened for a variety of reasons, some related to transient agricultural advantages of the new crops but another significant factor being the link between economic subsidies and the political power of the multinational corporations that produce the GM seeds. By contrast, the nations of the European Union and India have largely rejected GM crops out of fear of their biological and socioeconomic side effects.

At the time of this writing, the proponents and opponents of genetic engineering are waging a fierce battle, with victories and defeats on both sides. Genetic engineering is not likely to disappear, but its claims of potentially ending world hunger have no basis in reality; GM crops are not another Green Revolution.

What went wrong after forty years of the Green Revolution, and then, more quickly, with genetic engineering?

The Green Revolution has fallen victim to a host of intractable problems. It entirely depends on cheap energy to produce the synthetic nitrogen fertilizer; to make and run the machinery that is needed on the monoculture farms; and to package and transport the crop surpluses to distant markets. By the 1970s well into the 1990s, cheap energy was starting to become a thing of the past.

The monoculture fields that were so much a part of the green revolution were also causing serious problems. The heavy equipment used on the fields was compacting and breaking down the soils, increasing erosion, and decreasing soil fertility. The chemicals used to combat the pests, weeds, and diseases that are a hallmark of monoculture were affecting the integrity of ecosystems as well as the health of humans and other species. Irrigation required large amounts of energy, and it was drawing down scarce groundwater reserves. And the shift from many small farms to a smaller number of large ones, combined with the displacement of farmworkers by machine labor, caused a mass migration of people from rural areas to cities all over the world, from Sao Paolo to Manila, creating huge urban slums.

Genetic engineering has had less time than the Green Revolution to reveal its problems, but so far they seem just as numerous and intractable. Some are specific to this technology; others are shared with the Green Revolution.

One problem specific to genetic engineering is that its exaggerated claims are based on a genetic fallacy. It is common knowledge that most genes have more than one function, often many more, and that expression of these functions can be influenced by the changing environment of the cell, of the entire organism, and of the external world. But the hype surrounding genetic engineering is grounded in the false belief that one gene does one thing—even when the gene is moved from one species to another—and that its expression remains constant over time. Sometimes this is true; frequently it is not. The public sees only the illusion of one gene, one function; the high failure rate of genetic engineering is proof that this hype cannot be trusted. For example, in March of 2012, Reuters reported that a group of plant scientists were warning that Monsanto’s GM corn, which had been engineered to resist corn rootworm, was “losing its effectiveness,” potentially leading to “significant production losses.”

Similarly, in November of 2011, the U.S. Department of Agriculture, in an extensive study of Monsanto’s “drought-tolerant” corn (MON87460), concluded that “equally drought resistant varieties produced through conventional breeding techniques are readily available.”[5]

Contrary to the claims of agribusiness, genetically engineered crops  have caused an increase in the use of pesticides. This is hardly surprising, because the companies that develop and sell the genetically engineered seeds are the same companies that produce the agricultural chemicals. For example, seeds genetically engineered to contain a bacterial pesticide, Bt (Bacillus thuringiensis) toxin, a naturally occurring bacterial toxin, kill some pests, but its use results in enabling other pests, previously viewed as minor disturbances, to rush in and fill the ecological void, with unexpected consequences. In a May 2010 Nature article, Jane Qiu gives an example:

More than 4 million hectares of Bt [GM] cotton are now grown in China. Since the crop was approved, a team led by Kongming Wu, an entomologist at the Chinese Academy of Agricultural Sciences in Beijing, has monitored pest populations at 38 locations in northern China, covering 3 million hectares of cotton…. Numbers of mirid bugs,…previously only minor pests in northern China, have increased 12-fold since 1997, they found…. [and according to Kongming Wu] ‘Mirids are not susceptible to the Bt toxin, so they started to thrive when the farmers used less pesticide [for the bollworms].’ [The mirids also eat] green beans, cereals, vegetables and various fruits…. The rise of mirids has driven Chinese farmers back to pesticides.[6]

A perhaps more serious problem caused by agricultural technology—both Green Revolution and genetic engineering—is the erosion of the genetic base upon which all of agriculture depends. For more than ten thousand years, farmers have been cultivating and saving the seeds of the plants they have found most productive; most resistant to pests, diseases, droughts, and floods; and most delicious. Tens of thousands of local varieties of hardy crop plants that yield high-quality food even under adverse conditions are the heritage of these millennia of farming. The best seeds have always been saved and passed on to the next generation by the farmers who grew them, and, since the nineteenth century, they have also been produced and sold by many seed companies. However, starting with the Green Revolution, and accelerating with the rise of genetic engineering, restrictive patent laws and the growing power of the agricultural chemical companies (which now own the major seed companies) have caused the loss of thousands of preexisting crop varieties. Many corporate owners of these varieties have deliberately discontinued them in order to make way for their own, patented seeds. Restrictive laws in some countries now punish farmers who save their seeds. Loss of agricultural varieties is a worldwide phenomenon. For example, according to Dr. H. Sudarshan, in India, where in the first half of the twentieth century there were an estimated 30,000 indigenous varieties of rice, it is now predicted that soon just 50 varieties will remain, with the top ten accounting for more than threefourths of the subcontinent’s rice acreage.[7]

The spread of genetically engineered crops is causing a threat to traditional varieties and wild relatives of our crops. Corporate claims to the contrary, genetically engineered genes are escaping from the planted fields and contaminating the gene pools of traditional crops and their wild relatives. It is a paradox that the success of the Green Revolution, GM crops, and conventional agriculture largely depends on the preservation of the gene pools that are now being deliberately discarded by industrial agriculture, wiped out by herbicides, or accidentally contaminated with engineered genes. The genetic engineers are sawing off the very branch on which they sit.

Another effect of the genetic contamination is the transfer of the genes conferring the genetically engineered traits from the crops to the weeds. In another, more recent Nature news article, in August 2013, Jane Qiu reports that transgenes from rice crops genetically engineered to resist the herbicide glyphosate have crossed over into weedy relatives of the rice. Not only have the weeds become resistant to the weed killer, but they now have higher rates of photosynthesis, grow more shoots and flowers, and produce 48–125 percent more seeds per plant than their non-transgenic relatives. An ecologist at Shanghai’s Fudan University stated that “making weedy rice more competitive could exacerbate the problems it causes for farmers around the world.”[8]

Monocultures have been praised for their high yields, but even these appear to be an illusion. The physicist and agricultural scientist Vandana Shiva has exposed what she calls “the myth of productivity.”[9] Traditional polyculture systems, where many different crops are grown close together on the same farms, actually produce more food per acre than do modern monocultures. A mixture of corn, cassava, and peanuts yields less corn per acre than a GM corn monoculture, but it produces two and a half times as much total food per acre. As Shiva points out, “The Mayan peasants in the Mexican state of Chiapas are characterized as unproductive because they produce only two tonnes of corn per acre. However, the overall food output is twenty tonnes.” Shiva concludes that “industrial breeding has actually reduced food security by destroying small farms and the small farmers’ capacity to produce diverse outputs of nutritious crops.”

II.     Sustainable energy production

It was cheap energy that powered the Green Revolution and the entire industrial revolution of the twentieth century. Chief among the sources of energy was oil, a concentrated energy source that was easy to extract from the ground. Coal and natural gas completed the trio of “fossil fuels,” carbon-rich substances that were the end result of millions of years of decay of plants buried deep underground. Although vast, the underground reserves of fossil fuels are finite, and the easily extracted parts of these reserves have been largely depleted.

As the physicist Albert Bartlett pointed out,[10] with an increase in fuel consumption of 7 percent per year, a typical twentieth-century growth rate, the amount of a fuel consumed in ten years is equal to the grand total of oil consumed in the recorded history prior to that decade. In other words, simple arithmetic shows that if oil consumption grows at a rate of 7 percent per year between 2010 and 2020, we will have used during that same decade an amount of oil equal to all the oil consumed in all the years before 2010. Clearly, these extraction rates cannot continue, and they haven’t. The economist Herbert Stein put it succinctly in what has become known as Stein’s Law: “If something cannot go on forever, it will stop.”

The cheap energy that helped produce industrial civilization is nearly gone, as anyone who buys gasoline knows. This author remembers once, in the midst of a “gas war” during the 1950s, buying gas at 11 cents a gallon to fill the tank of his gas guzzler; now gasoline is more than thirty times as expensive. Some of the difference is due to a drop in the value of the dollar; most is because of dwindling supplies of cheap oil. Modern technologies of prospecting for new oil reserves are very sophisticated, yet new oil discoveries peaked in the 1960s. And oil consumption continues to grow, propelled by consumer demand and industrial expansion in China and India. However, according to World Energy Outlook 2010, global oil production peaked in 2006, and it is expected to decline from 70 million barrels per day in 2006 to less than 16 million in 2035. The International Energy Agency, the U.S. Joint Forces Command, and the oil companies themselves all know that cheap oil is a thing of the past.

The loss of cheap oil (and cheap oil = cheap energy) is an incontrovertible fact, so the technophiles have turned to the idea that technology will invent oil substitutes to power our technological civilization, and they keep alive their hopes that cheap energy will continue to be available to run a managed planet. Coal-to-liquid conversion; nuclear fission or fusion; hydrogen; tar sands and oil shale; fracking for natural gas; offshore and deep-sea oil and gas drilling; and the “renewables,” including solar power, wind power, and biofuels, are expected to rescue us.

But the cold facts tear this dream to pieces. True, nearly all of the celebrated energy substitutes are technically feasible and have been shown to work, but all suffer from one or more major problems. They require largescale investment and have long lead-in periods. They frequently need expensive government subsidies. Some routinely cause serious environmental damage and have high greenhouse gas emissions. Some are subject to major accidents. Their processing may place great demands on scarce freshwater supplies and can require high energy inputs for production. They may not be capable of producing enough energy to replace what we now use. And all the new energy substitutes are guaranteed of being more expensive, often much more expensive, than conventional oil.

The University of Manitoba’s Vaclav Smil, one of the world’s leading energy experts, writing in the May–June 2011 issue of American Scientist, looked at the substitutes for conventional oil and dubbed them “the latest infatuations.”[11] They reminded him of the scientist at the grand academy of Lagado, in Gulliver’s Travels, who had spent eight years on a project for extracting sunbeams out of cucumbers. (Actually, as mentioned below, cucumbers probably could be used for biofuel, but nobody in their right mind would think that the world’s energy needs could be met by cucumbers.)

Enthusiasm for the new energy sources waxes and wanes, as it does for any new fad. A few years ago the fad was hydrogen: Hydrogen-powered cars and distributed energy systems were the rage. But when people stopped to think, they realized that hydrogen is not a primary energy source (there are no hydrogen wells)—it takes money and energy to extract it from natural gas or water. Also, hydrogen is highly explosive (remember the Hindenburg disaster); is corrosive; and, in liquid form, even contains much less energy per gallon than does oil. Not surprisingly, we hear less about hydrogen cars now than we did in 2000.

Before hydrogen, nuclear fusion was going to save us. It was thought that ordinary seawater, believed to be in endless supply, could have acted as the fuel for a fusion reactor. The first patents for fusion reactors were registered in 1946. In 2012, sixty-six years and millions of research and development dollars later, I heard a lecture from a prominent fusion scientist who was equally enthusiastic about the limitless potential of fusion. When asked how long it would take to get a working reactor, she replied about thirty to forty more years.

Nuclear fission power plants have existed for decades in many countries. The oldest operating commercial nuclear power plant in the United States, New Jersey’s Oyster Creek plant, has been producing power since 1969, and it is not scheduled to shut down until 2019. Until the Fukushima Daiichi disaster caused by the Tohoku earthquake and tsunami in March of 2011, many assumed (despite the earlier accidents at the Three Mile Island and Chernobyl plants) that nuclear power would ease the transition to a new, renewable energy world. Since Fukushima, fission has become an increasing cause for concern: Few new reactors are being built; Germany has announced that it will abandon nuclear power completely by 2022; and, after Fukushima, Japan closed or suspended its 50 nuclear reactors.

Moreover, as noted by Mark Bittman in The New York Times, on August 24, 2013:

The dangers of uranium mining, which uses vast amounts of water…[are] barely regulated or even studied. Thousands of uranium mines have been abandoned, and no one seems to know how many remain to be cleaned up. The cost of that cleanup…will be borne by taxpayers…. Then there’s disposal of spent fuel…. Decades into the nuclear age there remains, incredibly, no real plan for this…. The economic viability of nuclear power is no more encouraging. Plants continue to close and generation rates continue to drop…. Subsidies for nuclear power have been more than double the expense of power generation itself.[12]

U.S. oil shales and the Canadian tar sands contain large reserves, but the environmental damage associated with the extraction of the oil is enormous; a great deal of freshwater is used in the process; the energy ratio, Energy Returned Over Energy Invested (EROEI), is terrible—only about three barrels of oil out for every two barrels put in; and the need to construct new pipelines to transport the heavy, toxic crude oil from remote production sites many miles to distant refineries generates grave political and environmental problems. Offshore oil, another heralded energy source, is extremely expensive, and it was dealt a serious blow by the Deepwater Horizon explosion. The Deepwater Horizon drilling rig cost a billion dollars to build and a half-million dollars a day to operate—while it lasted.[13]

Improvements in the efficiency of energy generation and use can save us a great deal of energy. These improvements are both desirable and possible. Again, however, they are unlikely to meet the energy needs of a highly managed planet. Modern agriculture has a much lower energy efficiency than that of traditional farming systems, which take advantage of the free energy subsidies offered by nature. And even when efficiencies materialize, there is the Jevons Paradox, first described by the English economist W. Stanley Jevons in 1866: Increased efficiency of energy production leads to increased consumption. Using the coal industry as his model, Jevons showed that improvements in efficiency led to lower cost of the product, which in turn caused a rebound increase in consumption of the coal. This paradox applies to other sources of energy besides coal.

     Renewable energy. Let us take a closer look at renewable energy—solar, wind, and biofuels, the great hope of the neo-greens. According to Smil, the renaissance of renewable energy “has led to exaggerated expectations rather than to realistic appraisals.” In 2011, he wrote:

Promoters of new renewable energy conversions that now appear to have the best prospects to make significant near-term contributions—modern biofuels (ethanol and biodiesel) and wind and solar electricity generation—do not give sufficient weight to important physical realities concerning the global shift away from fossil fuels: to the scale of the required transformation, to its likely duration, to the unit capacities of new converters, and to enormous infrastructural requirements resulting from the inherently low power densities with which we can harvest renewal energy flows and to their [irregularity].[14]

     Solar power. In his well-researched book Green Illusions, environmentalist Ozzie Zehner states:

If actual installed costs for solar projects in California are any guide, a global solar program [to replace fossil fuels in powering the planet] would cost roughly $1.4 quadrillion, about one hundred times the United States GDP. Mining, smelting, processing, shipping, and fabricating the [solar] panels and their associated hardware would yield about 149,000 megatons of CO2. And everyone would have to move to the desert, otherwise transmission losses would make the plan unworkable.[15]

Future costs of solar panels may come down with technological innovations (costs may already have started to plateau), but as Zehner notes:

Cheaper photovoltaics won’t offset escalating expenditures for insurance, warranty expenses, materials, transportation, labor, and other requirements. Lowtech costs are claiming a larger share of the high-tech solar system price tag.[16]

Passive solar power, which involves energy savings in heating and cooling achieved by sophisticated architectural design and construction, has been proving its worth for millennia, as the natives of New Mexico demonstrated in the tenth century with their incredibly energy efficient housing complex, which we call Pueblo Bonito. These energy efficiencies were built into Pueblo Bonito from the start of construction. Modern passive solar houses constructed today can be equally energy efficient and are a joy to live in. But many, perhaps most, existing homes have a limited potential for passive solar improvement.

Solar power has an important role to play among the energy sources of the future, but it does not seem to be about to replace cheap oil in maintaining our present industrial civilization.

     Wind power. Wind power, like solar, is receiving a great deal of enthusiastic praise, some of it justified. I am among those who find the sight of a row of giant, stately wind turbines with their slowly moving blades thrilling and beautiful, but, admittedly, I don’t live near them. Denmark is the pioneer in wind energy: In 2012, Denmark got 25–30 percent of its power from the wind, and now the country hopes to raise this figure to 50 percent or more. Denmark also produces half of the world’s wind turbines. Like solar power, wind has a great deal to offer an energy-challenged future. Wind power is not, however, all smooth sailing.

In The New York Times on August 15, 2013, Diane Cardwell chronicled the problems experienced by Green Mountain Power, whose wind turbines line the ridge of Lowell Mountain in Vermont.[17] These problems are typical of those experienced by the wind power industry. Some of the difficulties include “curtailments,” mandated cutbacks in energy production when the grid will not accept the wind power energy, either because the electric company can get energy cheaper elsewhere or for technical reasons involving the interface between fossil fuel generated electricity and wind power. Other difficulties involve the size of the lines carrying the power. When curtailments occur, the wind turbines must operate at a fraction of their potential output. In her article entitled “Intermittent Nature of Green Power Is Challenge for Utilities,” Cardwell writes:

Because energy produced by wind…is intermittent, its generating capacity is harder to predict than conventional power’s. And a lack of widely available, cost-effective ways to store electricity generated by wind only compounds the complex current marketplace…. [One wind power CEO noted that] at full operating capacity he can lose $1,000 an hour if the electricity is not sold. “We have a grid system that’s not smart…it’s a 100-year-old system—and they run it like fossils and nukes are the only things that matter and the rest of us, they can fiddle with,” he said.[18]

Integrating wind power into an electrical system that receives inputs from fossil fuel and nuclear plants plus, increasingly, solar installations involves daunting economic and technical challenges. Some of these will be fairly straightforward to resolve over time; others, like the difficulty or impossibility of storing excess wind power when the grid cannot accept it, are much harder to fix.

Among the other problems that are an inseparable part of wind power are the fact that wind turbines kill bats and migrating birds, that wind power installations on the roofs of city buildings are noisy and hard to maintain, that turbine installations on ridgetops damage and fragment some of the last undisturbed wildlife habitats, and that many people complain that the huge turbines spoil their view of the countryside or of their neighboring coastal waters.

Bat and bird kills by turbines are easy to document. Numerous counts have been published of dead bats and birds collected under turbines; but there is as yet no evidence that any populations are threatened by wind power, and some radar studies have shown birds flying well above the turbines during migration. Urban wind power production on the tops of tall buildings has been promoted by neo-greens as a renewable source of energy in cities, but noise and maintenance issues are likely to limit the potential of urban wind energy for the foreseeable future. Even outside of cities, some people living in rural areas near wind turbines complain of health problems such as insomnia, anxiety, palpitations, and nausea, allegedly related to the low frequency noise. The existence of this “Wind Turbine Syndrome” is still debated.[19] As for the question of unsightliness of the windmills, there is no right answer; some love them, some don’t.

     Biofuels. Biofuels are another mixed blessing as a replacement for vanishing cheap fossil fuel energy. The idea of biofuels is straightforward: Use plants to capture the energy of sunlight (like the Lagado cucumbers), and get some of that energy back by extracting energy-rich substances from the plants (sugars and other hydrocarbons) that can be either turned into fuel, such as ethanol, by chemical processing or used directly as a diesel fuel substitute. Corn, sugarcane, soy, rapeseed, palm and other tree oils, grasses, algae, and the desert plant called Jatropha are some of the plants used for biofuel.

Like solar and wind power, biofuels have a dark side. Some of the plants grown for biofuel, especially the grasses, can escape from cultivation and become invasive species, particularly harmful in agricultural fields. The EROEI of biofuels is troubling. Corn ethanol from the American Midwest has an EROEI ratio of about 1.0 or even lower, meaning that if we total the energy costs of growing the corn, harvesting it, and then processing it, we find that the amount of energy we get back is only equal to or less than the energy we put in, clearly a losing proposition. Meanwhile, we have wasted land that could have been used for growing food and have also driven up the price of corn. The EROEI of other biofuels can be better than that of corn ethanol, but not always enough to offset the other difficulties of the technology.

If the results for corn ethanol are so poor, why does the Midwest in the United States continue to produce so much of it? The answer is political: Midwestern states receive huge federal subsidies for growing corn and producing ethanol, and few politicians are willing to tell the truth about corn ethanol and risk the wrath of midwestern voters.

The land used to grow biofuel plants is unavailable for growing food in a hungry world. True, plants like Jatropha grow well in dry, nutrient-depleted soils that are not suited for crops. But the conceivable supply of Jatrophaderived biofuel could run only a tiny fraction of the world’s vehicles.


Timothy Beardsley summed up the problems with biofuels in an editorial titled “Biofuels Reassessed,” in the October 2012 issue of BioScience:

It takes a lot of land, a lot of water, and a lot of energy to produce biofuel crops and convert them into usable fuels. The displacement of food crops by biofuels has already increased food prices, and many have argued that such effects will put limits on the biofuel enterprise…. The enthusiasts are right that improvements [in biofuel technology] are possible…and the seriousness of the looming energy crisis—only partly ameliorated, at substantial environmental cost, by fracking—argues for the continuation of such efforts. Still…it is important to understand biofuel’s limitations.[20]

Beardsley cites scientific studies showing that the amount of biofuel that globally could be produced is four times lower than previously published estimates:

All these numbers exclude losses due to manufacturing the fuel…. Actual current global primary productivity suggests strongly that biofuels have less promise than many had thought…. Some new biofuels may yet alleviate the human predicament, but nobody should be under any illusions about the constraints that nature—ultimately through the laws of thermodynamics—has put in the way.[21]

In concluding this section on renewable energy, we should heed the words of Vaclav Smil: “None of us can foresee the eventual contours of new energy arrangements—but could the world’s richest countries go wrong by striving for moderation of their energy use?”[22] In other words, the best thing we can do to sustainably run the Earth and our own civilization is to depend less on technologies of control and more on regulation of our own self-destructive consumption.

III.     Geoengineering to control climate change

To begin, climate change is a reality. In 1981, NASA physicist James Hansen calculated the extent of global warming he expected in the near future, based on man-made CO2 emissions. Three decades later, these calculations have proven exceptionally accurate.[23] Temperatures have risen to meet or exceed Hansen’s predicted levels; polar ice is melting; and drought-prone areas are receiving less rainfall. In recent years, other consequences of climate change—more frequent and more violent storms, and rising sea levels—have forced themselves on our attention. In a May 9, 2012, article in The New York Times, Hansen writes that if we were to continue to burn conventional fossil fuels and to exploit Canada’s tar sands:

Concentrations of carbon dioxide in the atmosphere eventually would reach levels higher than in the Pliocene era, more than 2.5 million years ago, when sea level was at least 50 feet higher than it is now…. Disintegration of ice sheets would accelerate out of control. Sea levels would rise and destroy coastal cities. Global temperatures would become intolerable. Twenty to 50 percent of the planet’s species would be driven to extinction. Civilization would be at risk. That is the long-term outlook. But near-term, things will be bad enough. Over the next several decades, the Western United States and the semi-arid region from North Dakota to Texas will develop semi-permanent drought, with rain, when it does come, occurring in extreme events with heavy flooding. Economic losses would be incalculable. More and more of the Midwest would be a dust bowl. California’s Central Valley could no longer be irrigated. Food prices would rise to unprecedented levels.[24]

Other parts of the world, including its most populous nations, China and India, are already experiencing the effects of climate change. In China, the Gobi Desert is expanding, moving toward the Yellow River, and is within 100 miles of Beijing. Growth of the Gobi is the result of not only climate change but also careless use of groundwater and indiscriminate logging in the past. Groundwater use and logging can be and are being controlled to some extent by the government, and millions of trees are being planted at the edge of the desert to halt its advance, but global warming is a continuing presence. In India, now the world’s sixth-largest emitter of greenhouse gases (carbon dioxide, methane, and nitrous oxide), disastrous floods have been attributed to climate change; melting of the Hindu Kush ice mass is accelerating; and sea-level rise is forcing saltwater into coastal aquifers, contaminating drinking water.

The solution to the problem of climate change is obvious: We must immediately halt the expansion of greenhouse-gas release and quickly start to reduce it below present levels. A number of well-publicized, highlevel meetings of governments have confronted this issue, with some positive results. But international environmental agreements are subject to compromise and delay; meanwhile, greenhouse gas levels continue to rise. Impatient with the political process, some scientists have decided that geoengineering offers the best hope of managing our planet. Geoengineering solutions fall into three categories: dimming the sunlight reaching Earth; using plant photosynthesis to take up and reduce the carbon dioxide already in the atmosphere; and capturing carbon dioxide, turning it into charcoal, and burying it in the Earth.

There are various proposed ways to reduce the sunlight reaching the Earth. One solution, inspired by the observed effects of volcanic eruptions, would be to spray solar-reflective sulfates into the stratosphere, perhaps from a giant balloon. Other schemes include using rockets to send tiny reflectors into space, growing lighter-colored crops genetically engineered to reflect sunlight, painting all roofs white, and covering the Earth’s deserts with reflective Mylar.

Some of these ideas, like desert Mylar and lighter-colored crops, are too preposterous to deserve comment. After careful evaluation, most of the schemes, like painting roofs white, would not have enough effect to make a significant difference in global warming. Injecting 5 million tons of sulfates per year into the stratosphere (like other sunshade schemes) could make a difference, especially in the tropics, but could also disrupt monsoons, bringing famine to millions, and, according to Oxford’s Tim Palmer,[25] “You might turn the Amazon to desert.” Sending enough tiny reflectors into space could require an estimated 20 million rocket launches. And if there were bad side effects, how would we get our little reflectors back? Using plants to pull carbon dioxide out of the atmosphere through photosynthesis has no obvious adverse side effects, and it does have the added benefit of putting oxygen back in place of the carbon dioxide removed. Planting forests of relatively fast-growing trees can tie up a good deal of carbon dioxide. Reforestation is generally a good idea, not just because of carbon sequestration but because of beneficial effects on local climate, water storage, and stream flow.

Reforestation, however, is slow, varies greatly from country to country, and can present ecological and social challenges. Reforestation can be a win-win procedure to slow climate change. But planet managers are an impatient lot—reforestation is too slow for many of them.

Algae in the world’s oceans remove a great deal of carbon dioxide by photosynthesis, and some climate engineers might ask, Why not fertilize the oceans, increase the algal numbers, and pull out more carbon dioxide? This would slow climate change, benefit marine food webs that are based on algae, and even, in closed systems, provide algal biomass to be used as animal food or for biofuels. That’s the theory, and it works to some extent. Dumping iron fertilizer in the ocean does stimulate algal growth; the algae do remove carbon dioxide; and, when they die, some of them take the carbon out of harm’s way by sinking to the bottom of the ocean.

Unfortunately, ocean fertilization with iron can also stimulate toxic algal blooms and cause production of the greenhouse gas nitrous oxide. And when the algae die, as they do in vast numbers during blooms, the decomposition of algal bodies that stay at the surface pulls oxygen from the water while putting carbon dioxide back in the atmosphere. In closed, artificial systems, unlike ocean fertilization, the main difficulties are the costs of building, maintaining, and aerating the containers for the algae and the problem of scale—these systems will have limited impact on global climate change and biofuel energy production.

Carbon capture and storage is a geoengineering method that can reduce climate-changing carbon dioxide. The carbon dioxide is captured and removed at point sources, usually the smokestacks of large fossil fuel power plants, and then moved to sites where it can be deposited underground. This is a good idea, but one whose impact is limited because there are so many nonpoint sources of greenhouse gases. The principal risk of carbon capture and storage is leakage of the gas back into the atmosphere from its underground burial sites (declining oil fields, saline aquifers, un-mineable coal seams, and other suitable geological formations). Deep-well injection of unwanted substances has caused earthquakes. Needless to say, carbon capture and storage is a great deal more expensive than simply letting the gas escape into the atmosphere, and it may require government-sponsored incentives and subsidies.

Geoengineering has a great appeal to those looking for quick and simple solutions to overwhelming, complex problems. Such searches tend to promote tunnel vision, in which the gaze is always on simple models and their associated technical solutions, not on the many, sometimes serious, unpredictable, and unmanageable side effects produced by geoengineering technologies. Vaclav Havel, author and first president of the Czech Republic, wrote in The New York Times on September 27, 2007:

I’m skeptical that a problem as complex as climate change can be solved by any single branch of science. Technological measures and regulations are important, but equally important is support for education, ecological training and ethics— a consciousness of the commonality of all living beings and an emphasis on shared responsibility.[26]

IV.     Accident prediction, control, and repair

Our global management systems rest on a precarious edifice of predictions. These include predictions about the sustainability of industrial agriculture; the safety of nuclear power plants; the stability of the global political structure; the efficacy of our ecological restorations; the future of globalization—especially global trade; the continuation of economic growth; and, above all, the ability of our technology to solve any problems we face, now or in years to come.

These predictions are often unwarranted and very dangerous. One would think that the first priority of the planet managers would be to look at their past predictions and assumptions and see how well they have worked out. But this might involve admitting failure and, more important, shutting off sources of revenue for the failed projects. Consequently, risk assessments made at the start of projects are frequently “cooked,” unwarranted justifications for enterprises scheduled to go ahead no matter what.

In their book Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future,[27] geologists Orrin Pilkey and Linda Pilkey-Jarvis show how a model of future beach erosion and coastal sand movements has been used to justify escape from reality and allow construction of questionable shoreline structures and buildings. The standard model used in beach engineering is the Bruun Rule, which describes how shorelines retreat in response to rising sea levels. This simple model to describe a complex process has some general validity, but, as the authors note:

The Bruun Rule resides in a world dominated by engineers rather than scientists. It is a world where it is not possible to admit defeat and walk away or to respond flexibly, one where an answer must be found…and where the answer, to be credible, is best found by the most sophisticated means possible…. Evidence continues to accumulate from all over the world that the basic assumptions behind the Bruun model are very wrong. Yet it continues to be widely applied by coastal scientists, who should know better, and blindly applied by social scientists, planners, and international agencies concerned with how future global trends will affect coastal cities.[28]

When the Bruun Rule is used to predict the rate of erosion of a parTHE ticular shoreline, one has to know only the rate of sea-level rise and the slope of the shoreface on that particular beach. Two variables; it’s easy. But as Pilkey and Pilkey-Jarvis show, there are at least 31 variables that matter, including beach subsurface geology, sand grain size, coastal sediment supply, beach nourishment projects, storm types and frequency, shoreline vegetation, upland bluffs and dunes, dam construction and removal in neighboring rivers, and history of dredging.

Even if you know how each of the factors works and interacts with other factors, including sea-level rise, in causing shorelines to retreat, you still can’t predict the future because you don’t know the order in which the factors will occur…. On different shorelines the various parameters will be of varying importance, over varying time frames. This is ordering complexity. This is why shoreline retreat related to sea-level rise cannot ever be accurately predicted.[29]

Ordering complexity can make some management predictions absurd. Pilkey and Pilkey-Jarvis give, as the ultimate preposterous example, the Department of Energy’s Total System Performance Assessment (TSPA) for the proposed nuclear waste repository at Yucca Mountain, Nevada. The assessment of the chances of radioactive leaks from the underground repository, based on hundreds of models, is that it will be safe for more than a hundred thousand years. Yet, as the authors show, there are at least 15 important factors that will affect the seriousness of future leaks. None of these factors were known when the TSPA was formulated, and many will never be known.

In 2009, the Environmental Protection Agency issued a rule requiring that the Department of Energy (DOE) strictly limit the amount of radiation from the facility to no more than 15 millirems per year for the first ten thousand years after the facility’s closure, and requiring the DOE to show that the nuclear waste repository will resist earthquakes, volcanic activity, climate change, and container leakage for 1 million years. The risk assessment charade came largely to a halt when work on Yucca Mountain was ended by Congress in 2011, for political reasons. It remains to be seen whether it will be started again.

Ordering complexity is only one kind of complexity that makes the long-term predictions and assumptions used in planet management unreliable. The other is structural complexity. The pioneer in studying the hazards of structural complexity is Charles Perrow, Professor Emeritus of Sociology at  Yale. Using the well-studied 1979 accident at the Three Mile Island nuclear plant as his model, Perrow showed how the sheer complexity of the nuclear plant made accidents inevitable and unpredictable—“normal.”

The operating system of a nuclear power plant has a large number of separate subsystems, many of which interact in ways that cannot be directly observed, and in ways that might not be understood even if they were observed. Moreover, the operating systems interact with safety systems, which are themselves complex and often cannot be directly observed.

In his book Normal Accidents: Living With High-Risk Technologies, Perrow describes how the accident at Three Mile Island was caused by failure of a pressure-relief valve, which resulted in radioactive water boiling out and onto the floor of the reactor building.[30] This could have been determined only indirectly by the control room operators from a variety of gauge readings; while three audible alarms were sounding and simultaneously many of the 1600 lights on the control panels were flashing. Only 13 seconds elapsed between the time of the valve’s failure and the time when the accident became irrevocable. The scene in the control room was chaos.

Several hours after the start of the accident, control room personnel and supervisors were still arguing about what was happening. The valve stayed open for two hours and twenty minutes until a new shift came on and somebody thought to check it. But the accident was just getting started. Two reactor coolant pumps did not work (possibly because of steam bubbles in the lines), and levels of coolant began to drop alarmingly, the most feared happening in a nuclear plant. The two dials indicating reactor pressure gave diametrically opposite readings.

Then, thirty-three hours into the accident, an ominous bang was heard in the control room. It was a hydrogen explosion inside the reactor building. No one had expected this. Frantic discussions occurred between the plant operators and the commissioners of the Nuclear Regulatory Commission. The emergency pumps, like all electric motors, can produce sparks; when hydrogen accumulates, a spark can cause an explosion that could destroy the reactor building. Should the pumps be turned off or kept running? Opinions varied. That an explosion did not happen was in good measure a matter of luck.

Because of the vast complexities of nuclear plants, paradoxically including their safety systems, the operators did not actually know what was happening while the accident was going on. But they had to do something. In this sort of situation, Perrow notes, you form a mental model of events. You imagine what is happening, based on the inadequate and partially erroneous information that you have. “You are actually creating a world that is congruent with your interpretations, even though it may be the wrong world. It may be too late before you find that out.”[31]

In other words, the complex systems that we invent to manage and run our world cannot be made fail-safe. And if we add economic and ecological interactions, our constructed systems become still more complicated and accident-prone.

Here is an example: On April 20, 2010, the Deepwater Horizon oil drilling rig in the Gulf of Mexico suddenly exploded in flames. As chronicled by Joseph Tainter and Tadeusz Patzek, in their book Drilling Down:

The Gulf Oil Debacle and Our Energy Dilemma:

Everything seemed to be under control, with the computers in charge and their sensors humming. The people assigned to watch these computers, and act on their advice, were content and getting ready to go to sleep…. Suddenly all hell broke loose, and it became clear that the people watching the computer screens did not understand what the computers were telling them. It took just a few seconds for their false sense of security to go up in the same flames that consumed the Deepwater Horizon in two days.[32]

When the flames were extinguished, the accident was far from over. Several months later, the well was finally capped. By then, an estimated 210 million gallons of oil had leaked into the gulf. Various attempts were made to contain the oil or mitigate its effects. State of the art technologies were used. But several years later, we still do not know the long-term effects of this accident on the thousands of species living in the immensely complicated gulf ecosystem, or on the human communities of the adjacent land areas.

Tainter, a professor in the Department of Environment and Society at Utah State University, and Patzek, Chairman of the Department of Petroleum and Geosystems Engineering at the University of Texas, analyze in detail the causes of the accident. At the end of their book, they conclude:

The Deepwater Horizon was a normal accident, a system accident. Complex technologies have…ways of failing that humans cannot foresee. The probability of similar accidents may now be reduced, but it can be reduced to zero only when declining [energy returns] makes deep-sea production energetically unprofitable. It is fashionable to think that we will be able to produce renewable energies with gentler technologies, with simpler machines that produce less damage to the earth, the atmosphere, and people. We all hope so, but we must approach such technologies with a dose of realism and a long-term perspective.[33]

Three Mile Island and Deepwater Horizon teach us a simple lesson: We cannot predict all the accidents that will occur in our managed world; and even if we could predict them, we could not prevent many of them from happening. Disasters in our complex systems are bound to take place, and the techno-utopians’ models offer no credible ways of fixing them.

V.     Other global management concerns

Successful global management requires addressing issues of necessity besides the concerns listed above. To describe them briefly, they include:

     Ecological restoration and preservation: In some cases, restoration of damaged ecosystems is possible if done with care and ecological knowledge; in others, it can be difficult or impossible. Restorations are often confounded by ignorance of the component species and complexity of the specific ecosystem; by prior species extinctions; by major soil or water changes; and by lack of sufficient funds to do the restoration properly or to monitor it after the restoration is complete.

Preservation can be as hard as restoration. Moving species endangered by climate change to more favorable climate zones (“assisted colonization”), and attempts to reintroduce recovering populations of endangered species to their original habitat are challenged by the limitations of our ecological knowledge. This is not a reason to abandon restoration and preservation efforts, but it should make us think twice before we boast about how green the coming garden planet will be.

     Maintenance of adequate supplies of clean freshwater will be essential for sustainable global management; it is not happening now, and there are no affordable technologies on the horizon that will assure water for everyone, especially in the face of climate change. Already, international fights over water management complicate tense politics in the Middle East, South Asia, and parts of Africa. Water will undoubtedly be one of the greatest obstacles to a managed planet.

     Growing populations require more space, more food, more water, more mineral resources, and more energy than stable ones; and they produce more waste. The Earth’s population is growing: Estimates published by the United Nations (UN) in June of 2013 suggest an increase from today’s 7.2 billion to 9.6 billion by 2050.[34] Population growth models are no more reliable than any long-term predictions involving thousands of variables (climate and sea level, disease, ethnic conflicts and warfare, economic changes, etc.), and this sort of unreliability will greatly increase the difficulty of managing a gardened Earth. A point to consider is that per capita consumption is increasing more than twice as fast as population in many places around the world.

A managed world assumes good working coordination between nations. The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) shows that this is occasionally possible.[35] By 2013, 178 nations had ratified the convention, which protects—at least on paper— thousands of endangered animal and plant species from over-exploitation. With exceptions, this protection has been moderately successful. A great weakness of the treaty, however, is that reservations (exceptions) can be taken by member countries for specific species. Iceland, Japan, and Norway have taken reservations that allow them to hunt some baleen whale species, and Saudi Arabia has taken falcons as an exception. CITES is an encouraging model; nevertheless, the proliferation of regional military conflicts, terrorism, religious and ethnic strife, exhaustion of resources, and political instability do not bode well for cooperative management of the planet.

I have considered the various threats to the neo-green vision individually, but of course they interact, usually making the situation worse. For example, scarcity of cheap energy affects modern food production and water availability, while causing us to rely on increasingly dangerous energy technologies, which are prone to accidents that we are unable to predict. Similarly, climate change has a major impact on food, water, international relations, and energy use.


In conclusion, the paragraphs above give only an incomplete sampling of the reasons why many of the dreams of the planet-managing neogreens and ecological modernists are likely to turn into nightmares. In his chilling short story “The Machine Stops,” written more than a century ago, E. M. Forster described the chaos and total collapse that descended on a managed world when the “Mending Apparatus,” which had always repaired everything that was broken, itself began to fail: “Man, the flower of all flesh, the noblest of all creatures visible, man who had once made god in his image…was dying, strangled in the garments that he had woven.”[36]

The dream-to-nightmare scenarios outlined here do not have to become reality. We can keep trying to make the world a better place, using any safe technology that is proven or seems promising. For instance, we already know that traditional polycultures can reliably produce far larger amounts of food than can industrial monocultures year after year, with less input of chemical fertilizers and pesticides. The field is wide open to apply careful, modern scientific research to improve this performance still further. And in the case of our energy deficit, reduction of consumption is safer, easier, faster, and more effective than deep-sea oil drilling or nuclear power.

Wendell Berry wrote in The Unsettling of America that “what has drawn the Modern World into being is a strange, almost occult yearning for the future. The modern mind longs for the future as the medieval mind longed for Heaven.”[37] This yearning, embodied in the blind worship of technology, has led us astray—if we open our eyes and look at who and where we are, we have our best chance of finding out where to go next. I end with a quote from my book The Arrogance of Humanism, published in 1981, with words that I believe are as applicable now as the day they were written:

Not all problems have acceptable solutions…. There is…no need to feel defeated by the knowledge that there are limits to human power and control…. [We should start] with the honest admission of human fallibility and limitations, and from this realistic base [rise to the] challenge to construct a good life for oneself, one’s family, and one’s community…. We simply start with realism and then free the human spirit for high adventure, struggle, and an unknown fate.[38]

[1] F. Pearce, “New Green Vision: Technology as Our Planet’s Last Best Hope,” Yale Environment 360 (15 July 2013).

[2] L. Brown, “Can We Raise Grain Yields Fast Enough?,” World Watch, Worldwatch Institute (July-August 1997): 8-17; see also F. Magdoff and B. Tokar, “Agriculture and Food in Crisis: An Overview,” in Agriculture and Food in Crisis: Conflict, Resistance, and Renewal, ed. F. Magdoff and B. Tokar (New York: Monthly Review Press, 2010), pp. 10-17; D. Ehrenfeld, “Agriculture in Transition,” in Beginning Again: People and Nature in the New Millennium (New York: Oxford University Press, 1993/1995), 164-74.

[3] B. Halweil, “Grain Harvests Flat,” in Vital Signs 2006-2007: The Trends That Are Shaping Our Future (New York: Norton, 2006), 22-24.

[4] M. A. Altieri, Genetic Engineering in Agriculture: The Myths, Environmental Risks, and Alternatives, 2nd ed. (Oakland, CA: Food First Books, 2004); D. Ehrenfeld, Becoming Good Ancestors: How We Balance Nature, Community, and Technology (New York: Oxford University Press, 2009), 4-13; C. M. Benbrook, “Who Controls and Who Will Benefit from Plant Genomics?” Presented at the 2000 Genomic Seminar Genomic Revolution in the Fields: Facing the Needs of the New Millennium (Washington, D.C.: American Association for the Advancement of Science Annual Meetings, 19 Feb. 2000).

[5] APHIS (Animal and Plant Health Inspection Services), U.S. Department of Agriculture, “Monsanto Company Petition (07-CR-191U) for Determination of Non-regulated Status of Event MON 87460, OECD Unique Identifier: MON 87460-4, Final Environmental Assessment” (Washington, D.C.: U.S.D.A/APHIS, Nov. 2011).

[6] J. Qiu, “GM Crop Use Makes Minor Pests Major Problems,” Nature (13 May 2010). doi:10.1038/news.2010.242.

[7] H. Sudarshan, “Forward” in V. Ramprasad, Hidden Harvests: Community Based Biodiversity Conservation (Bangalore, India: Green Foundation, 2002), 4-6.

[8] J. Qiu, “Genetically Modified Crops Pass Benefits to Weeds,” Nature (16 Aug. 2013). doi:10.1038/nature.2013.13517; H. Thompson, “War on Weeds Loses Ground: The Rise of Herbicide-resistant Varieties Drives a Search for Fresh Methods of Control,” Nature 485 (24 May 2012): 430.

[9] V. Shiva, “Globalization and the War Against Farmers and the Land,” in The Essential Agrarian Reader, ed. N. Wirzba (Lexington, KY: University Press of Kentucky, 2003), 121-39.

[10] A. A. Bartlett, “Forgotten Fundamentals of the Energy Crisis,” Am J. of Physics 46 (1978): 876-88.

[11] V. Smil, “Global Energy: The Latest Infatuations,” American Scientist 99, no. 3 (2011): 212-19.

[12] M. Bittman, “The New Nuclear Craze,” The New York Times, 24 Aug. 2013, p. A21.

[13] For the cost of the Deepwater Horizon drilling platform, see J. Tainter and T. Patzek, Drilling Down: The Gulf Oil Debacle and Our Energy Dilemma (New York: Springer, 2012), 5.

[14] Smil, “Global Energy: The Latest Infatuations,” American Scientist 99, no. 3 (2011): 212-19.

[15] O. Zehner, “Solar Cells and Other Fairy Tales” in Green Illusions: The Dirty Secrets of Clean Energy and the Future of Environmentalism (Lincoln, NE: University of Nebraska Press, 2012), 3-30.

[16] Ibid.

[17] D. Cardwell, “Grappling with the Grid: Intermittent Nature of Green Power is Challenge for Utilities,” The New York Times, 15 Aug. 2013, pp. B1, B6; see also O. Zehner, “Wind Power’s Flurry of Limitations,” in Green Illusions, 31-60.

[18] Cardwell, “Grappling with the Grid: Intermittent Nature of Green Power is Challenge for Utilities,” The New York Times, 15 Aug. 2013, pp. B1, B6.

[19] K. French, “‘Never Stops, Never Stops. Headache. Help.’: Some People Living in the Shadows of Wind Turbines Say They’re Making Them Sick. Almost As Upsetting: Their Neighbors Don’t Feel a Thing,” New York Magazine, 23 Sept. 2013, p. 28.

[20] T. Beardsley, “Biofuels Reassessed,” BioScience 62(2012): 855; see also S. Raghu et al., “Adding Biofuels to the Invasive Species Fire,” Science 313(2006):293.

[21] Beardsley, “Biofuels Reassessed,” BioScience 62(2012).

[22] Smil, “Global Energy: The Latest Infatuations,” American Scientist 99, no. 3 (2011): 212-19.

[23] J. Major, “1981 Climate Change Predictions Were Eerily Accurate,” io9 (16 Aug. 2012).

[24] J. Hansen, “Game Over for the Climate,” The New York Times, 9 May 2012.

[25] See S. Battersby, “Cool It: From Sunshades to Making the Seas Bloom, There Are Plenty of Ideas About How to Stop the Planet Warming. But Will Any of Them Work?” New Scientist 215, no. 2883 (22 Sept. 2012): 31-55; J. Winston, “Geoengineering Could Backfire, Make Climate Change Worse,” Wired UK, 16 July 2012,; C. Hamilton, “Geoengineering: Our Last Hope, Or a False Promise?” The New York Times, 27 May 2013.

[26] V. Havel, “Our Moral Footprint: The Earth Will Survive—But Will We?” The New York Times, 27 September 2007, p. A33.

[27] O. H. Pilkey and L. Pilkey-Jarvis, Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future (New York: Columbia University Press, 2007).

[28] Pilkey and Pilkey-Jarvis, Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future, 101.

[29] Pilkey and Pilkey-Jarvis, Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future, 107.

[30] C. Perrow, Normal Accidents: Living With High-Risk Technologies (Princeton, NJ: Princeton University Press, 1999); see also D. Ehrenfeld, “When Risk Assessment is Risky: Predicting the Effects of Technology” in The Energy Reader: Overdevelopment and the Delusion of Endless Growth, ed. T. Butler, D. Lerch, and G. Wuerthner (Sausalito, CA: Foundation for Deep Ecology in collaboration with Watershed Media and Post Carbon Institute, 2012), 77-83.

[31] C. Perrow, Normal Accidents: Living With High-Risk Technologies (Princeton, NJ: Princeton University Press, 1999), p. 28.

[32] J. Tainter and T. Patzek, Drilling Down: The Gulf Oil Debacle and Our Energy Dilemma (New York: Springer, 2012), pp. 7-8.

[33] Tainter and Patzek, Drilling Down: The Gulf Oil Debacle and Our Energy Dilemma.

[34] C. Sullivan and ClimateWire, “Human Population Growth Creeps Back Up,” Scientific American (June 14, 2013).

[35] Convention on International Trade in Endangered Species of Wild Fauna and Flora, (accessed Sept. 12, 2013).

[36] E. M. Forster, “The Machine Stops” (1909) in The Collected Tales of E. M. Forster (New York: Modern Library, 1968), 144-97.

[37] W. Berry, The Unsettling of America (San Francisco: Sierra Club Books, 1977), 56.

[38] D. Ehrenfeld, The Arrogance of Humanism (New York: Oxford University Press, 1981): 211, 228-29.

Leave a Reply