Public opinion about the environment is often marked by unwarranted pessimism about the state of our air, water, and natural resources. But the most recent government data show that America in general, and Michigan in particular, have seen impressive gains in environmental quality since the first Earth Day 30 years ago. This report presents decades of facts and figures on Michigan and U.S. air and water quality, land use, and other environmental factors to show that, far from worsening, environmental conditions have actually improved substantially-and are likely to continue improving.
April 22, 2000, the 30th anniversary of the first Earth Day, provides us with a good opportunity to take a step back and assess what strides we have made toward a cleaner and better environment in which to live, work, and play.
If you asked most Americans, they would say that we are losing ground, that the air we breathe is dirtier and the water we drink more polluted than ever. Polls, in fact, consistently find majorities who believe that environmental quality in the U.S. is declining. But in this case, perception does not match reality.
This report presents decades of government facts and figures on Michigan and U.S. air and water quality, land use, and other environmental factors to show that, far from worsening, environmental conditions have actually improved substantiallyand are likely to continue improving.
Air quality, for example, has improved dramatically over the past generation. The U.S. Environmental Protection Agency reports that nationwide, ambient levels of all six pollutants thought to adversely affect outdoor air quality have declined significantly since the 1970s. Between 1976 and 1997, ambient levels of ozonethe major contributor to urban smogdecreased 30.9 percent. Sulfur dioxide levelsthe primary component of acid raindecreased 66.7 percent, while nitrogen oxides decreased 37.9 percent, carbon monoxide decreased 66.4 percent, and lead decreased a dramatic 97.3 percent. Particulate matter, commonly known as dust and soot, decreased 25.5 percent since 1988, the first year for which particulate data are available. Michigan cities monitored by the EPA are below the health-based thresholds set by the Clean Air Act for thes pollutants and are experiencing downward trends. Most Michigan cities not only meet the national standard, but are below the national average.
National water quality shows similar improvement trends. Due to wastewater treatment facilities, all sewage generated in the United States had been treated before discharge by 1992. This treatment means that since 1970, discharge of toxic organics has declined 99 percent and toxic metals by 98 percent.
There also have been large improvements in water quality and wildlife health in and around the Great Lakes over the past 30 years. Today it is once again possible to fish in the Great Lakes, and even to drink their water in most locations. In fact, the environmental challenge facing the Lakes today no longer comes mainly from industrial pollution or toxics, but from biological threats: Nearly 145 non-native or "exotic" species now found in the Great Lakes are crowding out the habitat of other indigenous species in the Lakes.
Unfortunately, monitoring water quality is much more difficult and costly than monitoring air quality, and the measures currently used are seriously defective. Nevertheless, Michigan has a superior record in monitoring water quality and has impressive results to report. While all 50 states taken together assessed only 17 percent of their rivers and streams in the 1996 National Water Quality Inventory, Michigan assessed 40 percent and found 93 percent of assessed rivers, streams, and lakes to be "fully supporting," which means they are safe for swimming and fishing.
Natural resources, including forests and wetlands, are making a comeback as well. U.S. forests now cover nearly 30 percent of the nation's total land area, and have remained stable for most of this century. Each year the United States plants more trees than it harvests and has done so since 1950. A full two-thirds of the deforestation experienced in North America took place between 1850 and 1910, and there is about three times more forestland in North America today than there was in 1920. In Michigan, roughly 44 percent of the state is covered in forest, while only 10 percent of the state's land area is considered "developed."
U.S. wetlands development also has decreased dramatically. For every 60 acres of wetlands converted to cropland annually from 1954 to 1974, only 3 acres were converted annually from 1982 to 1992. Since 1980, the United States has experienced no net loss of wetlands.
What accounts for these and other environmental gains? The seemingly obvious conclusion is to give all credit to such regulations as the 1970 Clean Air Act and Clean Water Act. A longer-term look, however, reveals a more complicated picture. For example, although the data for air pollution are not well quantified prior to 1970, studies indicate that air quality was improving rapidly before the passage of the Clean Air Act.
Why? While government regulations undoubtedly play a role, research suggests that the "wealth effect" of a growing economy is the key to an improved environment. As the Michigan and U.S. economies grow, so does their ability to control pollution and protect resources. Economic growth also means improved technology and, therefore, more efficient uses of raw materials and natural resources. Data suggest that it is this growth, combined with an increasing public demand for a clean environment, that has driven many environmental improvements over the past 30 years.
For this reason, environmentalists should not regard economic concerns as a hindrance to effective policy, but should embrace economic growth as the key to further environmental improvements. Moreover, if Americans want the improvement that has occurred over the past generation to continue, they will look to innovative new policies that incorporate and promote economic growth. Such policies not only best address today's environmental situation, but provide the most promising future for tomorrow's environment as well.
"Earth Day may be a turning point in American history. It may be the birth of a new American ethic that rejects the frontier philosophy that the continent was put here for our plunder."
Senator Gaylord Nelson (D-Wisconsin), April 22, 1970
"The bulldozer mentality of the past is a luxury we can no longer afford. Our roads and other public projects must be planned to prevent the destruction of scenic resources and to avoid needlessly upsetting the ecological balance."
California Governor Ronald Reagan, February 1970
Long before this year's 30th anniversary of Earth Day on April 22, 2000, it was evident that environmentalism had taken its place as one of the pre-eminent social movements in American public life, comparable in its impact to the movements for abolitionism, temperance, women's suffrage, and civil rights. The late Robert Nisbet, distinguished professor of sociology at Columbia University, predicted 20 years ago that, "It is entirely possible that when the history of the twentieth century is finally written, the single most important social movement of the period will be judged to be environmentalism."
Compared with other prominent social movements, environmentalism may be regarded as the most rapidly successful in American history. Whereas the civil rights movement toiled for decades to achieve its principal political and legal goals, culminating in the Civil Rights Act of 1964, the principal political and legal goals of environmentalism, such as the Clean Air and Clean Water Acts, as well as the establishment of the Environmental Protection Agency in 1970, were achieved before many environmental organizations were even founded. Also, there have been large and tangible improvements in several categories of environmental concern since the first Earth Dayin some cases, beyond what was hoped for in 1970. But the environment was not a wholly new issue at the time it sprang to life in the late 1960s.
As far back as 1949, author Fairfield Osborn warned in Our Plundered Planet that environmental disaster loomed unless there was a "complete revolution in man's point of view toward the earth's resources and toward the methods he employs in drawing upon them." Silent Spring, Rachel Carson's 1962 warning about the pesticide threat to wildlife, was a sensation, and set in motion a train of events that soon led to the ban on the pesticide DDT in the United States.
President Lyndon Johnson championed several early environmental measures, some of which were substantive, such as clean air and pesticide regulation, and some of which were cosmetic, such as highway beautification. But the environment was not seen as a mass political issue that could capture and move the sentiment of the nation.
Neither Richard Nixon nor Hubert Humphrey talked about the environment in the 1968 presidential campaign, a time when "green power" still meant the Irish vote. Gallup didn' t think the issue was worth polling until 1965, and the early polls generated ho-hum results. A Harris poll in the mid-1960s reported a majority against higher taxes and higher consumer prices to pay for environmental clean up. Today polls consistently find large majorities willing to pay higher prices for a cleaner environment.
Gallup's 1965 poll found that only 28 percent considered air pollution to be a serious problem, while only 35 percent thought water pollution was a serious problem. By 1969, these numbers had risen to 69 and 74 percent. Yet there were still only two registered environmental lobbyists in Washington at the time. But after the Santa Barbara oil spill in January 1969, environmental episodes became big news.
In June, five months after Santa Barbara, a pile of logs, picnic benches, and other debris that had collected beneath a railroad trestle over the Cuyahoga River in Cleveland caught fire when sparks from a passing train ignited the kerosene and oil floating on top of the river. The fire burned for only 24 minutes, not long enough for the Cleveland Plain-Dealer to snap a photo. Hence it was reported briefly in the back pages of the paper, and didn' t attract much attention until months later, when a National Geographic magazine article on river pollution gave the episode fresh attention nationwide.
The reaction to the Cuyahoga River fire is an excellent illustration of what economists call the "wealth effect," i.e., how the public demands higher environmental quality as society becomes more affluent. The Cuyahoga, which the mayor of Cleveland had described as an "open sewer" as far back as 1881, had caught fire twice before, in 1936 and 1952, but neither incident touched off fanfare or general outrage. Both were regarded as the price of progress, but by 1969 such a price was no longer acceptable.
The Affluent Society did not want to be the Effluent Society. While some environmentalists at the extreme fringe attack modern industrial society, it is rising wealth that has made environmentalism not only popular, but possible. "These wild things," Aldo Leopold reminds us in A Sand County Almanac, "had little human value until mechanization assured us of a good breakfast." As the 1970s began, the environment as a political issue was here to stay.
In 1970, Time magazine named the environment "Issue of the Year." Not to be outdone, its sister publication Life designated the 1970s as "the environmental decade." California Assembly Speaker Jesse Unruh, the originator of the phrase "Money is the mother's milk of politics," offered a corollary: "Ecology has become a substitute for the word 'mother.' "
On April 22, 1970, the first "Earth Day" was held. Wisconsin Senator Gaylord Nelson, one of the prime forces behind the event, proclaimed, "Earth Day may be a turning point in American history. It may be the birth of a new American ethic that rejects the frontier philosophy that the continent was put here for our plunder."
That year, the Readers' Guide to Periodical Literature entries for the environment and related subjects took up less than a page and a half. The following year, 1971, the entries required five pages, signaling the growth curve toward national prominence.
The coming of environmentalism marked more than just a turning point in domestic politics; it also marked a decisive turning point in the nature of government regulation, with far-reaching constitutional implications. Hitherto the object of government regulation was to ensure fairnessfairness to competitors and potential competitors, and fairness to consumers. The regulatory goal of fairness was clearly a derivative of the idea of equal rights and equal opportunity that is central to American political life. The new regulation starting in the 1970s was more ambitious and social in nature, and a marked departure from the old kind of regulation.
President Nixon's creation of the Environmental Protection Agency (EPA), which was cobbled together from parts of other federal agencies by executive order, marked the beginning of a regulatory revolution. No other federal agency has ever been created in this fashion. Nixon had first wanted an act of Congress to create a cabinet-level Department of Environment and Natural Resources, but abandoned the plan because of congressional opposition. Others in the Nixon White House thought the new EPA should be more like the National Institute of Health, conducting research and recommending environmental standards and strategies for Congress and the states to incorporate into legislation. But Nixon wanted the EPA to be an operating and enforcement-oriented agency. Ironically, the EPA that emerged is a more powerful agency than a cabinet-level department, because unlike a cabinet department that is more closely under the authority of the White House, the EPA enjoys more independence as an administrative agency.
The most striking aspect of the EPA and other regulatory agencies created contemporaneously, such as the Consumer Product Safety Commission (CPSC) and the Occupational Safety and Health Administration (OSHA), is that they were the first agencies with a mandate to range widely throughout the economy and set their own policy strategy largely without the deliberation or input of Congress. The CPSC, for example, originally considered regulating women's high-heel shoes, along with swimming pools, tricycles, and artificial turf. The CPSC's first chairman, Richard Simpson, remarked that, "It's possible to make this a complete witch-hunt . . . . I suppose if there were enough complaints about the hazards of wearing maxicoats, we could even regulate length."
Throughout the previous decades of the twentieth century, government regulatory agencies had always been highly specific to narrow aims and usually a single industry; the Securities and Exchange Commission and the Food and Drug Administration are good examples of the old kind of regulation. The one prior exception to single-industry regulationthe Federal Trade Commission, created in 1916is the exception that proves the rule. Its undefined, wide-ranging scope caused Congress and the executive branch to curb its power within a few years of its creation precisely because of its unaccountability.
Not only was the EPA without specific statutory basis but the Nixon reorganization plan that created the agency left the term "environment" undefined. Nixon thought these matters should be left for the EPA's administrator to decide. The trouble with leaving "the environment" undefined is that it leaves ambiguous mankind's place in it. Are humans part of "the environment," and therefore the object for whom the agency was created? In other words, is the EPA just a glorified public health agency? Or is "the environment" a transcendent category, in which case the EPA may regulate in the interests of nature herself? The lack of congressional deliberation and a specific statutory mission has made the EPA a battleground for the competing views of what "the environment" comprises.
For example, in 1970, some members of Congress wondered whether the EPA might claim jurisdiction over population control. A Nixon-appointed commission had, after all, doubted whether rising population was of any future benefit to the United States. Ironically, Sen. Edmund Muskie (D-Maine) had proposed a bill in Congress that would have created an "Environmental Quality Administration." Muskie's bill would have demanded from Congress a detailed statement of the agency's goals and purposes, and would have included a congressional intent to balance environmental protection with economic growth, a proviso without which Muskie judged that the proposal would fail.
Muskie's "EQA" would have been a more circumspect agency than Nixon's EPA. But Muskie's bill never even earned a committee hearing. A similar proposal in 1967 had been opposed by Sen. Robert F. Kennedy. Quite aside from the ideological battles that would swirl around the EPA and other new quasi-independent, economy-wide regulatory agencies, there was the question of how this new kind of regulation changed the relations of citizens to their national government.
Under the new regulation, administrative questions that had always been strictly local in character were centralized in the federal bureaucracy. The EPA's earliest mission was essentially a public-works program to build new sewers and water treatment plants throughout the nation. Hardly a single yard of sewer line was subsequently laid without the direct involvement and supervision of the EPA. The centralization of previously local decisions had a necessarily degrading effect on state and local government and contributed to the rising sense of remoteness from government that has increasingly plagued American public life over the last generation.
All of this happened largely without any kind of fundamental, substantive public debate about the principles of the new regulation. As this regulation was extended further and further during the 1970s, a backlash would grow into a potent force. In many ways, the regulatory revolution was President Nixon's most profound legacy.
"Probably more new regulation was imposed on the economy during the Nixon administration than in any other presidency since the New Deal," wrote Herbert Stein, who served on Nixon's Council of Economic Advisers.1
Stein never bothered to count up the new agencies, or he would have struck the "probably" from the front of his sentence. Between 1970 and 1974, eight new independent regulatory agencies and eight new agencies within the executive branch were created. In addition, 13 existing independent agencies, and 22 executive branch agencies, were substantially reformed and strengthened during these years. It represented a vast expansion and centralization of government power, penetrating local and remote reaches of the private economy.
These numbers represent nearly three-quarters of the regulatory apparatus of the federal government. This was not obvious then or now because of the murkiness surrounding the constitutional status of the new economy-wide "social" regulation, and the abstract issue of the violation of the separation of powers implied by this kind of activity.
In the last few years, federal courts have begun to reassert the "non-delegation" doctrine in a few areas of regulatory rule-making, which stipulates that Congress cannot delegate any of its legislative functions to executive branch or other agencies. A Washington, D.C. Circuit Court of Appeals has held up the EPA's latest clean air rules for ozone on these grounds (American Trucking Association v. EPA). There has also been some sentiment to require Congress to vote on all regulations, so that elected officials, rather than appointed, and often anonymous, regulators would be publicly accountable for the rules that government agencies impose. Either reform would restore a measure of responsibility and accountability to the entire range of federal regulatory activities.2
The goal of the environmental movement, however, has always been broader than mere legislation. Going back at least to the kind of thoughts expressed in Aldo Leopold's Sand County Almanac, environmentalism has sought to change public attitudes about man's place in nature. Like other serious social movements, environmentalism aims to found new modes of thinking and new orders of social life to bring about a different world. Its success in doing so can be measured by the simple example of trash. In the late 1960s, the public message was "stop littering." By the 1990s, the ubiquitous public message became "recycle."
To raise the example of recycling, however, is to evoke the most searching questions about the nature and extent of social transformation that the environmental ethic is advancing. While recycling can be viewed as a sign of great triumphpeople who once threw bottles and paper out their car window now take great care to recycle themsome environmentalists view recycling as a sign of failure, because the "culture of consumption" has remained fundamentally unchanged. And for some kinds of environmental thought, nothing short of a wholesale transformation of the economic and social systems of the world will suffice.
It is not only production processes that must be changed, but democratic political institutions. Martin Lewis, professor of environmental studies at Duke University, describes the "central proposition" of radical environmentalism as the view "that human society, as it is now constituted, is utterly unsustainable and must be reconstructed according to an entirely different socioeconomic logic . . . . Most importantly, eco-radicals inform us that economic growth must simply come to an end."3
The kind of thinking that looks beyond real problems to the need for revolutionary, "holistic" new social structures represents not environmentalism, but utopianism. This kind of utopianism is the greatest hindrance to serious environmentalism because it breeds an unrealistic, if not erroneous, understanding of how the world works and an intolerance that paves the way for political coercion.
Some environmentalists have been open and explicit about their support for greater political control of people. The Ecologist magazine's Blueprint for Survival, a 1972 manifesto about a simpler, decentralized form of social organization, forthrightly declares that "great restraint" among the people is required to make the "long transition" to this better world: "Legislation and the operations of police forces and the courts will be necessary to reinforce this restraint."4 (Emphasis added.) In Ecology and Socialism, British author Martin Ryle wrote, "If one is honest, however, about the objectives which an ecologically enlightened society would set for itself, it is difficult to avoid concluding that the state, as the agent of the collective will, would have to take an active law-making and enforcing role in imposing a range of environmental and resource constraints."5
To be sure, most of the prominent environmental organizations in the United States do not advocate this kind of thoroughgoing utopianism. Yet if so-called "mainstream" environmentalism eschews coercive utopianism and revolutionary social intentions, it nonetheless tends to be radically disaffected with the way the world works and harbors a latent romanticism about the ideal form of socio-economic organization for humankind. At its heart, even so-called "mainstream" environmentalism shares with radicals the fundamental doubt that economic growth and technological advance are good things. This, in turn, leads the environmental movement to be pessimistic in its outlook. Author Mark Dowie, for example, argues that environmentalists "have been unable to produce a significant improvement in the country's environmental health," even though the facts say otherwise.6
The fundamental doubt about progress and economic growth causes environmentalism to reject markets and economic ways of thinking. David Brower, founder of Friends of the Earth and other green activist groups, has said that "economics is a form of brain damage," which makes a rational discussion difficult. While Jeremy Seabrook, author of The Myth of the Market: Promises and Illusions, wrote that, "If it had been the purpose of human activity on earth to bring the planet to the edge of ruin, no more efficient mechanism could have been invented than the market economy."7 Seabrook's comment is remarkable in light of the much more severe environmental devastation found in the socialist command economies of Russia and eastern Europe. Environmental improvement did not begin until these nations embraced market economies after 1990.
Professor Martin Lewis, who considers himself a left-leaning environmentalist, provides a note of realism about this anti-market viewpoint: "[I]n seeking to dismantle modern civilization [the environmental movement] has the potential to destroy the very foundations on which a new and ecologically sane economic order must be built." The view that economic growth and technological progress are bad things, Lewis believes, "should be more deeply challenged as a threat to nature itself . . . . 'Primal' economies have rarely been as harmonized with nature as they are depicted; many have actually been highly destructive." To the contrary, Lewis concludes, "capitalism, despite its social flaws, presents the only economic system resilient and efficient enough to see the development of a more benign human presence on the earth."8
The pessimism that often accompanies environmentalism is ill-suited for both the naturally optimistic American character and the realities of the modern world, where economic growth and progress are the hope, and not the threat, of the future. The lesson of the past century has been that environmental progress depends on economic and technological progress, which are best produced by dynamic markets. Environmental progress in the twenty-first century will build upon this foundation.
Michigan is the result of a slow-motion geological cataclysm, having been formed two million years ago when a glacier swept down from the north, carving out the Great Lakes on three sides of the state, and grinding the flat topography of the middle west. Michigan has the largest amount of waterfront of any sovereign entity in the world not located on an ocean: it has 3,250 miles of shoreline along four of the five Great Lakes, accounting for nearly 60 percent of the total Great Lakes shoreline. Michigan also boasts the world's largest living object, a nearly 40-acre fungus growing under a forest floor in the Upper Peninsula that scientists think could be 1,500 years old.
Michigan is also America's paradigmatic industrial state. Michigan has been the scene of large-scale man-made transformations of the natural environment as well as the engine of the American economy. Lumber was the first big industry in Michigan in the nineteenth century, with thousands of acres clear-cut for timber and to make way for agriculture. Then in 1881, a wildfire burned half of the "thumb" area near Saginaw Bay. Late in the century large copper deposits were discovered on the Keweenaw Peninsula. In the twentieth century, Michigan also became a center for chemicals and machine tools.
But Michigan is best known for the auto industry. Detroit is one of those rare cities whose very name symbolizes a major industryautomobilesin the same way "Hollywood" denotes the entertainment industry. Some environmentalists regard the automobile as public enemy number one.9 Yet the auto industry has been at the forefront of efforts to apply technological innovation to the problem of reducing air pollution. New automobiles today emit less than five percent as much pollution as they did at the time of the first Earth Day in 1970, a fact that accounts for a large portion of the improvement in air quality. To be sure, the federal government mandated much of this innovation, with the auto industry protesting that rapid breakthroughs might not be feasible or affordable. More recently, however, there are signs that the auto industry is beginning to look farther ahead than legislation can contemplate. Experiments with fuel-cell and other cutting edge technology suggest that zero- or near zero-emission autos are within the foreseeable future. In the meantime, Volvo has introduced a car equipped with a radiator covered in a specialty coating that purportedly converts ozone smog into oxygen at rates that exceed the ozone-forming chemicals emitted by the car. In other words, driving this car will actually clean up the air. The specialty coating is currently expensive and not yet affordable for the mass market, but it is an example of the kind of innovation that industry is now developing on its own.
This long history makes Michigan an ideal laboratory to see how the modern economy has transformed humanity's relationship with the environment. What is especially notable about Michigan's progress over the last 25 years is that it has been able to achieve significant improvements in its environment while suffering through a wholesale restructuring of its economy amidst energy shocks and recession. Twenty years ago employment in the auto industry started plummeting, and Michigan was viewed as the capital of the "rust belt." It is at such times that environmental progress is thought to be at risk, because a shrinking economy cannot afford the expense of regulation. Yet Michigan proved to be highly adaptive. In the 1990s, Michigan's unemployment rate was below the national average, even though auto industry employment is little more than half as much as it was 25 years ago. And, as this report discusses, Michigan has enjoyed considerable environmental improvement.
Air quality in the United States has improved dramatically over the past 30 years as a result of the combined efforts of industry, individuals, and government. Approximately half of the U.S. population, however, still believes that air quality is deteriorating. According to a survey of the Foundation for Clean Air Progress, roughly 50 percent believe that air quality is worse than it was 10 years ago, 42 percent believe it is better, and eight percent believe it is the same.10
Air Quality: Emissions and Ambient Levels
Air quality regulations target six "criteria" pollutants: sulfur dioxide (SO2), nitrogen oxides (NOx), volatile organic compounds (VOCs), particulate matter (PM), lead (Pb), and carbon monoxide (CO).
Air quality trends are measured in two ways: emissions and ambient levels. Emissions estimate the amount of material that comes out of a smokestack, automobile tailpipe, or other source. Emission estimates are typically measured in pounds or tons. Ambient levels refer to the actual concentration of a pollutant in the air, and are quantifiably measured through more than 600 U.S. sampling stations in parts-per-million or parts-per-billion.
While both emissions and ambient levels show significant decline over the past quarter-century, this report focuses on ambient levels. These measure the real exposure to pollution, from which health experts and scientists can discern the actual threats posed to human health, and also quantify potential environmental hazards.
Ambient concentrations depend not only on the amount of man-made emissions, but also on many meteorological factors such as temperature, sunlight, air pressure, humidity, wind, rain, and so forth. For example, hot summers, such as 1983 and 1988, experienced higher ozone levels, while cool summers (1992 was the second coolest summer in the United States during the last 100 years) experienced lower air pollution levels. The EPA is currently trying to develop models that will adjust for meteorological conditions to permit better trend analysis.
It should be noted that some air pollution, especially ozone-forming hydrocarbons and particulates, is naturally generated in substantial amounts by trees and other vegetation. These issues are discussed further in the analyses of the individual pollutants.
This report presents the latest national annual trend data from the Environmental Protection Agency (EPA) but the EPA's air quality data for 1998 were not available at the time of publication. The trend data for 1976 through 1997 can be found in the Pacific Research Institute's 1999 edition of the Index of Leading Environmental Indicators, available at www.pacificresearch.org.
As Table 1 shows, there have been major decreases in ambient levels of air pollution since 1976. The EPA notes, "Since 1970, total U.S. population increased 31 percent, vehicle miles traveled increased 127 percent, and the gross domestic product (GDP) increased 114 percent. During that same period, notable reductions in air quality concentrations and emissions took place."11
Aggregate emissions decreased by 31 percent between 1970 and 1997.12 Chart 1 illustrates these trends.
Air quality is one of the great Michigan success stories. Michigan cities monitored by the EPA are below the health-based thresholds set by the Clean Air Act for all six "criteria" pollutants (lead, carbon monoxide, ozone, particulates, sulfur dioxide, and nitrogen dioxide, described beginning next page) and are experiencing downward trends. Most Michigan cities not only meet the national standard, but are below the national average. The one exception to this is Detroit, which is slightly above the national average for particulates and sulfur dioxide. On the other hand, the Foundation for Clean Air Progress in Washington, D.C. lists Detroit as one of the nation's 10 best cities in terms of ozone reductions over the last decade, which is especially notable because ozone is the most stubborn pollutant to control.13
Charts 2-7 display trend data for ambient air pollution levels in Michigan cities, since it is ambient levels, rather than emissions, that are the important factor in determining actual exposure and health risk to humans. These charts include measurements for each city the EPA monitors; the EPA does not monitor all six criteria pollutants in each location.
Lead is a soft, dense, bluish-gray metal used in piping, batteries, weights, gunshot, and crystal. Of the six criteria pollutants, lead is the most toxic. When ingested through food, water, soil, dust, or inhaled through the air, it accumulates in the body's tissues and is not readily excreted. Excessive exposure to lead can cause anemia, kidney disease, reproductive disorders, and neurological impairments such as seizures, mental retardation, and behavioral disorders.14
The highest concentrations of lead are found in the surrounding area of nonferrous and ferrous smelters, battery manufacturers, and other stationary sources of lead emissions. The decline in ambient lead concentration is the greatest success story in the effort to reduce air pollution.
Ambient lead concentrations in the United States decreased 97 percent between 1977-1996. Most of this reduction was achieved through the introduction of unleaded gasoline, and the elimination of lead compounds in paints, coatings, and from point sources such as smelters and battery plants.
Young children are the most vulnerable to blood lead and high blood-lead levels, which in small children retard brain and IQ development. Children who live in older housing that has lead-based paint are still at risk for high blood-lead levels, but the pervasive threat of lead from poor urban air is a problem of the past. Lead in blood samples is a much better indicator of the public health impact of lead than outdoor air quality. Between the late 1970s and 1991, the proportion of people with more than 10 micrograms of lead per deciliter of blood declined from 78 percent to 4.3 percent.15
When fuel and other substances containing carbon burn without sufficient oxygen, they produce carbon monoxide (CO), a colorless, odorless, and, at high levels, poisonous gas. Although trace amounts of CO occur naturally in the atmosphere, transportation sources account for 79 percent of the nation's total emissions. In cities, automobile exhaust may be responsible for as much as 95 percent of all CO emissions. Industrial processes, non-transportation fuel combustion, and natural sources such as wildfires are other sources of CO emissions.
Average ambient CO concentrations fell seven percent in 1997, and have declined 66 percent since 1976. Ambient CO levels have met the EPA's target "good" range since 1993. It is noteworthy that these reductions occurred despite a 100-percent increase in vehicle miles traveled (VMT) and at sites across all monitoring environmentsurban, suburban, and rural.
Ground-level ozone is the primary contributor to urban smog, although sulfur, nitrogen, carbon, and fine particulate matter contribute to smog's formation as well. Ozone is not emitted directly into the air but forms when volatile organic compounds (VOCs) combine in sunlight with NO2, dependent upon weather-related factors. This makes it difficult to accurately predict changes in ozone levels due to reductions in VOCs and NO2. VOCs evaporate into the atmosphere from motor vehicles, chemical plants, refineries, factories, consumer and commercial products such as lighter fluid, perfume, and other industrial sources. VOCs also occur naturally as a result of photosynthesis.
Even though ozone is the most pertinent air quality problem, ambient levels have declined by 30 percent between 1974 and 1997. Trends in ambient ozone concentrations are influenced by various factors: changes in meteorological conditions from year to year, population growth, VOC to nitrogen oxide ratios, and by fluctuations in emissions from ongoing control measures. Ozone problems occur most often on warm, clear, windless afternoons.
The December 1991 National Academy of Sciences report on ozone revealed that most of the variation in ozone comes from "natural fluctuations in the weather," not from "year-to-year changes in emissions." Therefore, it concludes that current ozone reduction strategies may be misguided because they do not account for naturally occurring VOCs.16
Particulate matter is the general term for a mixture of solid particles, including pieces of dust, soot, dirt, ash, smoke, and liquid droplets or vapor directly emitted into the air, where they are suspended for long periods of time. Particulates can affect breathing, damage paints, and reduce visibility. These particles derive from stationary, mobile, and natural sources. Such sources include forest fires and volcanic ash; emissions from power plants, motor vehicles, wood stoves, and waste incineration; and dust from mining, paved and unpaved roads, and wind erosion.
In 1987, the EPA changed its regulatory focus from total suspended particulates (TSPs) to PM-10, suspended particulates that are 10 micrometers or smaller. The change was due to the recognition that smaller particles are more likely to be inhaled deeper into the lungs. Therefore, PM-10 is a better indicator of health impact than TSP. In 1997, the EPA changed the standards again, and it now regulates particles 2.5 micrometers and smaller (PM-2.5). Ambient air quality data (which begins in 1957 for about 60 areas) indicate that urban air quality for PM, as measured by TSP, has been improving since 1957. From 1988-1997, air quality concentrations of PM-10 measured at monitoring sites across the country decreased 25 percent.
Sulfur dioxide (SO2) is a colorless gas that forms from the burning of fuel containing sulfur, mainly coal and oil, as well as from industrial and manufacturing processes, particularly the generation of electrical power. Environmental factors such as temperature inversion, wind speed, and wind concentration also affect SO2 levels. Ambient levels of sulfur dioxide decreased 66 percent between 1976 and 1997, and the United States has met the EPA's designated "good" category since 1981.
Nitrogen oxides (NOX) form naturally when nitrogen and oxygen combine through bacterial action in soil, lightning, volcanic activity, and forest fires. Nitrogen oxides also result from human activities including high-temperature combustion of fossil fuels by automobiles, power plants, industry, and the use of home heaters and gas stoves. Environmental agencies particularly track the light brown gas nitrogen dioxide (NO2) because when it combines with volatile organic compounds (VOCs) in the presence of sunlight, it forms ground-level ozone.
The national average for ambient levels of nitrogen dioxide decreased by 38 percent from 1976-1997. Since 1992, all monitoring locations across the country met the national NO2 air-quality standard.
While we await the EPA data for 1998, the 30th anniversary of Earth Day and the first Clean Air Act is a good time to review long-term factors in air quality improvement. In 1999, a major contribution to the scholarship on this subject appeared: Indur M. Goklany's Clearing the Air: The Real Story of the War on Air Pollution.17
Goklany's review of the subject shows that analyzing air quality improvements is more complicated than it seems. The apparent conclusion to draw from EPA data is that air quality improvements result from regulations such as the 1970 Clean Air Act. Goklany's detailed analysis, however, reveals a more complicated picture. Because comprehensive air quality monitoring did not begin until after the first Clean Air Act, our knowledge of the pre-1970 period is based on the monitoring samples available. However, the samples that do exist provide a consistent picture, and suggest an improving trend before 1970.Local government efforts to control air quality problems began long before 1970. Chart 8 shows the growth in local air pollution control agencies that occurred with the growth in the public perception of air pollution as a major problem. Very early in the twentieth century, visible air pollution, especially smoke, was identified as a problem to be reduced. More invisible forms of pollution, especially ozone, were more slowly recognized.
The smoke problem was practically solved by the 1960s. In Pittsburgh, between 1946 and 1955, the hours of heavy smoke dropped by 96.6 percent from 298 to 10 hours, drastically improving atmospheric visibility. Nonetheless, state air pollution programs increased after 1960 and smoke restrictions became more stringent.18
Chart 9 shows the trend for settleable dust in the industrial city of Pittsburgh between 1925 and 1965. The rapid decline in the early years between 1925 and 1940 is attributable to the simple efficiency gains from industry upgrading its technology. The industrial drive for cost-saving efficiency also typically leads to cleaner technology. In 1938, dustfall records in Pittsburgh averaged 60.0 tons per month per square mile, but by 1955 the figure had declined to 48.9 tons.19
Although the pre-1970 data for air pollution is not as well quantified as is the post-1970 data, studies indicate that air quality was improving rapidly before the passage of the 1970 Clean Air Act. For example, Paul Portney, an environmental economist with Resources for the Future, writes that it is "extremely difficult to isolate the effects of regulatory policies on air quality, as distinct from the effects of other potentially important factors," because "some measures of air quality were improving at an impressive rate before 1970."20
Historical data show that ambient levels of sulfur dioxide were reduced by almost 50 percent between 1964 and 1970 in New York City (see Chart 10).
Similar improvements occurred nationwide. Based on 21 urban monitors, the mean annual average of SO2 fell approximately 40 percent between 1962 and 1969. This set the stage for progress after the Clean Air Act; the national average dropped more than 60 percent between 1974 and 199721 (see Chart 11).
Although there is little data on ambient carbon monoxide (CO) concentrations until the early 1970s, the available data suggest that CO may have begun improving in the mid-1960s at least in urban areas, as indicated by the data from 1963-1968. These data are from the federally operated six-city CAMP network, which includes Chicago, Cincinnati, Denver, Philadelphia, St. Louis, and Washington. Goklany notes, "The fact that declines apparently began before the Federal Motor Vehicle Control Program went into effect indicates that stationary source reductions played a role in the initial turnaround; those improvements then gathered momentum as an increasing number of vehicles became subject to federal tailpipe controls starting with the 1968 model year."22
The Organization for Economic Cooperation and Development (OECD) notes in a 1991 report on the U.S. environment that "emissions have also fallen in other OECD countries, sometimes by as much or proportionally more than in the United States," but with less stringent regulation.23 Although such findings seem counter-intuitive, they demonstrate the difficulty in calculating the effectiveness of the large investment the United States has made in environmental regulation, with air quality control now costing nearly $40 billion per year.
These analyses are not meant to imply that regulations have played no role in air quality improvements over the past 30 years. Rather, they indicate that the regulatory approach is neither the only nor the most effective way to address air quality problems. Improvements in the United States occurred before the enactment of regulations, and during the same time period other developed nations with less strict regulations saw improvements not only comparable but, in some cases, greater. As these developments suggest, regulations are not the sole cause of U.S. air quality improvements; rather, they are the mechanism that the United States chose to use in response to the public's demand for a cleaner environment.
The rising costs of regulations, coupled with the decreasing health benefits they deliver, indicate that the current regulatory approach may not be the best way to achieve further progress. Continued improvement is likely to come from technological breakthroughs, such as clean-fuel cars and upgrading industrial processes. Regulatory strategies might also be adapted to emphasize emission reductions on certain days of the year when meteorological conditions suggest that a high pollution day can be expected, similar to the manner in which electric utilities practice "load shedding" on days of peak electricity demand.
Air in homes, workplaces, and schools often has higher levels of pollution than outdoor air. Although governments and international bodies such as the World Health Organization define healthy air in terms of the air quality at a fixed point outdoors, indoor air quality, especially in the home, is a much better indicator of the effects of air pollution on public health. The average person spends approximately 93 percent of his or her time indoors, five percent in transit, and only two percent outdoors.24 Presently, air quality standards are specified for each pollutant in terms of its concentration in outdoor air or its mass in a fixed volume of outdoor air. No measurements of long-term indoor air quality are available for the home or elsewhere.
Traditional indoor air pollution sources include heating and cooking equipment that use fossil fuels and biofuels, environmental tobacco smoke, household cleaning solutions used or stored in the home, radon, lead, and biological pollutants such as bacteria, viruses, mold, dust mites, and animal dander. According to an EPA study, major stationary and mobile sources accounted for only two to 25 percent of personal exposure to volatile organic compounds and pesticides. Smoking, dry-cleaned clothes, and chloroform from heated water in the home, on the other hand, were found to be two to five times larger sources of exposure than outdoor emissions sources.25
Calculations for the United States indicate that one gram of indoor particulate matter emissions can have a greater effect on total exposure of the population than one kilogram (1000 grams) released by a power plant from a relatively high stack.26 The average "nonsmoking" household's in-home concentrations of PM-10 between 1940 and 1990 declined 91 percent, according to EPA estimates of residential fuel combustion emissions divided by the corresponding number of occupied housing units. Further, 99 percent of the improvements occurred before 1970, before the imposition of federal regulations on stationary sources.27
Air quality is the great success story of environmental protection in the United States. Improvements in both indoor and outdoor air qualities began decades before federal legislation. Both technological change and affluence allowed households and industries to switch to cleaner fuels. This proves that people will voluntarily take action to improve their personal environment, even at a cost to themselves, with or without the government's intervention. On the whole, air quality trends are favorable almost everywhere.
The EPA calculates a composite measure of the criteria pollutants called "Pollutant Standards Index" (PSI). The PSI is the tool local meteorologists use to give warnings about "unhealthful" air quality. The PSI ranges up to a value of 500; a PSI value of 100 is the threshold of unhealthful air. Chart 12 shows the improvement in air quality as measured by the PSI in southern California and the 94 other largest metropolitan areas in the nation. Between 1988 and 1997, the EPA notes, the total number of days with PSI values greater than 100 decreased 56 percent in southern California (which is tracked separately) and 66 percent in the remaining major cities in the United States. In other words, even in smoggy southern California people face unhealthful air half as often as a decade ago.
The EPA's National Water Quality Inventory (NWQI) assesses the conditions of rivers, lakes, and estuaries every two years. The NWQI is mandated by the 1972 Clean Water Act and is the primary means for informing Congress and the public about the nation's general water quality conditions.
We have, however, omitted the NWQI data from this report due to concerns about the quality of the data. Data on water quality, unlike air-quality data which have been consistently measured, lack the measurement to enable evaluation of progress over time. To be sure, monitoring water quality is much more difficult and costly than monitoring air quality, but the measures currently used are seriously defective.
In 1992, the EPA admitted that the NWQI data "cannot be used to determine trends in national water quality or to compare water quality among individual States."28 The U.S. Geological Survey has been similarly critical, warning in 1991 that inconsistencies and methodological problems with the data "preclude any analysis of trends."29 The most serious limitation of the NWQI is that only a portion of our water bodies are evaluated, and the same points are not evaluated yearly. For example, in the 1996 NWQI, only 19 percent of the nation's 3.6 million miles of rivers and streams were evaluated, and only 40 percent of the 41 million lakes, ponds, and reservoirs were sampled (see Chart 13). NWQI data for estuaries were more complete, with 72 percent of the 39,000 miles of estuaries sampled.
The NWQI is not conducted by the federal government, but is in fact compiled by state agency reports. There is a wide variation in how thoroughly different states assess their water quality. Waters that some states deem "impaired," for example, would be found "non-impaired" by other states using a different method of evaluation. And, as we have observed in previous editions of this report, the combination of scarce resources and the bureaucratic incentive to locate the worst water conditions may lead to an overstatement of water quality problems. But in the absence of a more systematic monitoring and evaluation system, it is impossible to know.
The defects of the current approach to water quality monitoring were explored in detail in a May 1999 report released by Public Employees for Environmental Responsibility (PEER). The report, Murky Waters: Official Water Quality Reports Are All Wet, was produced by anonymous employees within the U.S. Environmental Protection Agency (EPA), as well as by current and former employees within select state environment agencies. It provides an insider's account of how the EPA's system of monitoring and assessing water quality has broken down.
PEER charges that states have no incentive to deliver accurate reports or to achieve comparability whereby the water quality records of states can be meaningfully compared with each other or tracked consistently on a year-to-year basis. In order to make a valid assessment of water quality in each state, rivers and streams must be consistently counted and sampled every year. However, as Table 2 shows, the EPA and the states are not sure how many miles of rivers and streams they have.
For example, in 1990, Arizona claimed to have 17,537 stream miles. Two years later that number jumped to 150,000. In California, the same phenomenon can be observed. In 1990, California reported that it had 26,970 stream miles. By 1992, this figure jumped to 203,313 miles. The reason for these sudden jumps is that the EPA implemented a new method of determining total stream miles.
Other changes were implemented during the 27 years that states and the EPA were collecting data. In 1988, the EPA issued a guidance document suggesting that states could count hundreds of miles of rivers and streams as a single water body with a single assessment based on as little as one sample.30 PEER's report cites the example in Washington State, where one of the rivers was identified as being more than 3,900 miles long with no monitoring data ever associated with it. The EPA's guidance document caused the number of stream miles to substantially increase, but the number of rivers assessed went down by as much as 60 percent from 1982 to 1996.31
The most significant change the EPA made was to the 305(b) reports the states have to submit every two years to the EPA. The change allowed for a new way to report water quality. The new category, called "evaluated waters," was added to the category of "monitored waters" on which states should have already been reporting. The new category made it appear as if states were assessing more streams and rivers.The EPA wrote to the states that "it is expected that states will strive to increase the number of waters that they assess by tapping new data sources, including 'evaluated' waters in their assessments...."32 The result was, once again, a large increase in assessed miles with a corresponding decrease in the quality of that assessment.
The United States has spent nearly $600 billion on water quality improvement since 1970, and doubtless this has achieved substantial reductions in water pollution, especially "point source" pollution such as wastewater treatment plants and effluents from industrial facilities. But without a consistent monitoring system, evaluating past policy and creating new strategies must be done behind a veil of ignorance. The problem of systematic monitoring and assessment is especially acute with "non-point" sources of water pollution (i.e., stormwater and agricultural runoff) that may be more serious than point sources. A more thorough and consistent method of water quality monitoring needs to be developed in order to guide policy and prevent taxpayers and private citizens from wasting billions of dollars on the wrong measures.
While we await innovations in monitoring, new directions in water quality improvement are being pioneered by cooperative efforts at the state and local level. Several case studies suggest that the typical method of regulating point sources is obsolete, and that breakthroughs in water quality improvement will involve creative, voluntary measures crafted at the local level. These case studies show how businesses, farmers, environmental groups, scientists, and concerned citizens have joined together to assess and improve water quality on their own. We will examine two below.
At the end of the 1980s, one of North Carolina's largest bodies of water was in dire need of help. From May 1991 to July 1992, at least nine major fish kills had occurred in the Pamlico Sound. The cause, scientists discovered, was countless numbers of "phantom" microbesa form of swimming algaethat burst forth from a state of suspended animation, swam up into the water, and released a poison that killed millions of fish before becoming inert again a few hours later.
Both the Tar River and the Pamlico Sound were virtually dead. In September of 1989, the North Carolina Department of Environmental Management declared the water body to be "nutrient sensitive" due to the unhealthy amounts of phosphorus and nitrogen draining from the land surrounding the Pamlico Sound. Because of the declaration, the state was required to come up with a plan to improve water quality by reducing point-source emissions. Local governments were facing extreme financial hardship if they alone were required to reduce point sources sufficiently to restore the water quality of the Pamlico-Tar basin.
The citizens of the area petitioned the state to try a different strategy. Instead of relying on a study and plan of action by the state bureaucracy, they gathered people from around the region who had a stake in the water quality of the river. Their plan was to evaluate the effects of all pollution sources. It was the first time a group of environmentalists, industrialists, and scientists joined together with the state to work out a solution. Calling themselves the Pamlico-Tar River Foundation, they committed themselves to reducing the amount of pollution in the water and finding agreeable solutions for those whose livelihood depends on the Tar River and the Pamlico Sound.
The project was broken up into two phases. The first phase gathered some much-needed information. It had been previously thought that the contamination of the rivers was from water treatment plants. However, upon testing for point-source pollution, the foundation learned that only 15 percent of the pollution came from point sources, while the remaining 85 percent was coming from non-point sources.33 Furthermore, there were no benchmark data available to tell the members of the foundation what levels of point and non-point source pollution the waterways could handle safely or how much of the pollution was naturally occurring.
The decision was made to find ways of reducing the point and non-point-source pollution simultaneously. The water treatment plants were organized into one association. They were now going to deal with their pollution as a single entity, instead of facility by facility and town by town. By doing this, the plants could look at the average contamination of all the plants, instead of focusing solely on their own output. This maximized the overall reduction in pollution. Their goal was to reduce point-source contamination by 35 percent over the next five years. However, the plan was so successful that there was an 80-percent reduction in just one year.34
The next problem was trying to help the farmers who were contributing to the non-point-source pollution. The water treatment association decided that instead of having to expand their treatment facilities at a cost of millions of dollars for little nutrient reduction, the plants would pay farmers to adopt Best Management Practices (BMPs) to reduce their runoff. The association donated $850,000 to a pilot program run by the North Carolina Division of Soil and Water Conservation designed to help farmers install better irrigation systems and storage facilities for their water.
Phase One was completed in December 1994. Phase Two began in January 1995 and strove for specific yearly reduction of contaminants in the water for the next nine years. Also in 1995, new standards were implemented on confined animal farms. New farms could not operate without the proper irrigation practices and nutrient-rich water holding facilities. Existing farmers had until 1997 to comply with the new regulations.
So far the Pamlico-Tar River Foundation's efforts have been successful, although the effects of Hurricane Floyd have set progress back slightly. Still there is much cause for optimism. Since the project's inception, the average amount of water flow from wastewater treatment plants has increased by 34 percent, while at the same time the amount of nitrogen has decreased by 34 percent and the amount of phosphorous has decreased by 57 percent.
Pamlico-Tar River Foundation is an example of how human ingenuity was able to successfully reduce pollution in the waterway. It is being used as a model for other communities facing similar situations. According to Bruce Yandle, a professor of economics at Clemson University, the Pamlico-Tar River Foundation solution worked because it was in the best interests of everyone involved. Says Yandle: "Tar-Pamlico introduced a set of economic incentives that made pollution control pay and everyone gained, including the river."35
The mighty Mississippi starts as a small stream in Lake Itasca, Minnesota, and flows 1,300 miles to the Gulf of Mexico. Even before Mark Twain romanticized the Mississippi, it was the nation's principal interstate waterway, and continued to be a major transportation artery even after the coming of the railroad and interstate highways.
However, the towns and businesses that grew up along its banks have contributed to the water pollution that has plagued the river's health. For the millions of residents who live by its banks, the Mississippi supplies water for drinking, irrigation, power plant cooling, and recreation. It is said that by the time the water of the Mississippi has passed to the Gulf of Mexico, it has passed through seven different people.
The Mississippi River supports a diverse array of fish and wildlife that live in its channels, backwaters, and wetlands. According to the U.S. Geological Survey (USGS), the Mississippi Flyway is the migration corridor for 40 percent of North America's waterfowl and shorebirds. A 40-mile reach of the Upper Mississippi River has been characterized as the single most important inland area for migrating diving duck in the United States. And 154 species of fish and 50 species of freshwater mussels have been recorded in the river system.36
Downstream states bear the brunt of water-quality problems that accumulate as the river makes its course south. High rates of nutrient loading upstream have contributed to the development of a 7,700 square-mile zone of reduced dissolved oxygen in the Gulf of Mexico.37 Those same high-nutrient loads have also increased the amount of algae in the water, creating a dead zone the size of New Jersey. This area can no longer sustain sea life for most of the year.
The people of the Upper Mississippi River Basin have begun to respond. Recognizing this as a serious problem that doesn't just affect those downstream, they have begun looking for ways to help clean up the river. Government agencies have hitherto focused mainly on point sources of pollution: heavy industries, sewage-treatment plants, food-processing companies, slaughterhouses, and other large sources of wastes whose pipes empty into the river. However, just as the Pamlico-Tar River Foundation found, while some of these nutrients come from point sources such as factories or municipalities, approximately 90 percent come from non-point sources, such as septic systems and farm drainage lines.38
According to a 1994 U.S. Geological Survey report, 71 percent of U.S. cropland lies in watersheds where at least one agricultural pollutant violates criteria for recreation or ecological health.39 For example, the Minnesota River (a Mississippi River tributary) flows through land that is 92-percent agricultural, and 50,000 households on the banks of the river have inadequate sewage systems.40 While each farm or sewer alone contributes a small amount of damage to the river, cumulatively they generate an enormous amount of pollution.
Many groups are following the example of the Pamlico-Tar River project by joining together industry leaders and environmentalists to think of new solutions for these old problems that are not contemplated in typical Clean Water Act measures that target mostly point sources. For example, a state-level program called Reinvest In Minnesota (RIM) works to purchase permanent conservation easements along threatened watersheds to reduce the rate of runoff from agricultural soil. Efforts of this kind are in their infancy, but they point the way toward an effective strategy for safeguarding the Mississippi for the twenty-first century.
Michigan has a superior record in monitoring water quality and has impressive results to report. While all 50 states taken together only assessed 17 percent of their rivers and streams in the 1996 National Water Quality Inventory, Michigan assessed 40 percent. As Charts 14 and 15 show, 93 percent of assessed rivers, streams, and lakes were deemed "fully supporting," which means that they are safe for both swimming and fishing.
As the largest freshwater bodies in the world, the Great Lakes are a pre-eminent environmental concern. At the time of the first Earth Day in 1970, it was popular to say that Lake Erie was "dead," especially since the Cayuhoga Riverthe one that caught fire in Clevelanddrained into it.
Michigan is bordered by four of the five Great Lakes, with 3,250 miles of shoreline. There have been large improvements in water quality and wildlife health in and around the Great Lakes over the past 30 years. Today it is once again possible to fish in the Great Lakes, and even to drink their water in most locations. The Great Lakes now serve as an example of how the ecological balance can be disrupted less as a byproduct of industrial activity and more as a byproduct of our interconnected world. The most significant threat to the ecological balance of the Great Lakes no longer comes mainly from industrial pollution or toxics, but from biological threats. The proliferation of zebra mussels, a non-native species that has entered the Great Lakes region chiefly in the ballast water of cargo ships, currently presents one of the more significant environmental challenges for Lake Michigan and other Great Lakes. The zebra mussel is only one of 145 non-native or "exotic" species now found in the Great Lakes. These exotic species crowd out habitat of other indigenous species in the Lakes. Yet the Great Lakes Initiative and many environmental activists continue their crusade against chlorine and other synthetic chemicals that no longer pose a serious threat to the Lakes. Professor Bill Cooper of Michigan State University comments, "If one wished to allocate scarce monetary and human resources so as to maximize the reduction in ecological risk per unit of resource expended, one would do more good by regulating and/or limiting the introduction of exotics than by obtaining marginal reductions in trace levels of existing toxicants."41 Michigan and other states have moved quickly to develop aquatic nuisance management plans, and ships transiting the Great Lakes now face a bevy of requirements designed to eliminate the discharge of biologically contaminated water.
According to the 1996 National Water Quality Inventory, all 3,250 miles of Michigan Great Lakes shoreline are considered "impaired" for some purpose, though only one mile of shoreline is classified as impaired for swimming, and only 80 miles of shoreline area are impaired for drinking. The chief problems are sediment and runoff that impair the shoreline areas for aquatic life.
On the other hand, the Great Lakes are a phenomenal success story in reducing persistent, bioaccumulative toxics such as PCBs (polychlorinated biphenyls), HCBs (hexachloro-benzene), and DDE (dichloro-diphenyl-ethylene). Charts 16 through 18 show the dramatic decline in the traces of these chemicals found in herring gull eggs in the Great Lakes.
There are two lessons to draw from the current state of water-quality policy in the United States. First, a better monitoring system is necessary. Public Employees for Environmental Responsibility (PEER) has several recommendations, including a requirement that states use a uniform, numeric biological criterion for river and streams, increased federal funding for states whose monitoring efforts are inadequate, and a set of sampling guidelines to be consistently applied in all 50 states.
Given the complexity and difficulty of water monitoring, it is probably not possible to gather comprehensive monitoring data on an annual basis as is done for air quality, but some reliable measures must be developed. Second, the experience of the Pamlico-Tar river basin shows the promise of using cooperative approaches that emphasize incentives and markets for tradable gains among affected parties. Early efforts at water quality improvement are a prototype for what is being called "civic environmentalism," an approach that emphasizes the harnessing of local knowledge and decentralized decision-making.42
Every five years the U.S Department of Agriculture produces the National Resources Inventory (NRI), which surveys land uses and conditions on non-federal lands. The chief purpose of the NRI is to evaluate soil conditions and monitor soil erosion, but the process of sampling more than 800,000 locations, mostly in rural areas, yields considerable ancillary data.
A summary (but not the complete data) of the most recent NRI was released in November 1999, and it generated considerable public attention because of its finding that the rate of urbanization has doubled over the last five years. Between 1988 and 1992, according to NRI figures, 7.3 million acres of land in the United States were developed, while between 1993 and 1997, 15.9 million acres, over 3 million a year, were developed. These findings have provided fresh fuel for the controversy over urban sprawl.
"These new figures confirm," Vice President Al Gore said when releasing the NRI, "what communities across America already know. Too much of our precious open space is being gobbled up by sprawl."
There is little doubt that the unprecedented prosperity of the 1990s has led to an increase in land development, but there is good reason to question whether the NRI figures are accurate. Changes of this magnitude over such a short period of timein any category of economic activityare highly unusual absent some extraordinary factor. Until the detailed state-by-state data are released, it will be impossible to conduct a thorough analysis. A number of observations and anomalies, however, can be noted from the NRI summary.
The NRI itself cautions that, "Statistics derived from the NRI database are estimates and not absolutes. This means that there is some amount of uncertainty in any result obtained using NRI data." That the degree of uncertainty may be very large is suggested by comparing some of the specific land development figures in the NRI with other sets of data. For example, one of the eye-popping findings of the NRI is that Pennsylvania, one of the slowest growing states in the nation, developed as much land (over 1.1 million acres) as fast-growing Texas and Georgia.
One way of understanding the difficulty with these figures is to compare the NRI land-development estimates with population growth, which is done in Table 3. From this, a simple ratio of the land developed per new resident of the state can be calculated. By this comparison, Pennsylvania developed at a rate of 28 acres for every new resident of the state, which is far out of line even with other slow-growing northeastern states such as New York and Ohio. Other anomalies can be observed in some of the fast-growing sunbelt states.
If the NRI figures are correct, California is developing its land quite efficiently, if efficiency is defined narrowly and simply as using the least amount of land per new resident, which is an inadequate definition. By these numbers, California is developing more "efficiently" than Texas, Georgia, Arizona, and even "smart growth" Maryland. Most startling of all, California used less land per person than Oregon, whose land-use planning policies are widely praised as the best in the nation. Nevada and Arizona present an especially striking anomaly.
Sprawl critics have been claiming that Las Vegas is being developed at a rate of two acres an hour. Yet the NRI reports a land-use rate for the whole state of Nevada less than half of that total. Nevada's population growth was nearly twice the population growth of neighboring, and fast-growing, Arizona, yet Arizona is reported to have developed four times as much land.
It should be kept in mind that ascertaining the amount of developed land is a secondary purpose of the NRI. Its primary purpose is to determine the amount and condition of various rural land resource categories, especially farmland. Most of its nearly 800,000 "sample points" are in farming and rural areas.
It is true that in states with modest population growth such as Pennsylvania, land development is driven less by population growth than by people leaving cities and "spreading out" in the suburbs. Nearly half of Pennsylvania's counties, including the counties where Philadelphia and Pittsburgh are located, are losing population, while ex-urban counties are growing at a rate much faster than the state as a whole, so the simple ratio of land to population growth in Table 3 does not settle the question.
A way of cross-checking the NRI figures, and generating an independent estimate of land development, is to compare other data on development activity that has actually taken place.
Table 4 shows the number of single- and multi-family building permits for the two five-year periods (1988-1992 and 1993-1997), along with the square feet of commercial construction, including schools and other public buildings, and the miles of roads built. As can be seen from these figures, there was not a doubling of building activity between the two periods.
Tables 5 and 6 convert these figures into acreage estimates for the amount of land developed. This technique finds that 5.5 million acres of land were developed from 1988-1992, while the NRI for those years estimated that 7.3 million acres had been developed. The difference between these two estimates is 1.8 million acres, about 33 percent. The NRI figure may well be correct; estimating land development from permits and other data sets would not capture parks and planned open space within housing and other kinds of development. Not all development is contiguous, meaning that open space between development would be considered "developed" in NRI statistics even if it is bare land.
From 1993 to 1998, this technique yields an estimate of 6.1 million acres developeda 600,000 acre increase from the previous five-year period, or more than 100,000 acres a yearversus the NRI estimate of 15.9 million acres developed. The variance between these estimates is a staggering 9.7 million acresa 158 percent difference. It does not seem plausible that this large difference could be accounted for through non-contiguous development or other open space that would be counted as "developed" land.
A large error in the NRI estimate for the amount of land developed is not unprecedented. In 1981, the Department of Agriculture's Soil Conservation Service (SCS) estimated that three million acres a year were being developed, the same rate as is now reported in the NRI. But the SCS had to withdraw that estimate by 1984 when the persistent inquiries of various scholars concluded that the SCS had overestimated development by a factor of three. A repeat of this episode may be on the way when the final 1997 NRI data is released for review.
Regardless of whether and how much the NRI estimates for the rate of land development are revised, the figures need to be put in a larger perspective. The NRI estimates that urbanized land now accounts for 105.4 million acres of land. This represents about 5.6 percent of the total land area in the continental United States (i.e., excluding Alaska and Hawaii). Most western states, where "sprawl" is said to be most rapidly occurring, are typically much less than 5 percent developed (Arizona and Nevada are only about 1 percent developed, while Utah is less than 2 percent developed) because of the large desert, mountain, and range areas in these states. Older, eastern states with more population typically have a higher proportion of their land classified as developed. Charts 19 and 20 show the proportion of various land uses in the United States and in Michigan, respectively.
The U.S. Department of Agriculture (USDA) measures two kinds of soil erosion: wind erosion, and what is called "sheet and rill" erosion. Wind erosion is self-explanatory; everyone recalls images of the "dust bowl" during the 1930s, when high winds blew away tons of drought-parched topsoil in the heartland of the nation. Wind erosion is prevalent in the arid western states that have dryer soil and less natural ground cover, while many eastern and southern states experience no measurable wind erosion at all. Sheet erosion is the removal of thin layers of soil over the whole surface chiefly through raindrop splash and surface water flow. Rills are channels small enough to be obliterated by normal tillage operations.
The USDA measures soil erosion every five years as a part of its National Resources Inventory (NRI). The USDA released the most recent data in November 1999. They show that wind erosion, which had been increasing during the late 1970s and 1980s, has declined in the 1990s (see Chart 21).
The NRI findings show that sheet and rill erosion has experienced a consistent decline at a rate of slightly more than two percent per year, or about 40 million tons per year (see Chart 22).
Soil erosion leads to a decline in agricultural productivity, and the sediment from erosion contributes to water degradation. The EPA regards soil erosion as one of the most important national environmental problems, though experts disagree about the severity of the issue. The rate and severity of erosion depends on local conditions and soil types. Hence it is difficult to make national generalizations. The title of a 1987 report from the U.S. Department of Agriculture is instructive: Soil Erosion: Dramatic in Places, But Not a Serious Threat to Productivity.48
Government policy aims to reduce soil erosion to "tolerable levels" (known as T-values) by 2010, and to zero by 2025. T-values from cropland are designed to be the maximum amount of erosion that will indefinitely support agricultural productivity. T-values range from one to five tons per acre per year, depending on soil type and location. The good news is that the NRI figures show soil erosion is occurring at the low end of the T-values on two thirds of the cropland acreage in the United States. Only about one percent of cropland is experiencing a high rate of soil erosion.
There is no uniform natural rate of soil erosion, and one leading textbook argues that "there is no solid basis for tolerance soil loss values."49 The normal rate of soil erosion under natural vegetation is thought to be in approximate equilibrium with the rate of soil formation. It is helpful to consider some numbers. One ton of soil per acre (the low end of the T-values) is equal to uniform depth of .007 inches (.18 mm). At one ton lost per acre per year, it would take 43 years to lose an inch of topsoil. This is approximately the rate of erosion on uncultivated cropland. Even this rate, however, may be faster than the rate at which new soil is formed. Readers should not conclude, however, that U.S. farmland is in any serious danger.
The USDA itself, in a 1994 report, said that "loss of farmland poses no threat to U.S. food and fiber production."50 The chief reason to be concerned about soil erosion is that it is a factor in "non-point" water pollution, i.e., soil erosion carries fertilizer and pesticides, as well as dirt, into streams and rivers.
Charts 23 and 24 show that the largest gains in reducing soil erosion have come on cultivated cropland, the result of educational efforts and improved technology in farming practices.
Charts 25 and 26 display average soil erosion findings for Michigan. Michigan has been below the national average for both kinds of erosion on cultivated cropland.
The leading national measurement of toxic substances is the Toxics Release Inventory (TRI), which the Environmental Protection Agency produces every year. The TRI is an unwieldy measure because it tracks "releases" of more that 600 chemicals that vary widely in their hazardous character. In addition, the EPA refines the TRI from year to year, changing the number of chemicals tracked, the threshold for reporting, and the kind of commercial operations that must report. In the early years, the TRI reported on about 300 chemicals, but starting in 1995, the EPA doubled the inventory to more than 600. These changes make it difficult to discern trends.
The EPA has helpfully broken out the data against a 1988 baseline that includes only the chemicals included in the original inventory (shown in Chart 27). This measure shows a 42-percent decline in toxics releases since 1988, a reduction of nearly 1.5 billion pounds. The chemical industry, not surprisingly, has shown the largest decrease of all industries included in the TRI, with a 50.8 percent reduction in releases since 1988.51 These industry reductions reflect mostly productivity gains and technological improvements, as well as concerted efforts to reduce "releases" for public relations purposes.
Chart 28 shows the last three years according to an updated (1995) baseline. Both Chart 27 and 28 show a slight increase in "releases" in 1997 over 1996.
The TRI has significant limitations as a measure of toxic risk and environmental quality, and is not a very useful indicator of future trends. The TRI cannot be compared, for example, to the trendlines for air or water quality. First, the term "release" is misleading. While it does include compounds that are released into the air, it also includes chemicals that are disposed of in hazardous waste facilities, and even chemicals that are recycled on the premises. It would be more accurate to call the TRI the "Toxics Use Inventory."
Second, the TRI is incomplete and arbitrary in many ways. It does not include any data from government facilities, such as military bases. It does not capture data from many kinds of small businesses, such as "mom and pop" auto body shops. Third, not all chemicals are created equal in their toxic properties.
Measuring chemicals simply by the pound can be highly misleading, since an ounce of one chemical may be more toxic than 10 pounds of another. Hence, the TRI is of limited use in making judgments about environmental quality and toxic risk. The EPA itself has repeatedly cautioned readers about these limitations. The most recent TRI, for 1997, warns: "TRI reports reflect releases and other waste management activities of chemicals, not exposures of the public to those chemicals. Release estimates alone are not sufficient to determine exposure or to calculate potential adverse effects on human health and the environment."52 (Emphasis added.)
Professor George Gray of Harvard University's Center for Risk Analysis puts the problem more bluntly: "Chemical use does not equal chemical risk . . . . Simply knowing how many pounds are used provides no information about health or environmental risks."53 Gray points out that supermarkets can be required under the TRI to report their use of acetic acid as a "toxics release," even though the acetic acid may be in the form of vinegar used in salad dressing. Better to replace chemical use reporting, argues Gray, with chemical risk reporting. "Risk evaluations should be certified by independent agents, just like financial data are certified by accounting firms," Gray says. "This would provide firms with strong incentives to reduce risk and would provide firms and citizens both with useful information . . . . We must focus on how chemicals are used, not whether they are used."
Chart 29 shows the TRI results for Michigan on the EPA's 1988 baseline: a 48.7 percent reduction.
"Sustainable development" has become the leading environmental theme of our time. Like most great issues, the discussion about sustainable development involves an argument about our future. It is a concept of both common sense and controversy. It reflects common sense because no one is for a mode of life that diminishes our capital stock, which would make future generations poorer, or degrades our living conditions, which would make current and future generations less healthy. Yet sustainable development is also a subject of controversy because of the difficulty of comprehending the myriad linkages between environmental factors in a dynamic world.
Clashing conceptual frameworks lead to widely varying conclusions about what constitutes "sustainability." Still less is there any clear direction for public policy with regard to sustainability; at this point the policy discussion resembles the Woody Allen gag about trying to find a framework to turn a concept into an idea. The conceptual difficulties with the issue arise chiefly because of a lack of clarity and definition about what "sustainable development" means.
Most discussion begins with the UN World Commission on Environment and Development's 1987 definition: "To meet the needs of the present without compromising the ability of future generations to meet their own needs." This definition is too vague and general to be helpful. The President's Council on Sustainable Development, which has been laboring over the subject since 1993, recognizes this definition to be "inexact." Even some environmental groups have expressed misgivings about the plastic nature of sustainable development. Greenpeace once described sustainable development as the "deceptive jargon" of anti-environmentalism.
Some Alternative Definitions of Sustainable Development
"Sustainable utilization is a simple idea: we should utilize species and ecosystems at levels and in ways that allow them to go on renewing themselves for all practical purposes indefinitely."
Robert Allen, How To Save The World (Totowa, New Jersey: Barnes & Noble Books, 1980), p. 18
"Sustainability might be redefined in terms of a requirement that the use of resources today should not reduce real incomes in the future."
Anil Markandya and David Pearce, "Natural Environments and the Social Rate of Discount," Project Appraisal, Vol. 3, No. 1, 1988, p. 11
"The core idea of sustainability . . . is the concept that current decisions should not impair the prospects for maintaining or improving future living standards . . . . This implies that our economic systems should be managed so that we live off the dividend of our resources, maintaining and improving the asset base."
Robert Repetto, The Global PossibleResources, Development, and the New Century (New Haven: Yale University Press, 1985), p. 10
"Sustainable development implies using renewable natural resources in a manner which does not eliminate or degrade them, or otherwise diminish their usefulness to future generations. . . Sustainable development further implies using non-renewable (exhaustible) mineral resources in a manner which does not unnecessarily preclude easy access to them by future generation . . . . Sustainable development also implies depleting non-renewable energy resources at a slow enough rate so as to ensure the high probability of an orderly societal transition to renewable energy sources."
Robert Goodland and G. Ledec, "Neoclassical Economics and Principles of Sustainable Development," Ecological Modeling, No. 38 (1987), p. 37
"[Sustainable development is] development without growth in throughput of matter and energy beyond regenerative and absorptive capacities."
Herman Daly and Robert Goodland, "Environmental Sustainability: Universal and Non-negotiable," Ecological Applications, Vol. 6, no. 4 (1996), p. 1002
Sustainable development is of little use if it is just a "motherhood and apple pie" concept. Until a more exacting definition of sustainable development is found, it will be a solution in search of a problem, not unlike the evanescent enthusiasm for "participatory democracy" in the 1960s. Ultimately this does a disservice to sound environmental policy and to the idea of sustainable development itself. Environmental scientist Timothy O'Riordan warned in 1988: "It may only be a matter of time before the metaphor of sustainability becomes so confused as to be meaningless, certainly as a device to straddle the ideological conflicts that pervade contemporary environmentalism."54
The one seemingly clear application of the concept of sustainable development is global warming, i.e., if the temperature rise from CO2 emissions will cause catastrophic consequences, then our current way of life is unsustainable and will have to change. But is this conclusion as clear cut as it seems?
Leaving aside the issue of whether the catastrophic scenario of global warming is firmly established and predictable, what the imperative of sustainability requires us to do is not self-evident. A prominent theme is that we should abide by the "precautionary principle," i.e., that we should undertake significant changes in our mode of life just in case the catastrophic scenario comes to pass. Aside from the tenuous logic of the precautionary principle (which could readily justify a multi-trillion dollar space-based "precautionary" defense against asteroids striking Earth), it is arguable whether mankind would be better able to sustain itself and the rest of nature in the face of such change with the technology and wealth that will be accumulated with further growth. This weakens the case for requiring a vast transformation in our mode of life through institutional and policy means that remain murky and speculative, not to mention expensive. We won't begin to solve the issue here, but a few observations are in order.
The core idea of sustainable development is that future generations will have the means to meet their own needs better than we are able to meet our own today without degrading the natural environment. This implies that there will be adequate resources for future generations, and that sufficient care should be taken not to pollute our air and water, and not to despoil our forests and biological habitats. This suggests that the question can be divided into two halves: resource use and pollution. When thinking about resources, it is further necessary to divide them into two categories. The renewable resources include forests, water, food supplies, and animal species, while fossil fuels and minerals are non-renewable.
The clearest application of the idea of sustainable development applies to the use of renewable resources, such as forestlands, watersheds, wildlife, and other self-generating resources and ecosystems. Sustainable use of renewable resources is easier to measure. In fact, when a renewable resource is used in an unsustainable way, faster than it can replenish itself, it takes on the character of a highly scarce, non-renewable resource. Typically, if there is a well-functioning market for the resource, price increases are a signal of overexploitation. In this respect, sustainable development resembles the late economist Herbert Stein's memorable dictum: If something can't go on forever, it won't.
It is easier to devise solutions for problems of renewable resource use. Indeed, the United States has shown the way toward correcting unsustainable practices with regard to forestry; forestland in the United States and other industrialized nations has been increasing for more than 40 years. There are signs that the unsustainable exploitation of rain forests throughout the world is beginning to ebb as well.55
Many instances of unsustainable resource use can be attributed not only to a lack of a well-functioning market, but to perverse institutional or legal incentives, such as a lack of property rights to resources, or (especially in underdeveloped nations) a lack of ready resource alternatives.56 Groundwater resources in the U.S., for instance, are often overused because of subsidies, a lack of tradable rights to water ("use it or lose it"), and a lack of clear property rights to water tables. Overfishing in the oceans provides a better example.
It is easy to imagine that cattle might be scarce, just as buffalo became scarce, if they were owned in common and were taken from one vast domain, rather than being privately owned on separate ranches. While the exact analogue to barbed wire for fishing grounds in the ocean may be hard to conceive, assigning ownership rights to the ocean should not be much more difficult than assigning ownership rights to the radio frequency spectrum, as is currently being done throughout the world.
The United States should encourage developing nations to follow this general strategy. Much of the destruction of forest resources that is of present concern is due to unsound government policies that private owners would not likely have undertaken to the same extent, if at all. Vice President Al Gore notes in Earth in the Balance: "the most serious examples of environmental degradation in the world today are tragedies that were created or actively encouraged by governmentsusually in pursuit of some notion that a dramatic reordering of the material world would enhance the greater good." And Alan Durning of the Worldwatch Institute has noted that "tenure is a key determinant of the sustainability of forest economies . . . nationalizing the forests sabotaged traditional management, creating the free-for-all it purported to avert."
There is much enthusiasm for "getting the incentives right." This produces nods of agreement on the general level, and furious disagreement about its specific application. "Getting the incentives right" should mean chiefly assigning property rights to environmental goods, rather than using government power to set the "correct price" for the use of a commonly held environmental good. Any so-called "market-based incentive" policy that involves government setting the "correct price" to establish a "level playing field" is inherently flawed, because it misunderstands the nature of markets and prices.
The government will always lack the necessary knowledge to set the "right" price, and such policies will usually introduce new distortions into the marketplace that will likely be counterproductive and wasteful of resources. A genuine "market-based policy" allows dynamic prices set through the decentralized marketplace to act as signals to individuals about the relative cost, economic as well as environmental, of their choices. Hence the hallmark of sound market-based policy is determined by the extent to which it is decentralized.
The implications of sustainable development become more difficult to sort out with respect to non-renewable resourcesfossil fuels, minerals, and so forth. On the surface, it is obviously impossible to use a non-renewable resource "sustainably"; each unit of a non-renewable resource used is one less unit from a finite pool.57 But it is not immediately clear whether and how non-renewable resources such as oil, gas, and minerals should be conserved. It is not even self-evident that running out of a resource necessarily impinges on the ability of future generations to meet their needs.
For example, using whale oil in the nineteenth century as an input for energy or manufacturing was clearly unsustainable. And obviously it wasn' t sustained. The hunting of whales to near extinction may have threatened the biological diversity of the planet, but the depletion of whale oil as a resource did not impede succeeding generations from growing and meeting their needs, and, not coincidentally, protecting and restoring the whale population at the same time.
This points to the problem of having a static view of our resource consumption and production, and the paradoxical problem of having a sufficiently long time horizon. The paradox is that our technological and resource utilization mix is certain to continue changing as rapidly, if not more rapidly, than it has for the past century, yet the longer the time horizon we try to anticipate, the less certain we can be of the conditions and challenges facing our successors. Put more plainly, it is impossible for our generation to know what resources future generations will need, and in what proportions.
A resource planner in 1900 who worried about the resource needs of the year 2000 would have been taking care to secure supplies of kerosene and firewood for heating and lighting, copper for telegraph wires, rock salt for refrigeration, horses for transportation, and large amounts of land to grow feed stock for draft animals. Certainly this planner would not have known to secure large supplies of oil and gas, as they were only starting to come into major use and their supplies were abundant.
It is possible to conceive generally of technological advances in the next 75 years that will make today's resource concerns as obsolete as a concern for rock salt would have been 75 years ago. In arguing in favor of a "promethean environmentalism," Duke University Professor Martin Lewis points to the prospect of "molecular nanotechnology," i.e., programmable molecules, which would be a green technology that might even provide the means of species restoration (shades of Jurassic Park?).58
Such "out-there" ideas may seem as unthinkable today as trips to the moon 100 years ago, or the desktop personal computer 50 years ago, yet these precedents show how the unimaginable becomes the routine. And even well beneath these high-technology frontiers, it is possible to imagine such low-tech practices as mining old landfills for their raw materials, which might be thought of as retroactive recycling.59 These examples are intended to reintroduce the old economic principle of substitution to our thinking about sustainability.
In the classic Economics 101 sense of the term, new materials or methods are substituted when a resource becomes too scarce, and hence expensive. In the environmental arena it has a wider application: we can see a history of resource substitution that is both more efficient and cleaner. Some uses of non-renewable energy, especially oil and gas, are positively "green" technologies compared with the modes of energy use they replaced. For example, the development of the automobile, which is often portrayed as environmental public enemy number one, had several positive environmental tradeoffs.
First, a large amount of land previously devoted to raising feed for draft animalsas much as 25 percent of total agricultural land at the turn of the century was returned to nature or put to other, higher agricultural uses. Second, the elimination of the use of horses and other draft animals in urban settings obviously led to reduced water pollution and soil degradation, as well as improvements in urban sanitation. There were about 1.4 million horse-drawn transportation vehicles in the United States in 1900. The transport capacity of horses was three-quarters as great as the transport capacity of the railroads in 1900. As late as 1911, the value of horse-drawn transportation equipment produced was greater than the value of railroad equipment produced.60
The average horse consumed about 30 pounds of feed a day, or five tons a year. The amount of land used for growing feedstock for horses peaked at 93 million acres in 1915, an area roughly equivalent to all U.S. cities and suburbs today. Almost no land is used today to grow feedstock for horses (the U.S. government discontinued the data series for feedstock land in 1961, because the acreage had shrunk almost to zero), and this decline in land use to produce draft animal feed has doubtless contributed to the reforestation that has taken place in the United States61 (see Chart 30).
The air and water quality hazards from horse dung are obvious; a single horse would produce 12,000 pounds of manure and 400 gallons of urine a year, much of which fell on city streets. At the time, the state-of-the-art pollution control technology was a broom. In other words, the coming of the car, truck, and tractor has saved 90 million acres of land in the United States, a calculation that is usually left out of the environmental accounting of the internal combustion engine.
Another positive tradeoff of fossil fuel use has been the preservation of forests. In 1850, 50 percent of timber harvested in the United States was used for fuel. As late as the turn of the century, nearly 25 percent of the energy needs in the country were supplied by burning woodnearly five billion cubic feet a year. As fuel oil, natural gas, and electricity became widely adopted in the early decades of the century, the use of wood for fuel began declining rapidly, from more than five billion cubic feet in 1900 to less than 500 million cubic feet in 1970.
Although there was no national "spot" market for fuel wood like there was for other commodities in 1900, the price for construction lumber can be taken as a reasonable proxy for fuel wood. The inflation-adjusted price of lumber in 1900 was five times the price of lumber in 1970. It is worth noting in Chart 31 when the decline in wood use halts and heads back up againduring the Great Depression years, when fewer people could afford new gas and oil furnaces, and when businesses reduced spending for new capital equipment. Here is a clear example of the effect of economic growthand the lack of iton resource use and environmental quality. It is also ironic to recall that during the "energy crisis" of the 1970s one of the favored popular remedies was a return to wood stoves, which would have represented a step backward for both air quality and forest habitat.
There is an important lesson here for developing nations, where up to 80 percent of wood harvested is used for fuel. The development of fossil fuel energy could help lead to conservation of biologically valuable forestlands, and in some cases a reduction in air pollution. The development of cheap fossil fuel energy in the developing world will be the chief means of generating the surplus capital necessary to afford cleaner, higher-technology energy systems in the fullness of time.62 Demands that developing countries move directly to high-tech or "alternative" clean energy sources are not realistic.63
These resource and technological substitutions were fairly crude compared to what is conceivable today, yet they delivered large gains in environmental improvement. If the past is a prologue, we can look forward to even larger environmental gains in the twenty-first century. But the main point is this: The answer to the question of how non-renewable resources should be used today turns on whether a non-renewable resource is used in a manner that leads to a positive tradeoff of other environmental goods.
Such a tradeoff might be the more sustainable use of renewable resources, such as forests, and promote the development of permanent capital stock that will offer future generations more options and means to meet their needs. It is entirely appropriate to regard the use of non-renewable resources, especially fossil fuels, as "intermediate" modes on the way toward still more efficient and cleaner future modes, and therefore consistent with a dynamic understanding of sustainable development.
This is one reason why projections that come in the form, "There are only X years of Y resource at current usage rates," are not a very helpful or illuminating way of thinking about the issue. It is probably impossible to determine the practical optimum depletion rate for non-renewable resources. The common sense observation should be made at this point that every generation in human history has inherited more resources than the previous generation did. There is no reason to suppose that this will not continue for as far as it is possible to project. Doomsday scenarios of future scarcity and catastrophe, which have been issued regularly since the beginning of the industrial revolution, have consistently failed to come about. Hence, straight-line projections of current use trends should be viewed skeptically.
While it may be reasonable to suppose that there is an eventual limit to the process of resource substitutability and technological innovation that has constantly expanded the resource pool throughout history, there is no compelling evidence to conclude that this limit is imminent within the next few generations. At this point, the outer limit of resource creation is more a speculative than a scientific estimation. Most current non-renewable resources, including oil and gas, will last several generations even at projected rates of increased use.
In light of this understanding, energy efficiency policy goals need to be carefully assessed. Efficiency of any kind is always to be desired, but if it costs a constant $2 for every $1 of energy efficiency achieved, total resources are probably being wasted rather than conserved, and therefore the "efficiency" may not be environmentally benign. Many advocates of alternative energy argue that efficiency-forcing policies contribute to increased productivity and help create new markets. If this is true, such policies would be that economic rarity, the free lunch.
It may be true that in some cases managers lack proper information about environmental technologies that increase energy efficiency in a cost-effective way. But this would be an information problem, not a policy problem. Efficiency-forcing policies can only be justified if it can be shown economy-wide that such policies induce companies to make profitable innovations that they had somehow previously overlooked.64 Therefore, ambitious policy goals that would require unreasonable costs, or divert capital from potentially more productivity-improving investments, should be eschewed. While the wonks of the world debate this issue, the private sector is rendering the debate obsolete and irrelevant in many respects through its own practical steps.
Many resource-intensive industries are finding that it is not only possible, but profitable, to reduce material usage and pollution, and they have embarked on ambitious programs to reduce material use and pollution without the prod of legal mandates. In other words, we have reached a point at which environmental improvement in our industrial processes need no longer be regarded necessarily as a deadweight cost, as was supposed at the time of the first regulatory mandates 30 years ago. This accounts, for example, for the chemical industry's 50-percent reduction in "releases" in the Toxics Release Inventory. Some companies have set targets of as much as a 90-percent reduction in air emissions, 50-percent reduction in wastewater emissions, and 20-percent reduction in energy use per unit of production.65 The point is, when sustainable development is conceived as a technical problem rather than a global metaphysical and social problem, progress becomes both manageable and measurable.
Even when more exacting definitions of sustainable development are offered, confusion still persists because of honest scientific disputes and uncertainties about the facts, and differing opinions about how economies adapt over time to changing resource constraints. Consider Herman Daly's definition that sustainable development is "development without growth in throughput of matter and energy beyond regenerative and absorptive capacities."
Scientists can point to favorable trends in resource bases, increasing efficiencies of production, and falling amounts of pollution, reaching the conclusion that we are on the way to achieving, if not already achieving, Daly's definition. Yet Daly and many others take a dimmer view, arguing that the imperative of sustainable development requires "steady state" economics, which would include zero population growth, centralized command of natural resources, and controls on individual incomes and personal wealth.
Some economists have called Daly's "steady state" idea "a return to a regulated caveman culture."66 His view throws a spotlight on the implication some have drawn that economic growth itself is unsustainable and should be stopped or drastically curtailed.67 The most stark expression of this view is found in Paul and Anne Ehrlich's equation for human environmental impact,
I = P x A x T
where I = environmental impact, P = population, A = affluence, and T = technology.
In other words, any increases in population, wealth, and technology are inherently damaging to the environment, no matter what mitigating measures are possible. It is a schematic for the most extreme pessimism and would require wholesale transformation of human society and political institutions if it were made the basis of policy.
Indur Goklany offers an elegant and compelling refutation of the Ehrlichs' equation, and in the process explains why the signs point to a sustainable future.68 The most significant flaw is the assumption that population, affluence, and technology are wholly independent factors with no relation to each other. To the contrary, these three factors are highly interdependent, mostly in favorable ways.
Rising affluence, for example, cuts fertility rates. The richest nations of the world have negative fertility rates and falling populations, a condition that would be true of the United States in the absence of high rates of immigration. The world fertility rate has fallen by nearly half since 1960, from 5.58 to 2.75, and with it the global rate of population growth, from 2.07 percent in 1967 to 1.33 percent in 1998. The stabilization of world population can be expected as the rest of the world grows more affluent.
The environmental impact of technology is exactly backward from what the Ehrlichs' equation suggests. The amount of energy used and pollution emitted per dollar of economic activity has been falling for as long as reliable long-term data exist. In the United States, energy intensity has been falling by one percent per year since 1800. That is, it takes one percent less energy each year to produce the same amount of goods.
Goklany has examined specific air pollutants in the United States, finding, for example, that a dollar of economic activity today generates only .084 times as much sulfur dioxide emissions as a dollar of economic activity in 1900. In other words, changing technology has delivered a more than tenfold reduction in SO2 pollution per unit of economic output in the twentieth century. Other pollutants show even larger declines30-fold for volatile organic compounds and particulates, and 100-fold for lead.
This trends means, among other things, that today's worldwide carbon emissions are nearly 60 less than what they would have been were we still using 1950 technology. As developing nations become wealthier, we can expect to see a convergence of environmental performance that approaches the progress of the United States and other western nations. An example of what this convergence should look like can be seen in Chart 32 which shows sulfur dioxide trends in the U.S. and some of the eastern European nations that have embraced market economies in the last decade.
Goklany's conclusion is worth quoting at length:
The future could see a world in which the population has stabilized, is richer, cleaner, and with room for both humanity and the rest of nature, or one which is more populated, poor and polluted and where the rest of nature is pinched for space and water. The odds of the former are increased by bolstering the co-evolving, mutually reinforcing forces of economic growth, technology, and trade by strengthening the institutions that are their mainstays. These institutions are free markets; secure property rights to both tangible and intellectual products; fair, equitable and relatively transparent rules to govern those markets and enforce contracts; institutions for accumulating and converting knowledge into useful and beneficial products; and honest and predictable bureaucracies and governments . . . [I]ndustrial ecology can play an important role in moving such solutions closer to perfection, and in accelerating society's various environmental transitions so that technological change and economic growth are transformed from being problems to becoming solutions in the quest for a sustainable industrial society.69
As we begin the twenty-first century, the environmental discourse is showing signs of maturing. At the time of the first Earth Day, 30 years ago, most people in the business community thought environmental protection would be ruinously expensive, while many environmentalists were pessimistic that substantial improvements were possible in the coming decades. Yet major improvements came quickly. To be sure, regulation has been very expensive, but it has not been ruinously expensive in most cases.
This is not to say that none of the cost was wasteful and that particular regulatory prescriptions were optimal. Greater environmental gains might have been available through different strategies, but this policy argument will go on forever. Often the good faith cost estimates of regulatory compliance proved to be too high because the ingenuity and productivity of American business was underestimated.
The more recent experience shows that the basic economic trend of falling material and energy intensity in the production of goods and services is converging with environmental concerns in a way that was not foreseen at the time of the first Earth Day. As this report has shown, in some areas, such as air quality, we have consistent, high-quality measurements to show our progress. In other areas, we have major gaps in our data or only fragmented knowledge, and in some cases no measures of any kind. This is especially the case for wildlife habitat and biodiversity.
However, current trends suggest that environmental policy may slowly become less adversarial in character, evolving into the consensus issue it was expected to be at the time of the first Earth Day. This may mean that the radical elements of the environmental movementthose individuals and groups who reject economic growth and technological innovationwill be marginalized, as Peter Huber suggests in his recent book Hard Green: Saving the Environment from the Environmentalists.70 This can only be a positive development for those who recognize the vital link between environmental health and economic growthand for our environment itself.
The authors wish to thank the staffs of the Mackinac Center for Public Policy and Pacific Research Institute, especially Mackinac Center President Lawrence Reed, Mackinac Center Graphic Arts Manager Daniel Montgomery, PRI editorial director K.L. Billingsley, PRI research director Lisa MacLellan, and the PRI marketing team of Jennifer Berkowitz and Laura Dykes. G. Tracy Mehan, director of the Michigan Office of the Great Lakes, provided peer review for this report.
Steven Hayward is director of the center for environmental and regulatory reform at the Pacific Research Institute in San Francisco. He holds a Ph.D. in American studies and an M.A. in government from Claremont Graduate School. Dr. Hayward writes frequently on a wide range of current topics including environmentalism, law, economics, and public policy. His commentaries have appeared in National Review, Reason magazine, Policy Review, the Intercollegiate Review, and dozens of daily newspapers including The New York Times, San Francisco Chronicle, and Orange County Register.
Elizabeth Fowler is a public policy fellow with the center for environmental and regulatory reform at the Pacific Research Institute (PRI) in San Francisco. She specializes in market-based environmental research with an emphasis on California issues. Ms. Fowler is co-author of two PRI studies, the 1999 Index of Leading Environmental Indicators and Ending California's Water Crisis: A Market Solution to the Politics of Water. Her Op-Eds have appeared in major newspapers including the San Diego Union-Tribune and the Orange County Register.
Laura Steadman is a research assistant with the center for environmental and regulatory reform and the center for enterprise and opportunity at the Pacific Research Institute (PRI) in San Francisco. Her work on urban sprawl and other issues has been published in periodicals including San Francisco Business Weekly and the Triangle Business Journal in North Carolina.
1Herbert Stein, Presidential Economics: The Making of Economic Policy from Roosevelt to Clinton (Washington, D.C.: AEI Press, 1994), p. 190.
2See David Schoenbrod, Power Without Responsibility: How Congress Abuses the People Through Delegation (Yale University Press, 1993).
3Martin W. Lewis, Green Delusions: An Environmentalist Critique of Radical Environmentalism (Duke University Press, 1992), pp. 2-3.
4Edward Goldsmith, et al., A Blueprint for Survival (Harmondsworth: Penguin, 1972), p. 50.
5Martin Ryle, Ecology and Socialism (London: Radius, 1988), p. 60.
6Mark Dowie, Losing Ground: American Environmentalism at the Close of the Twentieth Century (MIT Press, 1996).
7Quoted in Ibid., p. 106.
8Lewis, pp. 6, 9.
9For an extreme expression of this point of view, see Jane Holtz Kay, Asphalt Nation: How the Automobile Took Over America and How We Can Take It Back (New York: Crown Books, 1997), p. 24. See also James Q. Wilson, "Cars and Their Enemies," Commentary, July 1997, pp. 17-23 and James A. Dunn, Jr., Driving Forces: The Automobile, Its Enemies, and the Politics of Mobility (Washington: Brookings Institution, 1998).
10Foundation for Clean Air Progress, available at http://www.cleanairprogress.org/survey/answers.cfm.
11Environmental Protection Agency, National Air Quality and Emissions Trends Report, 1997 (Research Triangle Park, NC: Air Quality Trends Analysis Group, 1998), p. 9.
12Ibid., p. 9.
13"Progress in Reducing Ozone Exceedance Days in Ten Major U.S. Cities, 1987-1999," prepared for the Foundation for Clean Air Progress, Washington, D.C., by Tech Environmental, Inc., Waltham, MA, October 1999.
14Council on Environmental Quality, Environmental Quality, 25th Anniversary Report (Washington, D.C.: The Executive Office of the President, 1995), p.184.
15Goklany, p. 65, and D. Brown, "Lead Level in Americans' Blood Has Fallen 75% Since the Late '70's," Washington Post, July 27, 1994.
16Rethinking the Ozone Problem in Urban and Regional Air Pollution (Washington, D.C.: National Research Council, 1991).
17Indur M. Goklany, Clearing the Air: The Real Story of the War on Air Pollution (Washington, D.C.: Cato Institute, 1999).
18Goklany, pp. 21-22.
19Goklany, p. 21.
20Paul R. Portney, "Air Pollution Regulation," in Paul R. Portney, editor, Public Policies for Environmental Protection (Washington, D.C.: Resources for the Future, 1990), p. 40.
21Goklany, p. 56.
22Ibid.
23OECD Economic Surveys - United States (ISSN: 0474-5329, November 1991).
24Goklany, p. 43, and EPA, Report to Congress on Indoor Air Quality Volume II: Assessment and Control of Air Pollution, Office of Air and Radiation, EPA/400/1-89/001C, 1989.
25Goklany, p. 44; Lance Wallace, "A Decade of Studies of Human Exposure: What Have We Learned?" Risk Analysis 13, April 1993; and Wayne Ott and John Roberts, "Everyday Exposure to Toxic Pollutants," Scientific American, February 1998.
26Goklany, p. 44, and Kirk R. Smith, "Fuel Combustion, Air Pollution Exposure, and Health: The Situation in Developing Countries," Annual Review of Energy and the Environment 18, 1993.
27Goklany, p. 44.
28Public Employees for Environmental Responsibility, Murky Waters: Official Water Quality Reports Are All Wet, May 1999, p. 28.
29Ibid., p. 38.
30Ibid., p.14.
31Ibid., Table 2.
32Ibid., p. 17.
33U.S. Environmental Protection Agency, For the Health of a River: The Story of the Tar River in Eastern North Carolina, 1994. For more information contact the Center for Policy and Legal Studies, Clemson University.
34Ibid.
35Ibid.
36U.S. Geological Survey's Upper Midwest Environmental Sciences Center, About the Upper Mississippi River System. Available at http://www.umesc.usgs.gov/umesc_about/about_umrs.html.
37U.S. Geological Survey, "Ecological Status and Trends of the Upper Mississippi River System," 1998, p. 5-5.
38Tom Meersman, Star Tribune Special: "The Minnesota River in Crisis," December 1999.
39R. A. Smith, G. E. Schwarz, and R. B. Alexander, Regional Estimates of the Amount of Land Located in Watersheds with Poor Water Quality, U.S. Geological Survey Open File Report 94-399.
40Meersman, December 1999.
41Bill Cooper, State of the Great Lakes 1993 Annual Report, April 1994.
42For more on the concept of "civic environmentalism," see www.civicenvironmentalism.org.
43The median size house lot for single-family homes is .35 acres, according to the U.S. Census Bureau.
44Assumes one acre of street is needed for each acre of residential housing.
45Author's calculations based on data about miles of roads constructed from Highway Statistics, Federal Highway Administration.
46F.W. Dodge/McGraw Hill, as reported in 1998 Statistical Abstract of the United States, Table No. 1195. This data includes schools and other public sector construction.
47Estimate assumes a 5:1 ratio of parking space and landscaping to floor area.
48U.S. Department of Agriculture, Agricultural Outlook (Washington D.C.: Economic Research Service, 1987).
49Frederick R. Troeh, J. Arthur Hobbs, and Roy L. Donahue, Soil and Water Conservation (Englewood Cliffs, NJ: Prentice Hall, second edition, 1991), p. 115.
50Marlow Vesterby, Ralph E. Heimlich, and Kenneth E. Krupa, "Urbanization of Rural Land in America," USDA, Economic Research Service, Agricultural Economic Report 673, March 1994.
51EPA, 1997 Toxics Release Inventory, pp. 4-14.
521997 TRI, p. 1-6. For a more complete discussion of the TRI, see Volokh, Green, and Scarlett, "The Toxics Release Inventory, Stakeholder Participation, and the Right to Know," Reason Public Policy Institute, Policy Study #246, available at www.rppi.org/righttoknow.html.
53George Gray, "Forget Chemical Use, Let's Report Risk," Risk in Perspective, (Cambridge: Harvard Center for Risk Analysis, Vol. 4, No. 4, April 1997), p. 1.
54Timothy O' Riordan, "The Politics of Sustainability," in R.K. Turner, ed., Sustainable Environmental Management: Principles and Practice (London: Belhaven Press, 1988), p. 29.
55The deforestation rate of the Amazon rainforests, which had been averaging 1.2 percent a year in the 1980s, is showing signs of slowing down, in large part because of increased consciousness about the folly of deforestation and changes in policies that encourage deforestation.
56World Bank environmental consultant John Pezzey comments, "Environmental policy is all about internalizing externalities; and internalizing externalities usually amounts to establishing some kind of property rights over the environment."; Sustainable Development Concepts: An Economic Analysis (Washington, D.C.: World Bank Environmental Paper Number 2, 1992), p. 30.
57Economist R. Kerry Turner comments: "It makes no sense to talk about the sustainable use of a non-renewable resource (even with substantial recycling effort and reuse rates). Any positive rate of exploitation will eventually lead to exhaustion of the finite stock." Turner, "Sustainability, Resource Conservation and Pollution Control: An Overview," in R.K. Turner, ed., Sustainable Environmental Management: Principles and Practice (London: Belhaven Press, 1988), p. 13.
58Martin Lewis, Green Delusions, pp. 143-44.
59Landfills are already rapidly becoming obsolete as recycling and materials recovery processes mature. Some of our current recycling mandates and practices are proving to be an impediment to this favorable development.
60Historical Statistics of the United States, Colonial Times to 1970 (U.S. Census Bureau, 1975), Series S-4, p. 818; Series P-362, p. 702.
61Historical Statistics of the United States, Colonial Times to 1970 (U.S. Census Bureau, 1975), Series K-498, p. 510.
62Indeed, transnational research on the issue of economic development and environmental quality validates the view that while emerging economies suffer deteriorating environmental quality for a time as industrial output increases, environmental quality begins to improve as net wealth increases beyond a certain point. This has been labeled the "J-curve" phenomenon. Moreover, this "wealth effect" provides the basis of broad popular support for environmental improvement; as a people become wealthier, their demand for environmental improvement increases. See Don Coursey, "The Demand for Environmental Quality," Working Paper, Business, Law, and Economics Center, Washington University, 1992.
63Fossil fuel energy development might actually lead to improvements in air quality in the short run if it leads to a reduction in the use of wood, cow dung, and other inefficient and highly polluting fuel sources. See Gregg Easterbrook, "Forget PCBs, Radon, Alar: The World's Greatest Environmental Dangers Are Dung Smoke and Dirty Water," New York Times Magazine, September 11, 1994, pp. 60-63.
64"Economists are rightly wary of believing that such opportunities can exist . . . . The few studies of links between environmental compliance costs and productivity suggest that such wariness is fully justified." "When Green Is Good," The Economist, November 20, 1993, p. 19.
65Dimensions of this kind of practice, and several case studies, are detailed by the World Business Council for Sustainable Development, available at http://www.wbcsd.ck.
66H.S. Burness and R.G. Cummings, "Thermodynamic and Economic Concepts as Related to Resource Use Policies: A Reply," Land Economics Vol. 62, no. 3 (1986), p. 323.
67The UN World Commission on Environment and Development was emphatically not among this camp. It called for economic growth as central to any strategy of sustainability.
68Goklany's complete algebraic analysis of this topic can be found in Clearing the Air, pp. 69-73; also in Goklany's unpublished paper, "The Future of the Industrial System," prepared for the International Conference on Industrial Ecology and Sustainability, Troyes, France, September 22-25, 1999.
69Goklany, "The Future of the Industrial System," p. 21.
70Peter Huber, Hard Green: Saving the Environment from the Environmentalists (A Conservative Manifesto) (New York: Basic Books, 2000). See also Marian R. Chertow and Daniel C. Esty, eds., Thinking Ecologically: The Next Generation of Environmental Policy (New Haven: Yale University Press, 1997).