STRATEGIES FOR DEALING WITH ENVIRONMENTAL RISK CONFLICTS


CONFLICT RESEARCH CONSORTIUM

Working Paper 94-68 August, 1994(1)

By Guy Burgess

Co-Director, Conflict Research Consortium

University of Colorado, Boulder


(1) This paper is based upon a talk given to the Center for Environmental Journalism at the University of Colorado at Boulder, April 27, 1994. All ideas presented are those of the author and do not necessarily represent the views of the Consortium, the University, or the Hewlett Foundation. For more information, contact the Conflict Research Consortium, Campus Box 327, University of Colorado, Boulder, CO 80309-0327. Phone: (303) 492-1635; e-mail: crc@cubldr.colorado.edu.


© 1994. Conflict Resolution Consortium. Do not reprint without permission.


I plan to talk about one component of the Conflict Resolution Consortium's Program on Intractable Conflict. I have a number of brochures that I will pass around that describe other parts of this multi-campus, community and university initiative to look at difficult and continuing conflicts, which include environmental risk conflicts, "not in my back yard, you don't" problems, and conflicts between deep ecologists and the wise-use movement. We are also examining civil rights conflicts, abortion, gay rights, and a lot of other issues. One of the major problems in dealing with a lot of intractable conflicts is the problem of technical risk and uncertainty; that is what I plan to talk about today.

I thought that it would be useful to begin by talking about how we, as a society, deal with environmental risks. My approach to the problem is based upon Charles Lindbloom's principle of "remediality." Lindbloom would have observed that it is very hard to get people to agree on a single "best" way of dealing with risk and uncertainty just as it is hard to get them to agree on the meaning of true "justice." He would, however, also have noted that it is much easier to get people to agree on a number of serious problems that deserve remedial attention and should be fixed. Based upon this principle I would like to make a number of somewhat disjointed observations concerning dynamics that distort environmental decision-making processes in ways which most people would agree are not as wise as we would like them to be, and, therefore, worthy of steps to avoid them.

I start with something that I call the "Certainty Trap" or the "Omniscience/Omnipotence Trap." This is the illusion that many people have that they are smart enough to "figure it all out" and, thereby, eliminate any risk or uncertainty. For a long time the "hard scientists" thought there really wasn't any irreducible randomness in the world—they could figure it all out (given enough money). But more recently they have discovered Chaos Theory which, from the perspective of a social scientist, is a respectable way of saying that there are things about the universe that we can't predict.

Another example of how we really can't predict the future and eliminate risk is the story of what would have to have been done to avoid the Rely tampon disaster. You may remember that a few years ago, something like one out of 10,000 women who used the Rely tampon fell victim to toxic shock syndrome--a devastating and frequently fatal disease. Now imagine that you are in the product testing department, and that you are responsible for avoiding this kind of problem. To test for a problem that occurs once in every 10,000 cases you will need a sample large enough to yield at least 100 instances of the problem. To do that you need a sample size of one million (100 times 10,000)! If you start looking at long-term risks that may not appear for 5, 10, 20, 30, or, even, 40 years, it would mean tracking a million people for 40 years—a clearly impossible task. Then there is the ethical problem—how do you get a million "guinea pigs" to try something to see if it is going to work? Clearly these and a host of related problems means that certain kinds of risks are simply unavoidable.

Senator Muskie used to ask before Congressional committees for a "one-armed scientist." He was sick of experts who kept saying that on the one hand it might be like this and, on the other hand, it might be like that.

If you believe that certainty is possible, an obvious frequent approach to environmental problems that has a lot of political appeal is something you might call the "Fruitless Certainty Search." Politicians hate to make decisions that, by implication, say they think it is okay to place the general public at risk. It is much easier to say, "Do another study." I've been doing an informal survey of official reports for the last several years and I have yet to see a report that doesn't end with the suggestion that more study is needed.

Another common problem is the "Delay-Default Syndrome." People get the impression that if they keep studying a problem and studying it, they are not making any decisions. That's not true. They are making a decision to pursue the default, business-as-usual alternative. We can't decide, for instance, what to do about high-level nuclear waste, so we have study upon study upon study. To give you some idea of how really out of control this has become, we are studying how to write warning signs to warn people that some really dangerous stuff is buried 2,000 feet underground. What's more, these warning signs have to be written in a way so that they can be read by somebody who has no knowledge of any currently known language. In an attempt to be safe we are spending a lot of time worrying about what Nevada is going to be like when a great climate change transforms it into a lush wetland. Meanwhile, we are making a default decision to leave this high-level waste sitting around urban centers, in often questionable storage tanks that may, in fact, be leaking.

Another thing that comes up is something I call the "Ostrich Trap." A fundamental principle when dealing with risk and uncertainty is that the further you look into the future, the greater the level of uncertainty. You can prove this mathematically if you want. If you have a 90 percent confidence in your ability to predict one year out, you multiply that again by 90 percent for the second year out, and the same for the third year out. When you do this 90 percent falls to 81 and then to 72 percent. Pretty soon, you discover that you know very little.

A fundamental principle of sound decision-making under conditions of risk and uncertainty is that you don't say, "Well, I have accurately predicted the future. I have made a decision, and I won't reconsider or change it." This is sticking your head in the sand like an ostrich. What you should do, which is much more sensible, is say, "Well, this is what I think now, but I am going to come back and look at this again and again. If, as the future comes nearer, I find my predictions were wrong, I am going to change what I did. Not only that, I am going to design my decision in ways which allow me to change easily."

The most spectacular example of this mistake might be the Washington Public Power Supply System, or WPPSS ("Whoops"). During the early '70s, before the great oil embargo, there was the need for reliable forecasts for the growth in the demand for electricity in the Pacific Northwest—so they got out a terrific forecasting tool, a ruler, and assumed that the rapid growth rate for the last 20 years would simply continue unchanged!

In order to meet this demand they then decided that the cheapest alternative was a series of six really big nuclear power plants. They assumed that there was no uncertainly about the design of the plants, possible safety problems, or public opposition and, therefore, started building the plants. They also assumed that the demand would be there once the long construction period required by these large plants was completed.

Soon after this, the oil embargo sent energy prices way up, forcing the demand curve down to the point where they didn't really need all of the new plants. Still construction was underway and they were committed. Meanwhile public opposition and safety problems caused costs to skyrocket. Ultimately WPPSS went broke. The reason for this was that they took a course of action that didn't give them any flexibility when the demand forecast was modified or when the predictions of the suitability of nuclear power changed. A series of smaller, quicker to build, power plants that more closely tracked the demand curve would have been a vastly better approach for dealing with this inflexible, ostrich-trap problem.

Another problem is what I call the "QED Syndrome," which comes from my high school geometry class. When we got through with a proof, we wrote QED at the bottom indicating that the hypothesis has been proven and we didn't have to think about it any more. One of the things that happens is that people shop around for studies that tell them what they want to hear. They then decide that their self-serving hypothesis has been proven and they no longer need to think about the possibility that they might be wrong. As a result you'll find people on different sides of an issue believing contradictory studies, which they interpret to be incontrovertible proof that their view is correct and the other guys are obviously off the deep end.

One example of this is the late Dr. Carl Johnson, who used to be director of the Jefferson County Health Department and, from what I could tell, a really conscientious and caring man. He did a study on the health impacts of Rocky Flats that contradicted all of the other studies on the impact of radiation that had been done around the world. Still his stuff was published around the world and he became famous. The sad fact was that some of these studies (which were never replicated) would have flunked an undergraduate statistics class in this age of grade inflation. For example, Johnson confused correlation of coefficients with measures of significance. The numbers didn't mean anything, but they told people what they wanted to hear.

Another issue is something called the "Risky Shift," which psychologists have studied for years. They find that in certain situations, for instance with a group of rock climbers in Eldorado Springs, a sort of bravado effect sets in. Pretty soon the climbers are competing with one another by making riskier and riskier (and stupider and stupider) ascents just to prove they are tougher than the next guy.

You will often see this with people who routinely work in some sort of hazardous situation. I never found anybody at Rocky Flats who thought it was very dangerous. When I was active in mountain rescue in Colorado, nobody ever thought it was very dangerous. But, looking back on that experience, there were at least five times that I almost was killed, but at the time it was "no big deal." This is a familiarity effect that breeds a certain degree of contempt for many hazards.

There is also the problem of the "Cautious Shift," which psychologists have known for a long time as the flip side to the Risky Shift. The Cautious Shift works in much the same way except instead of leading people to take more and more risks, it leads them to be more and more cautious. You see this in politics, especially with hot- button propaganda/political advertising techniques. It is very, very difficult (in fact, probably politically suicidal) to give a campaign speech saying, "I am for taking greater risks with the public health than my opponent."

Since modern science has developed to the point where there is no such thing as zero risk, a candidate can always try to get ahead by promising to be safer than his opponent. This can result in a bidding war of increasing caution, which eventually leads to a point where the costs of risk reduction far, far outstrip the benefits.

Another challenge is the "Order of Magnitude Trap." I chaired a panel once with David Hawkins, who said something that I thought was very wise: "One of the great privileges of being a scientist is to understand the meaning of orders of magnitude or powers of 10." There is only a four orders of magnitude difference between a funky old TNT bomb and the 1000 times more powerful nuclear bomb. There is roughly a four orders of magnitude difference in the speed of an ant walking across the sidewalk and the speed of a space shuttle.

In my Rocky Flats work, we had risk estimates ranging from 10-2 to 10-8, which people didn't understand. They would say, "It's a risk. Minus two, minus eight. What's the difference?" Well, the difference between a minus two and a minus eight is six orders of magnitude. That is a staggering number.

I got into a bit of a fight in writing the Final Report for the Rocky Flats Blue Ribbon Citizen's Committee. First, the Committee endorsed the risk estimates as "thorough, complete and objective and should be a sound basis for policy-making." However, when I looked at the numbers and the uncertainty limits it was clear that trying to protect the public health by moving Rocky Flats would cost somewhere between $2 billion and $3 billion per life saved. (This was way back when people thought that we would need a plant like this forever.) By saving lives, I mean avoiding a cancer that may occur 40 years after an accident. I figured out that this $2-3 billion was roughly the cost of all the health care in Colorado--all the hospitals, all the doctors, all the ambulances, all the nursing homes, for a whole year. It was just an absolutely unconscionable waste of money. But nobody would even let me put that calculation in the report. A risk was a risk. Nobody cared what these little order of magnitude numbers were.

There is also what I call the "Pricelessness Trap," the sense that human life is priceless. One of the characteristics of our society is that we are often willing to spend any amount of money to save somebody. People will tell you how callous and horrible it is to put a dollar value on human life. But if you look at it in terms of opportunity costs, then human life has to have some value. If you are spending money, time, and effort saving somebody from exposure to lead from old mining tailings in a wilderness area that nobody ever visits, you are not doing something else. The key question is, "Where should the limited amount of time and effort be best spent?" You could theoretically create a list of all the risks that there are in our society and include natural and manmade hazards and who they effect. Next you could allocate the limited resources available on the risk reduction efforts that produce the greatest (and most equitably distributed) benefits until you run out of money. That point defines, in practical terms, the value of human life.

Another real concern is something I call the "Contradictory Expert Problem." I ran into a Washington agency that has a list of experts. The media could call up the secretary and say, "Get me an expert who will say this and an expert who will say the opposite." The secretary would then go down the list and give the reporter the names they wanted. Next you would find them in the newspaper or on television saying opposite things. It cancels out completely anything that science has ever learned about an issue. To avoid this it takes some digging to find out about what the credentials are of the people arguing various things. There are reasonable moderate people who can tell you what a prevailing scientific opinion is, where the frontiers of knowledge are, where the reducible uncertainties are, and when somebody like Dr. Johnson is using statistics that are utterly indefensible.

Another issue that gets tied up with this is the common belief that spending money doesn't matter as long as it is not yours. You get this with litigation. If you can sue somebody and force them to pay for it, then the fact that costs greatly outweigh benefits doesn't matter. You also see this with the growing furor over unfunded federal mandates. This is a new trick that our bankrupt Federal government has been using, which allows them to spend money even though they don't have any. What they are doing is simply requiring state and local governments as well as private individuals and corporations to spend their money on Federal priorities. All of this is done without ever really accounting for how much people are spending and what the benefits are.

There is also the "Pentagon Toilet Seat Syndrome." You may remember a few years ago the Pentagon got caught spending $500 a piece for toilet seats. Anytime that there is a lot of money to be made doing things where the benefits are uncertain and exist only in some thick technical report estimating reductions in national security or environmental risks, there is a larcenous temptation for people to take advantage of the situation and spend lots of money on things that don't really matter.

In my antiwar days, I always used to complain that I'd never seen a "hawk" complaining about a wasteful Pentagon program. Now the test for environmentalists is whether they are willing to oppose wasteful environmental programs because they are not a sensible investment of public funds.

There is also the "Posterity Trap." Kenneth Boulding liked to remember Marx (not Karl but Groucho) who asked, "What has posterity ever done for me?" While it's true that posterity does little for the current generation, its also true that any society that doesn't take care of succeeding generations will not be around very long.

There are really two parts to this. One is environmental degradation. As we use up natural resources, poison and pollute the air and water, we incur long-term costs which are most likely to accrue to our children or our children's children. But there is another part of the sustainability issue that is often ignored, which is the financial sustainability. All you have to do is look closely at the scope of the Federal deficit problem and then add to this the problem of unfunded, off-budget mandates, non- maintained infrastructure, and the fancy finagling they do with Social Security and it gets pretty scary.

Thus, one aspect of environmental protection involves protecting the environment. A second crucial aspect involves the pursuit of fiscal policies needed to provide the resources to protect the environment over the long term. If you read about the countries of Eastern Europe you learn that the environment is in disastrous shape because they simply don't have money it takes to take care of it.

Another problem to be concerned about is "Good Old- Fashioned Deception." With the Rocky Flats project I was involved in, the Department of Energy was persuaded that it should consider relocating Rocky Flats as a way of reducing health risks for metropolitan Denver. They elected to conduct a $5 million study of the cost of relocation, the problems of decommissioning and decontamination, and the risks associated with the current operation. Still nobody was going to believe a study that DOE did of itself. To get around the problem Tim Wirth raised a half a million dollars in no-strings-attached money for an independent blue-ribbon oversight committee. I worked on the staff of that committee. We spent three years looking over the shoulder of all the DOE consultants who were studying the risks of Rocky Flats. We put together a pretty prestigious nationwide commission of expert advisors to look over different aspects of the study. They all came back and said that they thought that the study was professionally done and with a couple of minor exceptions thorough, complete, and objective.

One of the key parts of the study was based on Building 371. The most dangerous thing they do at Rocky Flats was reprocess plutonium in various waste forms back into plutonium metal, which they could then machine into bomb parts. This reprocessing phase is the time when plutonium is in its more dangerous chemical forms.

At the time of the study they were constructing Building 371, a fancy, new, tornado-proof, earthquake-proof, robot-controlled super building to handle these operations. They assumed that Building 371 had a risk of virtually zero. All of the risk studies were based on this. Just after the final report was issued DOE admitted that the building simply didn't work. It never, to my knowledge, produced at more than five percent of capacity, which meant that DOE was doing a great deal of reprocessing using informal, untested, unevaluated procedures in buildings that were not really designed for it. All of this was in the midst of the Reagan arms buildup and nuclear arms production surge. By concealing the fact that DOE was unable to bring this facility on-line threw the entire risk assessment off. All of the good work was largely worthless.

One last observation. There is also an "Escalation and Polarization Problem"--a reciprocal cycle of distrust. It usually begins when one side does something either intentionally or accidentally that the other side thinks is not very reasonable. Maybe its an ill-considered inflammatory statement about how the other guys are reckless and stupid. The other side then does something inflammatory and pretty soon nobody trusts each other. They shout insults back and forth. Once this builds up, it spills over to similar cases all across the country. There is such a bad feeling about nuclear power, for instance, that if somebody really tried to do it right, nobody would believe them. Escalation is also a slippery slope—it's easy to fall into and very hard to climb out of.

These are a few observations of the kinds of things that you need to think about when dealing with risk and uncertainty.