6 Cognitive Psychology: Emphasis on Thinking Do you recycle aluminum cans? Why or why not? Please write down your reasons before reading on. T J. f you answered yes to this question, your reasons probably included thoughts about the importance of saving natural resources. You may even have mentioned that it takes a lot of energy to extract aluminum from bauxite ore, but even if you did not know this technical fact, you probably contemplated the importance of conserving scarce resources. If you answered no, perhaps you reasoned that it does not really make that much difference. Or that recycling bins are never available when you need them. Or that recycling is just a fad. These kinds of responses illustrate a basic tenet of cognitive psychology: in order to understand behavior, we must understand people's thoughts. Cognitive psychology, which we will define here as the study of thinking, focuses on the way people make sense of their worlds. The typical 188 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING 189 cognitive psychologist studies human behavior in a laboratory, running experiments so that hypotheses about mental functioning can be tested. Like behaviorism, cognitive psychology is conducted as a science. But unlike behaviorism, cognitive psychology focuses on what is going on inside people when drey make decisions, formulate solutions, surmise meaning, etc. Unlike behaviorism, cognitive psychology studies the mind and how it works. According to cognitive psychologists, we cannot understand human action until we understand what and how people think. The way people understand environmental problems is crucial for understanding their responses to them. If I believe there are endless supplies of old-growth timber left, I will not be particularly distressed to see logging trucks carrying massive trees to the lumber mill. If, on the other hand, I think that those trees come from the last 5 percent of our ancient forests (which I do), I will be more concerned. I will also be more likely to try to save old-growth forests by supporting groups who are trying to preserve them and finding ways to reduce my own use of wood products, such as paper. Consumption behaviors are determined by our knowledge and beliefs about the environment. Or at least, so it would seem. Actually, what cognitive (along with social psychologists) have shown us is that the relationship between our beliefs and our behavior is much more complicated than we would suspect; that we like to think we are more rational and logical than we actually are; and that we are easily tricked by the limitations of our own perceptual and reasoning processes. Yet, our attempt to create meaning, no matter how faulty and botched, is an important organizing feature of our behavior. Just as the social psychologists showed that our irrational thoughts influence behavior, from the cognitive perspective, what goes on inside the organism is crucial for understanding behavior. In this chapter we will examine the historical roots of cognitive psychology, discussing its most important principles to give you a clear sense of how the field looks at human behavior. In doing so, I will organize the discussion around an information-processing model that assumes that our behavior is a function of the quality of our information, and how adequately we process it. This information-processing model has been very productive in cognitive psychology and has delivered wonderfully intriguing insights about how our minds work. But as you will see, I also believe that there is a real danger in accepting the implications of this research at face value, a danger that can exacerbate an already serious split we experience between planet and self. Thus, I will conclude this chapter by analyzing this danger and suggesting ways to circumvent it. But let us begin by exploring how cognitive psychology came to be so important in the field of psychology in the first place. In a few short decades, the cognitivist viewpoint has replaced the behavioral one as the dominant viewpoint in psychology. 190 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING The Cognitive Revolution For many years, especially the 1950s and 1960s, behaviorism eclipsed all other approaches and schools in psychology. By the middle of the 20th century, behaviorists filled the halls of the country's most important academic institutions, and most psychologists believed that the behavioral approach would only continue to rise in importance. Behaviorism was so successful that to even speak about thinking, memory, mind, consciousness, or reasoning would signify one's allegiance to outdated and unscientific mentalism. The American Zeitgeist of the 1950s help bolster behaviorism's reign. In the U.S., public dismay over the Soviet Union's successful launch of the first space satellite helped fuel the "space race," making science and technology a matter of national priority. As psychology's most ultrascientific approach, behaviorism continued to have widespread appeal. Psychologists interested in mental events certainly continued their work, but were increasingly forced to a second-tier position beneath their behaviorist colleagues. Several important trends in the 1960s, however, coalesced to weaken and finally overthrow the dominance of behaviorism, so that people often say that a revolution took place. That revolution, the cognitive revolution, put mind—or inner psychological events that cannot be directly observed—back on top again. In the 1990s, the ascendant force in psychology is the cognitive viewpoint, principles of which have spread to influence most other areas of psychology, such as child and school psychology, psychotherapy and counseling psychology, industrial and organizational psychology, and, especially, social psychology. (In that field, which was discussed in Chapter 3, Lewin's legacy ensured that mental life would be considered an appropriate matter for understanding the social behavior of the individual. The cognitive revolution simply re-emphasized that point.) But why would mainstream psychology shift its direction so radically in just a few short decades? Whereas in the 1950s the majority of academic psychologists were doing experiments on laboratory animals, counting responses in mazes and Skinner boxes, the 1980s saw the majority working with human beings again, theorizing about what went on between those human ears that would account for their observable behaviors. Although it is much messier and riskier to test hypotheses about inner events of human beings (which cannot be directly observed, much less controlled) most scientific psychologists are back to it with the enthusiasm of Wundt, if not with his methods. What could account for this swift change? Is it, as many behaviorists insist, a regression to an earlier, less scientific psychology? Whenever a major shift occurs in an academic discipline, many factors both within and outside the academy play a role. Within the field, the number of psychologists quickly grew as the affluence of post-war America allowed many people to pursue their interests in a flourishing discipline. Increasing THE COGNITIVE REVOLUTION 191 numbers brought increasing diversity of thinkers, who chafed against the elegantly reasoned behaviorism.1 For example, many young psychologists became persuaded of behaviorism's limits when a psycholinguist at MIT attacked it vigorously for its difficulty in explaining children's language development. Noam Chomsky argued effectively that children could not possibly learn language by the S-R mechanisms of operant conditioning because their learning is far too quick and ordered.2 That is, children's language errors show that they inherit the capacity to learn language by grammatical rules, allowing the child to produce statements that have never been reinforced. Children use rules efficiently, if not always correctly. When a child says "there are some mouses," she demonstrates an understanding of the rule "make a plural by adding 'es' " even though she never before heard or was reinforced for saying "mouses." Although Skinner's work on verbal learning, which had been written without access to Chomsky's attack, was a plausible answer to Chomsky,3 it was not seen as entirely successful by many psychologists. The publication of both works in 1957 was a case of bad timing since it gave the appearance that Skinner was unable to reply convincingly to Chomsky's assault. Chomsky's role also illustrated the importance of outside factors in debilitating the dominance of behaviorism. The late 1960s and early 1970s were tumultuous times in the country as well as the universities: the antiwar movement, of which Chomsky was a vigorous spokesperson, converged with social unrest over racism, poverty, sexism, and environmental problems to challenge the status quo both inside and outside academia. Beyond the rarefied atmosphere of the laboratory, behaviorism looked ill-equipped to answer the concerns of student and public social movements. Also, behaviorism contained an implicit ideology of control. Running animal subjects under tightly controlled laboratory conditions seemed artificial, if not distasteful. Skinner's arguments for better control of human social institutions directly opposed the romanticism that was surfacing in the larger culture, romanticism that celebrated human freedom and liberation from the dehumanizing institutions of the military, the corporation, and the "military-industrial complex". Despite Skinner's continued attempts to apply behavioral principles to social problems until his death in 1990, behaviorism's tenets began to appear increasingly brittle in the context of widespread social turmoil. These societal issues drew many people to social psychology, whose participants were always more cognitively and less behaviorally oriented than their colleagues in other subfields. In so doing, the role of human meaning and mental life assumed importance again. Atkinson, R. L., Atkinson, R. C, Smith, E. E., and Bern, D. J., Introduction to Psychology, 11th ed. (Fort Worth, TX: Harcourt Brace Jovanovich College Publishers, 1993), p. A-12. 2Chomsky, N., Syntactic Structures (The Hague: Mouton, 1957). 3Skinner, B. F., Verbal Behavior (New York: Appleton-Century-Crofts, 1957). 192 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING The Scientific Study of Thinking: Jean Piaget Even though behaviorism reigned through the midpart of the century, an important minority voice was expressed through the brilliant work of Jean Piaget (1896-1980). Piaget was a Swiss psychologist who never worked in the United States and never became influenced by the laboratory-based behavioral psychology that was so popular during his lifetime. Instead Piaget studied children, children at play, children at home, children doing what they do in ordinary settings. In his attempt to integrate biology, genetics, and philosophy, he formulated an enormously influential theory of cognitive development and became child psychology's most important theorist. His views about cognitive functioning had an important impact on how American psychologists looked at thinking, and his contributions helped lay the groundwork for the cognitive revolution. Piaget developed most of his ideas from interacting with his own children, making what has been called "amazing discoveries ... a host of fascinating, hardy phenomena which were under everybody's nose but which few were talented enough to see."4 For example, take the game of peek-a-boo, which most of us have played with a baby at some time or another. Piaget noticed, through painstaking analysis, that babies younger than about 9 months do not understand disappearing objects. As soon as you hide something (such as your face) behind a screen or a handkerchief, the young infant loses interest. But eventually, somewhere in the second 6 months of life, the baby would look for the occluded object, expressing surprise and laughter when it suddenly reappears. From these observations, Piaget theorized that as the infant matures, it develops a sense of object constancy, a belief that the object exists even though it cannot be seen. Like the object-relations theorists whom we discussed in Chapter 4, Piaget believed that the sense of object constancy is the pivotal achievement of the first stage of life, what he called the sensorimotor period (birth to 2 years). Through practicing motor actions, the child learns to internalize images. The child plays peek-a-boo because she or he has developed the ability to store images (like your face). This accomplishment lays the basis for all of thinking, for without the ability to internalize images, we could not do cognitive work "in our heads." Piaget went on to outline several more stages, all leading to the eventual capacity to think about and solve problems with less and less sensorimotor information and more internalized cognitive information. The abil- 'Jerome Kagan, himself a widely renown child psychologist, made this remark about Piaget, as cited by Hunt, M., The Story of Psychology (New York: Doubleday, 1993), p. 354. THE SCIENTIFIC STUDY OF THINKING: JEAN PIAGET 193 ity to deal in more abstract terms accompanies what Piaget called the process of decentration, the ability to take in more information in the formation of a concept. We can understand Piaget's meaning of decentration by examining his most famous demonstration of cognitive development, the conservation experiment. The conservation experiment tests the child's ability to recognize something is not changed, even though its physical appearance is different. Imagine that a young child watches you pour milk into two identical glasses. You keep evening out the amount of milk until the child agrees that both glasses have the same amount. Then you pour one of the glasses into a taller, but narrower glass. You ask the child if both glasses have the same amount of milk. A child in the preoperational stage (2 to 7 years, roughly) answers that the taller glass has more because the milk is higher. Piaget believed this answer comes from the child's inability to keep track of more than one operation: pouring the milk into the skinnier glass created a higher surface level (one operation), but also a narrower width to the milk (another operation). Preoperational children focus only on one feature of the problem: the higher level of milk. If they could decentrate, they would realize that both the height and the width of the milk has changed, canceling out their effects on the total volume of milk, and resulting in a conservation of total volume. Piaget also observed that children in this age group endorse the concept of animism, the idea that everything, even nonliving objects such as a chair or a house, is alive. According to Piaget, cognitive development will ensure that the child outgrows animism as she or he cognitively matures. Piaget's ideas about decentration and animism will be discussed in Chapter 7 as we consider the concept of identification with the ecological world. But for now, the important point is that Piaget's theory of cognitive development is based on the idea of operations. An operation is a rule that is applied to a problem. Notice the similarity of Piaget's thinking to that of Chomsky's—both emphasize that rules are used to arrive at decisions. This rule-based approach to thinking quickly became a dominant feature of cognitive psychology. It got momentum from other work that had been going on in the military during and after the Second World War on human perceptual problems. Studying how human beings detect signals, and how they respond to complicated machinery (such as an airplane display board when an enemy plane is sighted) engineering psychologists had laid out theory on how humans make quick response decisions. This way of looking at thinking as the application of rules provided a centrally important feature of the early cognitive psychology.5 5Lachman, R., Lachman, J. L., and Butterfield, E. C, Cognitive Psychology and Information Processing: An Introduction (Hillsdale, NJ: Lawrence Erlbaum Associates, 1979). 194 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING When All You Have Is a Computer, Everything Looks Like Information The work of Chomsky, Piaget, and the engineering psychologists laid important groundwork for the fall of behaviorism. Of all the pressures on behaviorism, however, the most important one was the creation of the computer. Computers were developed directly from the high-speed calculating machines designed for military purposes during World War II, so that decisions could be made about where best to move troops and supplies. The first electronic computer was built in 1946 (called ENIAC, at Pennsylvania State University); computers quickly emerged as a principal tool for processing information in the postwar period. In 1948 a mathematician named John von Neumann and a neurophysiologist named Warren McCulloch attended a conference at the California Institute of Technology and presented the idea that computers could be compared to human brains.6 The computer revolution in psychology had begun. The metaphor of the brain as a computer had instant appeal. Both seem to depend on digital events: the firing of a neuron and the firing of a bit are on/off occurrences. More importantly, both the brain and the computer "process information": data are fed in, transformed according to some rules or decisions, and outputted. The mind is assumed to work like a computer program: both implement a series of actions that depend on the outcome of preceding operations. In other words, both mind and computers are sophisticated calculating machines that can "decide" what to do next based on outcomes that have previously occurred. Complicated but orderly decision sequences allow both the machine and the mind to behave intelligently. Computer scientists such as Herbert Simon and Allen Newell soon set out to develop computer programs that could think, that would perform logical work like the kind humans do, which if you are a mathematician, meant proving theorems. Soon the first artificial intelligence programs were developed that could prove logical theorems in about the same manner and at about the same temporal rate that humans do. The field of artificial intelligence demonstrated that, at least with some types of problems, the human mind did operate like a computer program. Work in artificial intelligence continued to grow, and by the late 1970s, cognitive psychologists increasingly saw psychology as the science of information processing. As historian of psychology Morton Hunt has noted, Herbert Simon (and others) believed "the Von Neumann, J., and McCulloch, W., Paper presented at California Institute of Technology conference "Cerebral mechanisms in behavior" (1948). Cited by Hunt, M., ibid., p. 514. INFORMATION PROCESSING: THE CONSTRAINTS OF GIGO 195 computer [to be] as important for psychology as the microscope had been for biology; . . . other enthusiasts said the human mind and the computer were two species of the genus information-processing system.' "7 Thus, from an information processing point of view, we humans input information, run various operations on it, and then act on the basis of our program outcomes. With this model, the unobservable "machinations" of the mind could now be studied by comparing them to computer programs. The fact that the mind is unobservable was no longer problematic: cognitive psychologists could infer the program by studying the observable input and the output. Seeing the mind as a machine fits comfortably with the Western world-view that we described in Chapter 2: the mind, like the world, is rational, knowable, and predictable; all we have to do is figure out its orderly workings. Information Processing: The Constraints of GIGO There is a familiar saying in computer science: garbage in, garbage out (GIGO). No computer can do a good job if the incoming information is faulty. If our minds are like gigantic computer programs, our behavior is dependent on the accuracy of the information on which our programs operate. If the information is limited or distorted, our behavior will likely be inappropriate. One way to understand our continued environmentally destructive behaviors, then, is to see them as outcomes of faulty information. Among other problems, information can be inadequate because it is wrong, because it is limited, or because it is irrelevant. We will look at these three types of information problems in turn. Wrong Information Obviously GIGO can result from bad information. Good decision making requires accurate knowledge, but information that appears accurate in the present can later be discovered to be inaccurate. For example, since the Second World War, the Atomic Energy Commission has continually lowered the maximum permissible radiation doses for both nuclear plant workers and the general public.8 Early nuclear tests were conducted without adequate protection to workers in part because officials mistakenly believed that small and moderate doses were not harmful. Similarly, heated debates about logging in the Pacific Northwest hinge on widely varying es- 7Hunt, M. ibid., p. 540. 8Gerber, M. S., On the Home Front: The Cold War Legacy of the Hanf ord Nuclear Site. Chapter 4: "Radiobiology: The Learning Curve" (Lincoln, NB. University of Nebraska Press, 1992), pp. 171-200. 196 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING timates of how much old growth timber is left, as well as the future sus-tainability of tree plantations that replace ancient forests. Because good forest management depends on good information, the Clinton administration in 1993 funded the Eastside Ecosystem Management Project, a comprehensive scientific study of the Columbia Basin region, so that management policy will be based on correct information. Similarly, Kai Lee has urged that environmental policy be derived from "civic science . . . irre-ducibly public in the way responsibilities are exercised, intrinsically technical, and open to learning."9 No information is perfect, but a commitment to improving information and revising environmental decisions accordingly is an important principle of good policy. Limited Information There is good reason to believe that our information is inadequate because it is limited. First of all, we are limited by the hard wiring of our sensory apparati. We see only a tiny range of the entire spectrum of electromagnetic radiation, namely wavelengths between approximately 400 and 700 nanometers, which we call light; but the continuum of electromagnetic energy extends from short cosmic rays of 4 trillionths of a centimeter, to long radio waves, up to several miles. Thus, we are blind to the vast majority of this information: "instead of experiencing the world as it is, people experience only about one trillionth of outside events: a small world indeed!"10 Furthermore, the vast majority of us who are not visually impaired are visual-dependent. Our sight mechanism uses a greater part of our cortical brain than do our other senses—hearing, smell, touch, or taste—leading us to rely more heavily on visual information than any other kind. As researchers Ornstein and Ehrlich have noted, tree-dwelling . . . made it inevitable that human beings would become predominantly "sight animals" rather than "smell" or "taste" animals. This sensory emphasis on sight has many consequences in today's world. We notice the "visual pollution" of litter much more readily than we do carcinogens in automobile exhausts, potentially deadly chemicals in drinking water, or toxic contaminants in cooking oil.11 If we cannot see something we are not likely to find it important. This visual dependency makes it difficult to respond to ozone depletion or global BLee, K. N., Compass and Gyroscope: Integrating Science and Politics for the Environment (Washington, DC: Island Press, 1993), p. 161. 10Omstein, R., and Ehrlich, P., New World, New Mind: Moving Toward Conscious Evolution (New York: Simon and Schuster, 1989), p. 73. uibid., p. 21. INFORMATION PROCESSING: THE CONSTRAINTS OF GIGO 197 warming. Since we cannot directly see greenhouse gases or chlorofluoro-carbons (CFCs), it is less likely that we will notice their significance, or keep their importance paramount in our thinking. On the other hand, the invisibility of some hazards contributes to our fears, as we will see later in this chapter. Experiencing (even low-level) anxiety but not acting on it helps maintain a split between planet and self. I certainly was not convinced of the toxic effects of cleaning chemicals and industrial pollutants, for example, until I got severely sick with a liver disease about 5 years ago. My immune system was very weakened, and for several months during my recuperation, I was able to tell if a room I walked into had been cleaned with chemical solvents because I felt dizzy and weak within a few minutes. I have never been sensitive to such chemicals before or since, so I tend not to think about them anymore. But I can remember at the time being appalled at the general sea of invisible toxins to which we are unconsciously subjected without our permission. I can remember thinking then that those wacky environmentalists who are righting overuse of chemicals and pesticides have a darn good point, one that I had never "seen" before. Most institutions concerned about public opinion employ the principle of visual dependency. Using the principle of "out of sight, out of mind," for example, the U.S. Forest Service, in its published guidelines for forest management, officially sanctions sets of cosmetic strips, that is, intact forests directly bordering public highways. The USFS calls these strips "viewsheds" and they are maintained in order that the public not be overly concerned about clear-cutting, since clear cuts are ugly and tend to arouse public reaction. In published planning documents for each National Forest, the size and placement of viewsheds are explicitly specified. Viewsheds are defined as Areas (viewsheds) with high visual sensitivity (as seen from selected travel routes, developed use areas, or water bodies), manage to attain and perpetuate an attractive, natural-appearing landscape. Timber is managed on a scheduled basis and used to develop a large tree appearance and vertical diversity. Uneven-aged management is emphasized.12 Because so much clear-cutting has been intentionally hidden from public view, a Spokane, Washington, environmental group (called "Lighthawk") offers legislators free flights over the Pacific Northwest so they can see the extent of clear-cutting before voting on forest management questions. And in my own experience, I was not very concerned 12U.S. Department of Agriculture Forest Service, Summary: Final Environmental Impact Statement, Umatilla National Forest (Portland, OR: Forest Service, 1990), p. S-29. 198 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING about deforestation, until I saw huge patches of shaved forests in the Cascades as I drove back and forth to Seattle from my home in eastern Washington. Visual dependency is a powerful principle of our information-processing system, a principle that has been exploited by all sides of environmental debates. Sometimes, however, our reliance on vision can backfire. Consider the "owls versus jobs" controversy, which the press has claimed signifies the debate about the forests of the Pacific Northwest. The debate is an example of the importance of how we frame questions. Spotted owls are believed to depend on old-growth forests, that is, forests that have not been previously cut and replanted. Old-growth forests are more complex ecosystems than are replanted forests, so spotted owls serve as an indicator species: their presence signifies the health of a complex system of interdependent species and habitat. Thus, from an ecological point of view, spotted owls are important not just because of the species itself, but because they serve as a measuring rod of the health of many other species that occupy the same rich environmental niche. Spotted owls were chosen to serve as the "canary of the mine shaft," a signal species whose living existence demonstrates the ability of other species to survive. But any number of other species could have been selected, from the red-backed vole to the mycorrhizal fungi.13 The only available legal means for saving the last old-growth forest was for environmentalists to argue that the 1972 Endangered Species Act protects the owl from extinction. Environmentalists were smart to choose the owl: the owl is a strong visual image and a lot cuter than a vole or a fungus. Its large eyes and small nose comprise what physiologists call a "neotenic face," meaning a face that approaches the proportions of a baby's face. We have a built in, genetically hard wired response to neotenic faces14—we find them "cute." We like them, and we feel protective of them. But the owl's visual appeal has also backfired. Because the visual image of the owl is so strong, it has been difficult for environmentalists to remind the public that the owl serves only as an indicator species. When the debate is framed as "owls versus jobs" by a headline-hungry press, families whose livelihoods depend on the timber industry quite rightly ask how environmentalists could possibly think that saving some owls would be more important than feeding their children. Because of the owls' visual power, it is difficult for the public to remember that the owl signifies an entire forest ecosystem, comprising hundreds, perhaps thousands 13Maser, C, The Redesigned Forest (San Pedro, CA: R. & E. Miles, 1988). I am grateful to Shirley Muse for helping me with this reference. 14Omstein and Ehrlich, ibid., pp. 83-84. INFORMATION PROCESSING: THE CONSTRAINTS OF GIGO 199 Source: © 1989 Time Inc. Reprinted by permission of species, on whose healthy functioning humans are also dependent. An endangered ecosystem is the issue, and some Congresspeople are working to revise the Endangered Species Act to an Endangered Ecosystem Act. Analogously, environmental philosopher J. Baird Callicott has articulated Aldo Leopold's assertion that we must learn to develop a "land aesthetic" that goes beyond our naive visual dominance. Our uneducated reliance on vision leads us to value wilderness only when it is pretty. Instead we must learn to perceive much more complex ecosystems, even if such perception requires seeing beyond prettiness. Such an aesthetic will require education: 200 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING The land aesthetic is sophisticated and cognitive, not naive and hedonic; it delineates a refined taste in natural environments and a cultivated natural sensibility. The basis of such refinement or cultivation is natural history, and more especially, evolutionär)' and ecological biology. . . . The beauty of a bog is a function of the palpable organization and closure of the interconnected living components. . . . Thus ... an autonomous natural aesthetic must free itself from the prevailing visual bias.15 In addition to the limitations imposed by our visual dominance, two more principles of our perceptual system are likely to give us GIGO. The first is the principle of selective attention. Even though we perceive an extremely small range of electromagnetic information, if we were to notice everything in that range, our experience of the world would be chaotic—a "buzzing, blooming confusion" to use William James' well-known phrase. Instead, our perception is quite selective. In order to make sense of the world, we must relegate large portions of it to "ground" as we focus on some "figure." We unconsciously make these decisions all the time. I do not notice my left foot stretching in front of my right as I am walking—instead I am concentrating on where I am going. Similarly, I do not notice the fossil fuel burning in my gas tank as I drive, the electric power feeding my computer as I write, or the trees that have been used to build the room in which I sit. We are cognitive misers, delegating attention only to those items that need it, and tuning out the rest. In Chapter 3, we discussed the role of motivated selective attention— e.g., defense mechanisms—which enables us to avoid potentially anxiety-provoking stimuli. But even without the role of anxiety reduction, we do not notice these and many other features of our environment because of a second principle of our nervous system. Because of its hard wiring, habituation occurs if stimuli do not change. Our nervous systems are built to signal changes in our environment, rather than constancies. Stimuli that do not change quickly lose their ability to fire neurons in our nervous system; consequently, situational features that remain the same fade from our awareness whereas those that change too slowly never reach our awareness at all. Like the frog who will jump out of a pot of very hot water if suddenly thrown in, but will allow itself to be boiled to death if placed in a very slowly heating pot, we humans will endure quite noxious environmental events if they are introduced gradually enough. The smog level of Los Angeles is a good example of the role of habituation. As researchers Ornstein and Ehrlich put it lsCallicott, J. B., "The land aesthetic," in Chappie, K. C, and Tucker, M. E., eds., Ecological Prospects: Scientific, Religious, and Aesthetic Perspectives (Albany: State University of New York Press, 1994), pp. 178, 181. INFORMATION PROCESSING: THE CONSTRAINTS OF GIGO 201 a visitor to the LA. basin, arriving on a smoggy day, is often immediately appalled by the quality of air he or she is expected to breathe. But, as with many other constant phenomena, the locals hardly notice. A few years ago one of us arrived at John Wayne Airport in Orange County in the early evening to give a lecture. Every streetlight was surrounded by a halo of smog, and his eyes immediately began watering profusely. As a visitor from the (relatively) smog-free San Francisco area, he felt obliged to kid his host: "Well, at least we have a nice clear night for the lecture." His host's serious response: "Yeah—you should have been here a couple of weeks ago. We had a lot of smog then."16 These two processes—selective attention and habituation—are crucially important in producing the GIGO problem that many of us blame on the media. Most of us are poorly informed about pressing environmental problems, having only glimpses of fleeting stories quickly taken up and then dropped by the press, television, and radio. But media officials defend their poor coverage by pointing out that the public has a very limited attention span, and will not attend to stories that do not change. The same old bad news—about population growth, about global warming, about resource depletion—does not sell papers or retain viewers. If there is nothing new, we will not pay attention. Yet most of us need far more intricate information than we are currently presented. Brief headlines that oversimplify issues ("owls vs. jobs" is a classic example) jeopardize our ability to make sophisticated decisions about complicated issues. When National Public Radio covered the confirmation hearings of Supreme Court Judge Breyer, Nina Tottenberg demonstrated this problem. She described the Senate Judicial Committee's questions inquiring into his previous rulings against environmental regulations as "arcane and esoteric." Breyer's pro-business, anti-environmental record seemed to me the only interesting element of the otherwise rubber-stamped Congressional hearings, yet NPR (as did other media) failed to address the environmental implications of his appointment. No wonder the public has a similar difficulty. Irrelevant Information Our information is often limited, but just as often, we have the opposite problem: too much information produces GIGO, especially if the information confuses us. Unfortunately, many of our reasoning difficulties come from the use of irrelevant information. If problems are presented simply enough, we can usually come up with an appropriate answer, but unfortunately, life is usually not very simple. More often, we get distracted by irrelevant information. For example, consider the following set of statements 16Ornstein and Ehrlich, ibid., p. 76. 202 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING (based on a problem that Henri Zukier17 presented to his laboratory subjects): 1. Assuming you have access to sunlight, heating a house with passive solar heat costs 30 to 40% less over the lifetime of the house than heating with conventional systems, such as electricity, gas, coal, or oil. If you want to save money on heating, which system should you choose? 2. You are planning your dream home, which you plan to live in for the rest of your life. Assuming you have access to sunlight, heating a house with passive solar heat costs 30 to 40% less over the lifetime of the house than heating with conventional systems, such as electricity, gas, coal, or oil. Passive solar systems add 5 to 10% more to the construction costs. Some people believe that rooftop solar collectors are ugly and detract from the architectural design of a structure. In order to get the maximum efficiency from your passive system, you have to open and shut windows and shades to regulate heat distribution, although this could be accomplished by an inexpensive computer. Your access to solar energy could be disrupted if someone decided to build an interfering structure, since right now there are no laws to guarantee owner access to sunlight (such legislation has been opposed by builders of high-density developments). If you want to save money on heating, which system should you choose? The irrelevant information (irrelevant to the question of saving money) contained in the second problem is likely to distract readers from the crucial sentence "heating a house with passive solar heat costs 30 to 40% less over the lifetime of the house than heating with conventional systems" and confuse the issue. Since most environmental questions are complex and involve many different considerations, it is difficult not to get waylaid. The public's tendency to use irrelevant information has recently become a concern among policymakers whose job it is to convince the public to reduce energy use. Most energy-conservation programs are designed with the assumption that the public is primarily interested in saving money and will act rationally to do so. This assumption has been called the "rational-economic model." According to environmental psychologist Paul Stern, the rational-economic model rests on the "underlying behavioral assumption that technologies that will, over their useful life, save their owners and operators money will be adopted once the owners become aware of the benefits."18 17Zukier, H., "The dilution effect: The role of the correlation and the dispersion of predictor variables in the use of nondiagnostic information," Journal of Personality and Social Psychology, 43 (1982): 1163-1174. 18Stern, P. C, "What psychology knows about energy' conservation," American Psychologist, 47 (10) (1992): 1224-1232. p. 1224. INFORMATION PROCESSING: THE CONSTRAINTS OF GIGO 203 But research shows that instead of using a purely "rational-economic model" for decisions about conservation issues, most people rely on a "folk model" which looks irrational to energy experts. In the words of social psychologist Costanzo and his colleagues: The "folk model" typically used by individual consumers calculates current dollar savings as compared to preadoption expenditures and fails to reveal that the initial cost of the investment is paid back faster because of rising fuel prices. Thus, folk calculations based on naive and "irrational" assumptions cause consumers to make fewer energy-saving investments than an expert analysis would recommend. In addition, a variety of non-economic factors (e.g., style, status, performance, safety, comfort, and convenience) influence decision making and contribute to the apparent irrationality of conservation behavior.19 These additional dimensions are important to consumers, often more important than price. Thus, the concept of irrelevant information suggests the question "irrelevant to whom?" We will return to this question at the end of this chapter when we discuss risk assessment. For now, let me simply make the point that when experts (either policy analysts or cognitive psychologists) define a problem, they typically do it in narrower terms than laypersons do. Advertisers often employ irrelevant information to increase the desirability of their products. The information printed on a plastic bag I recently received from a shopkeeper demonstrates this tendency see Figure 6.1. Figure 6.1 Environmentally Compatible Packaging Nontoxic When Incinerated Nonleaching in Landfills All Inks Meet Federal Consumer Product Safety Act Regulation 1303 Recyclable "because we care. . . ." 19Costanzo, M., Archer, D., Aronson, E., and Pettigrew, T., "Energy conservation behavior: The difficult path from information to action," American Psychologist, 41 (5) (1986): 521-528, p. 525. 204 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING Ink regulation and incineration are irrelevant information, since plastic bags in landfills are rarely burned at high enough temperatures to be nontoxic. Claiming the product is "recyclable" borders on the fraudulent, since at present less than 1 percent of plastic bags are recycled at all, and this company neither recycles nor uses recycled bags. "Reusable" would have been a more appropriate term. Inaccurate and irrelevant information, however, is displayed in an attempt to make this company appear environmentally conscious. But we do not need the efforts of advertising to become confused by irrelevant information. More distressingly, there are two ways in which we actively pursue irrelevant information. One is called the confirmation bias. When testing our hunches against incoming data, we make the egotistical mistake of looking for confirming information rather than discon-firming information. For example, P. C. Wason presented his subjects the number sequence 2, 4, 6, and asked them to discover the correct rule that described the pattern (in this case the correct rule is "any three increasing numbers").20 In order to discover the rule, they could generate other three-number sequences and ask the experimenter whether or not their new sequences fit the rule. When the subjects thought they had discerned the rule, they tried to name it. By testing out additional examples, most people confidently named a wrong rule (usually "add two to the previous number") because the examples they generated were meant to confirm their hypothesis rather than disconfirm it. Unless you actively seek a discon-firmatory example, you are unlikely to discover the correct rule. Try this little experiment on a few friends and you will see the principle more clearly. Seeking lots of confirmatory information feels good, but is not very useful. We generally do not like to experience disconfirmations, so we do not seek them. Consequently, we tend to read material that confirms our views [I subscribe to High Country News, Utne Reader, and The Nation, rather than The National Review and Our Land (a Wise Use movement publication]. I have also had to force myself to ask colleagues whose opinions I know are different from mine to read drafts of this book. Their reactions are much more valuable, though not as comfortable, as responses from colleagues who already agree with me. In addition to using confirmation biases, we tend to seek out irrelevant information out of our need to believe that we have some control over our world. We like to think that our actions have impact, and we are prone to 20Wason, P. C, "On the failure to eliminate hypotheses in a conceptual task," Quarterly Journal of Experimental Psychology, 12 (1960): 129-140. INFORMATION PROCESSING: THE CONSTRAINTS OF GIGO 205 over-interpret our behavior; hence, we overinterpret random events in terms of illusory correlations. Some people would consider the field of astrology, for example, and its data on planet positions, as useless information; completely chance or random events incorrectly interpreted by human beings seeking meaning. In the arena of environmental concerns, the principle of regression toward the mean leads us to interpret essentially random events as meaningfully related to some human action. Here is how it works: chance events fall on a normal curve, with extreme occurrences far less likely than more typical ones. Because extreme events are rare, the next event is likely to be less extreme, simply due to chance. For example, extremely hot days are more likely to be followed by cooler ones than by hot ones, simply by chance alone. But human beings, looking for meaning, are prone to explain occurrences in terms of human actions. In the words of environmental educator Kai Lee: The significance of regression artifacts in environmental science may seem less important than in the allegedly "softer" social sciences, until we recall that environmental remedies are applied precisely because some aspect of the environment is in an extreme state. Regression to the mean predicts, for example, that after a species has been declared endangered it will tend to become more abundant. This is not an effect at all, but a reflection of the fact that the human decision to declare a population in bad trouble is based upon its being in extremis. To the extent that that condition is caused by a variety of factors—as is virtually always the case in the natural setting—some of them will fluctuate in the next year, and the fluctuations will on average tend to bring the population up. In the early years of the Columbia Basin program, before any of the rehabilitation measures could be carried out, there was a resurgence of salmon populations from the historic lows of the late 1970s, when the Northwest Power Act was passed. It took a special effort of political will not to take credit for this change, even though there was as yet no cause to which such an effect could be attributed.21 In other words, because human beings are prone to make up explanations for random events, we can easily misinterpret our actions as causing something that is happening by chance alone. Failure to recognize regression toward the mean is at the center of scientific debates about global warming. Are recent temperature rises due to human-caused increases in carbon dioxide, or are they part of a randomly occurring pattern of temperature changes? At present, scientists disagree on which explanation is most likely. 21Lee,K.N., ibid., pp. 71-72. 206 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING Information Processing: The Constraints of Faulty Programs GIGO is not our only information-processing problem. Even when we do have comprehensive, relevant, and accurate information, we often have difficulty reasoning well. Cognitive psychologists have shown that what we do with information is often problematic because we commonly make several types of processing errors: we overuse our preconceptions and expectations, we employ illogical reasoning strategies, and our quantitative literacy is often limited. Let us look at each of these kinds of errors. Preconceptions and Expectations Much as we might want to appear unbiased, total open-mindedness is not only impossible, but also unintelligent. Our learning and experience give us a framework for how to interpret information and situations, so that meaning is dependent on our pre-existing beliefs. The world would be totally chaotic without some pre-existing biases, or to paraphrase best-selling educator Alan Bloom, too much open-mindedness would make our brains spill out!22 But while pre-existing beliefs are necessary, they can, of course, also get in our way. Cognitive psychologists have shown how prone we are to be fooled by our biases and expectations. For example, a classic study by Adelbert Ames demonstrated how easy it is to perceive a full grown adult as smaller than a nursery school child, if both are placed in a trapezoidal room. Even when we are told that the room is trapezoidal, we perceive it as rectangular because we expect it to be rectangular (Fig. 6.2). Similarly when social psychologists showed a film segment of an incident in Lebanon involving the killing of civilian refugees, both pro-Israeli and pro-Arab audiences saw the portrayal as biased against their side.23 And football fans have been shown to interpret the same call differently depending on its outcome for their team.24 Unsurprisingly, pre-existing biases play an important role in debates about environmental issues by potently affecting our perception and interpretation of an event. Consider how two different people interpreted the 22Bloom, A., The Closing of the American Mind: How Higher Education Has Failed Democracy and Impoverished the Souls of Today's Students (New York: Simon and Schuster, 1987). 23Vallone, R. P., Ross, L., and Lepper, M.R., "The hostile media phenomenon: Biased perception and perceptions of media bias in coverage of the 'Beirut Massacre,' "Journal of Personality and Social Psychology, 49 (1985): 577-585. 24Hastorf A. H., and Cantril, H., "They saw a game: A case study," Journal of Abnormal and Social Psychology, 49 (1954): 129-134. INFORMATION PROCESSING: THE CONSTRAINTS OF FAULTY PROGRAMS 207 Figure 6.2. Misperception of size is due to the assumption of a regular room. The room is trapezoidal in shape; the boy is actually much closer than the adult. JřVtn-^a.. ->ry • ..íMSäjéSéL Source: Copyright Baron Wolman 1982. same traffic accident on 1-95 in Springfield, Massachusetts, involving a tractor-trailer carrying 11,000 pounds of radioactive uranium that overturned and burned: A representative of the antinuclear group Nuclear Information and Resource Service [said] that "People should be plenty concerned," since the accident signaled more trouble in the future: "Accidents happen at the same rate to nuclear shipments as for all other shipments—one per every 150,000 miles the truck travels." In contrast, a representative for the U.S. Council for Energy Awareness, which is supported by the nuclear industry, took the accident as a signal of assurance: "The system works," he said. "We had an accident including fire and there was no release of radioactivity."25 Given humans' proclivity for interpreting events to fit their pre-existing biases, it is no wonder that discourse about environmental problems is often divided, if not derisive. 25Cvetkovich, G., and Earle, T., "Environmental hazards and the public," Journal of Social Issues, 48 (1992): 1-20, p. 8. 208 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING Heuristics as Illogical Reasoning Strategies In order to think about and comprehend a complex world, people depend on reasoning heuristics. A heuristic is any reasoning device that helps us to think quickly and efficiently, like a rule of thumb that allows us to function effectively because it usually works. On my campus, male administrators and faculty can usually be recognized from a distance because the former wear coats and ties, but the latter do not. But heuristics do not always work: they often lead us into wrong snap judgments. For example, consider the following problem, which I have formulated on the basis of cognitive psychologists Tversky and Kahneman's now classic experiment26: John is a 31-year-old white male, single, outspoken, and very committed to environmental issues. He and his friends have demonstrated in many confrontational protests over logging, mining, and land-use operations. Which statement is more likely: 1. John is a bank teller 2. John is a bank teller and a member of EarthFirst! Most people think statement 2 is more likely because of what cognitive psychologists call a representativeness heuristic. Because the description fits the stereotype of an EarthFirstler, most people choose statement 2. But the conjunction of two events can never be more likely than either event alone. That is, any one event occurs more frequently alone than it occurs with another event. Thus, statement 1 is more likely. Representativeness heuristics lead us to reason poorly, and our language often implicates our poor reasoning. The frequency with which conservatives refer to environmentalists as "radical environmentalists," for example, suggests that their mental representation of an environmentalist does not include room for moderates or conservatives. Representativeness heuristics can be created by one vivid experience, which produces a memory so strong that only it is available when we encounter that category. When this happens, it is called an availability heuristic. For example, recently I had the pleasure of meeting the parents of a student at my college in a downtown cafe. She introduced me to her parents, who were very pleasant, and we got to talking about a recent lecture we had all attended given by Dennis Hayes, founder of Earth Day. I mentioned that I had just written about much of the same material in the first chapter of this book (the earth's limited carrying capacity, the problems of overpopulation in the Third World, and overconsumption by the 26TVersky, A., and Kahneman, D., "Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment," Psychological Review, 90 (1983): 293-315. INFORMATION PROCESSING: THE CONSTRAINTS OF FAULTY PROGRAMS 209 industrialized countries). Her father expressed an enthusiastic interest in my book, and asked me to let him know when it comes out as he wanted very much to read it. As he handed me his card, I remember swallowing my shock: he is a high-ranking officer of the Weyerhaeuser Company, the largest timber company in the country. I found it surprising that he would be so pleasant and so genuinely interested in reading my book. Why? Because I also, of course, rely, on my faulty mental images. When confronted with new information, I compare it to a mental image I have already formulated about that category. My mental representation of Weyerhaeuser officials was based on one presentation by a forest management executive 5 years ago, who I experienced as being rigidly technocratic, since his lecture was filled with endless tables, which he used to try to prove that Weyerhaeuser engages in sustainable forestry. I was unconvinced by his numbers and by the way he had handled questions from the audience. Consequently, I believed that all Weyerhaeuser officials would have the same rigid and technocratic manner. I formed an entire mental category from one vivid example, and used it to interpret new information—an availability heuristic. I was amazed by the warmth and interest of this new person with whom I was lucky enough to chat before I could apply my faulty availability heuristic. As David Myers describes in his social psychology textbook "people are slow to deduce particular instances from a general truth but are remarkably quick to infer general truths from a vivid instance."27 It is possible, however, to use the availability heuristic positively, which Myers points out was done by researcher Marti Gonzales and her colleagues when persuading people to sign up for financing energy conservation home improvements: They trained California home energy auditors to communicate their findings to homeowners in vivid, memorable images. Rather than simply point out small spaces around doors where heat is lost, the auditor would say "if you were to add up all the cracks around and under the doors of your home, you'd have the equivalent of a hole the size of a football in your living room wall." With such remarks, and by eliciting the homeowners' active commitment in helping measure cracks and state their intentions to remedy them, the trained auditors triggered a 50 percent increase in the number of customers applying for energy financing programs.28 27Myers, D., Social Psychology, 4th ed. (New York: McGraw-Hill, 1993), p. 55. I gratefully acknowledge relying on Myers' thoughtful second chapter on social beliefs for much of the material in this section on reasoning problems. 28Myers, ibid., p. 56, talking about a study by Gonzales, M. H., Aronson, E., and Costanzo, M. A., "Using social cognition and persuasion to promote energy conservation: A quasi-experiment," Journal of Applied Social Psychology, 18 (1988): 1049-1066. 210 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING By drawing a vivid image of a "hole the size of a football in your wall" these researchers were much more convincing than when speaking in more abstract, conceptual terms. In both cases, the information is the same, but the more vivid example is more persuasive because it creates a more memorable image. Quantitative Illiteracy Both our preconceptions and illogical reasoning heuristics weaken the performance of our information processing systems, and make us look, at least to the "rational experts," like pathetically irrational beings. And if all these examples were not enough, our quantitative illiteracy makes us look even worse. Unless we have some technical training in a quantitative field, most of us have difficulty conceptualizing very big and very small numbers. A billion may as well be a trillion or a gazillion—we do not deal in these types of numbers often enough to have a well-developed understanding of their differences. Consequently, when environmental problems are described in quantitative terms, many of us lose track of the numbers and reason poorly. For example, the estimated cost of cleaning up the Hanford Nuclear Reservation's plutonium, toxic chemicals, heavy metals, leaking radioactive waste tanks, ground water and soil contamination, and seepage into the Columbia River is estimated to be $30 to 50 billion.29 How does this number compare to the U.S. deficit? To the cost of Social Security? To federal expenditures on education? Most of us have no idea. In each case, we have probably heard the figures at some time, but they are too big to be meaningfully related.30 Consequently, it is difficult for us to make good decisions about environmental clean-up relative to other societal projects. Similarly, our pro-environment behaviors are often undertaken with less than optimal results because we do not understand their quantitative dimensions. For example, electric lights use about 5 percent of home electricity. In the 1970s the energy crisis induced many people to conserve energy by being very conscientious about turning off lights. When their behaviors failed to show impact on their electricity bills, people gave up trying to save energy altogether. Unfortunately, they did not realize that home heating and cooling uses 50 to 70 percent of domestic energy, so that turn- 29Seager, J., Earth Follies: Corning to Feminist Terms with the Global Environmental Crisis (New York: Routledge, 1993), p. 35. 30The figures are: the 1994 U.S. deficit was $280 billion, the cost of Social Security was $320 billion, and expenditures on education was $50 billion. Source: Statistical Abstract of the United States, 114 ed. (Washington, DC: U.S. Department of Commerce, Economics and Statistics Administration, Bureau of the Census, 1994), Table 507, p. 332. For more comparisons on military vs. environmental expenditures, see Figure 3.3. INFORMATION PROCESSING: THE CONSTRAINTS OF FAULTY PROGRAMS 211 ing down one's thermostat in the winter and turning it up in the summer would save far more energy than conscientiously turning off lights. Choosing energy-efficient major appliances is the single most important class of behaviors; refrigerators alone use about 19 percent of household energy. Likewise, many people are quite conscientious about recycling but less aware that reducing use is a far more effective way to save natural resources. For example, buying products in refillable plastic containers (such as shampoo from local health foods stores) is far more important than recycling plastic bottles. Inexperience with quantitative information makes us susceptible to framing effects, which are induced when the same information is structured in different ways. For example, consumers prefer ground beef described as 75% lean over that described as 25% fat, and more students rate a condom effective if it has a "95% success rate" in stopping AIDS than if it has a "5% failure rate."31 Moreover, people appear to be more motivated to avoid a loss than to achieve a gain. For example, Suzanne Yates showed that people were more likely to invest in a water-heater wrap if it was presented as a way to avoid losing money, rather than as a way to save it!32 In summary, the picture that cognitive (and social) psychology paints of human beings is not an especially attractive one. We are easily duped by our prejudices, our heuristic errors, our need to justify our actions, and our inexperience in quantitative matters. We like to think of ourselves as rational and open-minded, but research shows that we are anything but. There are already emotional reasons to leave environmental decisions to the experts because of our anxieties and our defenses. Our intellectual limitations simply add another excuse. Since all this seems to undermine the sub-title of this book, healing the split between planet and self, let me add a very important point here. Although we all are easily deceived by these processes (Amos Tversky once said: "we didn't set out to fool people; all our problems fooled us, too,"33) we do not have to be. Instead we can learn to avoid these errors by being taught how they function as well as how to use counteracting strategies. For example, in a classic study of pre-existing biases, Stanford University students were given mixed evidence on the deterrence effect of the death penalty. They interpreted the evidence to fit their pre-existing attitudes about capital punishment, showing clear distortions in their percep- 31Myers, D., Psychology, 4th ed. (New York: Worth Publishers, 1995), p. 335-336. 32Yates, S., Using prospect theory to create persuasive communications about solar water heaters and insulation. Unpublished doctoral dissertation, University of California, Santa Cruz. 33Quoted by Myers, D., Psychology, 3rd ed. (New York: Worth Publishers, 1992), p. 292. 212 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING tions and reasoning. However, when they were subsequently asked to imagine that the evidence supported the opposite position (to see the merit of the opposing side), they became much less slanted in their evaluation of the evidence. Thus, explaining the opposite view can reduce our proclivity for prejudice and bias.34 In my own experience of teaching psychology, I have seen students quickly become sensitized to the natural tendency to commit these reasoning errors. I like to demonstrate to students in class how they work and how easy they are to commit by giving students problems to work on, but I have to do it before they read the chapter in which the reasoning problems are described— otherwise, students will avoid making them. What this means is that we can easily learn to reason more effectively if we are taught to avoid our naive errors. Using Cognitive Psychology to Solve Environmental Problems The main message of cognitive psychology is that our inappropriate environmental behaviors are due to our inadequate, mistaken, distorted, or missing information about the consequences of our actions. Some of the impoverished information stems from intentional efforts on the part of advertisers, government officials, and military establishments to keep us ignorant about the impacts of our actions; but a lot of our difficulties also come from our own illogical and self-defeating cognitive limitations, which lead us to ignore and/or mishandle important information. Below I have listed some of the typical errors I discussed in this chapter. Before we discuss a specific example of how to work with them, you might find that reviewing their meaning will be useful now. Perceptual Information Errors Cognitive Information Errors Limited time frame Visual dependence Selective attention Habituation Irrelevant information Confirmation bias Illusory correlation Expectations Availability heuristic Quantitative illiteracy Framing 34Lord, C.G., Lepper, M. R., and Preston, E., "Considering the opposite: A corrective strategy for social judgment," Journal of Personality and Social Psychology, 47 (1984): 1231-1243. USING COGNITIVE PSYCHOLOGY TO SOLVE ENVIRONMENTAL PROBLEMS 213 To demonstrate how these insights can be used to confront our environmental difficulties, let us return to the problem of overconsumption. Consider how frequently these types of errors lead us to make environmentally inappropriate choices of consumer items. Most of us have an extremely limited time frame and focus on the short-term utility of our purchases. Even if we do become more sophisticated about time span and consider the lifetime of the product, say, when we purchase a car or a refrigerator, the extended time frames are still much too short for intelligent behavior regarding our planetary predicament. Perhaps a more appropriate time frame would be that used by the Iroquois, a native American tribe who claim to practice decision-making in terms of the "7th generation": will this choice benefit or damage the 7th generation of my descendants?35 Although it is not clear exactly what decisions the Iroquois made on this criterion, it is clear, from our own culture, that most decisions we make about environmental issues use a much shorter time frame than what is needed. On both the personal as well as the policy level, we rarely think beyond a few years into the future. As we discussed in Chapter 5, elected decisionmakers are concerned with re-election on a 2- to 6-year cycle, which makes their commitment to longer-range solutions difficult to maintain. Related to the problem of shortened time frame is the problem of visual dependency. We are seduced into thinking that what is pretty is what is good because we have so little information about the invisible effects of what is pretty. A beautifully landscaped golf course is environmentally harmful because of the overuse of water and chemicals that are expended to maintain it. Because it is pretty, people are more apt to vote for protecting photogenic wilderness than wilderness that is not, like a bog. Most people cannot "see" the biodiversity dependent on the bog. And our dependence on fashion drives an enormous amount of overconsumption so that we can procure the latest and the newest rather than the most long-lasting or the environmentally friendliest. Part of our need for the new is due to the process of habituation. We grow tired of what we are used to and long for the stimulation of something new. Making better choices will require decisions based on sustainability: intentional focus on longer time frames, hidden effects, and learning to value what is familiar more than what is new. For example, our consumer choices are continually fed by poor information, which we do little to counteract. A common example would be our choice of food. Most of us have very little idea where our food is grown, and would be dismayed to learn of how much wasted energy, especially fossil 3SLyons, O., "The Iroquois perspective," in Willers, B., ed., Learning to Listen to the Land (Washington, DC: Island Press, 1991). 214 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING fuel, went into getting it on our grocer's stand. Our ignorance makes it seem like the food just appears there, and that the costs of its production or waste products are innocuous, or covered by the price of the food. Just the opposite, however, occurs. I live in one of the most highly productive areas in the world for growing apples (Washington state). However, the choices of apples available to me in the chain grocery stores include only apples that have been shipped thousands of miles. Local apples are not available because they are being shipped to other regions. Apples seem like a responsible choice of food because I know that they are locally produced; the availability heuristic misleads me to think that so are the apples in my local supermarket. In more general terms, the way we frame food choices helps keep us ignorant of the environmental consequences of our choices. When selecting foods at the store, we generally consider price, menu, quality, and convenience. More recently, governmental agencies have required food labeling to inform the consumer of the nutritional dimensions of those choices: fat content, calories, and chemical additives are now listed. No information is yet made available, however, about relevant environmental concerns: number of gallons of gasoline used to grow and distribute this food product; number of people injured by the pesticide addition; and whether or not the food was produced in the United States or on foreign soil where environmental regulations are more lax. Reforming food choices in these terms would lead to the obvious conclusion that buying strawberries in January is a less appropriate action than buying them at the local farmer's market in June. Making more appropriate consumer choices on food and other products will not be easy, especially as we are intentionally kept ignorant of the environmental consequences of our choices by advertising and industry. Asking questions about the environmental costs of certain products often gets one vague, confusing, or inadequate answers. For example, I decided to ask some questions about a regular purchase I make every month: coffee. Living in the Pacific Northwest, I have become something of a coffee addict, and I enjoy drinking my morning latte (espresso and steamed milk) as one of my favorite activities of the day. While researching for this chapter, I ran across some material describing the horrendous conditions under which many Latin American coffee growers work. Forced off of their small farms by escalating debt, many peasants now labor on coffee plantations that pay poorly, force workers to endure harmful agricultural chemicals, and provide inadequate housing. Moreover, large coffee plantations typically entail huge monocrops; trees that do not produce coffee are removed. Such trees provide nesting places for bird populations, so that coffee production undermines biodiversity. I wanted to know if my coffee company USING COGNITIVE PSYCHOLOGY TO SOLVE ENVIRONMENTAL PROBLEMS 215 bought directly from farmers or from plantations, how much of their price went to the g rowers, whether they could get organic coffee, and whether the coffee was grown in diverse crops along with shade trees. After five long-distance phone calls, I finally had one returned. The answers to my questions involved a lot of complicated information about buyers, markets, coffee trade agreements, etc. I finally learned that my coffee company buys the best quality beans for the lowest price possible, meaning that it buys from large plantations produced in monocrops with pesticides. I also learned that they put 4 percent of their profit back into social programs. Unfortunately, this small percentage of company profits goes toward solving social problems that the company itself helps create. I have decided to order from the Alternative Trade Organization next time because Alternative Trade Organizations (see Appendix) buy coffee direct from farmers so that they can remain on their own land and make a decent living. These small farms retain the shade trees and produce coffee without harmful pesticides to workers. Demanding better information is only part of our problem. What we do with the information we do have is probably an even bigger part. Because of strong biases to retain basic core beliefs, we are likely to distort, diminish, or otherwise degrade information that contradicts them. Forcing ourselves to argue for an opposing viewpoint is a critically important strategy for avoiding such reasoning errors. Recall the Stanford study in which students were asked to argue against their own position on the death penalty. When they did so, they were able to evaluate relevant evidence far more objectively. Arguing the opposite viewpoint is not easy, but its effects are valuable in exposing our reasoning errors. Pick an environmental issue about which you have a strong opinion and put yourself through the same exercise. For example, I strongly believe that we ought not allow oil and gas development in the Arctic National Wildlife Refuge in Alaska (ANWR). The coastal area to which oil and gas companies have asked Congress for access is home to several hundred animal species, and 7000 indigenous people who live sus-tainably in that complex ecosystem. Degrading this priceless refuge for an unknown supply of nonrenewable energy seems unnecessary to me, a matter of sheer greed on the part of oil and gas companies. Forcing myself to argue the opposite viewpoint has disclosed several reasoning errors I have made about the information I have, as well as the inadequate information that I do have. Let me run my counterbelief argument past you, so you can see my cognitive difficulties. Here is my best effort at arguing that the oil companies should have access to the Alaska Refuge: 216 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING The United States needs to develop its domestic oil production. Currently we import about half our oil, principally from the Middle East. Look at what our dependence on the Persian Gulf led us to in January of 1991: a war with Iraq. If we calculate the cost of imported oil to include the military costs we spend to insure it, we pay about $100 a barrel, over three times as much as the market price. Developing our own reserves would produce much cheaper oil and alleviate our dependency on a politically dangerous neighborhood of suppliers who can arbitrarily raise the price at will. Our need for oil will only increase in the next decades as our population and use of automobiles continues to climb. The Department of Energy estimates that by the year 2000 we could be importing 60 to 70 percent of our oil, which would increase our debt and lead to severe inflation, widespread economic recession, and perhaps even a major depression. Drilling for oil in the ANWR will help ensure our national security—by strengthening our economy as well as reducing our involvement in the politically volatile Middle East. Preventing the exploration is an example of how the federal government has failed to develop a coherent energy policy for the next decades. First, forcing myself to make the counterargument has illuminated the relationship between energy consumption and military costs. When President Bush claimed that Iraq's invasion of Kuwait "threatened our American way of life" he was referring to our enormous dependence on oil from the Middle East. Limiting our own production of oil increases our military involvement in this dangerous geographical area. I had not seen the military implications of the ANWR before making this counterargument. I had framed the issue as one of oil company greed versus wildlife and indigenous culture preservation, two images available from media representations of the debate. Forcing myself to make the counterargument makes it easier for me to see the military and national security issues that are also relevant, though rarely discussed. Second, the problem of a federal energy policy is a real one, I think, and I appreciate the oil companies' frustrations with a chaotic series of rules and regulations that subsidize some parts of the industry while denying growth to others. In part because of President Bush's previous work as an oil company executive, I had thought of the oil industry and the federal government as far too chummy. I pictured lavish business lunches where government and oil officials work out cozy arrangements to profit the oil industry. President Bush triggered an availability heuristic that made me estimate the closeness between federal and oil company administrators. Arguing the oil companies' position about the ANWR forced me to see the gap between government and oil officials and appreciate that the latter feel they have been discriminated against on this aspect of their industrial development. As long as I had access only to my own point of view, it was easy to minimize the differences between it and the point of view of others. USING COGNITIVE PSYCHOLOGY TO SOLVE ENVIRONMENTAL PROBLEMS 217 Third, I pictured drilling for oil as killing land animals and dislodging people from their native land. Actually the proposed drilling would take place off shore, where the greatest deposit of oil appears to be. Such drilling would cause some environmental damage, especially to the marine environment, but not the sort I had pictured, which was on-land drilling. After going through this exercise, I am not convinced of the safety or wisdom of opening up the ANWR, but I am more open to listening to the arguments in its favor. Furthermore, I have more sense of the information I need to make an informed decision: How much would it really reduce our dependency on the Middle East? Would our national security be threatened because foreign military sabotage of the Alaska pipeline is easier to accomplish than interrupting our supplies from the Middle East? What kind of marine wildlife damage would occur? What would happen to the indigenous cultures? These are questions that did not occur to me before I forced myself to argue the opposite position. I also notice that it is easy for me to use the availability heuristic and to minimize the differences between other viewpoints relevant to the topics about which I have strong opinions. Basically, I am prone to overgeneralizing, and I can become a more thoughtful and effective individual by being vigilant about that tendency. To summarize, from a cognitive viewpoint we can change our environmentally inappropriate choices by getting better information about the effects of our actions. To notice that institutional structures intentionally keep us naive requires that we begin asking better questions and pursuing uncomfortable answers. All institutions have some vested interest in their own point of view, and intentionally or unintentionally distort information in order to maintain it. This is no less true of environmental groups than it is for businesses, governments, military organizations, or local landowners. In general, we need to get better information and make better judgments about it. We can do this by: 1. Asking difficult questions about environmental issues. Pursuing answers, even when they are not forthcoming. Learning more about the environmental consequences of our actions, especially our consumer choices. A number of good guides are now available that give information about the environmental impact of consumer goods. (I have described some of them in the Appendix.) Expressing our preferences to store managers for organic, nonpolluting, locally produced products, and for bulk items without unnecessary packaging. Shopping at locally owned farmers' co-ops. Choosing products with as little packaging as possible. 2. Forcing ourselves to make a counterargument in order to discover our reasoning errors. Noticing the reasoning errors we are most likely to make, and keeping vigilant about them. Being willing to ad- 218 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING mit that our information and/or reasoning is flawed and being open to learning more about our limitations. 3. Being confident enough about our intelligence to learn more about / complicated environmental issues, and refusing to leave them entirely to experts. This last point is so important to healing the split between planet and self that I want to spend the rest of the chapter looking at it in more detail. If we are to use the insights of cognitive psychology effectively we will have to examine where they have been used against us. Risk Assessment: Whose Quantification Problem Is It? One of the arenas in which the public's poor reasoning ability has been most frequently noted is their attitudes about environmental hazards. Since the publication of Rachel Carson's Silent Spring in 1962,36 which persuasively documented the toxic effects of chemical pesticides (such as DDT) on air, water, and wildlife, public concern about environmental hazards has continuously grown, becoming an important issue in government and industry circles. The Environmental Protection Agency (EPA) was founded in 1970, shortly after the National Environmental Protection Act (NEPA) was passed in 1969. NEPA requires formal environmental assessments by any federal agency proposing actions that have environmental impact. The EPA is charged with enforcing over 9000 environmental regulations and protections. Its jurisdiction has steadily grown as federal regulations on air, water, and endangered species protection have been passed. In 1991, the EPA employed about 17,000 people, over three times the original workforce that formed the agency in 1970. Both the NEPA and the EPA supply an important market for the form of applied cognitive psychology called "risk assessment." Risk assessment involves four factors: identifying hazards, estimating probabilities of damage, reducing risks, and communicating them to the public. From the start, a salient issue in risk assessment has been the discrepancy between what the public and what the experts regard as hazardous. For example, a 1979 study by cognitive psychologists Slovic, Fischhoff, and Lichtenstein37 showed that two amateur groups—the League of Women Voters and college students—judged the seriousness of 30 hazards significantly differently than did a group of experts. The follow- 36Carson, R., Silent Spring (Boston: Houghton Mifflin Co., 1962). 37Slovic, P., Fischhoff, B., and Lichtenstein, S., "Rating the risks," Environment, 21 (1979): 14-20, 36-39. RISK ASSESSMENT: WHOSE QUANTIFICATION PROBLEM IS IT? 219 ing table lists some of the more striking discrepancies. Note especially the first item, nuclear power. Whereas the two nonexpert groups rated it as the highest risk among the 30 hazards (indicated by the number 1), the experts rate it quite low (20th of 30). The public thus fears nuclear power much more than the experts do. Judged Rank of 30 Hazards (low rank = high risk) League of Women Voters College Students Experts Nuclear power Pesticides 1 9 1 4 20 8 Motor vehicles 2 5 1 Hunting 13 18 23 Skiing 21 25 30 Mountain climbing 15 12 29 Electric power 18 19 9 From Gifford33 In related studies, these researchers have also shown that public fears are caused by the availability heuristic: we tend to overestimate the incidence of infrequent events (such as botulism) and underestimate the incidence of frequent ones (such as automobile accidents). We remember the extremely unusual if it gets much press, but we tend to forget, and thus underestimate, the likelihood of more commonplace, less publicized occurrences. Consequently, we may fear the unlikely more than likely hazards.39 Data like these are frequently cited by risk management professionals as documentation of the public's ignorance, even irrationality, about risk issues. In the words of environmental psychologists Cvetkovich and Earle: The public's frequent divergence from the conclusions of technical risk assessments has been said to reflect judgment "biases," and to indicate inconsistency and perhaps an inability to understand complex scientific and technical issues. Characterized in this way, public reactions to hazards are seen as a reflection of a general "scientific illiteracy." Proposals for better science education often follow in the wake of such characterizations. Other commentators have painted even darker images of the public as basically irrational and suggested that public responses should be excluded from hazard management 38Gifford, R., Environmental Psychology: Principles and Practice (Boston: Allyn and Bacon, 1987), p. 254. 39Fischhoff, B., "Psychology and public policy: Tool or toolmaker?" American Psychologist, 45 (5) (1990): 647-653. 220 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING decisions altogether. Public opposition to the development of nuclear power has often been described in this way.40 By illuminating the common reasoning errors of human beings, cognitive (and social) psychologists run the danger of disempowering the public. As Fischhoff has noted, "psychologists can contribute to a sort of disen-franchisement—by reducing the perceived need to let the public speak for itself."41 One does not need to look far to find instances of public officials dismissing the public's credibility or its ability to reason. A former director of the Hanford Nuclear Reservation in Washington State once "described public concerns about nuclear power development as 'mass paranoia.' "42 During William K. Reilly's confirmation hearings as director of the EPA, New York Senator Daniel Patrick Moynihan warned him: "Above all, do not allow your agency to become transported by middle-class enthusiasms." Reilly claimed to interpret this to mean "pay attention to science; don't be swayed by the passions of the moment."43 However, it could just as easily have been interpreted to mean "Don't let public concern dissuade you from what your experts tell you." To some degree, it probably was. Although I do not mean to imply that the public is always right, I do believe that the tendency for many experts to disparage public viewpoints is a troubling feature of professional hazard assessment. Risk assessment can be useful, but it can also be used to justify harmful environmental practices to which the public is subjected without consent. For these reasons, it is crucial for the public to understand the rubric of risk assessment so that it can retain its voice in policy decisions. It is also imperative that risk experts see their work in the context of people's daily lives. For this, they need the public's voice. Although the field of risk assessment becomes quantitatively complicated very quickly, the discrepancies between public and expert estimates can be understood in basic terms. And this basic difference is at stake in more than risk assessment—it is also fundamental to the way environmental issues are addressed in our society. At the bottom line, it is at the heart of our split between planet and self. Let me show how. Most fundamentally, the public and the experts use different definitions of risk. Professionals usually define risk as the number of deaths 40Cvetkovich and Eade, ibid., 1-20. 41Fischhoff, ibid., p. 647. 42Cvetkovich and Earle, ibid., p. 5. 43ReilIy, W. K., "Why I propose a national debate on risk," EPA Journal (March/April 1991). Reprinted in Goldfarb, T. D., ed., Taking Sides: Clashing Views on Controversial Environmental Issues, 5th ed. (Guilford, CT: Dushkin Publishing Group, 1994), p. 93. RISK ASSESSMENT: WHOSE QUANTIFICATION PROBLEM IS IT? 221 caused by that hazard in 1 year (or some other time unit): 150,000 deaths from smoking; 17,000 from handguns, 100 from nuclear power per year. Thus, by this index, smoking is said to be 1500 times more risky than nuclear power. Number of fatalities per year is easy to count, and thus easy to think about. When one has a quantitative background, it is especially easy to think that everything that is important can be quantified. This illusion is furthered by using complicated formulas that estimate exposure rates, event probabilities, financial costs, and other effects. For example, a recently developed computer program called "Demos" can estimate the number of excess deaths caused every year by a hazard, against the price of regulation controls and the dollar value of one death. Demos, like other risk-assessment programs, requires that a number be given for the value of one human life. Social costs are then calculated to be a sum of the control and the mortality costs. Because of this kind of quantitative effort, some people, such as former EPA policy analyst Ken Bogen, call risk assessment a form of "probabalis-tic cannibalism"44 that trades lives for dollars. Abstract numbers can hide the effects of environmental hazards; as risks become quantified their social dimensions get lost. For example, consider the issue of pesticides. In the words of environmental studies expert G. Tyler Miller, According to the National Academy of Sciences, pesticides account for 2.1% of all U.S. cancer deaths each year. That means that pesticides licensed for use in the United States legally kill about 10,000 real, but nameless, Americans a year prematurely from cancer, without the informed consent of the victims.45 People who are harmed by environmental hazards are often uninformed and nameless. They are also unequally distributed among the population; there is extensive evidence to show that environmental risks are incurred more often by lower-income groups, minorities, and children.46 These considerations, disappear, however, when risks and risk outcomes are quantified. Furthermore, these considerations are likely seen as irrelevant to risk analysts. But they are not irrelevant to the public. The public often uses a different definition of risk than do experts. Remember the example given earlier in this chapter about passive solar heat- 44Quoted by Miller, G. T., Living in the Environment: An Introduction to Environmental Science, 7th ed. (Belmont, CA: Wadsworth Publishing Co., 1993), p. 552. ttlbid., p. 552. 46Opotow, S., and Clayton, S., "Green justice: Conceptions of fairness and the natural world," Journal of Social Issues, 50 (1994): 1-12; also see Laituri, M., and Kirby, A., "Finding fairness in America's cities? The search for environmental equity in eveiyday life" Journal of Social Issues, 50 (1994): 121-140. 222 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING ing: human beings tend to make decisions on the basis of a lot of information, information that experts might narrowly define as "irrelevant." But "lay-people have different, broader definitions of risk, which in important respects can be more rational than the narrow ones used by experts."47 More specifically, public estimates of risk are correlated with their view of how well the question is understood, how equitably the danger is distributed, how well individuals can control their exposure and whether risk is assumed voluntarily. . . . When people are asked to order well-known hazards in terms of the number of deaths and injuries they cause every year, on average they can do it pretty well. If, however, they are asked to rank those hazards in terms of risk, they produce quite a different order. People do not define risk solely as the expected number of deaths or injuries per unit time.48 (emphasis added) Thus, when one looks more fully at how the public does make risk assessments, one sees that the public are not stupid, although the model they use for risk assessment is more like the "folk model" than the "rational-economic" model described earlier in this chapter. Rather than interpreting the discrepancy between public and expert estimates of risk as indications of public irrationality or paranoia, it can be understood as the experts' failure to adequately assess the public's definition of risk, which includes the degree to which it affects innocent bystanders and whether the risk has been voluntarily undertaken. A more accurate model of public risk assessment has been generated by Morgan49 in Figure 6.2, which illustrates public perception of risk along two different dimensions: degree of controllability (including fatality, equitability, risk to future generations, and voluntariness) and the degree of observability (including knowledge to those exposed, delay of effects, and amount of scientific knowledge available). This model of risk space has been shown to predict public perceptions of risk and their calls for government regulation of them. For example, risk space explains why the public is so concerned about nuclear accidents, even when the fatality estimates are only 100 per year. What makes nuclear accidents so frightening is their catastrophic consequences, their high risk to future generations, their involuntary exposure, and the secrecy with which their probabilities are protected from public scrutiny. But matters of observability, voluntariness, equitability, and knowledge are much more difficult to quantify than are number of deaths per year. Some critics of risk assessment suggest that we will never be able to quan- 47Morgan, M. G., "Risk analysis and management," Scientific American (July 1993): 32-41, p. 32. 48Morgan, ibid., p. 35. 49Morgan, ibid., p. 41. RISK ASSESSMENT: WHOSE QUANTIFICATION PROBLEM IS IT? 223 Figure 6.3 Risk space has axes that correspond roughly to a hazard's "dreadfulness" and to the degree to which it is understood. Risks in the upper right quadrant of this space are most likely to provoke calls for government regulation. Not Observable Unknown to those exposed, effect delayed new risk, risks unknown to science • DNA technology Microwave ovens « # Electric fields Wale r fluoridation . Saccharin | N"«« ® Waler chlorinalion e Polyvinyl 0 0 DES 0 Nitrogen fertilizers chloride e Radioactive was! e Oral contraceptives« »Diagnostic Valium* 01UDs x.r^ys Nuclear reactor accidents • ® Pesticides • Uranium mining Antibiotics e «Asbestos « pcbs Nuclear weapons _ insulation fallout 9 Mercury • Satellite crashes Controllable Lead (autos)ei Uncontrollable Not dread, not global 9 Aspirin • Lead paint »Coal-burning pollution Dread, global catastrophic, catastrophic, consequences not fatal, equitable, low risk to future generations, • Vaccines consequences fatal, not equitable, high risk to future generations, not Skateboards 9 9 Carbon monoxide ® Storage 0 Nerve gas easily reduced, risk Power Smoking (disease) • (autosj and transport accidents Black lung 9 of liquefied « Large dams Plural gas « Skyscraper fires easily reduced, risk decreasing, voluntary mowers Snowmobiles« increasing, involuntary Trampolines e a Tractors Chainsawsa Nuclear weapons (war)« Home swimming 0 a Elevators Underwater construction P°ols «Downhill skiing * e Sport parachutes • Coal-mining accidents Recreational boating» 9 General aviation Bicycles « Motorcycles o »High construction «Railroad collisions Ai co ho i-related accidents« s Commercial aviation Fireworks • « Auto racing Auto accidents ,, . Handguns e Dynamite Observable Known to those exposed, effect immediate, old risk, risks known to science Source: Illustration by Johnny Johnson from "Risk Analysis and Management" by M. Granger Morgan. Copyright © 1993 by Scientific American, Inc. All rights reserved. tify them adequately, and so should not use risk assessment to make major policy decisions. For example, Senator David Durenberger (R-Minnesota) has argued that it is impossible to compare risk hazards. "How does one compare a case of lung cancer in a retired petrochemical worker to the loss of cognitive function experienced by an urban child with lead poisoning?"50 Others argue that risk assessment is an inadequate, even a dangerous way to make policy decisions because it fools us into thinking that we can make rational, objective decisions based on numerical formulas. "Risk management is, fundamentally, a question of values. In a democratic society, there is no acceptable way to make these choices without involving the citizens who will be affected by them."51 Such controversies over the method of risk assessment have recently been described by environmental philosopher K. S. Shrader-Frechette as 50Durenberger, D., Mott, L., and Sagoff, M., "A dissenting voice," EFA Journal (March/April 1991). Reprinted in Goldfarb, ibid., p. 98. 51Morgan, ibid., p. 32. 224 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING portraying two fundamentally wrong approaches to human issues: the "naive positivist approach," which assumes that all dimensions of human decision making can be objectively quantified, and therefore that logical decisions can be made (although probably only by trained experts); and the "cultural relativists" who argue that since values are impossible to avoid in risk decisions, all risks are simply social constructions. Both approaches, she believes, are wrong because they underestimate or overestimate the role of values. The naive positivists assume that values can be avoided; the cultural relativists assume that everything is a matter of values, since the process of risk assessment is itself a socially constructed proposition. But while values cannot be avoided, there are still physical events which kill people, whether or not one has conceptualized them with a risk assessment model. Furthermore, risk assessment can help us think about hazards if we do not lose track of its limitations. Therefore, I appreciate Shrader-Frechette's thinking when she says the challenge, for any risk evaluator who holds some sort of middle position (between the cultural relativists and the naive positivists), is to show how risk evaluation . . . can be rational and objective, even though there are no completely value-free rules applicable to every risk-evaluation situation ... [I] argue [for] a "middle position," which I call "scientific proceduralism." This . . . account is based on the notion that objectivity in hazard evaluation ... is tied to at least three things: 1) the ability of risk evaluations to withstand criticism by scientists and laypeople who are affected by the hazards; 2) the ability of risk evaluations to change, as new facts about probabilities are discovered; and 3) the ability of risk evaluations to explain and predict both risks and human responses to them.52 In other words, I doubt that we can afford to dispense with the service that risk assessment performs—forcing us to explicate our assumptions, empirically measure what we can, and consider the diversity of concerns that different people will bring to the problem—simply because some terms in the formula are difficult to measure. Risk assessment should be improved, not abandoned; moreover, the public needs to participate more effectively in decisions based on risk assessment. The danger of both the naive positivist and the cultural relativist positions is that they both lead to risk management by experts who have increasingly narrow and distorted understandings of public concern. They both threaten our democratic rights and responsibilities. 'Shrader-Frechette, K. S., Risk and Rationality: Philosophical Foundations for Populist Reforms (Berkeley, CA: University of California Press, 1991), p. 8. HEALING THE SPLIT: RETAINING A VOICE 225 Healing the Split: Retaining a Voice Even though the Department of Energy recently approved a grant for $85,000 to a Washington, DC, psychiatrist to help "counter the public's irrational fear" about nuclear power,53 public attitudes regarding nuclear power continue to be skeptical. Does that mean that the public is irrational? If public fears about nuclear accident are irrational, one wonders why the nuclear utilities would need the Price-Anderson Act, which limits the liability of nuclear energy companies to $7 billion, a small fraction of the estimated damage from a worst-case accident. Critics point out that the law is an unfair subsidy of the nuclear industry; since the industry could never afford to pay for total cost coverage, it would not have been able to develop without this law. More disturbingly, since the nuclear industry claims that it would go bankrupt without this legislation, they must believe that nuclear accidents are likely. If they are likely, the public is not so irrational.54 Most of us do not yet have adequate information to make an informed decision about the safety of nuclear power, or about any number of other pressing environmental hazard issues. Many of us assume the experts know what they are doing, even though public trust in the industry officials is not high.55 Leaving important risk decisions to experts while not entirely trusting them is a good example of our split between planet and self: we assume the questions are too complicated, too enormous, and too overwhelming for us as individuals, and so it becomes easiest to let others make the decisions for us. In many areas, leaving decisions to experts makes sense. For example, automobiles and buildings should be designed by engineers and architects with far more knowledge than the public could ever be expected to possess. And because we live in a representative democracy, we entrust elected and nonelected leaders alike to be good stewards of society, to be educated in their areas of influence, and to make responsible decisions. But engineering problems and risk assessments are fundamentally different areas because the latter involve human values. There is no reason to think that experts can make the crucial judgments about the monetary value of human 53Shrader-Frechette, ibid., p. 14. 54Shrader-Frechette, ibid., p. 15. S5Covello, V. T., "Public confidence in industry and government: A crisis in environmental risk communication," in Miller, ibid., p. 572. See also, Pilisuk, M., and Acredolo, C, "Fear of technological hazards: One concern or many?" Social Behaviour, 3, (1988): 17-24; also Cvetkovich, G., and Earle, T. C, "The construction of justice: A case study of public participation," Journal of Social Issues, 50, (1994): 161-178. 226 COGNITIVE PSYCHOLOGY: EMPHASIS ON THINKING life any better than can the layperson. What would make it any easier for an industry official than a housewife to make decisions about how many lives are worth risking for any particular technological advancement? Risk assessors, like everyone else, are prone to minimize concerns that are not visible. For example, in June 1990, the Nuclear Regulatory Commission proposed that most of the low-level nuclear waste be exempted from federal regulation. Deregulation would mean that the waste would be treated like ordinary trash, dumped in landfills, burned, or recycled into manufactured consumer products. The NRC admitted that exposure to such radiation would probably cause 2500 more cancer deaths per year among Americans, but that this loss of life would be acceptable since it would save the nuclear industry at least $1 billion over the next 20 years.56 To many laypersons, killing 2500 people to save the nuclear industry money looks irrational, especially, as in this case, one death is valued at $20,000. Who is more irrational, the public or the experts? My point is that neither is irrational, but both are likely to use different criteria for evaluating rational decisions. Certain decisions, such as the dollar value of human life, are not answerable from a numerical formula, even though it would be easier if they were. It is well worth monitoring the way in which experts think about and document risk assessment because without public input, their judgments (like everyone's) are likely to be distorted by the institutions that train, hire, and pay them. In response, risk experts argue that daily life involves all kinds of risks that we ordinarily do not think about. For example, one of the riskiest things we do is to get in our cars and drive. Every time I get in my car, I decide that driving is worth the risk of dying. Motor vehicles kill about 50,000 Americans per year, almost three times as many as handguns (which kill 17,000).57 Since every time I drive, I risk dying in an accident, and that risk is accumulated over a lifetime, the chances of dying within a driving history of 50 years becomes more than 1 in 10. Driving is far riskier than being exposed to nuclear waste or pesticides, yet we do it every day without thought. Many risk experts would point out that a rational response to risk assessment would be to put more money into driving safety than into gun safety (and much less into nuclear safety). On the other hand, driving involves something of an informed risk choice. If asked, most people realize that they take a risk when they drive so that it at least seems possible to reduce risk by not driving. People do not «Miller, ibid., p. 493. 57Slovic, P., Fischhoff, B., and Lichtenstein, S., "Rating the risks," Environment, 21 (1979): 14-20. Reprinted in Glickman, T. S., and Gough, M., eds., Readings in Risk (Washington, DC: Resources for the Future, 1979). HEALING THE SPLIT: RETAINING A VOICE 227 have that same sense of choice when they undergo the risks of nuclear power. No one asked them if they were willing to undertake these risks, or gave them an alternative in case the answer was no. I believe that -when we ask people to undergo risks from technological hazards, we ought to listen to and take seriously their definition of risk. The public's perception of a hazard's invisibility, controllability, and unfair distribution should be addressed. And in the process of public discussion, the public needs to learn more sophisticated thinking about how risk assessment is conducted and how risks are assessed. Again, my point is not that the public is always right and the experts always wrong, but that both need to be educated to the others' way of thinking when decisions about risks are made in a democratic society. The larger question here, however, is the matter of abdicating our responsibility, and with it, our connection with the future of our ecosystem, by making the facile assumption that it is all too complicated for our little selves to deal with. Industry and government officials will unconsciously encourage this split because public involvement in decision making makes their lives more difficult. To the extent that cognitive psychology has illuminated the perceptual and cognitive limitations of human beings, it too has colluded with our split, and it too has jeopardized our democratic institutions as well as our psychological well-being. But cognitive psychology has also shown that people can learn to reduce their mistakes, and "given balanced information and enough time to reflect on it, they can do a remarkably good job of deciding what problems are important and of systematically addressing decisions about risks."58 Each of the biases I have discussed in this chapter can be overcome, if we are alerted to them and practice methods of subverting them. Unless trained to do otherwise, experts are just as likely to make these errors. We all need to address problems of sustainability with a rigorously attuned intelligence, an intelligence that is strengthened, rather than undermined, by learning about our cognitive limitations. 58Morgan, ibid., p. 40.