2013 : WHAT *SHOULD* WE BE WORRIED ABOUT? [1]

john_tooby's picture [5]
Founder of field of Evolutionary Psychology; Co-director, Center for Evolutionary Psychology, Professor of Anthropology, UC Santa Barbara
Unfriendly Physics, Monsters From The Id, And Self-Organizing Collective Delusions

The universe is relentlessly, catastrophically dangerous, on scales that menace not just communities, but civilizations and our species as well. A freakish chain of improbable accidents produced the bubble of conditions that was necessary for the rise of life, our species, and technological civilization. If we continue to drift obliviously inside this bubble, taking its continuation for granted, then inevitably—sooner or later—physical or human-triggered events will push us outside, and we will be snuffed like a candle in a hurricane.

We are menaced by gamma ray bursts (that scrub major regions of their galaxies free of life); nearby supernovae; asteroids and cometary impacts (which strike Jupiter every year or two); Yellowstone-like supereruptions (the Toba supereruption was a near extinction-event for humans), civilization-collapsing coronal mass ejections (which would take down the electrical grids and electronics underlying technological civilization in a way that they couldn't recover from, since their repair requires electricity supplied by the grid; this is just one example of the more general danger posed by the complex, fragile interdependence inherent in our current technology); and many other phenomena including those unknown to us. Here is one that no one talks about: The average G-type star shows a variability in energy output of around 4%. Our sun is a typical G-type star, yet its observed variability in our brief historical sample is only 1/40th of this. When or if the Sun returns to more typical variation in energy output, this will dwarf any other climate concerns.

The emergence of science as a not wholly superstitious and corrupt enterprise is slowly awakening our species to these external dangers. As the brilliant t-shirt says, an asteroid is nature's way of asking how your space program is doing. If we are lucky we might have time to build a robust, hardened planetary and extraplanetary hypercivilization able to surmount these challenges. Such a hypercivilization would have to be immeasurably richer and more scientifically advanced to prevent, say, the next Yellowstone supereruption or buffer a 2% drop in the Sun's energy output. (Indeed, ice ages are the real climate-based ecological disasters and civilization-enders—think Europe and North America under a mile of ice). Whether we know it or not, we are in a race to forge such a hypercivilization before these blows fall. If these threats seem too distant, low probability, or fantastical to belong to the "real" world, then let them serve as stand-ins for the much larger number of more immediately dire problems whose solutions also depend on rapid progress in science and technology.

This raises a second category of menaces—hidden, deadly, ever-adapting, already here—that worry me even more: the evolved monsters from the id that we all harbor (e.g., group identity, the appetite for prestige and power, etc.), together with their disguised offspring, the self-organizing collective delusions that we all participate in, and mistake for reality. (As the cognoscenti know, the technical term monsters from the id originated in Forbidden Planet.) We need to map and master these monsters and the dynamics through which they generate collective delusions if our societies are to avoid near-term, internally generated failure.

For example, cooperative scientific problem-solving is the most beautifully effective system for the production of reliable knowledge that the world has ever seen. But the monsters that haunt our collective intellectual enterprises typically turn us instead into idiots. Consider the cascade of collective cognitive pathologies produced in our intellectual coalitions by ingroup tribalism, self-interest, prestige-seeking, and moral one-upsmanship: It seems intuitive to expect that being smarter would lead people to have more accurate models of reality. On this view, intellectual elites therefore ought to have better beliefs, and should guide their societies with superior knowledge. Indeed, the enterprise of science is—as an ideal—specifically devoted to improving the accuracy of beliefs. We can pinpoint where this analysis goes awry, however, when we consider the multiple functions of holding beliefs. We take for granted that the function of a belief is to be coordinated with reality, so that when actions are based on that belief, they succeed. The more often beliefs are tested against reality, the more often accurate beliefs displace inaccurate ones (e.g., through feedback from experiments, engineering tests, markets, natural selection). However, there is a second kind of function to holding a belief that affects whether people consciously or unconsciously come to embrace it—the social payoffs from being coordinated or discoordinated with others' beliefs (Socrates' execution for "failing to acknowledge the gods the city acknowledges"). The mind is designed to balance these two functions: coordinating with reality, and coordinating with others. The larger the payoffs to social coordination, and the less commonly beliefs are tested against reality, then the more social demands will determine belief—that is, network fixation of belief will predominate. Physics and chip design will have a high degree of coordination with reality, while the social sciences and climatology will have less.

Because intellectuals are densely networked in self-selecting groups whose members' prestige is linked (for example, in disciplines, departments, theoretical schools, universities, foundations, media, political/moral movements, and other guilds), we incubate endless, self-serving elite superstitions, with baleful effects: Biofuel initiatives starve millions of the planet's poorest. Economies around the world still apply epically costly Keynesian remedies despite the decisive falsification of Keynesian theory by the post-war boom (government spending was cut by 2/3, 10 million veterans dumped into the labor force, while Samuelson predicted "the greatest period of unemployment and industrial dislocation which any economy has ever faced"). I personally have been astonished over the last four decades by the fierce resistance of the social sciences to abandoning the blank slate model in the face of overwhelming evidence that it is false. As Feynman pithily put it, "Science is the belief in the ignorance of experts."

Sciences can move at the speed of inference when individuals only need to consider logic and evidence. Yet sciences move glacially (Planck's "funeral by funeral") when the typical scientist, dependent for employment on a dense ingroup network, has to get the majority of her guild to acknowledge fundamental, embarrassing disciplinary errors. To get science systematically moving at the speed of inference—the key precondition to solving our other problems—we need to design our next generation scientific institutions to be more resistant to self-organizing collective delusions, by basing them on a fuller understanding of our evolved psychology.