2015 : WHAT DO YOU THINK ABOUT MACHINES THAT THINK? [1]

julia_clarke's picture [5]
John A. Wilson Professor and HHMI Professor, Jackson School of Geosciences, University of Texas at Austin
Machine Thought Will Never Have More Than A Metaphorical Relationship With Human Thought

The way we use language is flexible, generous and creative, the product of our own peculiar intelligence. But human thought and machine thought are not the same and their differences are important to look at.

We might argue that machine "thinking" is in a model-phenomena relationship to human thought, a necessarily simple description of a complex process of interest that nonetheless might be adequate and certainly may be useful. These words, and machines themselves, could both be viewed as a kind of shorthand for the things we want get at. Describing a machine as "thinking" could be a simple heuristic convenience or machine design might be explicitly biomimetic. Indeed, very often we co-opt the language of biology to talk about objects of our own creation. We see machines evolving, their thinking becoming more and more like our own, perhaps surpassing it in key, perhaps even threatening, ways.

However, we should remember that machine "evolution" is not a biological process but a human, creator-driven process. It is natural or biological only in that it results from the action of natural, biology-bound, humans. This definition of "natural" leads to several core problems. Biological evolution is not a creator-driven process. Structures cannot be dreamt up or driven by an entrepreneurial spirit or curiosity-driven mind. Biologists, philosophers, and social scientists studying how we teach evolution have repeatedly shown the damage caused by imbuing biological evolution with intentionality or teleology. Talking about machines "evolving" greater cognitive capacity holds back our own understanding; it perpetuates a profound misunderstanding about the nature of the evolutionary process. A second, linked outcome of a description of machine "thinking" as natural is that all human-caused modification of the Earth system via neglect or war is similarly naturalized.

Certainly there is some truth we are communicating with analogies like "the brain is a machine" or "machine thinking" but this says more about the form and structure of how we make sense of the world. We would do well to remember that any cognitive attributes unique to humans are the result of the vagaries and contingencies of our ~6 million years separate from any other lineage alive today. Indeed, abstract thought is often estimated to be closer to a mere 50,000 years old, or if we are optimistic, 200,000 years old. This particular form of abstract thought appears to be exceptionally young, appearing in the last moments of Earth history. It is these facilities that lead us to homologize machine thought and human thought.

The processes behind technological innovation and biological innovation are fundamentally different and the interactors in these processes are similarly distinct. In technological innovation, there is some product or functionality, "thought" or "thinking", we want to see happen and move towards. This process is fundamentally unlike biological evolution. Human cognition evolved in populations of individuals completely unlike machines, which, like Lamark's giraffes, can acquire within their "lifetimes" the characteristics needed for some new functionality. Innovation in biological evolution proceeds like a prolonged improvisation. There is only genetic and trait variability in populations and the environment and chance to influence the longevity of these traits of a population.

So what is lost by thinking about machines "thinking"? I would argue that we lose sight of key aspects of the phenomena that we are relating through analogy. Biological evolution occurs in populations and is not goal directed. It is not trying to solve a problem. It is the vagaries of history of both Earth and Life that have lead to current human cognitive facilities. Not just are the processes behind these things distinct, but their results are very different. Take language, can a machine use terms so imprecisely? If we allow machines to "think" do we begin to increasingly see ourselves only as thinking machines?

Will our human cognitive facilities be shaped by interacting with technology? It is important to remember how diverse and downright enormous the human population is. Computer use has not been linked to passing more offspring into the next generation. Most of the human population has as yet limited access to technology. The evolution of our species will be slow, and it will be importantly influenced by our environment and collective access to clean water, nutritive food and health care. If we could remember to be as inclusive in our discussions of humanity as we are in what we want to call thinking we might end up in a better place.