Roy Baumeister: John asked us to give some of our own personal quest or struggle or background of this. I don't know. The thing is, I was actually raised by wolves and it was not a happy childhood, you missed out on a lot of things. It taught me a lot but, you know, when I got to adolescence and I didn't want to be a wolf anymore, and what they taught me was useless.
Ever since, I've been trying to figure out, so as not to miss out on any more, what human life was all about and psychology is good for that. Figure out how the parts fit together. And to do that, one has to be something of a generalist because I have to know what's in all the parts, and well, today, to be a generalist, you know, you got to be fast because there's so little time, so much to know.
I go from area to area, trying to size things up. One thing I've learned is, caring about what is the right answer just slows you down. It gets in the way. And these are a lot of topics that people care very much about and have strong opinions. I'd rather just not care. I aspire not to have political views.
Going from area to area, I notice some patterns that come up over and over again. One thing is that I’ve become increasingly skeptical of reductionism. It seems like reductionism is always proved wrong in the long run. In psychology we had behaviorism and freudian psychoanalysis, that were going to explain everything. They explained some things and we learned a lot, but they could not explain everything. Far from it.
To some extent we're now going through this with the brain and evolution: many people think these will explain everything. Well, certainly we are going to learn a lot, and have already. But we need to be attentive to continuities that we're the same as animals, and also perhaps the ways in which we are different, in order to put them together.
Beyond the reductionism, another thing is that motivation tends to be undervalued compared to cognition and ability. And a third point is we tend to have the individual focus, so we tend to neglect and undervalue the interpersonal dimension, and things are perhaps more interpersonal than we are typically inclined to think.
And so that said, in terms of trying to understand human nature, well, and morality too, nature and culture certainly combine in some ways to do this, and I'd put these together in a slightly different way, it's not nature's over here and culture's over there and they're both pulling us in different directions. Rather, nature made us for culture. I'm convinced that the distinctively human aspects of psychology, the human aspects of evolution were adaptations to enable us to have this new and better kind of social life, namely culture.
Culture is our biological strategy. It's a new and better way of relating to each other, based on shared information and division of labor, interlocking roles and things like that. And it's worked. It's how we solve the problems of survival and reproduction, and it's worked pretty well for us in that regard. And so the distinctively human traits are ones often there to make this new kind of social life work.
Now, where does this leave us with morality? Well, it's not so much the purpose to facilitate individual salvation or perfection, or whatever, as I quoted McIntyre in our discussions earlier today, but rather morality is the set of rules to enable people to live together. It serves the purpose of making the culture work, as culture depends on cooperating with each other, there's trust, shared assumptions, things like that.
Although nature and culture, in that sense, are working together, there are some conflicts; in particular, nature's made us, at least in a very basic way, selfish. The brain is selfish, and maybe it's the selfish gene, not the selfish individual or whatever. But there's still a natural selfishness, whereas culture needs people to overcome this to some degree because you have to cooperate with others and do things that are detrimental to your short-term, and even your long-term self-interest. In order for culture to work, you have to keep your promises, you have to wait your turn, pay your taxes, even maybe send your offspring into battle to risk their lives. It goes against the grain, biologically. But these are the sorts of things that morality promotes, to try to get people to overcome their natural selfish impulses, to do things that make the system work. And that benefits everyone in the long run.
Morality does this, and of course laws, too. We haven't said that much about laws, but laws regulate behavior in a lot of the same ways that morality does. They prescribe a lot of the same things, restraining self-interest to do what is better for the group, and so that the system will operate effectively. And there's a big difference between the laws and morals, which is mainly in the force they use. Why people have to do moral things in practice is because of concern with their reputation, and it's based, therefore, on long-term relationships. If you cheat someone you're living next door to, for the rest of your life they're going to know that, and other people are going to know that and you'll be punished and it will compromise your outcomes long-term.
As society got larger and more complex and moved to more stranger interactions, laws have had to step in to take their place, because you can cheat a stranger whom you'll never see again, and get away with it. Anyway, you're seeing here the neglected interpersonal dimension in understanding morality. Morality depends on relationships. And it's there, again, to regulate interpersonal behavior so that people cooperate, so that the system can work.
Now, consider some of the traits that evolved to enable people to overcome these selfish impulses so as to do what's best for the group and the system and so on. Among those, self-regulation is central. I think in part I got invited here, is I have a history of doing research on self-regulation and self-control. The essence of self-regulation is to override one response so that you can do something else —usually something that's more desirable, better either in the long run, or better for the group.
That is why we've called self-control the moral muscle. I'm going to unpack that and comment on both parts. It's moral: self-control is moral in the sense that it enables you to do these morally good things, sometimes detrimental to self-interest. So if you get lists of morals, whether it's the Seven Deadly Sins or the Ten Commandments or a list of virtues and so on, they're mostly about self-control. And you can really see self-control as central to them, so there are the Seven Deadly Sins of gluttony, wrath, and greed and the rest. They're mostly self-control failures. Likewise, the virtues are exemplary patterns of self-control. So that's the moral part of the ‘moral muscle’, it's a capacity to enable us to do these moral actions, which are good for the group, even though overcoming this short-term self-interest.
The muscle part, that's kind of emerged from our lab work, independent of any moral aspect. There seems to be a limited capacity to exert self-control that gets used up. It's like a muscle, it gets tired. As we found in many studies, after people will do some kind of self-control task, then they go to a different context with completely different self-control demands, they do worse on it – as if they used a muscle and it got tired there.
So it's a limited resource that gets exhausted. The muscle, there are other aspects of the muscle analogy. If you exercise self-control regularly, you get stronger. I wouldnt want people to say, well, if self-control and morality's a limited capacity, I'm never going to do anything to exert self-control because I don't want to waste it. No, au contraire, you should exert it regularly; it will make you stronger and give you greater capacity to do things.
And certainly then we find that when people have exerted this muscle and it's tired, so to speak, or when they've depleted, you know, ego depletion's a term for it, depleted their resources, then behavior drifts toward being less moral. So we found that people are perhaps more gratuitously aggressive towards somebody else after they've exerted self-control and used up some of their “moral muscle” resources.
In a study on cheating and stealing we published a couple of years ago, people had to type up an essay about what they had done recently, either not using words with the letter 'a' or not using words containing the letter, 'x.' There are a lot more words contain an 'a' than 'x’, so the former requires much more self-control and overriding. And so when you're trying to make up a sentence and you keep reaching the point, oh look, there's an 'a' in that word and you have to override it, and so it uses self-control to keep overriding one response and coming up with another, that depletes people's resources. So they were more depleted in the “A” than in the “X” condition.
Afterwards, then they went to another room, supposedly another experiment where they're taking an arithmetic test and they're being paid for the number of ones they get right. They either scored it themselves, or the experimenter scored it for them. Of the four conditions (depleted or not, and self-scored or experimenter scored) all got about the same number right —except for the depleted people who scored their own tests, they somehow claimed to get a whole lot more right. It was not plausible they were actually getting smarter by virtue of having typed while not using words with the letter 'a' in them, because when the experimenter scored them, he couldn't find any difference. Got about the same number right. But when nobody was checking and their answer sheet was shredded and they said, you know, I got six correct. Then suddenly they got a whole lot more correct. So that suggests increase in lying and cheating, and effectively stealing money from the experiment.
There are some other findings, too, depleted people are more likely to engage in sexual misbehavior, and so on. So moral behavior does seem to go down when people have depleted their moral muscle capacity. More recently, we're working with Marc Hauser on seeing if depletion changes, how people make moral judgments of others, that's proving a little bit more slippery. But again, this kind of process is geared toward regulating your behavior more than your thinking about others. So it's not surprising that it shows up right there.
A couple of other things we've found, relevant here. Choice seems to deplete the same muscle as self-control, it's the same resource. So we have people make a lot of choices about which of these two products would you buy and so on, afterwards then their self-control is damaged, too, so making choices uses up the resource needed for self-control. That resource seems to be tied into some physiological processes. We found changes with the glucose levels in the bloodstream, and so something about doing these advanced kinds of self-control acts uses up this resource and depleted self-control in the bloodstream.
If you give people a drink, after manipulation we give them lemonade mixed with sugar or with Splenda, and Splenda they still act bad, but they got sugar in there, it gives a quick dose of glucose to the bloodstream and suddenly their behavior is more self-controlled; in some cases more moral, making more rational decisions and so forth. And conversely, too, if they're depleted from self-control, then their choice process is changed to be more shallow and so forth.
In terms of self-regulation plus choice, I mean, you start now to think that this same capacity is used, the same resource used for choosing and for self-control, and in maybe a couple of other things as well. There are some data on initiative. So instead of talking about it in terms of regulatory depletion, we're trying to come up with a bigger term, and that's how I got to talking about free will.
Free will is another of these topics where people are very emotional on both sides, and they have a lot of passionate feelings. And I don't really want to deal with that. Let me try to forestall some by saying when I'm trying to develop a scientific theory of free will, there's nothing supernatural, nothing that's noncausal in there. Let’s understand the processes by which people make these choices and exert self-control. And I think there is a social reality corresponding to this, and certainly behaving with self-control, behaving morally, making moral choices and making certain kinds of choices, these are the things we will associate with free will. And so in that sense, there's a real phenomenon there. Whether it deserves the term 'free will' depends kind of on this or that definition.
I'm surprised, I've been to this conference, I was at this conference in Israel with Bloom and Pizarro on morality, and yet nobody at either conference mentioned free will, really, in any talk or discussion. Yet it seems to me that this is a natural way to build this theory and extend it. So part of my interest in this topic, morality assumes that the person can do different things. And it says, well, this act is good and that act is bad, so it's a way to persuade you to do one thing rather than the other.
And likewise, moral judgments about people are based on the assumption that the person could have, and essentially the moral judgment says the person should have acted differently. And legal judgments, of course, very much the same sort of thing.
So I see I'm well ahead of schedule here. Let me comment on a couple of other points. In terms of evolution and morality, there was a recent article by David Barash saying well, there's the fairness instinct, you can see look in other animals, cited the Frans De Waal’s study, in which monkeys were mad if they saw another monkey getting a nicer treat for the same action. They'd say well, look, I think there's a fairness instinct. Again, I'm skeptical of reductionism, and you know, we need to attend to both the continuities and also the differences between human and animal behavior. To call it a fairness instinct seems a little overstated; it's a step in that direction, but you know, it's not that impressive.
If you have two dogs and you give one of them a treat, the other looks at you like, well, what about me? But what you don't see is the over-benefitted one complain. The other dog doesn't say, well, I'll share my biscuit with you, or I'm not going to eat mine until the other dog gets one too. Yet human behavior does show some of those patterns. And so I think if we want to see a fairness instinct, we need to see both the over-benefitted and the under-benefitted one complain. And perhaps even more, to get to what the human, you have to have a third party saying no, you got more than this one and that's not fair, and intervening to redistribute, as happens all over the world in human societies.
In the Israel conference, Paul Bloom was talking about moral progress, too. And Steven Pinker has a recent book on that as well, I gather. Yes, the world's gotten to be a better place, but again, I'm not sure that we're morally better people. The laws, I mean, I mentioned the laws, are very much responsible for accomplishing that. I's a lot of third-party intervention to tell people not to do that. That reflects really some things that are new in human culture, perhaps not seen so much in other creatures.
Let me draw some conclusiosn here. Culture, I want to say, is humankind's biological strategy. It's our new way of solving the basic biological problems of survival and reproduction. We take our sick children to the hospital, we ask the government to give tax breaks for research, or to provide tax breaks to families with children or whatever. It's been very successful, culture. It worked very well for us, but requires a lot of advanced psychological traits.
One might ask, if culture works so well, why don’t other species use it? Well, they don't have as many capacities. Culture requires advanced psychological capabilities. And so human evolution maybe added some new things, or at least took what was small in other animals and made it larger and more central. Self-control is present in other animals, but needs to be developed much more thoroughly in humans because culture has a lot more rules, a lot more regulations, of the laws and morals and so on. So a lot more needing to override your behavior to bring it into line with standards.
Morality in the full-fledged sense, and I'm going with the cultural materialist view that culture is a system that basically has to provide for the material and social needs of the individuals. And so regulates behavior for that, and morality comes with it, in a full-fledged sense, comes with culture. Tells people what to do to override their self-interest, and at least their short-term, and to follow the system's rules. The system works, and because of that we all live better, but we all have to cooperate to a significant degree in order for the system to work. And so morality is this set of rules to help us do that.
Self-control, then, is one of the crucial mechanisms that had to improve in humans, to enable culture to succeed. So it's an inner capacity, limited energy expensive, and so on, to alter your behavior, override responses, and enable one them to change one's behavior to fit in with the requirements of the system so that it will work. And then free will, again, you can see continuity with animals, their choice and agency in other creatures, and free will perhaps a more advanced form of agency, that evolved out of that, and more adapted to working in culture using meaningful reasons and operating within the context of the shared group.
It enables the human animal to relate to its social and cultural environment. I mean, a simple way of the basic agency of the squirrel and so on, is to enable that little animal to deal with its physical environment, but the free will as an advanced form enables the human being to deal with its cultural environment. And recognizes that as humans we can be somewhat more than animals, control our behavior in these advanced ways, need to make the system work. And once it works for us, then it has provided the immense benefits that it has.