In my previous post, I discussed my introduction to the science behind the rationality problems all humans suffer from. I later found another book, this one called Evil Genes: Why Rome Fell, Hitler Rose, Enron Failed, and My Sister Stole My Mother’s Boyfriend, that introduced me to the biology behind our emotional — and sometimes irrational — thinking.
This time, I’m going to mostly just go with quotes from the book, as they say it all:
The Limbic System’s Role in “Emotional Thinking”
The role of emotion in shaping “rational” thinking is tremendously underrated. Strong evidence shows that human behavior is the product of both the rational deliberation that takes place in the front areas of the cerebral cortex and the “emote control” — emotional reasoning – that originates in the limbic system. …As Princeton sociologist Douglass Massey writes: ‘Emotionality clearly preceded rationality in evolutionary sequence, and as rationality developed it did not replace emotionality as a basis for human interaction. Rather, rational abilities were gradually added to preexisting and simultaneously developing emotional capacities….’
Human behavior…is not under the sole control of either affect or deliberation but results from the interaction of these two qualitatively different processes… Emote control is fast but is largely limited to operating according to evolved patterns. Deliberation is far more flexible… but is comparatively slow and laborious. (p. 187)
Our Lack of Rational Thinking When We Have a Vested Interest
Just prior to the 2004 Bush-Kerry presidential elections, two group of subjects were recruited – fifteen ardent Democrats and fifteen ardent Republicans. Each was presented with conflicting and seemingly damaging statements about their candidate, as well as about more neutral targets such as actor Tome Hanks…
…when the participants were asked to draw a logical conclusion about a candidate from the other — “wrong” — political party, the participants found a way to arrive at a conclusion that made the candidate look bad, even though logic should have mitigated the particular circumstances and allowed them to reach a different conclusion. Here is where it gets interesting. When this “emote control” began to occur, parts of the brain normally involved in reasoning were not activated. Instead, a constellation of activations occurred in the same areas of the brain where punishment, pain, and negative emotions are experienced.
Once a way was found to ignore information that could not be rationally discounted, the neural punishment areas turned off, and the participant received a blast of activation in the circuits involving rewards – akin to the high an addict receives when getting his fix. In essence, the participants were not about to let facts get in the way of their hot-button decision making and quick buzz of reward. ‘None of the circuits involved in conscious reasoning were particularly engaged,’ says Westen. ‘Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones. (p. 189)
It Affects Everyone
“motivated reasoning” — that is, political bias (in this case, at least) – appears to be qualitatively different from reasoning when a person has no strong emotional stake in the conclusions to be reached.
‘Everyone from executives and judges to scientists and politicians may reason to emotionally biased judgments when they have a vested interest in how to interpret ‘the facts,’ according to Westen. (p. 190)
The Need for Tolerance — Why You Can’t Trust Your Own “Rational” Thoughts
This next quote struck home to me due to all the pro and con views of Prop 8 flying around. I ask both sides of the issue to read the next quote humbly and with an open mind:
Similar reasoning has led kindhearted individuals to support “feel-good” programs such as busing, which seemed, on the face of it, to be an outstanding method to integrate school systems. Opponents of this program – whatever their reasons – were seen as racists, which meant that rational concerns about the program were discounted. The results was that cities such as Detroit were devastated as the well-to-do moved to the suburbs, out of range of the managed busing system. This worsened the segregation the busing had been designed to remedy. (p. 191)
But simply looking at the research results, one must conclude that people’s first emotional responses about what’s wrong, who is to blame, or how to proceed, particularly in relation to complex issues, must always -– always -– be considered suspect. There is no simple algorithm for teasing rationality from emotion. An ardent Democrat or Republican, a dyed-in-the wool community union organizer, a young devotee of Scientology, a Palestinian suicide bomber, or a KKK grand kleagle could each reason the above paragraphs and think. I’m not irrational –- it’s those other idiots who can’t see the obvious. But we all have pockets of irrationality, some large, some small, no matter if we are mathematicians who make our living doing proofs, wealthy philanthropists, or stay-at-homewives. If there is one thing that is important for us to know, it is that emote control allows our best traits – love, caring, loyalty, and trust – to be used as manipulative levers. (p. 192)
Comments 9
Nicely said.
Jonathan Haidt is an excellent resource into emotions and moral reasoning. Here are a few links that are absolutely fascinating:
http://www.believermag.com/issues/200508/?read=interview_haidt
http://faculty.virginia.edu/haidtlab/articles/haidt.the-moral-emotions.manuscript.html
http://www.edge.org/3rd_culture/haidt07/haidt07_index.html
Excellent topic. In my consulting work (in the area of strategic planning), I review ‘avoiding reality’ tendencies that preclude individuals from evaluating risk and return rationally. These tendencies include:
* general avoidance – unpleasant, not equipped to deal with
* ego – pride cometh before the fall because it is so blinding.
* selective listening – confirmation bias
* emotional overinvestment – unwilling to face the risks to success
* wishfull thinking
* false dichotomy
I am always looking for practical ways for leaders to better ‘confront reality’. The key elements I have observed so far are: 1. humility – recognize that these tendencies are natural, 2. courage – the willingness to see thing as they are rather than as we would prefer them to be and 3. a safe environment for sharing information and a willingness to be influenced.
This doesn’t bode well for the Prop 8 debate.
Motivated reasoning probably makes a lot of sense from an evolutionary point of view. Molding our own thinking to fit the tribe’s consensus viewpoint is probably a survival and reproduction advantage, even when the logic is faulty. I’m going to ponder this next month …. during fast and testimony meeting.
Author
“I’m going to ponder this next month … during fast and testimony meeting.”
Why wait? Ponder it right now while reading the newspaper, or listening to Fox News/CNN. Or during a lecture at a university. Or at a Sunstone symposium. Or while listening to conservative or liberal radio.
Heck, think of it while reading Mormon Matters right now!
I will ponder it now and I intend to read the referenced book. Topics like this one are why I visit this blog. Thanks.
I just wanted to make a not so subtle point that emotional, illogical thought is rampant in our religous convictions. We should probably be aware of motivated reasonsing when we consider our own religous beliefs and the way others react when we share them.
Author
“I just wanted to make a not so subtle point that emotional, illogical thought is rampant in our religious convictions. We should probably be aware of motivated reasonsing when we consider our own religious beliefs and the way others react when we share them.”
Of this, I have no doubt. And I completely agree.
But what I was indirectly wondering out loud in #5 is why you picked that group (i.e. Mormons in a testimony meeting), and that alone, as your target when this is a problem in abundance everywhere and with everyone.
Seldom, I don’t know you or your religious leanings, so can I use you for an honest experiment? I mean no offense by this, just honestly curious.
You see, if I were a betting man, I’d guess you don’t self identify with being “typical Mormon in a testimony meeting” because upon hearing that all human beings are irrational, the first think you indicated was that “typical Mormon in a testimony meeting” were irrational. Thus my doubts that you were actually speaking of yourself or people like yourself.
But then again, maybe you do. Maybe you really did, upon hearing that people were irrational, immediately think of yourself and people that believe like yourself. It’s fully possible, though admittedly rare.
Can I do a quick experiment without offending? (You can always just choose to not answer.)
Here is the experiment:
Let’s, for the sake of this experiment, define “a typical Mormon in a testimony meeting” as being someone that believes that Joseph Smith actually had golden plates of ancient origin written by ancient Nephites and that the Book of Mormon is in some sense a “translation” of that record written by ancient Nephites that was then quite literally and physically taken away by an Angel afterwards.
If you do believe this, congratulations, you are one of the rare few (or at least your having a rare moment) in that you are questioning your own beliefs and religious convictions first, and not someone elses.
If you can’t answer the above in the affirmative, then you were actually just questioning other people’s beliefs — that you don’t actually share — as irrational and not your own. You were literally fulfilling this statement from the quote above: “I’m not irrational –- it’s those other idiots who can’t see the obvious.”
Would you be willing to honestly tell us which it was?
Again, I’m not saying this in a bad way. I think it would be a good example and there is no shame, in my mind, of falling into a trap (or seeing others as irrational) that we are all guilty of a lot of the time. Heck, I’ll be the first to admit that I’m guilty of it too more times than I can possibly conceive or count, so no worries if you were too. 🙂
“Would you be willing to honestly tell us which it was?”
I consider myself a faithful skeptic. I acknowledge this label makes me more irrational than most. Why would someone devote so much time and effort to a cause that they don’t firmly believe in?
Testimony meeting is a time when its very uncomfortable being a skeptic. The discomfort comes from the combination of my own doubts and the certainty of belief expressed by others. I’ve heard very few testimonies where the person expresses their desire to live by faith in spite of their doubts.
It’s actualy comforting to think that people have a genetic or biologically driven predispostion to believe. I wonder if its a stonger predispostion in some, than in others. The concept somewhat mirrors the principle that knowledge of truth is a spiritual gift. It is something I will ponder during testimony meeting.
Author
Seldom,
For what it is worth, I label myself a “skeptical believer” (as opposed to “faithful skeptic” (or as I’ve heard elsewhere “faithful doubter.”) The emphasis is on “believer.”
I can relate to what you are saying about how some of us (I include myself) just don’t feel the same feelings of complete certainty sometimes expressed in testimony meetings. At one time I felt out of place, though I don’t any more.
But don’t we skeptical types feel certainty in other circumstances? I think we do. It’s built into our biology so there are no exceptions. A person that never felt “certain” about a basic set of beliefs in life would have to be institutionalized and would be non-functional as a human being, even if they were actually more “correct” technically speaking. This is because “certainty” is a necessary thing to act and “acting” is always faith.
Because of this, I have learned to not see expressions in testimony as anything problematic or concerning. They are expressions of faith and they are an accurate description of how the person really feels – certain – before they acted on that faith. At a biological level, what they are saying is completely accurate. You could find equivalent statements in any religion or political party because that’s just the way it is.
And I note that those that have concerns over testimonies are always guilty of some equivalent offense of “non-rational” certainty, but just in some other aspect of their lives. It’s easy to find examples in everyone.
Oh, one more thought for you. If you do choose to bear testimony via expressing “faith” rather than “knowing” it, why would you feel compeled to also express “doubt?” If you were to say “I have faith Jesus is the Christ” it would say it all. A few people might wish you had said “know” but not many. And it implies a lack of complete certainty by definition. I’m not sure a testimony meeting is the best place to be expressing “doubt” as that goes against the proclaimed purpose of the meeting.
That being said, it would be nice if we could culturally catch up to the teachings of our leaders and allow for people to express “faith” rather than “knowledge” without any perceived inferiority of the testimony. The GA’s have taught this for many years as an acceptable alternative, but culturally it’s not accepted as the same and seems to be looked down upon sometimes. But I think it’s getting more acceptable to use words like “faith” or “belief” all the time.
“I wonder if its a stonger predispostion in some, than in others.”
It is, according to the current scientific thinking. But then again, I doubt science will ever figure out a way to make this solely nature. It will always be partially nurture, partially nature, and partially choice.