One reason that conflicts are often difficult to resolve is the nearly universal human need for self-justification. Most of us believe that we are smart, ethical, and competent. If we find ourselves in a situation that threatens to make us feel stupid, immoral, or incompetent, we experience “cognitive dissonance,” a feeling of discomfort and anguish. Likewise, when we hold any strong commitment to a person, group, or cause, we will feel dissonance when evidence challenges the wisdom of that commitment. Dissonance, like hunger or thirst, is so unpleasant that we are motivated to reduce it.
Here’s how it works. In one of my early experiments, when people volunteered to join a weekly discussion group, I randomly assigned half of them to a condition where they had to go through a severe initiation to gain admission; those in the control condition were admitted with only a small degree of effort. Then each individual listened to the same tape recording of the group they had just joined. The discussion was boring and almost completely worthless. Those in the control group saw it for what it was—boring and worthless. But those who went through the severe initiation rated the group as significantly more interesting and worthwhile, and were eager to attend the upcoming meeting. Going through hell and high water to gain admission to a boring group would have made them feel stupid; therefore, they unconsciously emphasized the few positive aspects of the group discussion and turned a blind eye to its many negative qualities.
Similarly, when people are deeply committed to a particular point of view, they have a tendency to close their minds to any evidence that they might be wrong. In the criminal justice system, when a married woman is murdered in her own home, police and prosecutors know from experience that the perpetrator is likely to be her husband. If they find the merest shred of evidence pointing to his guilt, they will move to arrest and convict him. But if further evidence turns up implicating a different, unknown perpetrator, they will be inclined to dismiss it. To take the dramatic example of Michael Morton, the police were so sure they had arrested the right man that they ignored a bloody bandana found near the crime scene. He was convicted and given a life sentence. After many years of litigation, prosecutors were forced to run a DNA test on the bandana. It revealed that the blood was not Morton’s but that of a convicted rapist and murderer. Not only did an innocent man serve 25 years in prison, but the true culprit was free to murder others—and did—until he was finally apprehended.
According to the Innocence Project, this tragic miscarriage of justice is not a rare event. Hundreds of people currently languish in prison because police and prosecutors ignored crucial exculpatory evidence. I am not suggesting that many of those police and prosecutors are evil—only that once they are convinced they have the guilty party, they blind themselves to information that is dissonant with that belief, and proceed to justify everything they do to convict.
Our human need for self-justification extends to the groups with which we are deeply identified. We want to believe that our group is right and the opposing group is wrong. Social psychologist Lee Ross took peace proposals created by Israeli negotiators, labeled them as Palestinian proposals, and asked Israeli citizens to judge them. The Israelis liked the Palestinian proposal attributed to Israel more than they liked the Israeli proposal attributed to the Palestinians. In short, like the participants in my initiation experiment, their evaluation of the proposal was influenced by their need to justify their pro-Israel/anti-Palestinian attitude. Ross lamented, “If your own proposal isn’t going to be attractive to you when it comes from the other side, what chance is there that the other side’s proposal is going to be attractive when it comes from the other side?”1
Elliot Aronson is a social psychologist. He is the only person in the 120-year history of the American Psychological Association to have won all three of its major awards: for distinguished research, teaching, and writing. He has written 21 books including The Social Animal, Not By Chance Alone, and (with Carol Tavris) Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts.
1 Quoted in Eric Jaffe, “Peace in the Middle East may be impossible: Lee D. Ross on naive realism and conflict resolution,” American Psychological Society Observer 17, 9-11 (October 2004).