Suppose you are reading an article, and suddenly you hit a sentence that bothers you. For instance, let suppose you are a psychologist who reads a finding that genetics affects behavior, and since you are very anti-eugenics, a feeling of dismay arises in you. (This could happen if you felt that the implication of this finding means that different ethnic groups might have behavioral differences, and further, that some groups might have worse behavior than others.)
But the feeling comes fast, and you then feel the need to explain why the fact is wrong. Without the feeling that started the process, you would have no desire to delve deeper, or to find reasons why the assertion is wrong. Sometimes this kind of response is dismissed as rationalization of emotional bias, but I don’t think that is always the case.
I was recently reading about semantic-pointer theory, which says that a concept can be represented at several levels, each level more abstract than the one before. If true, this might mean that you feel a “mismatch” at a high, abstract level, which could be indeed real, and then you search for the explanation of why you feel so uneasy with an assertion you just encountered. The mismatch that you sensed could be with a deeply held ethical belief, but it could also be a different type of mismatch, a mismatch with knowledge that you have.
In 1957, Leon Festinger proposed that human beings strive for internal psychological consistency in order to mentally function in the real world. That a person who experiences internal inconsistency tends to become psychologically uncomfortable, and so is motivated to reduce the cognitive dissonance: either by changing parts of the cognition, to justify the stressful behavior; or by adding new parts to the cognition that causes the psychological dissonance; and by actively avoiding social situations and contradictory information that are likely to increase the magnitude of the cognitive dissonance.
But reducing some kinds of inconsistency is actually desirable. If you see a contradiction between two pieces of information, obviously some thought on your part (or an attempt to find new information), is necessary to get at the truth.
Festinger showed that reducing cognitive dissonance can lead to crazy results. For instance, there was a religious cult that believed an alien spacecraft soon to land on Earth to rescue them from earthly corruption. At the determined place and time, the cult assembled; they believed that only they would survive planetary destruction; yet the spaceship did not arrive to Earth.
This obviously led to discomfort.
Had they been victims of a hoax? Had they vainly donated away their material possessions?
Rather than believe that, most of the cult ended up accepting the idea that due to their own efforts to spread “light” they had saved the earth. The aliens from outer space had given planet Earth a second chance at existence.
The logic here may seem crazy, but if you have a fixed point that you do not doubt (in their case it was the destruction of the earth), then if you encounter data that contradicts that fixed point, it makes sense to find an explanation for the data that leaves the fixed point intact. In a more sensible scenario, your fixed point might be that light cannot travel faster than a certain speed, and if you encounter data that contradicts that fixed point (such as an experiment that was reported in major newspapers a few years ago), chances are you will doubt it, and double check. As well you should.
Let’s suppose you are a little company that makes a living by ‘fracking’ and you see a report that earthquakes have increased in areas where there is a lot of fracking. Your self-interest creates an emotion of dismay, and you will look for evidence that will contradict that observation. You find out that is it true, but that the main reason for it is the method of disposal of wastewater (injecting deep into the ground), and that another method of disposal can be found. So here, that emotion you felt has caused a good result.
If you see yourself as a hero, and you display cowardice in a situation, you may feel dissonance, and that may inspire soul-searching. If you see yourself as a stickler for truth, and find you have fallen for a scam, that too might prompt self-examination. If you see yourself as decent, but the large numbers of your Facebook friends are sharing a regrettable photo from your wild youth, that too might inspire thought.
The big problem in this type of thinking is in cases where your fixed point (whatever it is) should give way to the unpleasant information your have just been exposed to, but your emotions prevent it from giving way.
One suggestion I want to make in this post is that detecting inconsistencies and trying to resolve them is a good thing. If the contradiction causes you discomfort and you try to remove that discomfort, you are not necessarily “rationalizing” away the facts.
The other point is that ‘feelings’ may be a unconscious precursor to understanding. Albert Einstein said something that illustrates this: “But the years of anxious searching in the dark for a truth that one feels and cannot express, the intense desire and the alternations of confidence and misgiving until one achieves clarity and understanding, can be understood only by those who have experienced them.”