Everybody thinks they know how their mind works, but they don’t. You can ask someone why they like their boyfriend, or why they chose a job, or whether a book changed their opinion of global warming, and they’ll think about it for a moment and happily give you an answer. But they’re making it up.
The experiments were done ages ago, and the research is still going, continuing to tease apart actual cause and psychological effect. We know now that what people tell us about their own mental processes is quite thoroughly inaccurate. We all believe that we have this magic thing called “introspection” that lets us see what is going on in our own minds, but in reality we don’t. It’s a fictional superpower.
The research on this point is really quite good. It’s not even a new finding, having been understood for at least the last fifty years. And yet this simple but important fact has never quite managed to make it into popular culture.
Perhaps no one wants to believe it.
For example, the simple act of writing an essay expounding against what you actually believe changes your mind somewhat; if you’re anti-abortion and you have to write an essay arguing for reproductive rights, you’ll be somewhat more pro-choice at the end of it (and vice versa.) It seems that merely making the right mental connections changes your beliefs. This is surprising, but the truly weird part is this: if you ask people if they’ve changed their opinion, they’ll tell you no. Even weirder, if you ask them a week later what their original belief was, they’ll tell you they’ve always been pro-choice. (This paper is the classic study.) The effect is repeatable with those who are convinced by verbal arguments, who, sure enough, will swear to experimenters that they have always held their new position.
And yet we think we know what we’re thinking.
It goes on. Most of us have heard of the halo effect where we unjustifiably infer one positive trait (such as intelligence) from another (such as attractiveness), but how many of us are conscious of doing it — and will admit to that fact? If you give insomniacs a sugar pill and tell them it will increase their heart rate and anxiety, they actually get to sleep earlier, because they falsely attribute their spinning minds to the pill and not to their insomnia. Even such basic perceptions as taste are not immune, as recent experiments show: given identical food, people (Americans anyway) will tell you that the meals labeled “full-fat” are tastier than their “low-fat” counterparts, and give adamant nonsense reasons for their preference.
It gets absurd. If you ask people to choose their favorite scent from four identical-smelling objects, they’ll mostly choose the rightmost one. No one believes this; suggesting that position was a factor produces the most thoroughly alarming looks…
When asked what goes on in our own minds, we mostly make it up. We find plausible explanations, and mistake them for an interior causality that, evidence shows, we can’t actually perceive.
The classic in this field is the 29-page review paper by Nisbett and Wilson, Telling More Than We Can Know: Verbal Reports on Mental Processes, which combs the previous 50 years of psychological research and discusses many of these examples. The authors write,
When subjects were asked about their cognitive processes, therefore, they did something that may have felt like introspection but which in fact may have been only a simple judgement of the extent to which input was a representative or plausible cause of output. It seems likely, in fact, that the subjects in the present studies, and ordinary people in their daily lives, do not even attempt to interrogate their memories about their cognitive processes when they are asked questions about them.
An important question remains. If we’re mostly telling stories about the workings of our own minds, then why do we believe them? There are many possibilities. Perhaps because we do have access to an enormous amount of introspective state — memories, emotions, sensations, goals — we mistakenly believe that we know about all of our minds, when in reality our conscious access is partial and fragmented. Perhaps we experience snippets of our reasoning processes, the fleeting words and images of deep thought, and mistake these symptoms for deeper causes. But I perfer Nisbett and Wilson’s closing comment:
It is naturally preferable, from the standpoint of prediction and subjective feelings of control, to believe that we have such access. It is frightening to believe that one has no more certain knowledge of the workings of one’s own mind than would an outsider with intimate knowledge of one’s history and the stimuli present at the time.
No one said the truth would be easy.
Minsky I think said something like “I bet consciousness is a stack trace” – can’t recall the attribution. I’ve always liked that one as a summation of this position.
Another keeper: Alva Nöe here at UCB recently said something about how if a rock thrown into the air suddenly acquired consciousness midflight, it would have a half dozen reasons for its apparent direction before it hit the ground. He’s doing some interesting work, you may dig him if you’ve not read him before. One of the new wave of ’empirical philosophers’.