The Intuition Trap

  • Post author:
  • Post category:Casual / Philosophy / Psychology
  • Reading time:22 mins read
  • Post last modified:December 5, 2022

Inborn knowledge

Nobody likes to be ignorant—to be unaware of things. To be ignorant naturally presents a disadvantage for survival; to be ignorant means being unaware of certain benefits or dangers in the world that can be either used or avoided, such as a generous source of clean water, or a lurking predator.

In the modern world, the nature of ignorance has shifted towards the complex. One no longer worries about predators, but rather other forms of often imperceptible dangers and disadvantages, such as being lied to by massive organisations who do not have their subjective best interests in mind (conspiracy theories come to mind, but even superstitious cults and science denialists fall into this category, believing that science is an organised web of lies).

I wonder if these worries stem from the disparity between one’s actual level of knowledge and understanding of the world compared with their desired level of knowledge and understanding of the world.

Imagine, if someone told you, that you were born with all the knowledge in the world. How would you feel? That you could work everything out immediately, with little to no sustained cognitive effort? We have something like that: our intuition—a source of knowledge that everyone is born with; the “feeling” of whether something is or isn’t what we think it is.

Or, in more contemporary terms, the gut feeling of whether something is right or wrong. Imagine being born with such power! This is a common theme in Japanese power fantasy works such as isekai (meaning “otherworld”) or tensei (meaning “reincarnation” novels where the protagonist is given virtually infinite knowledge about the world from the get-go. As expected, this is a very popular genre. The protagonist, being armed with immense knowledge, is able to quickly spot deceit, make unique observations, and manipulate their environment to quickly become more powerful than everyone else.

There’s no need to study, contemplate, or think. One needs only to “feel” the right answer and it will come to them. They can outsmart everyone without even trying; they were born as gods.

The power fantasy

But this power fantasy doesn’t exist only in fiction; fiction can represent reality to the extent that the most engaging ones often fulfil some desire that maintains reader interest. After all, is the point of fantasy not to imagine something spectacular; something magical; something that helps us escape from our relatively disappointing reality—even if only temporarily?

What if someone could promise the same fantasy in real life? What if someone told you to just trust your intuition? What if you were really born with all the knowledge in the world? This, is what I believe lays the foundation for not only conspiracy theories but even many religious beliefs and personal philosophies (that sometimes eventually devolve into cults)—it’s all about intuition. Unfortunately for science (and philosophy), reality can often be counter-intuitive (base rate fallacy, correlation as causation, just-world fallacy, affect heuristic… the list goes on. For a few more examples, check out this article).

These are not unavoidable errors, and, in spite of this post seemingly being against relying on innate knowledge, I would argue that we do actually have innate knowledge—intuition, even— to help us spot the errors in our intuition. This is where my argument may fall shorter than usual for its invocation of a reductionistic mindset, but I believe our ability to spot logical contradictions—or rather our literal inability to hold contradictory beliefs even outside of cognitive dissonance—is the greatest reasoning ability all humans are innately born with.

Contradiction and science

Though it doesn’t provide a set of axioms, rather manifesting itself as only a single rule, I’m tempted to venture as far as to say that all scientific and technological progress was based on this single rule: “don’t contradict yourself”.

Say that we believe that a feather will fall slower than a steel anvil in a vacuum (lack of air resistance is very hard to intuit), and someone shows us a demonstration of a feather and a steel anvil falling at the same speed. Now, if we believe that what we have seen from the demonstration is true, then we are now holding two contradictory beliefs: the feather falls slower than the steel anvil but also falls at the same speed. How can that be?

Like two cats on a windowsill, even though there’s plenty of space in our memory to hold both of these beliefs, it cannot be tolerated: clearly, one has to go.

It doesn’t matter which belief goes first. If we think it’s the latter—that what we saw in the demonstration was either an illusion or manipulated, then we will discard it. If we believe our senses were accurate and that the demonstration was fair, then the former belief will have to go.

Of course, reality rarely makes sense if we only look at isolated logical steps. The former belief has to have come from somewhere, as with the latter—about the demonstration being fair, trusting the sensory input we received, etc.
These, however, are issues far beyond the scope of this discussion. Our senses cannot be trusted, but it doesn’t really matter; we can never be absolutely certain of anything, but we can work from probabilities. Good enough is still enough. (that was technically two sentences).

A different way to put it would be:
If we believe that evidence matters, we cannot also believe that it doesn’t matter. We cannot use logic to dismiss logic; doing so contradicts itself. We cannot simultaneously agree and deny something, such as when following a syllogism with agreed-upon axioms, only to deny the logical conclusion.

It is very difficult to hold actual contradictions in the mind, as they tend to “slip” through gaps we unconsciously create such as in the form of “exceptions”, like oil and water, or two opposing magnets—they find their own side. And it is precisely this that is what I’m referring to. Not the ability to create exceptions a la hypocrites who are often accused of contradicting themselves—they’re not contradicting themselves, closer analysis will reveal that they rely on a list of exceptions in order to avoid genuinely contradicting themselves.

Though this is often the result of fallacious reasoning, the underlying mechanic of avoiding contradictions remains. The term “hypocrite” can also be used to refer to people who believe in superstition or conspiracy theorists, for they claim to value evidence while at the same time rejecting evidence. It won’t take long to realise that they aren’t actually struggling with the abundance of counterevidence in the contemporary scientific literature—they’ve merely dismissed them in favour of their own, or put them aside in the exception box of “it doesn’t matter”.

Nobody can genuinely contradict themselves.

We can always contrive ways to deal with these contradictions and put them in their own boxes, but this isn’t what science is about. Science doesn’t try and amass a huge list of exceptions just to deal with contradictions such as seeing an image of Earth taken from space and having to come up with a hundred exceptions in order to continue believing that the Earth is flat.

Rather, it’s about leveraging our natural tendency to avoid contradictions, maintaining consistent use of our initial axioms (not redefining them mid-reasoning), and Occam’s razor—otherwise known as the law of parsimony—which guides us to discard one side of a contradiction instead of inventing a thousand and one convoluted exceptions to help with their co-existence.

Socratic questioning and a brief dialogue from After Life
Socratic questioning works off our inability to contradict ourselves; it follows that people who hold contradictions are necessarily unaware of them, for example, in this age-old argument against Creationism, adapted by the Netflix series “After Life:

“Thought you were atheist.”
“Doesn’t mean I don’t believe in anything. It means I don’t believe in any god.”
“How can you not believe in God?”
Which one?
(Silence)
“What do you mean?”
“Well, er, Zeus?”
“Who?”
“Greek god. Or Ra, or Ganesh?”
“No, not those ones, the real one in the Bible.”
“Yahweh.”
“Just God!”
“Well, you know how you don’t believe in all those other gods I mentioned?
That’s how I don’t believe in yours.”
How can you not believe that someone created all this, though?
“Why do you believe that someone created it all?”
Because it’s so good; it can’t just be chance, can it?”
(Nods)
“What, the Big Bang? Everything came from nothing? That’s impossible.
“You’re right. God did it.”
“Right.”
“So, where did God come from?”
He’s always been around.
(Shrugs)
“There you go. Easy, innit?”
1. The first set of bolded words demonstrates a common fallacy (No true Scotsman)
2. The second set of bolded words contains a contradiction that the Creationist is necessarily unaware of. A common counter is the invocation of Special pleading—a form of double standard that exempts God from this question, yet another method to get around the contradiction.


And dismissing its disproven ideas is what science does, often with little hesitation. This self-correcting mechanism is what makes it so powerful and useful. After all, if it’s always trying to correct itself (at least, on paper, temporarily putting aside the replication crisis, predatory journals, publication bias, etc. No plan survives first contact with the enemy known as reality), how can it stay wrong forever?

Though I am reluctant to use the word “truth” due to the connotation of deceit (the opposite of “truth” is often “lie”) as there is no deceit—the universe began as unknown to us; there is no “lie”—science eventually converges on the truth—it is the most accurate method we have of looking at reality, perhaps analogous to the “correct_belief” variable mentioned in my previous post.
Hypocrisy
Given that my argument is against what we may consider taking the path of least resistance or taking one's innate knowledge and abilities for granted instead of carefully, methodologically, and slowly thinking things through, I wonder if I can be accused of hypocrisy.

For example, I can read faster than the average person, which presents an innate advantage: given the same amount of effort, I can parse more information than others. I can also intuitively spot logical inconsistencies, which leads me to rely on my intuition the same way others do.

By expending similarly low amounts of effort we apply the argument "People want to believe that they already know everything" against me, I find that it fits quite well.

Hypocrisy

In this post, I have written against the overreliance on intuition and labelled it a form of power fantasy, but once we shift our focus to the low-effort and instant-gratification nature of its appeal, a certain hypocrisy begins to emerge.

People are different. As a result of the constant and complex interplay between nature and nurture, people have different abilities, different opportunities, and different experiences.

Let's imagine we're walking through an empty mall, for example. There are stores lined up everywhere we look, with their names displayed prominently across their storefronts, and smaller signages along the windows and walls with more details about what they sell, such as the process of making their clothes, bags, or even coffee.

For the sake of simplicity, we'll assume that everyone in this example is equally disinterested in these stores, spending only the minimum time and effort looking at them. We'll also assume that the time and (subjective) effort are the same.

One person walks through this mall, briefly looking at the stores while walking past them, and ends up only remembering the names of some of the stores. Maybe a bit of the text from signages, but barely enough to make out anything.

Another person does the same, spending the same amount of time and effort, and walking at the same pace. Yet, this person remembers where the cafe imports its beans, the name of the new windproof material being touted by the travel store, and the seasonal item on the restaurant's promotional menu.

The latter might read faster or have a better command of English, or they might be able to think more quickly and hold more information in their head. Let's have these two people talk about their findings.

We ask the first one which store they found most interesting, and they give us an interesting name, saying that it stood out the most due to how it uses a pun in it.

But when we ask the second one the same question, they give us a different name, saying that the brewing process and type of beans used at this cafe was what interested them the most.

Let's follow our two participants for a week and see how they base their shopping decisions, assuming they aren't allowed access to internet reviews. As expected, the former goes through the stores with the fanciest names and most ostentatious decor, while the latter seems to prefer speciality stores that are often comparatively dull on the outside but sell much higher-quality products. I hope you can see where this is going.

Remember, they're still spending the same amount of effort looking at things, which we've confirmed using the proprietary new technology invented for the sake of this analogy at the end of the week.

Now, let's go back to reality. Substitute these fun stores that focus on superficial appeal with the usual astrology, superstition, and other pseudoscience, then substitute the dull speciality stores with dense but reliable academic texts.

I hope it makes sense as to why some people lean towards the former beliefs instead of the latter. It's still the same amount of effort, but not everyone can accomplish the same things with the same amount of effort—not everyone can finish a 1000-word essay before dinnertime, at which point they'd need to go for dinner and can't read anymore. Is it really a surprise that they prefer shorter texts that tell them they already know everything they need to know and only need to "feel" the truth?

So, where is the hypocrisy?

It's simple. By asking the former group to parse all that information, what we're really asking them to do is expend more effort—more effort than the latter group—in order to read and understand everything. Something, I'm sure, the latter group isn't willing to do either. It's not the same demand. Reading relatively dense material, thinking critically (a term that's been abused by conspiracy theorists), and learning how to differentiate the right dense material from the wrong dense material (science vs pseudoscience) takes far more effort for the former group than the latter group.

The demand of time and effort required may even border or exceed a level we consider reasonable, regardless of which group we're talking about, even if only just to train one's critical thinking ability. One may also spot a chicken and egg situation where it's not clear whether one learns critical thinking because they naturally have a talent for it, or if they've studied critical thinking and thus have a "talent" for it.

To sum it up:
If hypocrisy is defined as demanding others do something one can't or doesn't do, then asking others to expend significantly more intellectual effort may be considered hypocrisy. This is not to say that people shouldn't try harder, nor is it to say that people who are naturally talented don't have to try hard. The example provided was a bit extreme. In reality, the vast majority of people can learn critical thinking without investing an unreasonable amount of time and effort, but the situation is a lot more complicated than that, especially with everything else competing for one's attention on the internet. Social media is anathema to anyone learning critical thinking for the first time, especially when conspiracy theorists use the same phrase to mean completely different things.

Let's move on.

Ignorance, fear, and lack of trust

It may be that those who lack the ability to easily differentiate between truth and falsehood are more sceptical and untrusting of others, instead preferring to trust themselves—their intuition. A common theme found in many cults is the "feeling" of the truth of the universe (or even "feeling the universe", whatever that means), often sprinkled with the word "quantum" picked out of a word blender.

After all, they claim, why trust them—or anyone else—if the truth is already within oneself? Ironically, the "truth" is to trust them in the end and offer monthly or yearly donations for their "teachings". One can only imagine what one could possibly teach if everyone already had the "truth" within themselves, but they do it anyway. They mislead people into thinking that the trust they feel is what "truth" feels like. The follower thinks they're trusting themselves; in reality, they're trusting someone who's found a way to take advantage of their weaknesses.

What about conspiracy theorists? There are two ways one can interpret the white trails left by planes in the sky: one may say that they're formed by changes in air pressure at the wings or wingtips, and one may also say that they're caused by water vapour and impurities in the engine exhaust that cause water droplets to form and freeze into ice crystals. These are called contrails. But, it's not the most intuitive physics, and it can be difficult to comprehend for some people.

Therefore there's also a third way—a wrong way—to interpret these: they're chemtrails; they're chemicals the government releases into the air that does whatever the conspiracy theorist happens to be most afraid of (it varies depending on who you ask). It is an understandable perspective. After all, according to our intuition, there's "nothing" in the sky, and if we see anything in the sky, it has to be put there by something (or someone). Therefore, it has to be the government dumping chemicals which have to be harmful... for some reason (hint: fear and irrationality are inextricably linked).

It's peculiar that harm is the default assumption, when the reasoning need not lead to that conclusion. Nobody's worried about chemtrails secretly improving eyesight, vaccines secretly boosting intelligence, or radio towers giving people superpowers. There clearly is a market for this, after all. We have people administering homeopathic nothings, and people hovering their hands over another while allegedly transferring "energy", so why not "mask-wearing improves hearing ability"?

These people—from believers in the supernatural to conspiracy theorists—despite being ostensibly different, are perhaps merely two sides of the same coin. An overreliance on one's intuition and the fear of being lied to—the fear of being in the dark. Like someone with poor eyesight, it's difficult for them to move with the same freedom as everyone else.

To be unable to see if there's really danger ahead, like a fatal fall from a cliff, one's best strategy may simply be to fear everything to the extreme. It is these people who are the most vulnerable to telling each other that there is danger ahead even though none of them can see well, and most susceptible to being led astray by kind strangers offering to hold their hand and lead them to safety while slyly leading them to their own homes.

Closing

Nobody has perfect vision, and it always pays to be a little bit sceptical of everything (unless you're Laplace's Demon), but just like the criteria for being legally blind, there is a point where the scepticism turns into irrational fear and no longer becomes useful, let alone valid.

This is where the analogy ends, because unlike our eyes, we can improve our "vision", so to speak. Proper education can be a major influence, though it is not the only factor (most things in this world can't be attributed to a single cause). Many factors influence how people think, and it would be naive to discount social pressures from the local culture, tradition, and peers, along with other socioeconomic factors such as lack of access to information, even if only by means of a language barrier.

Along with the above discussion in the "Hypocrisy" section, people also differ from each other in terms of their innate abilities. Some people are less proficient at critical thinking and logical reasoning, for example. Though I still want to emphasise the just-world fallacy: perhaps this is also partially the result of increasingly-popular computer games, but a lot of people seem to believe that everyone is overall equal, as if they have the same amount of "skill points", just allocated differently.

There is no evidence for that, nor do we know that the universe has a correcting force that guarantees everyone is equal—people can have different amounts of "skill points", and there may even be people who are genuinely better at everything compared to someone else. Of course, reality isn't as straightforward as this and one can improve their skills as they grow, but it helps to recognise the limits.

One who is told that they can do anything risks becoming dejected and depressed after realising there are certain things they can't do no matter how much they try. They may even tell themselves that they are a failure, whereas someone who's acknowledged their unique characteristics and limitations may forgive themselves more easily for domains they're particularly weak in while instead focusing their energy on things they're better at.

Of course, the latter perspective does require crushing the soul out of free will (pun intended), but it is the most realistic, and I don't think the "loss" will be so great if children weren't taught that souls existed throughout their childhoods. That said, Vitalism, along with its relatives, do have innate origins—the concept of "souls" and thus free will has been said to originate from our intuitive understanding of the world. It is why we don't wonder (not even for a second) where the catepillar has gone after seeing a butterfly emerge from the cocoon, or whether your childhood friend is really still the same person despite definitely being much shorter and less ugly back then.

Our intuition is extremely useful for seeing the world, but we have to recognise that it can also lead us astray. One often falls into the temptation of wanting quick and easy solutions such as believing that intuition is all one needs to have. We can do better; we're smart enough for that.