The Powerful Role of Magical Beliefs in Our Everyday Thinking

New research on magical thinking challenges many traditional views of cognition.
The idea that you would believe something that you know to be impossible is only one of many strange and counterintuitive properties of our mind. Image: Aron Visuals, via Unsplash
By: Gustav Kuhn
Listen to this article
Brought to you by Curio, an MIT Press partner

Adults often deny believing in magic, but on closer inspection, much of our behavior is more magical than we think. Eugene Subbotsky, who for over 40 years has studied the development of magical thinking, has suggested that in adults, magical beliefs are simply suppressed and can be reactivated given the appropriate conditions. His research also suggests that when denial of a magical belief is costly, adults are happy to give up their belief in the power of physical causality and view the world in terms of magical explanations.

Subbotsky’s findings show that magical thinking is deeply ingrained in our day-to-day thoughts and behaviors, and that magical and scientific beliefs can happily coexist inside our minds. But why does such thinking exist in the first place?

This article is adapted from Gustav Kuhn’s book “Experiencing the Impossible.”

In children, magical beliefs provide fuel for imaginary role-playing and fantasizing that helps them master difficult problems and maintain a feeling of independence and power. Similar concepts also play a role in our adult lives. Magical beliefs can help us deal with complex situations that we would otherwise simply fail to comprehend, and they can make the inanimate world more understandable. For example, human-computer interactions rely on a deep-rooted magical belief that is typically known as the user illusion. Every time you empty your computer’s trash folder, you happily accept the magical belief that the files within have been deleted. Accepting this user illusion is far more manageable than having to deal with the complexity of computer programming.

Anthropologist Bronisław Malinowski argued that magical beliefs and superstitious behaviors help fill the void of the unknown.

Another aspect is the illusory sense of control that magic provides, with magical beliefs offering a helping hand in situations beyond our rational control. Control is an important coping strategy, and a lack of control can lead to mental health issues such as anxiety and depression. Anthropologist Bronisław Malinowski argued that magical beliefs and superstitious behaviors allow people to reduce the tension created by uncertainty and help fill the void of the unknown. Malinowski noticed, for example, that the behavior of fishermen in the Trobriand Islands changed depending on where they fished. In the inner lagoon, fishing was straightforward, with little ritual. When fishermen set sail for the open sea, however, there were much higher levels of superstitious behavior, often involving elaborate rituals. The water in the inner lagoon was always calm and the fishing consistent, with little risk and, consequently, a high level of perceived control. Fishing in the open sea, on the other hand, was more dangerous, with prospects that were much less certain, resulting in a lower sense of control.

More recent studies have provided further support for this connection. During the 1990–1991 Gulf War, researchers observed more magical thinking and superstitious behavior in people who lived in areas under direct threat of a missile attack, compared to those in low-risk areas. In their study of superstitious rituals employed during high-stress examinations, Jeffrey Rudski and Ashleigh Edwards observe that the frequency of students’ exam-related magical rituals increases as the stakes increase. Intriguingly, students report that they frequently use these rituals while denying any causal effectiveness. Superstitious behavior therefore seems to give us the illusion of control, which can reduce anxiety during stressful situations and consequently improve performance. As with homeopathic medicine — which can have the same healing power as a placebo, suggesting that its effects are all in our mind — many of these rituals might actually work, albeit through unintended or indirect mechanisms.

Few doubt that magical beliefs can provide an illusory sense of control, but why do normal people also develop and maintain magical beliefs in ordinary, nonstressful contexts? The social psychologist Jane Risen suggests that magical beliefs result from some of the shortcuts and heuristics that our minds use to reason about the world. According to Risen, there is nothing intrinsically special about magical beliefs; they simply reflect some of the biases and quirks found in our everyday cognition. Let’s examine this theory in a bit more detail.

In recent years, psychologists have proposed that we use two fundamentally different mental processes to solve cognitive tasks. In his influential book “Thinking, Fast and Slow,” Nobel laureate Daniel Kahneman proposes that our reasoning and decision making rely on two separate mental processes. One of them, System 1, operates quickly and requires little cognitive effort. Rather than analyzing a problem in all its detail, it uses simple heuristics to come up with quick, intuitive answers. In many situations, this is an effective and reliable strategy. But as with any shortcut, it can lead to errors.

An example of this is the availability heuristic, a cognitive shortcut that helps us evaluate the importance or prevalence of an event based on the ease with which we can remember the appropriate information. Information that comes to mind more easily is weighted more heavily. This is why, for example, most people vastly overestimate the likelihood of dying from a shark attack. Such attacks are extremely rare; you are far more likely to be killed by a cow. Yet unlike cow attacks, shark-related deaths are widely reported in the press and so pop into your mind more easily, thereby influencing your beliefs.

During the Gulf War, researchers observed more magical thinking and superstitious behavior in people who lived in areas under direct threat of a missile attack, compared to those in low-risk areas.

Although System 1 is fast, it is not necessarily accurate, whereas accuracy is much better with System 2, the other mental process. But System 2 operates in a controlled, step-by-step manner, making it rather slow and effortful. According to Kahneman, most of our day-to-day decisions are made through System 1, with System 2 intervening to override these intuitive assessments when they go wrong. Unfortunately, however, System 2 is often too effortful, so that many of these wrong answers go unnoticed, especially when they seem like they’re correct.

Let me illustrate this using a famous problem-solving task. Try to solve the following problem: The combined cost of a bat and a ball is $1.10. The bat costs $1.00 more than the ball. How much does the ball cost? Before reading on, take a few moments to solve the problem. (No, really. Give it a try.) The answer that immediately springs to most people’s minds is $0.10. But this is incorrect. If the ball cost $0.10 and the bat cost $1.00 more, the total would be $1.20, not $1.10. The correct answer is actually $0.05. Even though solving this problem does not require sophisticated mathematics, more than half of the participants at elite universities and more than 80 percent of participants at less selective universities answered it incorrectly.

If you came up with $0.10 as the answer, you relied on System 1 and did not invest enough cognitive energy to check your answer. Had you done so, you would certainly have spotted the error because the problem is not particularly challenging. The fact that most people fail to check their answer suggests that System 2 is often lazy and inattentive. There is huge pressure on the brain to save its cognitive resources. System 1 requires less effort and is much more likely to be used, even though it occasionally makes mistakes. People who come up with $0.10 as the answer have replaced “the bat costs $1.00 more than the ball” with a simpler statement: “The bat costs $1.00.” According to Kahneman, most of our cognitive reasoning is carried out by System 1, but once System 2 spots a mistake, it corrects it and enables us to come up with the correct answer.

Jane Risen recently suggested that, in many situations, System 2 notices the mistake but still does not correct it, acquiescing to the erroneous conclusion. The idea that you would continue to believe something that you know to be wrong sounds rather odd, but of course, this is exactly what we observe during magical thought processes. For example, when participants refuse to drink a beverage labeled “cyanide” — fully aware that the label is false and the drink does not contain cyanide — they know that they are acting irrationally, just as they know that cutting up a picture of a loved one causes no real harm. It is clear from participants’ verbal reports that people realize that their feelings toward these objects are unfounded but that they feel them anyway. For example, I know that there is nothing special about my particular wedding ring, but I feel strongly about it nonetheless.

Risen argues that superstitions and other powerful intuitions can be so compelling that we simply cannot shake them off, despite knowing that they are wrong. According to her, System 2 is not simply lazy and inattentive, it is also “a bit of a pushover”; it will not override the result of System 1 if the feelings associated with that result are too strong. Many magical beliefs occur because we rely on System 1’s simple heuristics and employ them in situations where these rules do not apply. Even though System 2 knows they are wrong, it fails to correct the erroneous logic and thus acquiesces to magical beliefs.

The idea that you would believe something that you know to be impossible seems rather counterintuitive. However, this is only one of many strange and counterintuitive properties of our mind. It is important to note that Risen’s new model of cognition does not apply exclusively to magical thinking and can explain a wide range of rather irrational behaviors.

For example, in 2015, British gamblers lost a staggering £12.6 billion. In 2016, American gamblers lost even more: $116.9 billion. People’s probability judgments clearly have some rather irrational characteristics. Many of these judgments are based on System 1 responses, which people often know are wrong. Imagine that you can win a prize by selecting a red marble from a bowl that contains both red and white ones, and you can choose whether you’d like to pick from a small bowl or a large one. The small bowl has 10 marbles, one of which is red (a 10 percent chance of winning). The large bowl has one hundred marbles with fewer than ten that are red (a less than 10 percent chance of winning). You know the odds, which are clearly marked on each bowl. So which bowl would you choose? Rather surprisingly, over 80 percent of people chose the large bowl, even though they knew that the odds of winning would be lower. We are evidently compelled to choose this bowl because it contains the larger number of winners.

RelatedAI Is No Match for the Quirks of Human Intelligence

This is one of numerous situations in which System 1 makes a decision based on a heuristic (i.e., choose the situation with largest number of winners), while System 2, which knows the odds, fails to override this intuitive yet suboptimal decision. Likewise, sports gamblers are reluctant to bet against the favorite, even if the potential winnings of the underdog are higher. Again, System 2’s failure to override such decisions contributes to the astronomical profits made by casinos and bookmakers and influences consumer behavior and stock markets around the world.

We’ve explored here our beliefs in “real” magic and the important role these play in much of our day-to-day behavior. The current research on magical thinking challenges many traditional views of cognition — in particular, the view that childhood magical beliefs are replaced by rational and scientific reasoning in adulthood. Instead, it has become apparent that rational and magical thoughts cohabit deep inside our minds. Most previous models of cognition have struggled to accommodate the coexistence of magical and scientific thought processes, hence the need to revise our models of cognition.

Understanding our magical beliefs also helps us understand the experience of performance magic, because witnessing a magic performance results in a coexistence of contradictory beliefs. The magician Teller — the silent one in the Penn & Teller duo — describes magic tricks in terms of experiencing things as real and unreal at the same time, while Jason Leddington suggests that the experience of magic results in a conflict between our beliefs about the world and the automatic alief that the trick itself elicits. These ideas share many similarities with the theories of magical thinking discussed here.

In light of this new research, the idea of simultaneously holding contradictory beliefs or experiences seems entirely plausible. It is tempting to think of magic as simply a form of fringe entertainment that deals with unique experiences rarely encountered during day-to-day life. However, magical beliefs play an important role in our everyday cognitive processes.


Gustav Kuhn is Reader in Psychology at Goldsmiths, University of London, and a member of the Magic Circle. He is the author of “Experiencing the Impossible: The Science of Magic,” from which this article is adapted.

Posted on
The MIT Press is a mission-driven, not-for-profit scholarly publisher. Your support helps make it possible for us to create open publishing models and produce books of superior design quality.