Skip to main content
Full access
Published Online: 12 June 2023

Chapter 1. Delusional Beliefs and the Madness of Crowds: What Are Beliefs, and Why Are Some of Them Pathological?

Publication: Decoding Delusions: A Clinician's Guide to Working With Delusions and Other Extreme Beliefs
Abnormal beliefs, otherwise described as delusions, are a common feature of severe mental illness and are often reported by patients with both nonaffective and affective psychoses (Picardi et al. 2018). In DSM-5-TR (American Psychiatric Association 2022), these kinds of beliefs are defined as
fixed beliefs that are not amenable to change in light of conflicting evidence. Their content may include a variety of themes (e.g., persecutory, referential, somatic, religious, grandiose). . . . Delusions are deemed bizarre if they are clearly implausible and not understandable to same-culture peers and do not derive from ordinary life experiences. (p. 101)
This definition is a change from the third and fourth editions of DSM, which described a delusion as “a false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary” (American Psychiatric Association 1994, p. 765). This is similar to the definition contained in ICD-11:
A belief that is demonstrably untrue or not shared by others, usually based on incorrect inference about external reality. The belief is firmly held with conviction and is not, or is only briefly, susceptible to modification by experience or evidence that contradicts it. The belief is not ordinarily accepted by other members of the person’s culture or subculture (i.e., it is not an article of religious faith). (World Health Organization 2018, MB26.0)
Definitions such as these raise numerous philosophical and practical quandaries. For example, some kinds of nonpathological beliefs, such as political beliefs, are notoriously resistant to counterargument or inconsistent evidence (Lodge and Taber 2013; Westen 2008). It is also unclear what would count as “incontrovertible and obvious proof or evidence to the contrary.” Moreover, the idea that delusions are false beliefs collapses in the case of rare examples such as when a pathological belief is ill-founded but true (e.g., patients with delusional jealousy often drive their spouses into the arms of others) (Enoch and Trethowan 1979), when the belief is untestable (as in the case of most religious delusions), or when it is impossible in practice to determine the truth of a patient’s claims (e.g., a complaint of being victimized by intelligence services may seem unlikely, but it is not obvious how it could be definitively discounted) (Cermolacce et al. 2010).
A further problem with the requirement that delusional beliefs should be false is that false beliefs are common (e.g., most people think that they are better than average compared with other people), and, rather than being products of neural dysfunction, many of these beliefs are tolerable products of normal cognitive processes, arise from reasoning biases that favor the avoidance of costly errors (trusting someone who is untrustworthy) at the expense of less costly errors (failing to trust someone who is trustworthy), or are in other ways evolutionarily adaptive (Haselton and Nettle 2006; McKay and Dennett 2009). For these reasons, many other features of delusions have been highlighted by commentators, for example, that they are idiosyncratic, seem incredible to others, are highly preoccupying, or are usually the cause of personal distress and thereby interfere with the individual’s ability to cope with everyday life (Oltmanns and Maher 1988).
One approach that can be traced to the very beginning of psychiatry has been to characterize at least the most severe delusions as “bizarre.” The German psychiatrist Emil Kraepelin (1856–1926), whose foundational work on the classification of psychiatric disorders I have described elsewhere (Bentall 2003), emphasized the nonsensical nature of many delusions and argued that this feature could be used to distinguish between the patients he diagnosed as having dementia praecox (the term he coined for what modern mental health professionals call schizophrenia) and those who had paranoia (a diagnosis he used to describe patients who had any kind of delusion when no other symptoms were present; Cermolacce et al. 2010). Delusions have been said to be bizarre if they depart from “culturally determined consensual reality” (Kendler et al. 1983) or violate agreed-on understandings about what is possible (e.g., in the case of a patient who believes that his neighbor is stealing electricity through the walls of his home) (Mullen 2003). However, although clinicians can usually agree which of their patients are deluded, the distinction between bizarre and nonbizarre beliefs has proven difficult to operationalize, with the consequence that the reliability of this distinction is poor (Bell et al. 2006; Spitzer et al. 1993).
A fundamental challenge to these efforts is the propensity of ordinary people to believe weird things that seem in many ways as irrational or incomprehensible as those seen on any psychiatric ward. Indeed, it has been argued that the drawing of a line between delusions and widely accepted but arguably irrational beliefs is a fundamental Hilbert problem that must be solved to progress psychology as a science (Ross and McKay 2017). (The term Hilbert problem comes from the German mathematician David Hilbert (1862–1943), who in 1900 identified 23 fundamental but unsolved problems in mathematics. Hilbert’s list has been influential in driving mathematical research up to the present time.) In this chapter, I explore the similarities and differences between delusions and beliefs and attitudes that are widely accepted as nonpathological (in particular religious and political beliefs) and show how confusion about this issue reflects misconceptions and naiveté about the nature of belief systems in general. Following this analysis, I make a new proposal about why delusions are different from nonpathological beliefs.

Psychotic Delusions and the Madness of Crowds

To begin, it is helpful to consider the weird beliefs of ordinary people. An attempt to catalog these kinds of beliefs was made by the Scottish journalist Charles Mackay (1814–1889), whose work Memoirs of Extraordinary Popular Delusions and the Madness of Crowds was first published in three volumes (Mackay 1841) with a preface that begins as follows:
In reading the history of nations, we find that, like individuals, they have their whims and their peculiarities; their seasons of excitement and recklessness, when they care not what they do. We find that whole communities suddenly fix their minds upon one object and go mad in its pursuit; that millions of people become simultaneously impressed with one delusion, and run after it, till their attention is caught by some new folly more captivating than the first. We see one nation suddenly seized, from its highest to its lowest members, with a fierce desire for military glory; another as suddenly becomes crazed upon a religious scruple, and neither of them recovering its senses until it has shed rivers of blood and sowed a harvest of groans and tears, to be reaped by its posterity. (p. 1)
Mackay’s first volume dealt with what he called “national delusions,” in which he included economic bubbles (e.g., the Mississippi scheme, in which French investors in 1720 lost huge amounts of money after speculating on land for French settlements in Louisiana), prophecies such as those of Nostradamus, and “the influence of politics and religion on the hair and beard” (Mackay 1841, p. 1). The second volume, on “peculiar follies,” included long accounts of the Crusades and the witch-hunting epidemic in Europe during the sixteenth and seventeenth centuries, endeavors that Mackay regarded as ultimately self-defeating. Finally, Mackay discussed “philosophical delusions,” such as alchemy (a pseudoscience and the precursor of modern chemistry in which believers hoped to find a way of turning base metals into gold) and the “magnetizers” (i.e., medical practitioners who, at various times, had claimed the ability to cure diseases by manipulating magnetic fields; these approaches included Mesmerism, which was a forerunner of modern hypnotism).
History has since provided any modern cartographer of belief with numerous further examples that would expand Mackay’s list enormously. For example, shortly after Mackay’s book was published, in 1856, the Xhosa nation in the Eastern Cape, who were locked in a long series of wars with white settlers, embraced a prophecy by a teenage girl named Nongqawuse that if they killed their cattle and destroyed their crops, their ancestors would reward them with a mighty army that would rise from the sea and defeat the British; so many joined what became known as the “cattle-killing cult” that an estimated 40,000 died of starvation and the survivors were reduced to servitude to their sworn enemies (Peires 1989).
Another striking example of a national delusion was, arguably, the rise of Nazi ideology in the 1930s, which was fueled by myths about German history, notably the “stab-in-the-back” myth that the country’s capitulation at the end of the First World War was caused by Jews and German socialists (Evans 2021). The U.S. subprime mortgage crisis, which sparked the global recession of 2007–2008, would have to be included as an example of an economic bubble (Lewis 2010). Additional examples of national delusions include numerous new pseudosciences, such as creation science, which claims to provide scientific evidence for the biblical account of the origins of the universe (Pigliucci 2018); quack medicines, such as homeopathy, which claims that drugs become more potent the more they are diluted with water; varieties of medical skepticism, such as the belief that vaccines cause autism; conspiracy theories, such as the belief that NASA faked the Moon landing in 1969 (Brotherton 2015); and exotic new-age religions. Indeed, the number and variety of “crazy” belief systems seems to change with dizzying speed, which would make the task of the compiler of the catalog even more difficult. At the time of this writing, for example, believers in the QAnon conspiracy (so named because it originated from a social media post by “Q,” who claimed to be an anonymous U.S. intelligence officer) believe that the 45th president of the United States, Donald J. Trump, is engaged in a secret war against a global cabal of Satan-worshipping pedophiles and sex traffickers (Roose 2020).
The willingness of a large number of the population to believe such apparent absurdities raises the intriguing question of whether there could be a type of irrational or ill-founded belief that is so widely embraced as to be nearly universal. A candidate was proposed by Lerner (1980), who marshaled a large volume of evidence to argue that the belief in a just world is such a fundamental delusion. This idea of a just world is a commonplace feature of storytelling: from an early age, we like to see heroes rewarded and villains punished (Jose and Brewer 1984). However, this has real-world consequences, one of which is victim derogation, the tendency to believe that the unfortunate bring misfortune on themselves; hence, the poor are assumed to be feckless and lazy (Bénabou and Tirole 2006), and victims of rape are criticized for wearing short skirts or for consuming too much alcohol (Russell and Hand 2017). Among the many intriguing observations made about these kinds of beliefs are historical changes documented by Malahy et al. (2009), who noticed that many social psychologists had used the same questionnaire—the Just World Scale (Rubin and Peplau 1975)—in studies conducted with U.S. college students. Examining 28 studies that had been published between 1975 (just before Ronald Reagan entered the White House and introduced neoliberal economic policies) and 2006, they found that belief in a just world increased across this period, correlating positively with increases in income inequality as measured by the Gini coefficient (arguably evidence that the world is unjust).

Delusions of Psychiatric Patients

Many of the efforts to characterize the delusions of psychiatric patients have focused on their content. As the current DSM-5-TR definition cited earlier states, these beliefs tend to follow particular themes. The most studied type is the persecutory or paranoid delusion (Bentall et al. 2001; Freeman 2016), in which the individual believes “that someone, or some organisation, or some force or power, is trying to harm them in some way; to damage their reputation, to cause them bodily injury, to drive them mad or to bring about their death” (Wing et al. 1974, p. 175). The central feature of this kind of belief is an extreme sense of vulnerability and of being under attack coupled with an intense feeling of apprehension or fear (Boyd and Gumley 2007). Some definitions (e.g., the one in DSM-5-TR) also include the belief that someone close to the individual is being threatened with harm, although other commentators have cautioned that beliefs of this kind do not really belong to the paranoid category (Freeman and Garety 2000). These beliefs are particularly common in patients with a diagnosis on the schizophrenia spectrum; for example, in a large clinical trial that recruited patients very soon after they first became known to psychiatric services, 235 patients (91.8%) scored above the clinical cutoff for suspiciousness when their symptoms were assessed with the Positive and Negative Syndrome Scale (Moutoussis et al. 2007).
Delusions of reference, in which innocuous events are attributed special meaning, have been much less studied than paranoid beliefs, despite also being very commonly encountered in psychiatric practice (Startup et al. 2009). Sometimes these beliefs are included within the paranoid grouping (Green et al. 2008). However, empirical studies have shown that they fall into two separate types: delusions of observation, in which the patient believes that they are being spied on or gossiped about; and delusions of communication, in which they believe that some innocuous message or sign (e.g., a radio broadcast) is directed at the self. Only the former type is associated with beliefs about persecution (Startup and Startup 2005).
Grandiose delusions have also been studied only rarely (Knowles et al. 2011) but typically involve beliefs about special identity, special talents, a special mission in life, or extreme wealth (Leff et al. 1976). One hypothesis about these beliefs, dating back at least as far as the work of the psychoanalyst Karl Abraham (1911/1927), is that they are the product of some kind of psychological defense against depression or low self-esteem. However, a recent qualitative study of recovered grandiose patients found that these beliefs more often seem to reflect a desperate need for a purpose and meaning in life rather than a need to feel superior to others (Isham et al. 2019).
Delusions of control, sometimes called passivity phenomena, involve the belief that feelings, drives, and volitional acts are under the control of others. These types of delusions have sometimes been considered to have special status with respect to the diagnosis of schizophrenia because German psychiatrist Kurt Schneider (1887–1967) included them in his list of first-rank symptoms that he thought were characteristic of the disorder (Schneider 1959). Phenomenologically speaking, these symptoms seem to involve a loss of the sense of agency (Gallagher 2015) or ownership of actions and feelings (Bortolotti and Broome 2008), which has led to research to try to identify the neuropsychological origins of this kind of deficit (Frith 2012). If this account is correct, one implication is that delusions of control might be closely related to hallucinatory phenomena, such as auditory-verbal hallucinations, which occur when self-generated cognitive processes such as inner speech are misattributed to an external source (Bentall 1990), rather than to the other types of delusions considered here. (See Chapter 13, “Who Are You?”)
Last, it is not uncommon for delusions to have religious content (Brewerton 1994). One study estimated that about a quarter of patients experiencing a first episode of psychosis have delusions of this kind (Siddle et al. 2002). Numerous other, much rarer delusional systems have been intensively studied because they either are associated with specific neuropsychological impairments, such as Capgras syndrome, in which the individual believes that a loved one has been replaced by an impostor (Young et al. 1990), or lead to specific medical complications, such as delusional parasitosis, which frequently leads to unnecessary interventions by dermatologists (Hylwa et al. 2011; Munro 1978), but these are not considered further here.
A striking feature of these themes is that they are universal. In a meta-analysis of 102 studies from around the world (Collin et al. 2022), paranoid delusions were consistently found to be most common, present in 64.5% of the patients studied. Ideas of reference occurred in 39.7%, grandiose delusions in 28.2%, delusions of control in 21.6%, and religious delusions in 18.3%. These estimates were almost completely unaffected by various geographic and cultural covariates, such as whether the samples were from developed industrialized nations or developing nations or whether the countries considered had high or low levels of inequality. One possible interpretation of this finding is that the themes reflect common existential themes that affect all humankind, such as the need to distinguish between trustworthy and untrustworthy others (paranoia), the need to make sense of ambiguous communications (reference), and concerns about social rank and the meaning of life (grandiose and religious delusions).
However, this is not to say that delusions are uninfluenced by the social, cultural, and political milieus in which the individual lives, although these influences become evident in only a few studies that have conducted more fine-grained analyses. For example, a study of Egyptian patients found that those who were least educated tended to have religious delusions relating to Islam, whereas those who were middle-class and educated tended to have secular and science-based delusions (El Sendiony 1976). A study from Malaysia compared patients in Penang, on the northwestern coast of the country, where the population is predominantly Chinese, with patients in Kota Bharu, in Kelantan on the eastern side, where there the population is mainly Malaysian and Muslim (Azhar et al. 1995). Once again, paranoid delusions were the most common type of delusion in both communities, followed by grandiose delusions. However, the grandiose delusions of the Kelantan patients typically concerned power or wealth, whereas those of the Penang Chinese patients were more often concerned with status. Among the Kelantan patients, delusions often focused on interpretation of the Koran; for example, patients thought that they had been specially chosen by God or were descendants of the Prophet.
A striking example of how context can color delusional content was reported during the recent COVID-19 pandemic when, in population surveys conducted in four countries, it was found that a small proportion of people had developed paranoid ideas about the virus (e.g., that others were trying to infect them); it turned out that those who developed these beliefs also scored highly on more general measures of paranoid thinking (Ellett et al. 2022).

Difficult Cases

Not surprisingly, difficult cases in which mental health experts struggle to agree on the delusional status of a belief system are not uncommon. Within clinical settings, a degree of ambiguity about this issue is often tolerated by psychiatric staff, and the problem usually generates attention only in rare and extreme cases, often those in which criminal behavior or violence is involved.
For example, in July 1984, two Mormon fundamentalists, Ron and Dan Lafferty, visited the home of their brother, Allen, in the town of American Fork, Utah. Allen was away working at the time, but they were greeted by Allen’s wife, Brenda. After entering the house, the two men murdered both Brenda and her 15-month-old daughter, Erica (Krakauer 2003). Arrested after a half-hearted attempt to evade law enforcement agencies, the Laffertys claimed that the murders had been carried out on the instruction of Jesus Christ; Dan later asserted that he was the Prophet Elijah. At trial, their crime presented a conundrum for mental health professionals, who were unable to agree on whether the brothers had a shared psychotic illness or were merely in the grip of an extreme religious ideology. A similar dilemma faced mental health professionals at the trial of the Norwegian mass murderer Anders Behring Breivik, who in July 2011 bombed Norwegian government offices in Oslo, killing eight people, before shooting and killing 69 young political activists attending a summer camp on the nearby island of Utøya (Melle 2013; Parnas 2013). Breivik justified his actions on the grounds that he was a member of a secret organization, the Knights Templar, that was fighting feminism, the “Islamification” of his country, and the “cultural suicide of Europe.”
Legal attempts to adjudicate these kinds of cases have usually focused on questions of culpability and the possibility that a person accused of a crime should be considered not guilty on the grounds of insanity (in this context, it is important to note that insanity is a legal and not a clinical concept). In many jurisdictions, the relevant legislation employs some version of the M’Naghten rule, named after Daniel M’Naghten, a Scottish woodcutter who on January 20, 1843, shot Edward Drummond, a civil servant whom he had mistaken for the prime minister, Robert Peel. By modern standards, M’Naghten would probably be diagnosed as psychotic because he held a complex set of highly paranoid ideas about the Tory government that was in power at the time. According to the rule that now bears his name, established by the British House of Lords, a successful not guilty plea requires the defense to establish that “the party accused was labouring under such a defect of reason from disease of the mind, as not to know the nature and quality of the act he was doing, or if he did know it, that he did not know that what he was doing was wrong” (Allnutt et al. 2007, p. 294). However, in recent times this defense has rarely been successful. Many U.S. states, including Utah, where the Laffertys were tried, removed the defense from their statute books after John Hinckley Jr. tried but failed to assassinate Ronald Reagan on March 30, 1981, in an attempt to impress actress Jodie Foster. However, in many states it is still possible for the defense to raise mental health issues in mitigation or to determine whether a defendant is capable of standing trial. Norway, where Breivik was tried, has been almost unique in simply requiring proof of psychotic illness (determined by standard psychiatric criteria, with no evidence of impaired judgment required) as grounds for a not guilty verdict.
The testimonies of expert psychiatric witnesses at the trials of both Ron Lafferty and Anders Breivik ultimately proved both contradictory and controversial. (Dan was tried separately from his brother, conducted his own defense, and rejected any suggestion that his mental health was impaired.) Juries in both cases were unconvinced by the psychologists and psychiatrists who argued that the crimes of the defendants were caused by their delusions and sided with those who argued that their beliefs were not pathological. Making the case that there was nothing pathological about Ron Lafferty’s beliefs, Noel Gardner, a psychiatrist at the University of Utah Medical School, while acknowledging that the defendant’s beliefs were unusual, appealed to the madness of crowds:
Many of us believe in something referred to as trans-substantiation. That is when the priest performs the Mass, that the bread and the wine become the actual blood and body of Christ. From a scientific standpoint, that is a very strange, irrational, absurd idea. But we accept that on the basis of faith, those of us who believe that. And because it has become so familiar and common to us, that we don’t even notice, in a sense, it has an irrational quality to it. Or the idea of the virgin birth, which from a medical standpoint is highly irrational . . . . (Krakauer 2003, p. 301)
Commenting on the Breivik case, one observer lamented that the disputes about his mental health could have been avoided if only those who had examined him had paid less attention to the content of his beliefs and instead focused more on subtle phenomenological features that marked out true delusions (Parnas 2013).

Phenomenology and the Continuum Debate

The phenomenological approach to psychiatry traces its roots to the work of European philosophers, notably Franz Brentano (1838–1917); Edmund Husserl (1859–1938); Martin Heidegger (1889–1976); and Karl Jaspers (1883–1969), who was also a psychiatrist (see Bovet and Parnas 1993; Broome et al. 2012). Clinicians working in this tradition have argued that psychosis is a disturbance of the way the individual experiences their existence in the world, which can only be revealed by the clinician who uses empathy as a tool for understanding the unique meaningful connections that compose the patient’s mental life. These connections are held to be quite different from the kinds of causal relationships that are the concern of the natural sciences (Jaspers 1913/1963). It is this disturbance of experience (arguably absent in the case of Anders Breivik but less obviously so in the case of the Lafferty brothers) that is thought to prove that delusions are qualitatively different from ordinary beliefs and attitudes.
Jaspers (1913/1963) noted that the beliefs of psychiatric patients are typically held with great conviction, are resistant to counterargument, and seem bizarre to observers, but he was aware that these criteria could also be applied to other fervently held beliefs and attitudes. He therefore argued that meeting these three criteria was not sufficient for beliefs to qualify as true delusions as opposed to what he termed “overvalued” ideas. Because true delusions do not arise meaningfully from the individual’s personality and life experiences, the clinician would fail to empathize with the patient, no matter how hard he or she tries. Delusions, therefore, could only be “explained,” presumably as some kind of disorder of the CNS. They are the consequence of a sudden, more or less sudden, breakdown in meaning (Jaspers 1913/1963). Taking this argument to its logical extreme, a later phenomenologically inclined researcher not only rejected the idea that delusions are wrong beliefs but argued that they are “empty speech acts, whose informational content refers to neither world or self” (Berrios 1991, p. 12).
This approach leads to the often-made distinction between the form and content of a belief, for example as articulated by Kurt Schneider:
Diagnosis looks for the “How” (form) not the “What?” (the theme or content). When I find thought withdrawal, then this is important to me as a mode of inner experience and as a diagnostic hint, but it is not of diagnostic significance whether it is the devil, the girlfriend or a political leader who withdraws the thoughts. Wherever one focuses on such contents, diagnostics recedes; one sees then only the biographical aspects or the existence open to interpretation. (quoted in Hoenig 1982, p. 396)
This distinction has led phenomenologists to emphasize the affective and experiential aspects of delusional thinking rather than what patients say they believe (Feyaerts et al. 2021). The plausibility of this approach, of course, depends on the success with which these mental states can be characterized. One strategy has been to focus on the period that precedes the onset of the fully developed delusional system. For example, in detailed studies of more than 100 patients with psychosis—mostly soldiers with paranoia symptoms—conducted in a military hospital during the Second World War, German psychiatrist Klaus Conrad (1905–1961) claimed to identify a series of stages through which their ideas evolved (Conrad 1958/2012; see also Bovet and Parnas 1993 and Mishara 2010). First, according to Conrad, there is an initial phase of das Trema (derived from Greek, colloquial for “stage fright”) or delusional mood, which may last for a few days or much longer, in which the patient feels a sense of tension, that there is something in the air, but is unable to say what has changed. At first this applies only to certain events and objects, but it gradually widens to encompass everything in the patient’s world, creating suspiciousness, fear, and a sense of separation from others. This leads to a state of apophany (revelation) in which the delusion appears suddenly, as an “a-ha!” experience, often bringing about a sense of relief. Finally, in the anastrophe (turning back) phase, the patient feels themself to be the passive focus around which the delusional business of the world is revolving. In psychiatric research, these ideas have been influential in attempts to identify very early prodromal or basic symptoms of psychosis (e.g., Klosterkötter et al. 2001) but otherwise have been subject to very little empirical investigation.
Without a doubt, phenomenological research has been useful in making us think more broadly about psychopathological phenomena, but it has not been without limitations. One, which will not detain us here, is the problem that people have when trying to put private experiences into words to report them; the philosopher Ludwig Wittgenstein (1953) provided a compelling analysis of the limits of language in this regard. A more important limitation for the present purposes concerns the assumption of abnormality that has been made in these studies. Phenomenologists have generally conceived “normality” in terms of either coherence (whether experiences are in agreement with other experiences) or optimality (whether experiences contribute to the richness and differentiation of intentional objects in the world) rather than in statistical terms (Heinämaa and Taipale 2018). Within this framework, it is of course still necessary to consider a variety of experiences, yet phenomenological researchers have generally focused only on those of people diagnosed as having mental illness and have neglected to consider the variety of ordinary beliefs and attitudes (Connors and Halligan 2021). This has led them to underestimate the madness of crowds.
To see how serious this oversight is, we can consider religious beliefs, which often have exactly the kind of experiential component that phenomenologists think is the key to understanding delusions. Probably the best-known example of a profound change in religious belief is the conversion to Christianity of Saul of Tarsus (later known as Paul the Apostle). Born a Roman citizen to a devout Jewish family, he was the beneficiary of a broad education by the standards of his time but, as a young man, assisted in the persecution of the early Christians. At some point between 31 and 36 C.E., while traveling on the road to Damascus, he underwent a sudden and dramatic mystical experience, the nature of which has ever since been the subject of theological as well as psychological debate, made possible because it was described differently in different passages of the New Testament. According to the most widely quoted account in the Acts of the Apostles (which describes the event in the third person):
And as he journeyed, he came near Damascus: and suddenly there shined round about him a light from heaven.
And he fell to the earth, and heard a voice saying unto him, Saul, Saul, why persecutest thou me?
And he said, Who art thou, Lord? And the Lord said, I am Jesus whom thou persecutest: it is hard for thee to kick against the pricks.
And he trembling and astonished said, Lord, what wilt thou have me to do? And the Lord said unto him, Arise, and go into the city, and it shall be told thee what thou must do.
And the men which journeyed with him stood speechless, hearing a voice, but seeing no man.
And Saul arose from the earth; and when his eyes were opened, he saw no man: but they led him by the hand and brought him into Damascus.
And he was three days without sight, and neither did eat nor drink. (King James Bible, Acts 9:3–9)
Occasionally, neurologists have attempted to explain away episodes of this kind as the product of epilepsy. Indeed, modern studies have found that patients with temporal lobe epilepsy often show high levels of religiosity, and one study even claimed to have detected abnormal brain waves in a patient with epilepsy who had a messianic experience while being monitored by electroencephalogram (Tedrus et al. 2015). However, it seems very unlikely that all religious experiences can be accounted for in this way. Spiritual encounters not only have been reported by key figures in all three of the Abrahamic religions but also seem to be surprisingly common experiences in ordinary people. This was demonstrated by a research program initiated by Sir Alister Hardy (1896–1985), an Oxford-based marine zoologist and one-time Antarctic explorer, who believed that human spirituality is an evolved capacity and that the spiritual strength that results from religious experiences contributes to resilience in the face of stress. Compiling more than 6,000 first-person accounts of religious experiences sent to him by members of the general public, he reported that many people (29%) included the experience of a pattern of events that convinced the individual that they were meant to happen; others involved the experience of the direct presence of God (27%), prayers being answered (25%), being looked after or guided by a presence (22%), or an awareness of the sacred in nature (16%) (Hardy 1979). Detailed interviews conducted later with a small number of people who had contacted the center that Hardy established found that these types of experiences could not be distinguished from psychotic experiences in terms of either content or form (Jackson and Fulford 1997).
The logical alternative to the idea that delusions are qualitatively different from other beliefs, typically favored by psychologists, is to propose a continuum between normal and abnormal believing. Yet, arguably, the interpretation of the evidence that appears to support this hypothesis has also been limited by simplistic assumptions about the nature of normal beliefs and attitudes. Two kinds of evidence are often cited to support the continuum.
The first type of evidence concerns the prevalence of abnormal beliefs in the general population as revealed in epidemiological studies. For example, in a study of people attending appointments with their general practitioners in southwestern France using the Peters et al. Delusions Inventory (Peters et al. 1999b), of those who had no history of psychiatric treatment, 69.3% reported that people they knew were not who they seemed to be, 46.9% reported telepathic communication, 42.2% reported experiencing seemingly innocuous events that had double meanings, and 25.5% reported that they were being persecuted in some way (Verdoux et al. 1998). In the epidemiological Netherlands Mental Health Survey and Incidence Study (NEMESIS), 3.3% of the 7,000 participants were judged to have delusions, and 8.7% were estimated to have similar ideas that were judged to be not clinically significant because they were not associated with distress (van Os et al. 2000). A later German study confirmed that the delusions of psychiatric patients and apparently similar beliefs in nonpatients are mainly distinguishable in terms of the distress associated with them rather than by either conviction or the extent to which the individual is preoccupied with the belief (Lincoln 2007).
This evidence is not decisive because it is possible that psychopathological phenomena are more prevalent than previously supposed but nonetheless qualitatively distinct from normal psychological phenomena. Hence, the second type of evidence often appealed to in support of the continuum hypothesis has been obtained by examining the distribution of beliefs in the population more closely using appropriate statistical techniques. Studies that have attempted this have typically focused on paranoid beliefs. For example, Freeman et al. (2005) administered a checklist of paranoid thoughts to an online convenience sample of more than 1,000 predominantly female university students who were asked to rate each item (e.g., “People would harm me if given an opportunity”) on frequency over the past month, conviction, and distress. The three scales were highly correlated, and total scores formed a smooth exponential decay curve, with large numbers of participants endorsing nonpathological items and rarer items being endorsed only by those who had high total scores. A subsequent study by the same group used items picked out of the 2014 U.K. Adult Psychiatric Morbidity Survey of a representative sample of 7,000 adults (Bebbington and Nayani 1995). The analysis, which used sophisticated statistical techniques, identified four separate components of paranoia—interpersonal vulnerability, ideas of reference, mistrust, and fear of persecution—and again found that total scores on the items were distributed along an exponential decay curve.
One limitation of these studies is that they included no clinical samples. Elahi et al. (2017) compiled data on more than 2,000 healthy participants (mainly students and predominantly female), 157 patients with prodromal psychosis, and 360 patients with psychosis from previous studies that had used the same paranoia measure. The study used three separate taxometric methods, which have been developed to discriminate between continua and taxa (classes of individuals with unique characteristics), and the analyses were carried out on the entire sample and the nonclinical participants alone. The findings strongly supported a continuum model when the clinical participants were both included and excluded.
Earlier I criticized the phenomenologists for simplistic assumptions about the nature of normal beliefs, and the same charge can be directed against continuum theorists. The studies I have just described, which were based on questionnaires and structured interviews focused entirely on belief content, ignored the kinds of experiential aspects of believing that the phenomenologists have highlighted and that I have previously suggested are often evident in both normal and abnormal beliefs. Indeed, for the most part, psychological research has treated beliefs simply as propositions written on an inner list that is accessible only to the believer but that (assuming the respondent is truthful) investigators can access by asking the right questions. What seems to be missing from both the phenomenological and the psychological approaches is an adequate understanding of what believing entails.

Understanding Belief Systems

The concept of belief is ubiquitous in the human sciences and plays a central role in disciplines as diverse as history, anthropology, sociology, economics, and psychology. Indeed, to catalog its central role in how we attempt to understand the human mind and behavior would be a formidable task. At times philosophers have attempted to dispense with the concept but failed (Stich 1996), not least because the belief that there are no such things as beliefs is, of course, a belief. There is a plausible case to be made that the central role of belief is what distinguishes the social sciences from the natural sciences; this is implicit, for example, in Winch’s (1958) attempts to define the unique features of the former. It is all the more surprising, then, that there has been no serious psychological analysis of what is involved in believing.

Cultural Origins of Our Understandings of Belief

A contemporary philosophical treatment of belief begins with the following observation:
Contemporary analytic philosophers of mind generally use the term “belief” to refer to the attitude we have, roughly, whenever we take something to be the case or regard it as true. To believe something, in this sense, need not involve actively reflecting on it: Of the vast number of things ordinary adults believe, only a few can be at the fore of the mind at any single time . . . . Many of the things we believe, in the relevant sense, are quite mundane: that we have heads, that it’s the 21st century, that a coffee mug is on the desk. Forming beliefs is thus one of the most basic and important features of the mind. (Schwitzgebel 2015)
Given the central role of this “basic and important feature of the mind” in human affairs, it is not surprising that evidence of believing is as old as recorded culture; the oldest literary works in the Western canon, The Iliad and The Odyssey, imply the existence of a wide range of beliefs (e.g., about the role of the gods in human affairs). However, according to the Oxford English Dictionary, the oldest uses of the word belief in the English language are all theological. For example, the first recorded use of the mental conviction form of belief appears to be a Middle English passage in Ælfric’s Homily on the Nativity of Christ, written in about 1175. This reads as Ðesne laf we æteð þonne we mid bileafan gað to halige husle ure hælendes lichame, which can be translated as “This bread we eat when we with faith go to the holy Eucharist of Our Lord’s body.” This history is not accidental: nearly all of the earliest English language texts were written by monks, and the creation of theological doctrine implies the need for a deep contemplation about the nature of belief.
Nonetheless, the scope of the word belief and how people have thought about the process of believing have changed with time. From the very beginning, Christianity was a highly propositional religion, perhaps as a consequence of the early Church’s need to defend itself from the many dissenting cults (Ebionites, Docetists, Gnostics, and Arians, to name a few) that were eventually eliminated after Constantine the Great made Christianity the official religion of Rome in the early fourth century (Holland 2012). In the face of these challenges, orthodoxy became the child of heresy, and the early Church devoted considerable energy to the development of checklists of beliefs, known as creeds, that could be used to induct members, unify congregations, and distinguish between true believers and heretics (Hinson 1979). This may be the cultural origin of the inner-list idea, in which beliefs are assumed to be propositions in the mind or brain that can be read by a skillful interrogator or clinical psychologist.
Later, in the Medieval period, when Catholic orthodoxy intruded into every aspect of daily life, to believe was simply to assent to the doctrines of the Church and therefore something that was effortless and never questioned. Inevitably, this way of thinking ended with the Reformation; the emergence of competing theologies necessitated a transformation of believing into something that was difficult, often subject to error, and that might lead to a mistaken understanding of the world (Shagan 2018). These developments ultimately led to the birth of science and our modern conception of belief as an individual assessment of the likelihood of certain facts based on an assessment of probabilities, a transformation that was associated with dramatic changes in the way that ordinary people saw the world. A typical well-educated European in 1600, for example, believed that witches and werewolves existed, that a murdered body would bleed in the presence of a murderer, that a rainbow was a sign from God, and that comets portended evil—all ideas that were widely regarded as ridiculous a century later (Wootton 2015).
These historical observations point to what appears to be a paradox: the concept of belief is indispensable, yet it would be naive in the extreme to expect human psychological processes to mirror precisely the changing conceptual architecture of the English language. Indeed, English (and presumably all other languages) includes many words that are to some extent interchangeable with the word belief (e.g., attitude, value, opinion, conjecture, prediction). This rich language is useful in everyday life but has probably impeded any attempts to construct a unified understanding of belief. Although it seems obviously a mistake to try to construct a separate psychological theory aligned with each of these terms, that is what has happened in practice.
To make sense of what is involved in belief, then, we must identify psychological processes that correspond approximately to the ordinary language term belief and its near synonyms but without assuming a precise fit. This is a challenging task that is well beyond the space available in this chapter and requires synthesizing findings from a wide range of research areas that have been largely ignored by psychopathologists, such as political psychology, the psychology of religion, and the broader disciplines of sociology and anthropology. However, it is one that I hope to show, in the few pages available to me, has immense promise for informing our understanding of the distinction between pathological and nonpathological beliefs.

There Is No Inner List

Elsewhere I have defended the view that beliefs are propositional and underpinned by the human capacity for language (Bentall 2018). Observing the family dog staring hopefully at the food cupboard at dinnertime might prompt the assumption that animals can hold beliefs, and there is clearly a lot that is belief-like about their behavior. Contemporary comparative psychology (Pearce 2008) and computational models of associative learning (Dayan and Abbott 2005) treat both Pavlovian and instrumental conditioning as complex processes in which the animal adjusts its predictions about the world, adapting to changing circumstances in a way that allows it to efficiently use the resources in its environment, a set of processes that could certainly be characterized as belief-like. It is also worth noting that animals find the state of being unable to predict the world highly aversive; Pavlov showed that dogs presented with conditioned stimuli that are impossible to distinguish become highly distressed, a phenomenon he called conditioned neurosis (Liddell 1947), and later research using both Pavlovian and instrumental conditioning paradigms showed similar effects in a wide range of species (Mineka and Kihlstrom 1978). All this being said, no dog has ever become either a jihadist or a political scientist.
The human capacity to form sentences allows us to use grammatical rules and word combinations to construct sophisticated descriptions of the world (e.g., the blueprint of a nuclear submarine), rules to follow (e.g., the recipe for sourdough bread), novel statements about complex contingencies (e.g., “If there is no food in the valley during winter, the next best place to look for grub is in the forest at the far side of the mountain”), and abstract concepts (e.g., E = mc2) that are denied to other species. Moreover, because human beings internalize language at an early age and use it as a vehicle of thought (Fernyhough 2016; Vygotsky 1962), there is a consequent enhancement of cognitive capacity even when we decline to share our thoughts with other people. The invention of literacy was likely associated with a further expansion of human cognitive skills (Ong 1982) and enabled beliefs to be recorded and then passed on from generation to generation. It is impossible to overestimate the significance of this ability; without it there would be no culture or science.
It is important to recognize that by claiming that beliefs are propositional and language based, I am not claiming that beliefs involve only propositional processes. There are sound philosophical reasons for supposing that our knowledge of the world ultimately rests on a bedrock of processes that are nonpropositional (Wittgenstein 1969; see also Gipps and Rhodes 2011; Rhodes and Gipps 2008), and there is indeed experimental evidence that associative processes play an important role in many aspects of language (e.g., Colunga and Smith 2005).
Nor does claiming that beliefs are propositional necessitate an inner-list model. Stich (1996), in his attempt to dispense with the concept of belief, argued that there was nothing known to neuroscience that corresponded to the idea of a stored catalog of propositions and, moreover, that connectionist models of neural architecture could demonstrate belief-like behavior in the absence of any stored representations of this kind. This argument, as Stich later conceded, did not achieve its original goal of eliminating the need for the concept of belief, but it does undermine the inner-list notion. Moreover, the list metaphor fails for many other reasons. For example, it has difficulty accounting for novel beliefs, such as my conviction that there are no teacups on the far side of the Moon, and also inadequately captures the interdependency between beliefs (my belief about the absence of teacups on the Moon is underwritten by beliefs about the nature of the Moon and teacups). More important from a clinical perspective, it fails to reflect the dynamic, shifting ways in which beliefs are constructed and adjusted, either in the course of argument with other human beings (Edwards and Potter 1992) or when we deliberate with ourselves.
This dynamic negotiation of belief is evident when psychiatric patients discuss their beliefs with other people (Georgaca 2000) and during the evolution of delusional systems. For example, Peter Chadwick (2008), a British psychologist who became psychotic after suspecting that his neighbors knew about his bisexuality and transvestism, described how he spent many months constructing the all-preoccupying narrative that eventually became his delusional system:
But all [of these events earlier in his life] were eventually collected up, knitted together and turned into a delusional web of thoughts and feelings that in the end drove me to multiple suicide attempts that very nearly succeeded in killing me. In madness, no moment of one’s existence seems to be wasted; it is as if one’s whole life, and the depths of one’s very being in selective perspective, have been made magically clear in their awful and portentous significance. (p. 5)
To borrow a metaphor from the philosopher Dan Dennett (1991): When we administer a questionnaire or ask someone what they believe, we are not asking our interlocutor to read out from an inner list; instead, we are prompting the person to author the latest in an endless series of multiple drafts of what they believe. This draft will, in due course, be followed by many further drafts in the future.

Certainty

As noted in the subsection “Delusions of Psychiatric Patients,” it has often been said that delusions are unusual in being held with extraordinary conviction. Many studies, either using interview methods (e.g., So et al. 2012) or using experimental methods to observe how patients change their minds in the face of conflicting information (e.g., Woodward et al. 2008), have shown that belief inflexibility is associated with the certainty with which patients hold their delusional beliefs (Zhu et al. 2018). At the same time, other theorists have attempted to analyze belief inflexibility in terms of problems of updating prior beliefs (McKay 2012) or as a neurobiological error in the mechanisms responsible for processing discrepancies (errors) between expected events and those that actually occur (Corlett 2018).
A general problem with these approaches is that they have given insufficient attention to the resistance to attitude change evidenced by nonpathological but personally meaningful beliefs, such as political attitudes (Lodge and Taber 2013; Westen 2008). In a study by Colbert et al. (2010), people with delusions, people who had recovered from delusions, and healthy control subjects were asked whether they were willing to consider whether their beliefs (delusional in the case of patients, idiosyncratic but meaningful in the case of the control subjects) were mistaken. Personally meaningful beliefs were held with equal conviction, and equally inflexibly, in all three groups.
These observations highlight a feature of beliefs that is implicit in the philosophical account cited in the subsection “Cultural Origins of Our Understandings of Belief” (Schwitzgebel 2015): beliefs come, as it were, in two parts—a proposition (statement about the world) and a subjective estimate of the certainty that the proposition is true. This estimate is usually not explicitly articulated, but English (and I imagine every other human language) furnishes us with a rich variety of means to express our certainty implicitly (“I guess that . . .,” “I think that . . .,” “I believe that . . .,” “I expect that . . .,” and so on). Experimental studies show that in ordinary life, people’s subjective confidence usually correlates fairly well with the accuracy of what they believe (Koriat 2012).
Intriguingly, the subjective feeling of knowing something can sometimes become completely divorced from actual knowing. A common experience of this kind is the tip-of-the-tongue phenomenon, which was elegantly described at the end of the nineteenth century by the philosopher-psychologist William James (1893):
Suppose we try to recall a forgotten name. The state of our consciousness is peculiar. There is a gap therein; but no mere gap. It is a gap that is intensely active. A sort of wraith of the name is in it, beckoning us in a given direction, making us at moments tingle with the sense of our closeness and then letting it sink back without the longed-for term. If wrong names are proposed to us, this singularly definite gap acts immediately so as to negate them. (p. 251)
A noteworthy feature of this phenomenon is that it involves a state of awareness of knowing something that the individual does not yet know (Koriat 2000). It is therefore not far-fetched to suggest that this kind of experience might be in some way related to the Trema experience of something being in the offing, which Klaus Conrad (1958/2012) saw as a prelude to delusions.
Whether or not this parallel is correct, research evidence suggests that at least some patients with psychosis and people with a disposition to psychosis are impaired in their ability to estimate the certainty of their beliefs (Balzan 2016). For example, Moritz et al. (2015) asked a large population sample to rate their confidence in their answers in a game modeled after the television show “Who Wants to Be a Millionaire?” and found that a poor correlation between confidence and accuracy was associated with paranoid thinking. These observations suggest that it might be fruitful for psychopathologists to consider the processes by which human beings make these kinds of judgments and study them in clinical samples.
Cognitive psychologists studying human judgment (Koriat 2012) and consumer psychologists interested in attitudes toward commercial products (Tormala and Rucker 2018) have conducted a great deal of research on this topic. In general, the various theoretical models that they have proposed view certainty judgment as a metacognitive skill that is driven either by the direct experience of knowing or by inference from contextual factors.
The distress shown by animals when they are exposed to unpredictable environments (Liddell 1947; Mineka and Kihlstrom 1978) suggests that a direct sensitivity to uncertainty may be a fundamental feature of the nonpropositional learning systems shared by all vertebrate species. Moreover, some human studies have demonstrated that certainty can be read directly from our cognitive processes. For example, easily recalled knowledge is typically judged as more certain than difficult-to-recall knowledge (Alter and Oppenheimer 2009); hence, fast response times when recalling information are generally a strong predictor of confidence in the accuracy of memories (Zakay and Tuvia 1998). People are also more likely to feel a high level of certainty about a belief if the relevant information that is available to them is highly consistent (Smith et al. 2007) or if their beliefs are based on direct personal experience rather than information they have obtained secondhand from someone else (Wu and Shaffer 1987).
However, certainty judgments are sometimes inferred from self-knowledge, which leads to unjustifiably high perceptions of certainty when self-knowledge is inaccurate. For example, certainty increases when people believe that they have spent time and effort reflecting and elaborating on their beliefs, even when they have been misled into overestimating the time they have spent on this process (Barden and Petty 2008). Similarly, simply coaxing people to think that they have personally arrived at a theory that is in fact given to them is sufficient to increase their appraisal that the theory is true (Gregg et al. 2017). A complication is that individuals with the least knowledge in a particular domain (whose beliefs are least accurate) are generally poorest at judging their own knowledge, a phenomenon that is often referred to as the Dunning-Kruger effect after the psychologists who first studied it (Dunning et al. 1995; Ehrlinger et al. 2008).
These kinds of studies, for the most part conducted on ordinary people, provide important clues about why some beliefs—not only delusions—are held with extraordinary conviction. However, the way that beliefs are organized likely also plays a role.

Belief Systems

In a monograph on anti-Semitism penned shortly after the Second World War, the philosopher Jean-Paul Sartre (1948) noted that Enlightenment scholars had tended to treat beliefs as atomized, rational propositions that could each be evaluated independently. Anti-Semitism, Sartre argued, dispelled this assumption because hostility toward Jews seemed to be predictably related to other beliefs and attitudes, such as authoritarianism. This observation points to one of the most important features of human beliefs, which is that far from being randomly related to one another, they are often organized into master interpretive systems by which individuals interpret the world and their place within it (Bentall 2018). Examples of these master interpretive systems include religious beliefs (McCauley and Graham 2020; Norenzayan and Gervais 2013), political ideologies (Huddy et al. 2013), conspiracy theories (Brotherton 2015; Douglas et al. 2017), and beliefs about the supernatural (Dean et al. 2021).
Interestingly, these systems for the most part are correlated with one another, which suggests that there might be some common factors that lead people to embrace them. It is perhaps unsurprising that religiosity is associated with belief in the supernatural (Lindeman and Svedholm-Häkkinen 2016; Thalbourne 1995), but positive correlations have also been reported between religiosity and conservatism (Schlenker et al. 2012) and between conservatism and conspiracy theories (Galliford and Furnham 2017). Studies have shown that belief in conspiracy theories, in turn, is positively correlated with religiosity (Mancosu et al. 2017; Newheiser et al. 2011) and belief in the paranormal (Darwin et al. 2011; Newheiser et al. 2011; Swami et al. 2011). Importantly, these kinds of associations transcend the division between clinical phenomena and the madness of crowds. For example, and again not surprisingly, paranoia correlates with the tendency to believe in conspiracy theories (Imhoff and Lamberty 2018), but it is also associated with paranormal beliefs (Darwin et al. 2011). Similarly, both delusionality (Peters et al. 1999a) and paranoia (Ayeni et al. 2011) have been observed to be positively associated with religiosity. Although it is not possible to do justice here to the vast research that has been conducted on each of these types of belief systems, these literatures have some very important implications for understanding delusions that can be summarized briefly. In what follows, I focus mainly on political beliefs and conspiracy theories.
An important question addressed by political psychologists concerns why beliefs form distinctive patterns or structures of the kind intuited by Sartre. For example, historically, political actors have generally fallen into two groups: those who seek stability and order versus those who seek progress and rapid reform. This difference can be traced back as far as ancient Greece (Hibbing et al. 2014) but is today most commonly referred to as the right-left dimension, reflecting the seating arrangements of French Estates General during the French Revolution, where those who opposed the Ancien Régime and the Bourbon monarchy sat on the left. This distinction is evident in modern political attitudes and voting behavior (Jost et al. 2009), although other ways of describing variations in political belief have also been proposed; for example, a considerable volume of research has focused on authoritarian traits, which are usually thought to align with the right end of the political spectrum (Adorno et al. 1950; Stenner 2005).
Underneath this apparently simple structure, however, lies considerable complexity. In a landmark study, the American political scientist Kenneth Converse (1964/2006) noted that associations between specific beliefs can be sustained by logical associations (e.g., the belief that government should be as small as possible implies support for low tax rates and miserly benefits for the unemployed). Political belief systems can therefore be thought of as networks of interconnected attitudes that influence one another, and the same is presumably true for other kinds of belief systems. However, when Converse analyzed both quantitative and qualitative data collected from American voters in the 1950s and early 1960s, he found that these kinds of logical relationships held only at the extreme left (“liberal” in U.S. parlance) and right (“conservative”) ends of the spectrum, where ideologically committed voters had often spent many years actively refining their belief systems and developing sophisticated arguments to handle apparent contradictions. Outside these extremes, voters often cast their ballots on more flimsy grounds, such as because of niche policy promises (a new law to prevent cruelty to animals) or even a vague feeling that a particular politician is trustworthy. Consistent with Converse’s findings, more recent research has shown that the correlation between attitudes toward economic conservatism (free market economics) and social conservatism (family values) is greater at either end of the political spectrum than in the center (Feldman and Johnston 2014).
Note that the interconnectivity of beliefs within a system helps to explain why these systems are highly resistant to counterargument or refuting evidence. Many psychiatric patients, like political ideologists, have often spent many years finessing their theories and eliminating any contradictions within them; the firsthand account given by Peter Chadwick quoted in the subsection “There Is No Inner List” seems to illustrate this process. As a consequence, patients’ beliefs are not atomized but woven into elaborate systems analogous to those formed by committed ideologists. However, a highly interconnected network of beliefs of this kind is likely to be resistant to perturbation because each individual belief within the system is sustained by the beliefs connected to it—a change to one belief is prevented by the rigidity of all the beliefs that are associated with it.
Although these kinds of logical associations are clearly important in explaining the resilience of belief systems to challenge, they are not the only factor at work, and to see why this must be the case, it is useful to consider conspiracy theories. These theories make a particularly interesting comparison with delusions, not least because, as noted earlier, they sometimes appear to be as “mad” as any beliefs observed in the psychiatric clinic. Conspiracy theories have often been embraced by extreme political projects on both the left and right of the political spectrum; for example, they played an important role in the Nazi project to turn Germany into an authoritarian state (Evans 2021), and they have long been part of political discourse in the United States (Hofstadter 1964). Unhelpfully, they are often confused with paranoia; for example, in a seminal essay, American historian Richard Hofstadter (1964) made this mistake by talking about the “paranoid style” in American politics. As we have seen, the two kinds of beliefs are indeed often correlated (Imhoff and Lamberty 2018), but factor analytic studies have shown that they are psychologically distinct, with different psychological predictors (Alsuhibani et al. 2022). For example, whereas paranoia is associated with low self-esteem, belief in conspiracies is associated with high self-esteem and narcissism (Alsuhibani et al. 2022; Cichocka et al. 2016). Consistent with this latter finding, people who score highly on measures of conspiracy thinking are more likely to endorse conspiracy theories if they think that they are endorsed only by a minority of other people (Imhoff and Lamberty 2017).
Someone who believes in one type of conspiracy is very likely to believe in others (Brotherton et al. 2013; Bruder et al. 2013). However, this cannot be explained by logical processes. For example, there is no logical connection between the belief that the American government has imprisoned extraterrestrials in Area 51 and the belief that Donald Trump is waging a secret war against pedophiles. Indeed, people are capable of believing in conspiracies that appear, at least on the surface, logically contradictory, for example, that Princess Diana faked her own death and that she was assassinated by the British Secret Service (Alsuhibani et al. 2022; Wood et al. 2012).
So if the glue holding together the network of conspiracy beliefs is not a set of logical links between the individual beliefs, what could it be? A tempting answer is that it is some kind of hidden, superordinate belief that is linked to all of the individual propositions in the network. In the case of conspiracy theories, this superordinate belief is presumably something about the duplicitousness of governments and institutions. This hypothesis is supported by results from a recent study of conspiracy theories and paranoia in three European countries (the United Kingdom, Ireland, and Spain); in all three, conspiracy theories were uniquely associated with mistrust in political institutions, whereas paranoia was uniquely associated with a tendency to judge unfamiliar faces as untrustworthy (Martinez et al. 2022).
This general idea that there are fundamental beliefs underlying surface beliefs crops up in many places in the psychological literature. One version of this idea can be found in Beck’s (1987) theory of depression, which includes as a component a negative cognitive triad of beliefs about the self, the world, and the future that, when subjected to factor analysis, seems to yield a single dimension of negative beliefs about almost everything (McIntosh and Fischer 2000). Another version can be found in the work of the social psychologist Jonathan Haidt (2013), who has argued that political ideologies can be explained in terms of variations in five moral foundations: the importance people place on caring for others, commitment to fairness, loyalty to one’s group, respect for rank and status, and concerns about sanctity (purity). Lerner’s (1980) proposal that human beings cling to the fundamental belief that the world is just is yet another example of this kind of theorizing. Researchers studying skeptical attitudes toward science have also pointed to the importance of understanding root beliefs in order to explain surface beliefs, such as conspiracy theories and the idea that climate change being caused by human consumption of fossil fuels is a hoax (Hornsey and Fielding 2017). Most recently, Clifton et al. (2019) built on these models in an ambitious attempt to identify a hierarchy of primal beliefs. According to this theory, various fundamental beliefs about the world are organized into three main kinds: beliefs that the world is safe versus dangerous, that it is enticing versus dull, and that it is alive versus mechanistic.
The problem with evoking more fundamental beliefs in the attempt to explain surface ones is that eventually we run out of beliefs. A cognitive-behavioral therapist might say that the downward arrow technique can go only so far. As Wittgenstein (1969) pointed out, there ultimately has to be a bedrock of knowledge that requires no justification on which we can anchor our propositions about the world. Indeed, although fundamental beliefs look like propositions, this is arguably only an illusion created by the fact that we have to use language to express them; the idea that we should respect rank and status or that the world is dull and unexciting cannot be evaluated for truth-value in the same way that, say, we can evaluate the belief that the Earth revolves around the Sun. Indeed, in many cases it is not hard to see how proposed fundamental beliefs map onto more basic, nonpropositional psychological processes, some of which are shared with other species. For example, the big three primal beliefs described by Clifton et al. (2019) described in the previous paragraph seem to reflect sensitivity to threat, sensitivity to reward, and specifically human cognitive processes that have evolved to allow us to understand the behavior of other human beings. Similarly, Haidt et al.’s (1997) sanctity dimension in their moral foundations appears to be related to feelings of disgust, which is why disgust sensitivity is related to political conservatism and hostility to migrants (Aarøe et al. 2017).

Belief, Culture, and the Uniqueness of Delusions

Revisiting the Laffertys

There is one final characteristic of beliefs that we must consider before returning to the question of whether and how delusions are different from the madness of crowds. To understand this property, it is helpful to consider again the beliefs of Ron and Dan Lafferty, which provoked such disagreement among the mental health professionals who testified at Ron’s trial. Recall that the two brothers held that they had murdered their sister-in-law and her infant daughter on the instruction of Jesus Christ and that Dan believed, and to this day still believes, that he is the Prophet Elijah. It is worth adding that, unlike Anders Breivik (who later said that he had been deliberately exaggerating his paranoid beliefs to mislead psychiatrists), the Laffertys were reticent about sharing their beliefs with investigators and never recanted. The complete backstory to the murders was uncovered only some years later by investigative journalist Jon Krakauer (2003) in his remarkable book Under the Banner of Heaven.
Polygamy has long been a source of conflict among the Church of Jesus Christ of Latter-Day Saints (LDS Church; the institutional authority of the Mormon religion), the U.S. government, and some of the Church’s followers. The practice was introduced by the religion’s founder, Joseph Smith, in Illinois in the 1830s and was officially advocated by the LDS Church from 1852 onward, by which time the membership had moved and become a dominant force in the territory of Utah. Although an 1862 law passed by the U.S. Congress prohibited plural marriage, many Mormons were undeterred, believing that they were protected by the First Amendment of the U.S. Constitution. This position became untenable in 1879, when the U.S. Supreme Court ruled that the amendment protected religious belief but not all religiously inspired practices. In 1890, the LDS Church officially rejected polygamy, which allowed Utah to be recognized as a U.S. state. However, a substantial minority of Mormons interpreted this rejection as a betrayal of one of their core values, and some fundamentalists continue to practice plural marriage today. In a small number of communities in the borderlands where Nevada, Arizona, and Utah meet, this practice has led to widespread sexual abuse of young (often underage) women, who have been passed from one Mormon elder to another. In 2006, Mormon fundamentalist leader Warren Jeffs was placed on the Federal Bureau of Investigation’s Ten Most Wanted Fugitives list, and he is currently serving a long jail sentence for multiple sexual offenses against children.
The Laffertys came from a strict Mormon family ruled by an authoritarian and violent father. In adulthood they drifted toward fundamentalism, and they mixed with various fundamentalist cults in the months preceding their murder of Brenda and Erica, including a group who called themselves The Prophets, who believed that they could teach people how to receive messages directly from Jesus Christ. It was during this period that Ron’s wife divorced him after he demanded a plural marriage. He was also excommunicated by the mainstream LDS Church. Ron and Dan blamed these events on their brother’s wife, Brenda, who was vocally opposed to plural marriage. It was in these circumstances that Ron believed he had been told by Jesus to “remove” Brenda and Erica. Hence, the Lafferty brothers’ behavior and beliefs can be understood once the cultural context is known. To appreciate the implications of this observation for delusions, we need to unpack what it means when we say that a belief can be understood in its cultural context.

Belief Propagation

An important feature of beliefs is that they are transmittable. Indeed, this is one of the main evolutionary advantages conferred by language: propositions expressed in words can be passed from one person to another, either through direct speech or via intermediate media (e.g., television programs, articles, or chapters such as this one), allowing knowledge to be shared and accumulated across time. A culture emerges when a large number of people, typically but not always located in the same geographic area, share a set of representations that can be normative (e.g., “With fish, drink white wine”), complex (e.g., common law or Einstein’s theory of relativity), nonverbal (e.g., national flags), or multimedia (e.g., the saying of Mass). Of course, these representations include beliefs, and therefore, to a large extent, to be interested in culture is to be interested in the epidemiology of beliefs (Sperber 1996).
The processes involved in belief dissemination are complex, and there is space to make only a few brief observations about them here. A popular metaphor famously proposed by Richard Dawkins (1976) involves comparing beliefs to genes, with a focus on selection processes that determine whether a meme survives transmission from one person to another. It is ironic, therefore, that this model, despite receiving considerable attention for a while (Blackmore 1999) and even provoking the creation of a new journal (the Journal of Memetics), has not survived the process of academic natural selection (the journal closed in 2005 after just 8 years). Among the theory’s limitations is that it failed to generate testable hypotheses about the conditions under which selection would occur or the cognitive capacities required for someone to acquire a belief and pass it on. It also fell short in explaining both the variation in human ideas and the creativity with which these ideas are expressed (Atran 2001). On the last point, it is important to recognize that human communication does not involve the exact replication of beliefs; instead, the outcome is usually some degree of resemblance between the beliefs of the speaker and those of the listener (Sperber 1996).
An older and in many ways more fruitful metaphor (although still a metaphor) that will resonate with readers in the current times involves comparing belief propagation to disease transmission. This idea goes back to the work of the Nobel Prize–winning and personally troubled discoverer of the mechanism responsible for the transmission of malaria, Ronald Ross (1857–1932), who developed mathematical models that he thought explained both the transmission of infections and the transmission of ideas (Kucharski 2020). The main virtue of this metaphor is that it allows us to break down belief propagation into several distinct subprocesses that form stages in the transmission process: the initial creation of beliefs, the vector (medium of transmission), belief characteristics that make the beliefs a good fit with the receiving person, and, finally, whether the receiving person has an adequate immune response to them. I briefly discuss each of these stages before pointing to the implications of this model for the understanding of delusions.
First, let us consider the origins of beliefs. Often, beliefs are lost in time and hard to pinpoint because, of course, many of them are elaborated, are combined with others, and morph as they go from the mind of one human being to another. However, in the case of master interpretive systems, it is sometimes possible to identify “patient zero”—Jesus, the Prophet Muhammad, Karl Marx, and Adolf Hitler come to mind, although these individuals all had their personal cultural backgrounds and never acted alone. As the philosopher Quassim Cassam (2019) pointed out, conspiracy theories are often created by conspiracy entrepreneurs who actively proselytize them, such as right-wing shock jocks in the United States (e.g., Alex Jones) or the anonymous “Q” behind the QAnon conspiracy theory. Conspiracy theories, unlike paranoia, therefore very often have a specific and easily identified political goal. It is not hard to imagine who benefits from the idea that the 2012 Sandy Hook Elementary School shooting was faked or from the idea that the 2020 U.S. presidential election was rigged.
Vectors are clearly important in the transmission of beliefs, and people who can master the most efficient vectors of their time are likely to be rewarded by seeing their beliefs proliferated far and wide. One part of Martin Luther’s genius, after he nailed his 95 Theses onto the door of the chapel of Wittenberg Cathedral in 1517, was that he was able to exploit the latest transmission technology—the printing press—so that his ideas could be rapidly conveyed in pamphlets across Europe, eventually triggering the Protestant Reformation (Pettegree 2015). The Nazis were particularly adept at using radio propaganda in prewar Germany and continued to use it to consolidate their support and spread their anti-Semitic ideas once they had gained power (Adena et al. 2015). Today, of course, for better or worse, we have the internet; after Twitter conducted a purge of millions of suspected bot (automatic propaganda) accounts in 2018, a survey of elected representatives in European national parliaments found that far-right politicians experienced the greatest loss of Twitter followers (Silva and Proksch 2021), so it was pretty obvious which end of the political spectrum was making the most use of social media bots, perhaps the most efficient vector of all time.
An efficient vector can sometimes facilitate the accidental transmission of an idea, leading to what looks like an episode of mass delusion. For example, in April 1954, in Seattle, Washington, there was considerable public concern about small pits that citizens had begun to notice in the windshields of their cars. Because this occurred shortly after a series of U.S. nuclear tests in the Pacific, residents worried that these pits could have been caused by nuclear fallout, a hypothesis that was widely discussed in local newspapers, causing considerable alarm. Eventually, concern at the state and national levels (the U.S. president, Dwight D. Eisenhower, was consulted) led to a scientific inquiry that revealed there was nothing unusual about the small windshield blemishes, which usually went unnoticed. A subsequent survey revealed that most Seattle residents had first heard of the blemishes through local newspaper reports (Medalia and Larsen 1958). Presumably, once informed of the mysterious and possibly fallout-related windshield pits, the good people of Seattle inspected their cars and discovered blemishes that had always been there.
The question of fit between belief systems and the receiving person mainly concerns the extent to which the beliefs address the receiver’s existential concerns. It will not have escaped most readers’ notice that master interpretive systems, and the fundamental beliefs and nonpropositional processes that underlie them, all carry significance for our ability to cope with the great challenges of life, such as how to live a life that is rewarding, free of danger, and meaningful and how to maintain our status in a society of our peers while protecting ourselves and those we love from threats from outside our group. That this is so should not be a surprise; our passage through life is inherently subject to risks and hazards, and, unlike animals, we are blessed (if indeed it is a blessing) with the ability to describe and contemplate these challenges, including the fact that our lifetimes are finite (Becker 1973). Hence, existential concerns have been either explicitly or implicitly highlighted in theoretical accounts of religious beliefs (Willer 2009), political ideologies (Solomon et al. 2015), and conspiracy theories (Douglas et al. 2017) and, as we saw earlier in the subsection “Delusions of Psychiatric Patients,” seem to explain the most common delusional themes observed in psychiatric patients.
The rapid proliferation of master interpretive systems will therefore depend on the extent to which contextual factors activate these concerns; as the well-known aphorism goes, “There are no atheists in foxholes” (Jong et al. 2012). Nongqawuse’s 1856 prophecy that killing their cattle would help the Xhosa defeat the British in the Eastern Cape was no doubt made especially salient by the fact that the nation had already experienced a series of defeats by the white settlers and was also under threat from a lungworm epidemic that was crippling their herds (Peires 1989). Studies have shown that anxiety-provoking situations (Grzesiak-Feldman 2013) or the experience that life is uncontrollable (van Prooijen and Acker 2015) leads to a greater willingness to believe in conspiracy theories, so it is perhaps not surprising that the stab-in-the-back myth was facilitated by the sense of humiliation felt by the German people following their defeat by the Allied powers in the First World War and the economic difficulties that ensued (Evans 2021). More recently, it has been shown that people with authoritarian tendencies feel moved to vote for populist leaders, such as Trump in the United States and Marine Le Pen in France, or for populist policies, such as Brexit, only if they feel that their values are threatened (Stenner and Haidt 2018).
One type of existential threat that seems particularly powerful and yet difficult to avoid is awareness of our mortality. It is perhaps unsurprising that death anxiety is associated with religiosity, although the relationship may be complex and nonlinear (Jong et al. 2018), possibly accounted for by religion having a soothing effect in strong believers (Jong et al. 2013). Death anxiety has also been reported to be associated with paranormal and conservative beliefs (Tobacyk 2007; Wong 2012) and, interestingly, the severity of mental illness in psychiatric patients (Menzies et al. 2019). A considerable volume of work by social psychologists under the rubric of terror management theory has explored the effects of deliberately manipulating thoughts about death in laboratory experiments (Solomon et al. 2015); the general finding is that asking people to think about what it will be like to die leads not only to increased support for preexisting political ideologies but also to a shift toward the right end of the political spectrum (Burke et al. 2013) (although readers should be aware of some concerns about the replicability of these findings; Klein et al. 2022).
Of course, belief entrepreneurs often know how to exploit existential anxieties. Authoritarians, in particular, know that highlighting threats to normative values (e.g., the possibility of being “swamped” by migrants from other cultures) helps them to win votes. It is therefore no accident that populists at different times in history (e.g., Hitler in Germany, Slobodan Milošević in Serbia, and Trump in the United States) have often made the same promises to voters: to make their country great again, to regain control (often by revoking international treaties), and to keep out the culturally and ethnically different (Ben-Giat 2020). Depressingly, this seems to be a very effective formula.
Finally, someone who is exposed to a belief does not necessarily have to adopt it. The term slow thinking (sometimes called analytic reasoning) was introduced by Daniel Kahneman (2012) to describe the kind of thoughtful deliberation that allows people to decide whether something they have been told is reasonable. This style of thinking is easy to measure using simple puzzles (Frederick 2005), and individuals vary considerably in their propensity to indulge in this kind of reasoning. Some people, it seems, will believe a message as long as it is consistent with their existing worldview, whereas others think very carefully before accepting it.
It is intuitively obvious that good analytic reasoning should immunize people against bizarre theories. Most conspiracy theories do not survive the common sense test. For example, 46,000 people worked for NASA during the Moon landings, so if those landings were faked, all of those people would have to have been telling the same lie for a very long time; arguably, it would have been easier to go to the Moon (Aaranovitch 2009). A large number of studies have attempted to test the idea that good analytic reasoning is protective against belief in conspiracy theories by measuring slow thinking in relation to a wide range of belief systems. Poor analytic reasoning is associated with religiosity and belief in the paranormal (Pennycook et al. 2012, 2016), conspiracy theories (Swami et al. 2014), and an inability to spot fake news (Bronstein et al. 2019) or “pseudo-profound bullshit” (Pennycook et al. 2015). It has also recently been suggested that impaired analytic reasoning might be important in psychosis and delusions (Ward and Garety 2019), but the empirical studies carried out to date have produced mixed results. For example, Freeman et al. (2012, 2014) found that self-reported “intuitive thinking” was associated with paranoia in nonclinical samples but not in psychiatric patients. More recently, in a study that controlled for the covariation between paranoia and conspiracy theories in a large population sample, Alsuhibani et al. (2022) found that poor analytic reasoning was much more closely associated with the latter. This finding might seem surprising given the widespread assumption that the abnormal beliefs of psychiatric patients must reflect an impairment in thinking. However, perhaps it will seem less so when I explain the most important implication of this account of belief propagation for understanding delusions. This implication can be stated very simply: Delusions do not propagate.

Conclusion

In this chapter I have examined various attempts to distinguish between pathological and normal beliefs, noting that many proposals that have been made by psychopathologists in the past have failed to draw a clear line between the bizarre beliefs observed in the psychiatric wards and the madness of crowds. A common failure in all of these endeavors has been to underestimate the complexity of normal beliefs. For example, phenomenological researchers have assumed that the subtle characteristics they have identified in association with delusional thinking are absent in normal belief systems, but this is not the case. Many psychological researchers, by contrast, have simply focused on the content of beliefs without considering the dynamic ways that they are formed and linked to one another. An important implication of the wide range of research that I have considered here is that there is much that is inherently social about believing. Beliefs usually do not happen in isolation but are formed in the course of our relationships with other people.
This has led me to highlight one way in which delusions seem to differ from other types of belief systems: they do not propagate. The standard psychiatric way of describing this characteristic is to say that they are idiosyncratic, but what makes them idiosyncratic is that they are generally not formed in discussion with other people and do not get passed from one person to another. Deluded patients on psychiatric wards do not acquire their beliefs from conversations with other people, and they do not form societies to finesse their delusional doctrines or develop schemes to ensure that other people see the world as they do. There are no paranoia entrepreneurs. Why this is the case is, frankly, not known, but it is not difficult to speculate about possible explanations. Broadly, the factors that might be responsible fall into two types: features of the social world in which the deluded patient lives and abnormalities in the psychological mechanisms that sustain negotiation and belief sharing with other people.
With regard to the social world of the patient, one factor known to affect individuals’ judgments about the certainty of their beliefs is consensus—among most ordinary people, a belief is judged to be more likely to be true if it is shared by others (Clarkson et al. 2013). Social isolation might therefore make it difficult for individuals to estimate the degree of consensus around their beliefs and, conceivably, lead to the acceptance of bizarre beliefs that might otherwise, before they became too rigidly developed, be rejected after discussion with others. There is considerable evidence that people with psychosis or with psychotic traits tend to be lonely and to have impoverished social networks, with some evidence that social isolation often precedes the onset of psychosis (Gayer-Anderson and Morgan 2013). Consistent with the idea that lack of social contact may facilitate the onset of delusions, epidemiological studies have found that people who are more isolated (e.g., because they lack friendships or people to communicate with) are much more likely to report positive symptoms, especially paranoid beliefs (Butter et al. 2017), and that lack of identification (the sense of belonging to groups) is also a risk factor for these kinds of beliefs (McIntyre et al. 2018). However, against the social isolation hypothesis, once patients are ill, the severity of positive symptoms does not seem to be related to social network characteristics (Degnan et al. 2018). Moreover, it is difficult to imagine that most deluded patients are unaware that their beliefs are not shared by others. A further complication when understanding the relationship between isolation and delusions is that any apparent association might be the consequence of reverse causation—people who express beliefs that seem crazy might find that their friends start to shun them.
The other possibility is that delusions are facilitated and maintained by impairments in the psychological processes that are responsible for belief sharing. Anthropologists (Boyer et al. 2015) and evolutionary psychologists (Sutcliffe et al. 2012) have argued that human beings have evolved complex psychological mechanisms that enable us to establish coalitions with others. Bell et al. (2021) recently argued that abnormalities in these kinds of mechanisms might explain why delusions are typically social in content. As they pointed out, the majority of psychological research directed toward understanding delusions has assumed that they arise from a failure of reason. Researchers have therefore tried to identify cognitive and emotional abnormalities that could explain this failure. An important implication of the evidence reviewed here is that when these mechanisms are examined, delusions do not seem to be very different from other kinds of organized belief systems. Perhaps instead of assuming that delusions are a consequence of cognitive failures, we should consider whether they arise from some kind of disruption of the processes involved in sustaining social relations.

Questions for Discussion

1.
Are all religious beliefs, some religious beliefs, or no religious beliefs delusional?
2.
In everyday life, how do we know how certain we are about something we believe? Is certainty a feeling?
3.
Can you devise a set of criteria that would allow you to reliably distinguish between paranoid delusions and conspiracy theories?
4.
If you wanted to change someone’s political beliefs, how would you go about it? Would your approach be different from that taken when conducting therapy with a patient who has paranoid delusions?
5.
Why are delusional beliefs not passed from one person to another? Can you conceive of a situation in which they might be?

Key Points

It has proven difficult to discover criteria that distinguish between the delusions of psychiatric patients and nonpathological but bizarre beliefs (the madness of crowds). Although belief is a central and indispensable concept in the social sciences, there is no coherent theory of human beliefs that could assist us in developing the required criteria.
Human beliefs consist of language-based propositions that are sometimes organized into complex systems (e.g., religious and political ideologies, conspiracy theories), in which case they tend to be rigid and resistant to contradiction.
Beliefs are transmittable between individuals, and the process by which transmission occurs can be broken down into components analogous to those involved in the transmission of a virus. One way in which delusions are unique is that they are not shared with others.

References

Aaranovitch D: Voodoo Histories: The Role of the Conspiracy Theory in Shaping Modern History. New York, Penguin Group, 2009
Aarøe A, Petersen MB, Arceneaux K: The behavioral immune system shapes political intuitions: why and how individual differences in disgust sensitivity underlie opposition to immigration. Am Polit Sci Rev 111:277–294, 2017
Abraham K: Notes on the psychoanalytic investigation and treatment of manic depressive insanity (1911), in Selected Papers of Karl Abraham, M.D. Translated by Bryan D, Strachey A. Edited by Jones E. London, Hogarth, 1927, pp 137–156
Adena M, Enikolopov R, Petrova M, et al: Radio and the rise of the Nazis in prewar Germany. Q J Econ 130(4):1885–1939, 2015
Adorno TW, Frenkel-Brunswik E, Levinson DJ, Sanford RN: The Authoritarian Personality. New York, Harper Row, 1950
Allnutt S, Samuels A, O’Driscoll C: The insanity defence: from wild beasts to M’Naghten. Australas Psychiatry 15(4):292–298, 2007 17612881
Alsuhibani A, Shevlin M, Freeman D, et al: Why conspiracy theorists are not always paranoid: conspiracy theories and paranoia form separate factors with distinct psychological predictors. PLoS One 17(4):e0259053, 2022 35389988
Alter AL, Oppenheimer DM: Uniting the tribes of fluency to form a metacognitive nation. Pers Soc Psychol Rev 13(3):219–235, 2009 19638628
American Psychiatric Association: Diagnostic and Statistical Manual of Mental Disorders, 4th Edition. Washington, DC, American Psychiatric Association, 1994
American Psychiatric Association: Diagnostic and Statistical Manual of Mental Disorders, 5th Edition, Text Revision. Washington, DC, American Psychiatric Association, 2022
Atran S: The trouble with memes: inference versus imitation in cultural creation. Hum Nat 12(4):351–381, 2001 26192412
Ayeni OB, Ayenibiowo KO, Ayeni EA: Religiosity as correlates of some selected psychological disorders among psychiatric outpatients in Lagos State. IFE PsychologIA 19:114–128, 2011
Azhar MZ, Varma SL, Hakim HR: Phenomenological differences of delusions between schizophrenic patients of two cultures of Malaysia. Singapore Med J 36(3):273–275, 1995 8553090
Balzan RP: Overconfidence in psychosis: the foundation of delusional conviction? Cogent Psychol 3(1):1135855, 2016
Barden J, Petty RE: The mere perception of elaboration creates attitude certainty: exploring the thoughtfulness heuristic. J Pers Soc Psychol 95(3):489–509, 2008 18729690
Bebbington P, Nayani T: The Psychosis Screening Questionnaire. Int J Methods Psychiatr Res 5:11–19, 1995
Beck AT: Cognitive models of depression. J Cogn Psychother 1:5–37, 1987
Becker E: The Denial of Death. New York, Free Press, 1973
Bell V, Halligan PW, Ellis HD: Diagnosing delusions: a review of inter-rater reliability. Schizophr Res 86(1–3):76–79, 2006 16857345
Bell V, Raihani N, Wilkinson S: Clin Psychol Sci 9(1):24–37, 2021 33552704
Bénabou R, Tirole J: Belief in a just world and redistributive politics. Q J Econ 121:669–746, 2006
Ben-Giat R: Strongmen: How They Rise, Why They Succeed, How They Fall. London, Profile Books, 2020
Bentall RP: The illusion of reality: a review and integration of psychological research on hallucinations. Psychol Bull 107(1):82–95, 1990 2404293
Bentall RP: Madness Explained: Psychosis and Human Nature. New York, Penguin, 2003
Bentall RP: Delusions and other beliefs, in Delusions in Context. Edited by Bortolotti L. London, Palgrave Macmillan, 2018, pp 67–96
Bentall RP, Corcoran R, Howard R, et al: Persecutory delusions: a review and theoretical integration. Clin Psychol Rev 21(8):1143–1192, 2001 11702511
Berrios G: Delusions as “wrong beliefs”: a conceptual history. Br J Psychiatry Suppl(14):6–13, 1991 1840782
Blackmore S: The Meme Machine. New York, Oxford University Press, 1999
Bortolotti L, Broome M: A role for ownership and authorship in the analysis of thought insertion. Phenomenol Cogn Sci 8:205–224, 2008
Bovet P, Parnas J: Schizophrenic delusions: a phenomenological approach. Schizophr Bull 19(3):579–597, 1993 8235460
Boyd T, Gumley A: An experiential perspective on persecutory paranoia: a grounded theory construction. Psychol Psychother 80(Pt 1):1–22, 2007
Boyer P, Firat R, van Leeuwen F: Safety, threat, and stress in intergroup relations: a coalitional index model. Perspect Psychol Sci 10(4):434–450, 2015 26177946
Brewerton TD: Hyperreligiosity in psychotic disorders. J Nerv Ment Dis 182(5):302–304, 1994 10678313
Bronstein MV, Pennycook G, Bear A, et al: Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism, and reduced analytic thinking. J Appl Res Mem Cogn 8:108–117, 2019
Broome MR, Harland R, Owen GS, Stringaris A (eds): The Maudsley Reader in Phenomenological Psychiatry. Cambridge, UK, Cambridge University Press, 2012
Brotherton R: Suspicious Minds: Why We Believe Conspiracy Theories. London, Bloomsbury, 2015
Brotherton R, French CC, Pickering AD: Measuring belief in conspiracy theories: the Generic Conspiracist Beliefs Scale. Front Psychol 4:279, 2013 23734136
Bruder M, Haffke P, Neave N, et al: Measuring individual differences in generic beliefs in conspiracy theories across cultures: conspiracy mentality questionnaire. Front Psychol 4:225, 2013 23641227
Burke BL, Kosloff S, Landau MJ: Death goes to the polls: a meta-analysis of mortality salience effects on political attitudes. Polit Psychol 34(2):183–200, 2013
Butter S, Murphy J, Shevlin M, Houston J: Social isolation and psychosis-like experiences: a UK general population analysis. Psychosis 9:291–300, 2017
Cassam Q: Conspiracy Theories. Cambridge, UK, Polity Press, 2019
Cermolacce M, Sass L, Parnas J: What is bizarre in bizarre delusions? A critical review. Schizophr Bull 36(4):667–679, 2010 20142381
Chadwick PK: Delusional thinking from the inside: paranoia and personal growth, in Persecutory Delusions: Assessment, Theory and Treatment. Edited by Freeman D, Bentall RP, Garety P. Oxford, UK, Oxford University Press, 2008, pp 3–19
Cichocka A, Marchlewska M, de Zavala AG: Does self-love or self-hate predict conspiracy beliefs? Narcissism, self-esteem, and endorsement of conspiracy theories. Soc Psychol Personal Sci 7:157–166, 2016
Clarkson JJ, Tormala ZL, Rucker DD, Dugan RG: The malleable influence of social consensus on attitude certainty. J Exp Soc Psychol 49(6):1019–1022, 2013
Clifton JDW, Baker JD, Park CL, et al: Primal world beliefs. Psychol Assess 31(1):82–99, 2019 30299119
Colbert SM, Peters ER, Garety PA: Delusions and belief flexibility in psychosis. Psychol Psychother 83(Pt 1):45–57, 2010 19712542
Collin S, Rowse G, Martinez A, Bentall RP: The prevalence of delusional themes in clinical groups: a systematic review and meta-analyses of the global literature. submitted to Clin Psychol Rev, 2022
Colunga E, Smith LB: From the lexicon to expectations about kinds: a role for associative learning. Psychol Rev 112(2):347–382, 2005 15783290
Connors MH, Halligan PW: Phenomenology, delusions, and belief. Lancet Psychiatry 8(4):272–273, 2021 33743872
Conrad K: Beginning schizophrenia: attempt for a Gestalt-analysis of delusion (1958), in The Maudsley Reader in Phenomenological Psychiatry. Edited by Broome MR, Harland R, Owen GS, Stringaris A. Cambridge, UK, Cambridge University Press, 2012, pp 176–193
Converse P: The nature of belief systems in mass publics (1964). Critical Review 18:1–74, 2006
Corlett P: Delusions and prediction error, in Delusions in Context. Edited by Bortolotti L. London, Palgrave Macmillan, 2018, pp 35–66
Darwin H, Neave N, Holmes J: Belief in conspiracy theories: the role of paranormal belief, paranoid ideation and schizotypy. Pers Individ Dif 50:1289–1293, 2011
Dawkins R: The Selfish Gene. New York, Oxford University Press, 1976
Dayan P, Abbott LF: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. Cambridge, MA, MIT Press, 2005
Dean CE, Akhtar S, Gale TM, et al: Development of the Paranormal and Supernatural Beliefs Scale using classical and modern test theory. BMC Psychol 9(1):98, 2021 34162430
Degnan A, Berry K, Sweet D, et al: Social networks and symptomatic and functional outcomes in schizophrenia: a systematic review and meta-analysis. Soc Psychiatry Psychiatr Epidemiol 53(9):873–888, 2018 29951929
Dennett DC: Consciousness Explained. London, Allen Lane, 1991
Douglas KM, Sutton RM, Cichocka A: The psychology of conspiracy theories. Curr Dir Psychol Sci 26(6):538–542, 2017 29276345
Dunning D, Leuenberger A, Sherman DA: A new look at motivated inference: are self-serving theories of success a product of motivational forces? J Pers Soc Psychol 69:58–68, 1995
Edwards D, Potter JP: Discursive Psychology. Thousand Oaks, CA, Sage, 1992
Ehrlinger J, Johnson K, Banner M, et al: Why the unskilled are unaware: further explorations of (absent) self-insight among the incompetent. Organ Behav Hum Decis Process 105(1):98–121, 2008 19568317
Elahi A, Perez Algorta G, Varese F, et al: Do paranoid delusions exist on a continuum with subclinical paranoia? A multi-method taxometric study. Schizophr Res 190:77–81, 2017 28318838
Ellett L, Schlier B, Kingston JL, et al: Pandemic paranoia in the general population: international prevalence and sociodemographic profile. Psychol Med Sep 6:1–8, 2022 36065655 Epub ahead of print
El Sendiony MF: Cultural aspects of delusions: a psychiatric study of Egypt. Aust N Z J Psychiatry 10(2):201–207, 1976 1067839
Enoch MD, Trethowan WH: Uncommon Psychiatric Syndromes, 2nd Edition. Bristol, UK, John Wright, 1979
Evans RJ: The Hitler Conspiracies: The Third Reich and the Paranoid Imagination. New York, Penguin, 2021
Feldman S, Johnston C: Understanding the determinants of political ideology: implications of structural complexity. Polit Psychol 35(4):337–358, 2014
Fernyhough C: The Voices Within. London, Profile Books, 2016
Feyaerts J, Henricksen MG, Vanheule S, et al: Delusions beyond beliefs: a critical overview of diagnostic, aetiological, and therapeutic schizophrenia research from a clinical-phenomenological perspective. Lancet Psychiatry 8(3):237–249, 2021
Frederick S: Cognitive reflection and decision making. J Econ Perspect 19:25–42, 2005
Freeman D: Persecutory delusions: a cognitive perspective on understanding and treatment. Lancet Psychiatry 3(7):685–692, 2016 27371990
Freeman D, Garety PA: Comments on the content of persecutory delusions: does the definition need clarification? Br J Clin Psychol 39(4):407–414, 2000 11107494
Freeman D, Garety PA, Bebbington PE, et al: Psychological investigation of the structure of paranoia in a non-clinical population. Br J Psychiatry 186:427–435, 2005 15863749
Freeman D, Evans N, Lister R: Gut feelings, deliberative thought, and paranoid ideation: a study of experiential and rational reasoning. Psychiatry Res 197(1–2):119–122, 2012 22406393
Freeman D, Lister R, Evans N: The use of intuitive and analytic reasoning styles by patients with persecutory delusions. J Behav Ther Exp Psychiatry 45(4):454–458, 2014 25000504
Frith C: Explaining delusions of control: the comparator model 20 years on. Conscious Cogn 21(1):52–54, 2012 21802318
Gallagher S: Relationship between agency and ownership in the case of schizophrenic thought insertion and delusions of control. Rev Philos Psychol 6:865–879, 2015
Galliford N, Furnham A: Individual difference factors and beliefs in medical and political conspiracy theories. Scand J Psychol 58(5):422–428, 2017 28782805
Gayer-Anderson C, Morgan C: Social networks, support and early psychosis: a systematic review. Epidemiol Psychiatr Sci 22(2):131–146, 2013 22831843
Georgaca E: Reality and discourse: a critical analysis of the category of “delusions.” Br J Med Psychol 73(Pt 2):227–242, 2000 10874481
Gipps RGT, Rhodes JE: Delusions and the non-epistemic foundations of belief. Philos Psychiatr Psychol 18(1):89–97, 2011
Green CEL, Freeman D, Kuipers E, et al: Measuring ideas of persecution and social reference: the Green et al. Paranoid Thought Scales (GPTS). Psychol Med 38(1):101–111, 2008 17903336
Gregg AP, Mahadevan N, Sedikides C: The SPOT effect: people spontaneously prefer their own theories. Q J Exp Psychol (Hove) 70(6):996–1010, 2017 26836058
Grzesiak-Feldman M: The effect of high-anxiety situations on conspiracy thinking. Curr Psychol 32:100–118, 2013
Haidt J: The Righteous Mind: Why Good People Are Divided by Politics and Religion. New York, Penguin, 2013
Haidt J, Rozin P, McCauley C, Imada S: Body, psyche, and culture: the relationship of disgust to morality. Psychol Dev Soc J 9:107–131, 1997
Hardy A: The Spiritual Nature of Man: A Study of Contemporary Religious Experience. New York, Oxford University Press, 1979
Haselton MG, Nettle D: The paranoid optimist: an integrative evolutionary model of cognitive biases. Pers Soc Psychol Rev 10(1):47–66, 2006 16430328
Heinämaa S, Taipale J: Normality, in The Oxford Handbook of Phenomenological Psychopathology. Edited by Stanghellini G, Broome MR, Fernandez AV, et al. New York, Oxford University Press, 2018, pp 284–297
Hibbing JR, Smith KB, Alford JR: Differences in negativity bias underlie variations in political ideology. Behav Brain Sci 37(3):297–307, 2014 24970428
Hinson EG: Confessions of creeds in early Christian tradition. Review and Expositor 76:5–16, 1979
Hoenig J: Kurt Schneider and anglophone psychiatry. Compr Psychiatry 23(5):391–400, 1982 6754244
Hofstadter R: The paranoid style in American politics. Harper’s Magazine, November 1964, 77–86
Holland T: In the Shadow of the Sword: The Battle for Global Empire and the End of the Ancient World. Boston, MA, Little, Brown, 2012
Hornsey MJ, Fielding KS: Attitude roots and jiu jitsu persuasion: understanding and overcoming the motivated rejection of science. Am Psychol 72(5):459–473, 2017 28726454
Huddy L, Sears DO, Levy JS (eds): The Oxford Handbook of Political Psychology, 2nd Edition. New York, Oxford University Press, 2013
Hylwa SA, Bury JE, Davis MDP, et al: Delusional infestation, including delusions of parasitosis: results of histologic examination of skin biopsy and patient-provided skin specimens. Arch Dermatol 147(9):1041–1045, 2011 21576554
Imhoff R, Lamberty PK: Too special to be duped: need for uniqueness motivates conspiracy beliefs. Eur J Soc Psychol 47:724–734, 2017
Imhoff R, Lamberty P: How paranoid are conspiracy believers? Toward a more fine-grained understanding of the connect and disconnect between paranoia and belief in conspiracy theories. Eur J Soc Psychol 48(7):909–926, 2018
Isham L, Griffith L, Boylan AM, et al: Understanding, treating, and renaming grandiose delusions: a qualitative study. Psychol Psychother 94(1):119–140, 2019 31785077
Jackson M, Fulford KWM: Spiritual experience and psychopathology. Philos Psychiatr Psychol 4(1):41–65, 1997
James W: Principles of Psychology, Vol 1. New York, Holt, 1893
Jaspers K: General Psychopathology (1913). Translated by Hoenig J, Hamilton MW. Manchester, UK, Manchester University Press, 1963
Jong J, Halberstadt J, Bluemke M: Foxhole atheism, revisited: the effects of mortality salience on explicit and implicit religious belief. J Exp Soc Psychol 48:983–989, 2012
Jong J, Bluemke M, Halberstadt J: Fear of death and supernatural beliefs: developing a new Supernatural Belief Scale to test the relationship. Eur J Pers 27:495–506, 2013
Jong J, Ross R, Philip T, et al: The religious correlates of death anxiety: a systematic review and meta-analysis. Religion Brain Behav 8:4–20, 2018
Jose PE, Brewer WF: Development of story liking: character identification, suspense, and outcome resolution. Dev Psychol 20:911–924, 1984
Jost JT, Federico CM, Napier JL: Political ideology: its structure, functions, and elective affinities. Annu Rev Psychol 60:307–337, 2009 19035826
Kahneman D: Thinking, Fast and Slow. New York, Penguin, 2012
Kendler KS, Glazer WM, Morgenstern H: Dimensions of delusional experience. Am J Psychiatry 140(4):466–469, 1983 6837787
Klein R, Cook C, Ebersole C, et al: Many Labs 4: failure to replicate mortality salience effect with and without original author involvement. Collabra: Psychology 8(1):35271, 2022
Klosterkötter J, Hellmich M, Steinmeyer EM, Schultze-Lutter F: Diagnosing schizophrenia in the initial prodromal phase. Arch Gen Psychiatry 58(2):158–164, 2001 11177117
Knowles R, McCarthy-Jones S, Rowse G: Grandiose delusions: a review and theoretical integration of cognitive and affective perspectives. Clin Psychol Rev 31(4):684–696, 2011 21482326
Koriat A: The feeling of knowing: some metatheoretical implications for consciousness and control. Conscious Cogn 9(2 Pt 1):149–171, 2000 10924234
Koriat A: The self-consistency model of subjective confidence. Psychol Rev 119(1):80–113, 2012 22022833
Krakauer J: Under the Banner of Heaven: A Story of Violent Faith. New York, Doubleday, 2003
Kucharski A: The Rules of Contagion: Why Things Spread—And Why They Stop. London, Profile Books, 2020
Leff JP, Fischer M, Bertelsen A: A cross-national epidemiological study of mania. Br J Psychiatry 129:428–442, 1976 990656
Lerner MJ: Belief in a Just World: A Fundamental Delusion. New York, Springer, 1980
Lewis M: The Big Short: Inside the Doomsday Machine. New York, WW Norton, 2010
Liddell HS: The experimental neurosis. Annu Rev Physiol 9:569–580, 1947 20288843
Lincoln TM: Relevant dimensions of delusions: continuing the continuum versus category debate. Schizophr Res 93(1–3):211–220, 2007 17398072
Lindeman M, Svedholm-Häkkinen AM: Does poor understanding of physical world predict religious and paranormal beliefs? Appl Cogn Psychol 30:736–742, 2016
Lodge M, Taber CS: The Rationalizing Voter. Cambridge, UK, Cambridge University Press, 2013
Mackay C: Memoirs of Extraordinary Popular Delusions and the Madness of Crowds: A Study in Crowd Psychology, Vols 1–3. London, Richard Bentley, 1841
Malahy LW, Rubinlicht MA, Kaiser CR: Justifying inequality: a cross-temporal investigation of U.S. income disparities and just-world beliefs from 1973 to 2006. Soc Justice Res 22:369–383, 2009
Mancosu M, Vassallo S, Vezzoni C: Believing in conspiracy theories: evidence from an exploratory analysis of Italian survey data. South Eur Soc Polit 22:327–344, 2017
Martinez A, Shevlin M, Valiente C, et al: Paranoid beliefs and conspiracy mentality are associated with different forms of mistrust: a three-nation study. Front Psychol 13:1023366 2022 36329737
McCauley RN, Graham G: Hearing Voices and Other Matters of the Mind. New York, Oxford University Press, 2020
McIntosh CN, Fischer DG: Beck’s cognitive triad: one versus three factors. Can J Behav Sci 32(3):153–157, 2000
McIntyre JC, Wickham S, Barr B, Bentall RP: Social identity and psychosis: associations and psychological mechanisms. Schizophr Bull 44(3):681–690, 2018 28981888
McKay R: Delusional inference. Mind Lang 27(3):330–355, 2012
McKay RT, Dennett DC: The evolution of misbelief. Behav Brain Sci 32(6):493–510, discussion 510–561, 2009 20105353
Medalia NZ, Larsen O: Diffusion and belief in a collective delusion: the Seattle windshield pitting epidemic. Am Sociol Rev 23:180–186, 1958
Melle I: The Breivik case and what psychiatrists can learn from it. World Psychiatry 12(1):16–21, 2013 23471788
Menzies RE, Sharpe L, Dar-Nimrod I: The relationship between death anxiety and severity of mental illnesses. Br J Clin Psychol 58(4):452–467, 2019 31318066
Mineka S, Kihlstrom JF: Unpredictable and uncontrollable events: a new perspective on experimental neurosis. J Abnorm Psychol 87(2):256–271, 1978 565795
Mishara AL: Klaus Conrad (1905–1961): delusional mood, psychosis, and beginning schizophrenia. Schizophr Bull 36(1):9–13, 2010 19965934
Moritz S, Göritz AS, Gallinat J, et al: Subjective competence breeds overconfidence in errors in psychosis: a hubris account of paranoia. J Behav Ther Exp Psychiatry 48:118–124, 2015 25817242
Moutoussis M, Williams J, Dayan P, Bentall RP: Persecutory delusions and the conditioned avoidance paradigm: towards an integration of the psychology and biology of paranoia. Cogn Neuropsychiatry 12(6):495–510, 2007 17978936
Mullen R: The problem of bizarre delusions. J Nerv Ment Dis 191(8):546–548, 2003 12972859
Munro A: Monosymptomatic hypochondriacal psychosis manifesting as delusions of parasitosis: a description of four cases successfully treated with pimozide. Arch Dermatol 114(6):940–943, 1978 666333
Newheiser A-K, Farias M, Tausch N: The functional nature of conspiracy beliefs: examining the underpinnings of belief in the Da Vinci Code conspiracy. Pers Individ Dif 51:1007–1011, 2011
Norenzayan A, Gervais WM: The origins of religious disbelief. Trends Cogn Sci 17(1):20–25, 2013 23246230
Oltmanns TF, Maher BA (eds): Delusional Beliefs. Hoboken, NJ, Wiley, 1988
Ong WJ: Orality and Literacy: The Technologizing of the Word. London, Routledge, 1982
Parnas J: The Breivik case and “conditio psychiatrica.” World Psychiatry 12(1):22–23, 2013 23471789
Pearce JM: Animal Learning and Cognition: An Introduction. London, Psychology Press, 2008
Peires JB: The Dead Will Arise: Nongqawuse and the Great Xhosa Cattle-Killing Movement of 1856–7. Bloomington, Indiana University Press, 1989
Pennycook G, Cheyne JA, Seli P, et al: Analytic cognitive style predicts religious and paranormal belief. Cognition 123(3):335–346, 2012 22481051
Pennycook G, Cheyne JA, Barr N, et al: On the reception and detection of pseudo-profound bullshit. Judgm Decis Mak 10:549–563, 2015
Pennycook G, Ross RM, Koehler DJ, Fugelsang JA: Atheists and agnostics are more reflective than religious believers: four empirical studies and a meta-analysis. PLoS One 11(4):e0153039, 2016 27054566
Peters E, Day S, McKenna J, Orbach G: Delusional ideation in religious and psychotic populations. Br J Clin Psychol 38(1):83–96, 1999a 10212739
Peters ER, Joseph SA, Garety PA: Measurement of delusional ideation in the normal population: introducing the PDI (Peters et al. Delusions Inventory). Schizophr Bull 25(3):553–576, 1999b 10478789
Pettegree A: Brand Luther: How an Unheralded Monk Turned His Small Town Into a Center of Publishing, Made Himself the Most Famous Man in Europe—and Started the Protestant Reformation. New York, Penguin, 2015
Picardi A, Fonzi L, Pallagrosi M, et al: Delusional themes across affective and non-affective psychoses. Front Psychiatry 9:132, 2018 29674982
Pigliucci M: Nonsense on Stilts: How to Tell Science From Bunk, 2nd Edition. Chicago, IL, University of Chicago Press, 2018
Rhodes J, Gipps RGT: Delusions, certainty and the background. Philos Psychiatr Psychol 15(4):295–310, 2008
Roose K: What is QAnon, the viral pro-Trump conspiracy theory? New York Times, October 19, 2020
Ross RM, McKay R: Why is belief in God not a delusion. Religion Brain Behav 7:316–319, 2017
Rubin Z, Peplau LA: Who believes in a just world? J Soc Issues 31:65–90, 1975
Russell KJ, Hand CJ: Rape myth acceptance, victim blame attribution and just world beliefs: a rapid evidence assessment. Aggress Violent Behav 37:153–160, 2017
Sartre J-P: Anti-Semite and Jew. New York, Schocken Books, 1948
Schlenker BR, Chambers JR, Le BM: Conservatives are happier than liberals, but why? Political ideology, personality, and life satisfaction. J Res Pers 46:127–146, 2012
Schneider K: Clinical Psychopathology. New York, Grune Stratton, 1959
Schwitzgebel E: Belief, in The Stanford Encyclopedia of Philosophy. Stanford, CA, Stanford Center for the Study of Language and Information, 2015. Available at: https://plato.stanford.edu/archives/sum2015/entries/belief. Accessed December 12, 2021.
Shagan EH: The Birth of Modern Belief: Faith and Judgment from the Middle Ages to the Enlightenment. Princeton, NJ, Princeton University Press, 2018
Siddle R, Haddock G, Tarrier N, Faragher EB: Religious delusions in patients admitted to hospital with schizophrenia. Soc Psychiatry Psychiatr Epidemiol 37(3):130–138, 2002 11990010
Silva BC, Proksch S-E: Fake it ’til you make it: a natural experiment to identify European politicians’ benefit from Twitter bots. Am Polit Sci Rev 115:316–322, 2021
Smith SM, Fabrigar LR, MacDougall BL, Wiesenthal NL: The role of amount, cognitive elaboration, and structural consistency of attitude-relevant knowledge in the formation of attitude certainty. Eur J Soc Psychol 38(2):280–295, 2007
So SH, Freeman D, Dunn G, et al: Jumping to conclusions, a lack of belief flexibility and delusional conviction in psychosis: a longitudinal investigation of the structure, frequency, and relatedness of reasoning biases. J Abnorm Psychol 121(1):129–139, 2021 21910515
Solomon S, Greenberg J, Pyszczynski T: The Worm at the Core: On the Role of Death in Life. New York, Penguin, 2015
Sperber D: Explaining Culture: A Naturalistic Approach. Oxford, UK, Blackwell, 1996
Spitzer RL, First MB, Kendler KS, Stein DJ: The reliability of three definitions of bizarre delusions. Am J Psychiatry 150(6):880–884, 1993 8494062
Startup M, Startup S: On two kinds of delusion of reference. Psychiatry Res 137(1–2):87–92, 2005 16226316
Startup M, Bucci S, Langdon R: Delusions of reference: a new theoretical model. Cogn Neuropsychiatry 14(2):110–126, 2009 19370435
Stenner K: The Authoritarian Dynamic. Cambridge, UK, Cambridge University Press, 2005
Stenner K, Haidt J: Authoritarianism is not a momentary madness, but an eternal dynamic within liberal democracies, in Can It Happen Here? Authoritarianism in America. Edited by Sunstein CR. New York, HarperCollins, 2018, pp 175–220
Stich SP: Deconstructing the Mind. New York, Oxford University Press, 1996
Sutcliffe A, Dunbar R, Binder J, Arrow H: Relationships and the social brain: integrating psychological and evolutionary perspectives. Br J Psychol 103(2):149–168, 2012 22506741
Swami V, Coles R, Stieger S, et al: Conspiracist ideation in Britain and Austria: evidence of a monological belief system and associations between individual psychological differences and real-world and fictitious conspiracy theories. Br J Psychol 102(3):443–463, 2011 21751999
Swami V, Voracek M, Stieger S, et al: Analytic thinking reduces belief in conspiracy theories. Cognition 133(3):572–585, 2014 25217762
Tedrus GMAS, Fonseca LC, Fagundes TM, da Silva GL: Religiosity aspects in patients with epilepsy. Epilepsy Behav 50:67–70, 2015 26133113
Thalbourne MA: Further studies of the measurement and correlates of belief in the paranormal. J Am Soc Psych Res 89:233–247, 1995
Tobacyk JJ: Death threat, death concerns, and paranormal belief. Death Educ 7:115–124, 2007
Tormala ZL, Rucker DD: Attitude certainty: antecedents, consequences, and new direction. Consumer Psychology Review 1:72–89, 2018
van Os J, Hanssen M, Bijl RV, Ravelli A: Strauss (1969) revisited: a psychosis continuum in the general population? Schizophr Res 45(1–2):11–20, 2000 10978868
van Prooijen J-W, Acker M: The influence of control on belief in conspiracy theories: conceptual and applied extensions. Appl Cogn Psychol 29:753–761, 2015
Verdoux H, Maurice-Tison S, Gay B, et al: A survey of delusional ideation in primary-care patients. Psychol Med 28(1):127–134, 1998 9483688
Vygotsky LS: Thought and Language. Cambridge, MA, MIT Press, 1962
Ward T, Garety PA: Fast and slow thinking in distressing delusions: a review of the literature and implications for targeted therapy. Schizophr Res 203:80–87, 2019 28927863
Westen D: The Political Brain: The Role of Emotion in Deciding the Fate of the Nation. New York, Public Affairs, 2008
Willer R: No atheists in foxholes: motivated reasoning and religious belief, in Social and Psychological Bases of Ideology and System Justification. Edited by Jost JT, Kay AC, Thorisdottir H. New York, Oxford University Press, 2009, pp 241–268
Winch P: The Idea of a Social Science and Its Relation to Philosophy. London, Routledge, 1958
Wing JK, Cooper JE, Sartorius N: The Measurement and Classification of Psychiatric Symptoms. London, Cambridge University Press, 1974
Wittgenstein L: Philosophical Investigations. Oxford, UK, Blackwell, 1953
Wittgenstein L: On Certainty. Translated by Paul D, Anscombe GEM. Edited by Anscombe GEM, von Wright GH. Oxford, UK, Blackwell, 1969
Wong SH: Does superstition help? A study of the role of superstitions and death beliefs on death anxiety amongst Chinese undergraduates in Hong Kong. Omega (Westport) 65(1):55–70, 2012 22852421
Wood MJ, Douglas KM, Sutton RM: Dead and alive: beliefs in contradictory conspiracy theories. Soc Psychol Personal Sci 3:767–773, 2012
Woodward TS, Moritz S, Menon M, Klinge R: Belief inflexibility in schizophrenia. Cogn Neuropsychiatry 13(3):267–277, 2008 18484291
Wootton D: The Invention of Science: A New History of the Scientific Revolution. New York, Allen Lane, 2015
World Health Organization: International Classification of Diseases, 11th Revision. Geneva, World Health Organization, 2018
Wu C, Shaffer DR: Susceptibility to persuasive appeals as a function of source credibility and prior experience with the attitude object. J Pers Soc Psychol 52(4):677–688, 1987
Young AW, Ellis HD, Szulecka TK, De Pauw KW: Face processing impairments and delusional misidentification. Behav Neurol 3(3):153–168, 1990 24487239
Zakay D, Tuvia R: Choice latency times as determinants of post-decisional confidence. Acta Psychol (Amst) 98:103–115, 1998
Zhu C, Sun X, So SH: Associations between belief inflexibility and dimensions of delusions: a meta-analytic review of two approaches to assessing belief flexibility. Br J Clin Psychol 57(1):59–81, 2018 28805246

Information & Authors

Information

Published In

Go to Decoding Delusions
Decoding Delusions: A Clinician's Guide to Working With Delusions and Other Extreme Beliefs
Pages: 3 - 46

History

Published in print: 12 June 2023
Published online: 5 December 2024
© American Psychiatric Association Publishing

Authors

Details

Richard Bentall, Ph.D., FBA

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share