It is a common occurrence to come across people who believe things that seem extraordinary, and who maintain that belief even in the face of huge amounts of contradictory evidence. For example despite vast amounts of evidence suggesting otherwise, there are people who believe that aliens create crop circles, that astrology can predict their future, and that the next Adam Sandler movie will be any good. A delusion can be defined as an extraordinary belief that is strongly held despite the presence of seemingly overwhelming evidence to the contrary. They are of particular interest to psychologists and neuroscientists because they occur in a number of neurological disorders, as well as in seemingly healthy individuals. For example a variety of paranoid or grandiose delusions frequently occur in psychotic disorders such as schizophrenia. Delusions relating to various bizarre forms of misidentification, such as the belief that a loved one is an imposter (the Capgras delusion) can also occur, often in forms of dementia such as Alzheimer’s Disease, and even in old age populations who do not exhibit any other noticeable cognitive impairment (1). Delusions of various types also occur in Parkinson’s disease, depression and as a result of other brain traumas such as those caused by strokes.
One error or two?
On a theoretical level there has traditionally been a distinction between 1-step and 2-step theories of delusions. 1-step theories (e.g. 2) suggest that a single perceptual deficit causes delusions. The delusion represents the most logical response to the bizarre perceptual information the brain is receiving as a result of the perceptual deficit. For example paranoid delusions may be caused by a perceptual bias towards threat signals which makes the sufferer conclude that some overbearing threat must be present to explain the constant warnings coming from the sensory environment. In contrast 2-step models (e.g. 3) argue that in addition to a perceptual deficit, there must also be a second, cognitive deficit. Such theories are motivated in part by the finding that there are some individuals who exhibit very similar perceptual deficits to those with delusions, but nevertheless do not hold delusional beliefs. For example there are individuals with bilateral damage to specific parts of the frontal lobe who, like patients with the Capgras delusion, experience a lack of familiarity when they come into contact with a particular close relative. However in contrast to the Capgras patients, the frontal lobe patients do not hold the belief that the relative is an imposter (4). Instead they are able to understand that it is their experience that has changed, rather than their relative. While 1-step theories suggest that delusions are caused by a single neuro-perceptual deficit, which varies in its nature depending on the nature of the delusion, 2-step theories require that an additional, separate deficit exists within the neural system involved in the formation and evaluation of beliefs. Variances in this second cognitive stage explain the likelihood of adopting a delusional belief in the context of disrupted perceptual experiences, and hence the difference between the Capgras and frontal lobe patients.
How are beliefs formed and updated?
If delusions are underpinned by a 2-step deficit, with the second, cognitive step being similar across delusional disorders, then the question arises as to what is the exact nature of this cognitive deficit? Recently an answer to this question has been proposed based off the insight that our ability to navigate the world is achieved through a process of inferential learning (e.g. 5). In short it is proposed that the brain creates representations as to how the external world is organized based off the information it receives. These models of the world by their nature encapsulate our belief system, as they contain representations of how different information is related, and what is likely to occur in any given situation. These models also allow the brain to predict both upcoming external stimulation, and internal experience. When actual experience differs from that which is expected, signals communicating this discrepancy (referred to as prediction-error signals) are sent back to the areas that generated the prediction, with the purpose of updating the model from which the original prediction arose. This process, when working optimally, allows us to adapt to new, unexpected information while at the same time enabling the majority of unexceptional information we encounter to be processed quickly and with minimum effort (because it has been predicted in advance).
Within this system the updating of beliefs can be framed using the principles of Bayesian inference, whereby the decision as to whether to adopt one of (say) two explanations to account for an unexpected stimulus is taken by balancing the inherent probability of each explanation (based off the current model of the world that the individual holds) with the likelihood of the unexpected stimulus having occurred if each explanation were true. When in the presence of a surprising or anomalous experience, such as those caused by the perceptual deficits believed to underpin the first step of delusion formation, an alteration in the belief pattern will only occur if the difference between the probability of the sensation occurring given that the new belief is true, compared to its probability of it occurring if the existing belief is true, is greater than the difference in the inherent probability of the two beliefs. In order to adopt an atypical or delusional belief, whose inherent probability would usually be very low, new evidence would have to appear that is almost inexplicable within the current belief system, while being fully explainable using the new belief. For example to believe that the moon is made of cheese would probably require you to actually travel to the moon, dig a bit of it up, put it in your mouth and taste cheese. Any lesser form of evidence would be discarded as a coincidence or trick, as the inherent probability of the moon being made of cheese given your existing belief system is (or at least should be) extremely low!
Delusions: A problem with prediction error?
In delusions it is proposed that this process of error-dependent updating of beliefs is disrupted. Most likely this occurs through a process whereby the weight (or importance) given to various prediction error signals is sub-optimal (e.g. 6, 7). If prediction error signals are given undue weight then potentially unimportant variances from expectation will become flagged as being highly salient. This in turn would mean that they are given unnecessary influence in updating our belief system. An anomalous experience that would normally not be treated as particularly relevant to understanding how the world works, either because of the unusual context in which it occurred, or its infrequency, would, if this deficit existed, be treated as important enough to warrant a change in the individual’s belief system. In terms of Bayesian inference, a system which gives undue weight to prediction errors would be one that had a bias towards accepting the influence of the new anomalous experiences without taking fully into account the relative inherent probabilities of the competing potential beliefs (which would usually strongly favour the non-delusional belief) (8). A less convincing anomalous experience would therefore be required in order to successfully challenge an existing non-delusional belief.
As an example, reconsider the aforementioned difference between patients with frontal lobe lesions and those with the Capgras delusion. In both types of patient the feeling of familiarity that is expected to appear on the physical recognition of a known person is absent. In the non-deluded individual, while this discrepancy is noted, it is not used to adopt the ‘imposter explanation’ because the correct weight is given to the prediction error and it is therefore not strong enough to overturn an otherwise functioning belief that the individual is who they claim to be (a belief that would be supported by several other pieces of information). In contrast the deluded individual gives far too much weight to the unexpected experience of non-familiarity, and the model is changed to accommodate it through the acquisition of the belief that the person is an imposter. As the prediction error deficit in such cases is restricted to the perceptual system dedicated to familiarity processing, other evidence that is contradictory to the imposter hypothesis, but which comes from a different source (e.g. people telling the deluded individual that they are wrong) is not treated with the same weight as the experience of absent familiarity. The delusion is therefore maintained even in light of strong contradictory evidence.
More widespread delusions
Whereas the Capgras delusion tends to be monothematic (i.e. it relates to just one known person having been replaced by an imposter, rather than people in general being imposters) faulty prediction error signalling can also be used to explain more widespread delusional thinking such as paranoia. For example one potentail consequence of the incorrect updating of belief systems is that the model of the world that the individual holds will itself become further divorced from reality, making it less able to accurately predict upcoming stimulation. This in turn will lead to a further increase in the frequency of prediction errors; to the extent that surprising or anomalous information would appear to occur with seemingly baffling frequency. If the deficit in prediction error exists across more than one perceptual domain, the inferential response to this might be to adopt a paranoid outlook to explain this constant uncertainty in the world. For example a delusion that MI5 are spying on the sufferer might be the best explanation for a world where objects and strangers seem to take on a sinister level of salience, and unexpected events seem to happen with alarming frequency (6).
Is healthy belief formation optimal, or are we all deluded?
The strength of a model of delusions based off deficits in the processes of inferential learning is that it can be used to explain the characteristics of general belief formation. For example deficits in prediction-error signaling may explain why some otherwise healthy individuals tend to adopt a wide variety of irrational beliefs. Such people may lack the perceptual deficit that causes the bizarre but specific anomalous experiences suffered by individuals with clinical delusions, but they may share with the clinical group a general deficit in inferential reasoning which results in a tendency to accept unusual beliefs that are poorly supported by available evidence. Along similar lines, variances from optimal processing (in terms of Bayesian inference) may explain more general cognitive biases that seem to be present in most people (including scientists!) and which are therefore presumably hard wired in the human brain because they have some adaptive evolutionary advantage. For example most people display a ‘belief bias’, the tendency to evaluate the validity of evidence based on their prior beliefs, rather than on the inherent validity of the evidence as could be assessed through logical reasoning (9). This bias could be said to be the result of our system of inferential learning being sub-optimal (in Bayesian terms) but in the opposite direction to that seen in delusion, such that we have a bias towards evaluating beliefs more in terms of their inherent probability (as we see it) without fully taking into account new evidence.
More generally the processes of inferential learning and belief formation may be able to explain why people who have had relatively similar types of upbringing and experience can often exhibit very different sets of beliefs. These differences are likely to be in part due to differences in the process of belief formation between individuals. It would seem very unlikely that anybody’s brain is able to process information in strict accordance with Bayesian inference, given that neural signals are coded through the transmission of neurotransmitters between groups of neurons, a process that is naturally susceptible to a significant amount of noise. Differences in beliefs between people are presumably therefore inevitable, as is the likelihood that we all, at some time, adopt irrational convictions. Of course these are just things that I believe, and I may be deluded in believing them!
Image courtesy of www.freedigitalphotos.net
(1) Holt, A.E., & Albert, M.L. (2007) Cognitive Neuroscience of delusions in aging. Neuropsychiatric disease and treatment, 2 (2) 181-189. Link
(2) Maher, B.A. (1974) Delusional thinking and perceptual disorder. Journal of Individual Psychology, 30:98-113. Link
(3) Coltheart, M, Langdon, R. & McKay, R. (2011) Delusional Belief. Annual Review of Psychology, 62, 271-298 Link
(4) Tranel, D., Damasio, H. & Damasio, A.R. (1995) Double dissociation between overt and covert face recognition. Journal of Cognitive Neuroscience, 7(4) 425-432. Link
(5) Friston, K. (2003). Learning and inference in the brain. Neural Networks, 16(9), 1325-1352. Link
(6) Fletcher, P. C., & Frith, C. D. (2009). Perceiving is believing: a Bayesian approach to explaining the positive symptoms of schizophrenia. Nature Reviews Neuroscience, 10(1), 48-58. Link
(7) Corlett, P. R., Taylor, J. R., Wang, X. J., Fletcher, P. C., & Krystal, J. H. (2010). Toward a neurobiology of delusions. Progress in Neurobiology, 92(3), 345-369. Link
(8) McKay, R. (2012). Delusional Inference. Mind & Language, 27(3), 330-355. Link
(9) Markovits, H. & Nantel, G. (1989). The belief bias effect in the production and evaluation of logical conclusions. Memory & Cognition, 17(1) 11-17. Link