Dragon 1: Limited cognition

Robert Gifford identified and named the sub-species of the first dragon as follows.

These have been updated by van Bronswijk et al. (2021) in the following way:

Limited cognition or reasoning ability

Human thinking is not always logical or rational (Gifford, 2011). At least since the studies of Tversky and Kahnemann (1974) it could be shown that we humans tend to simplify our thinking by rules of thumb (heuristics).

The main purpose of this is to reduce uncertainty and to deal as well as possible with a limited amount of time and knowledge. Even though these heuristics are often useful for decision-making, they can easily fall victim to systematic errors (biases).

In the first Dragon Family Limited Cognition or Limited Thinking Capacity, Robert Gifford uses the example of the climate crisis to explain how irrational thinking can affect our decisions. Apart from error-prone rules of thumb, our actions can also be influenced more generally by limited cognitive capacities (e.g. Finite Pool of Worry, Hansen et al., 2004) or a limited attention budget (Finite Pool of Attention, Sisco et al., 2020).

The dragon species of the Limited Thinking family are divided into:

  • Ancient brain – I live in the here and now.
  • Ignorance – What I don’t know doesn’t make me hot.
  • Environmental numbness or dumbing down – Climate change in the news again?
  • Insecurity – I can’t, I don’t know, I don’t want to.
  • Optimistic bias – It’s going to be okay!
  • Putting it in perspective – It’s worse elsewhere and when elsewhere is worse.
  • (Lack of) perceived action control and (lack of) perceived self-efficacy – Can I change anything at all?

Ancient brain

In terms of contemporary history, human beings have long been preoccupied with finding the best possible way to cope in the here and now – be it with regard to the supply of food, the demands of the social group or the defense against danger.

This contrasts with today’s challenges, such as the climate crisis, which are unfolding slowly, sometimes even unnoticed, and independently of personal well-being.

As a result, people often do not feel (directly) affected. This can lead to a low motivation to adapt their behaviour, making, for example, the reduction of CO2 emissions less likely (Gifford, 2011).

Trope and Libermann (2012) also described in their Construal Level Theory that we humans focus our perceptions on the present and overemphasize it in our decisions. With psychological distance (temporal, spatial, interpersonal), self-regulatory capacity increases and a person’s core values are emphasized.

With lower psychological distance, situational evaluations become more important. Taking the example of a citizen’s decision on wind power, this could mean that voting behaviour at low psychological distance is more dependent on the environment, for example the opinion of the social group.

This lower psychological distance could be due to the spatial proximity of the planning area or a timely end of voting. With greater psychological distance, the decision would in turn be based more on one’s own values and convictions.


Ignorance arises from a lack of awareness that certain problems even exist, or because certain connections are unknown. This can be compounded by a lack of technical and scientific understanding (Gifford, 2011). So, does this mean that more knowledge simply needs to be taught? Not necessarily. Presenting too much information for a decision can lead to overload or excessive demands (Artinger et al., 2016).

It may even be that in some situations the heuristics just mentioned lead to better decisions than the evaluation of a lot of information (Gigerenzer & Selten, 2002). The reason for this is the less-is-more effect – a lot of information means a lot of effort and can therefore make it more difficult to make an optimal decision (Goldstein & Gigerenzer, 2002). Conversely, this should not mean that facts are presented incompletely, but must still be formulated clearly and correctly.

Ignorance can also come from media: Different presentations in reports can lead to confusion and thus to reduced readiness to act (Gifford, 2011; Hoggan & Littlemore, 2009). The influence of media in reports can lead to confusion and thus reduced willingness to act (Gifford, 2011; Hoggan & Littlemore, 2009).

However, the influence of media also depends on the motivation to want to develop a correct and well-founded opinion in the first place (Valentino & Nardis, 2013). Namely, according to the effect of selective exposure to information (Klapper, 1960), it is partly the case that people tend to look only for the reports that confirm the already existing views. The reason is that an unpleasant feeling of inner tension might arise if there is contradictory information (see Cognitive Dissonance: Festinger, 1957). And this is exactly what should be avoided by choosing reports that confirm one’s own views.

Environmental numbness or deadening

Many situations contain more information and detail than we can process (Artinger et al., 2016; Gifford, 2011).

This is also true of a large-scale problem like the climate crisis.

Through excessive confrontation with the topic of “climate change”, habituation then takes place, causing our attention to be increasingly less aroused (Gifford, 2011; Stoknes, 2014).

The consequence: people become dull in their discussion of the climate problem.

However, communicating only the severity of environmental degradation can also be counterproductive. It is important to point out possible solutions and opportunities at the same time. The reason for this is that otherwise the emotional resilience is exceeded and this not only leads to blunting, but the problem itself is “warded off” – so the issue is suppressed instead of being dealt with. Climate communication must therefore change, i.e. not just focus on sending a message, but also encourage people to take action.


The next type of Dragon Family Limited Thinking is the state of uncertainty and insecurity. Uncertainty means hesitation, and hesitation means inhibition of action (Gifford, 2011). Uncertainty can sometimes even be useful, namely when we “abuse” it for our own conscience. For: being uncertain is a helpful excuse to reel off one’s own (action) possibilities, while insincerely intending to justify one’s own selfish behavior (see dragon denial).

Scientists are thus faced with the challenge of meeting the demand for accurate and probability-based reporting without, in the worst case, “provoking” uncertainty and a systematic underestimation of risks.

Uncertainties and probabilities are encountered by people in connection with climate change in many places (Gifford, 2011; Yousefpour & Hanewinkel, 2016).

  • Which behavior makes the most sense?
  • What options are available to me at all?
  • And how can I still manage to protect my own needs and habits?

Roelich and Giesekam (2019) investigated the influence of uncertainty among decision-makers, for example in politics. Difficulties in decision-making are, for example, that greenhouse gases have various emitters and it is not always clear who contributes what. In addition, they too have to rely on models and forecasts whose predictions are only valid with a certain degree of probability. A long-term reduction of CO2 emissions is therefore complex and marked by uncertainties at various points.

Optimistic bias

The next dragon type is called optimistic bias. Through overconfidence, people may underestimate the risks of climate change-related environmental disasters (Gifford, 2011). We then assume that a positive future will occur, even without having to do or refrain from doing anything about it. This optimism also makes us feel less guilty for our actions and less likely to take responsibility for the consequences. Optimistic bias can also be self-esteem-serving. For example, people assume that they are more environmentally activist compared to others anyway (Bergquist, 2020)


If we want to meet the challenge of limiting the global temperature increase to 1.5C together, we have to take responsibility. Each and every one of us. This can be overwhelming and motivating to abdicate responsibility with regard to environmental degradation. A simple way to do just that is to affirm that events in the distant future or other places are much worse than in the subjective “here and now” (cf. Dragon Species Ancient Brain; Gifford, 2011).

This inhibits the willingness to act on site and in the present.

The reason: we feel less affected and therefore less responsible.

Lack of perceived action control and self-efficacy.

The last two dragon types of the Limited Cognition or Thinking Capacity family are called Lack of Perceived Action Control and Lack of Self-Efficacy (Gifford, 2011). They also lead to inhibition of action intentions.

When people believe, they can do nothing because of the magnitude of a problem such as climate change, they do not even take action (Ajzen, 2002). In extreme forms, this even leads to the belief that no one else can do anything either (Gifford, 2001; Mayer & Smith, 2019). The psychological cause of this is, among other things, the subjective assessment of who or what has how much control over what actually happens – or no control (see Locus of Control; Rotter, 1966).

In addition, a great many people and organisations are involved in the emission of greenhouse gases. Again, the question arises:

  • How much responsibility do I personally actually bear?
  • And above all: How much control do I actually have over whether something changes?

 The perception of a lack of control is fueled by the (justified) assumption that we can do as much as we like to combat climate change and that it will still be of no use if other countries and companies only think of their own profits and continue to emit greenhouse gases (Stoknes, 2014).

Uncertainty about how we should act thus arises from uncertainty about who bears how much responsibility or has how much influence at which point. This leads to a diffusion of responsibility. This can already be observed in everyday situations: Let’s imagine we are standing together with a group of other people in the park and suddenly help is called for. Often very few people take action, either because they are insecure or because they imitate the people around them – standing still and doing nothing. This behavior in group situations is called the bystander effect (Darley & Latané, 1968; Latané & Darley, 1969).

The climate crisis even reveals a global bystander effect (Mills, 2020): governments and opinion leaders do not act as environmental role models that could be imitated by citizens. They do nothing or far too little. Many people do the same. Just like individual citizens, “social leaders” are often overtaxed, unrealistically optimistic and underestimate the risks. In other words, everyone stands by while the climate calls for help.

But there is also good news: People are taking to the streets, protesting, and breaking the powerlessness (Swim et al., 2019). Moments of collective power emerge and weakened self-efficacy is strengthened by the group feeling (Jugert et al., 2016). Even people who were previously only inactive observers or bystanders are mobilised, because strikes not only strengthen the belief in collective potential, but can also reduce former prejudices against the demonstrators (Swim et al., 2019).


Dohm, Lea; Peter, Felix; van Bronswijk, Katharina (Hg.). August 2021. Climate Action – Psychologie der Klimakrise. Psychosozial-Verlag. Giessen Germany.

Gifford, Robert. 2011. The Dragons of Inaction: Psychological Barriers That Limit Climate Change Mitigation and Adaptation