Cognitive bias

Economists claim individuals are rational decision makers. They collect a lot of information, examine all alternatives and make decisions that maximise personal satisfaction. However, we do not make decisions in such manner.

Mount Everest tragedy

In 1996 two Mount Everest expedition teams were caught up in storm high up in the mountain. Both team leaders and three team members died during the storm. This was a commercial expedition meaning clients paid to be guided by professional mountaineers.


It takes two months to climb to Mount Everest. There is a six weeks routine preparation period. Expedition group has to establish a series of camps along the path to the summit, starting with the base camp. It takes 18 hours round trip to reach the final summit. You have to leave at night and climb through dark. If it goes well, members reach the summit midday. Afterwards, you have to down-climb quickly so you can reach camp IV before it gets dark. Supplement of oxygen is critical.


During this particular expedition, mountaineers violated some of their own rules. There is a certain principle — if you could not reach the top by one or two o’clock in the afternoon, you should turn around. This is because it is dangerous to climb down in the darkness. In May 1996 many of the expedition team members did not reach the summit until late afternoon. The turnaround rule was ignored. Storm hit the mountain and the climbers died high up there.

Cognitive Biases

Human beings are cognitively limited (bounded rationality). We cannot be as comprehensive in our information gathering and analysis as economists assume. Instead, we search for alternatives only to the point where we find an acceptable solution (satisficing). We do not keep looking for perfectly optimal solution.

In many situations, we take shortcuts. People employ heuristics and rules of thumb to make decisions. Most of the time, shortcuts serves us well. They save great deal of time and we still arrive at good decisions. However, sometimes we make mistakes. Our cognitive limitations lead to errors in judgement (cognitive bias).

1. Overconfidence Bias

Human beings are systematically overconfident in their judgements. Positive self-assessments are also present in Mount Everest case. Evaluating previous expeditions, team leaders were confident they will make it this time too.

2. Sunk-Cost Effect

Sunk cost effect is defined as tendency to escalate commitment to a course of action in which they have made substantial prior investments of time, money and other resources. If people were rational decision makers, they would make choices based on marginal costs and benefits of their actions. In the face of high sunk costs, people become overly committed to certain activities, even if the results are poor.

In Mount Everest case, climbers did not want to “waste” time, money and other resources that they have previously invested over many months of preparation. Therefore, they violated turning around rule and kept on climbing. When you are too close to the top, you just cannot stop, can you?

3. Recency Effect

Availability bias is defined as tendency to place too much emphasis on information and evidence that is most readily available to us when making certain decisions. In Mount Everest case, climbers were fooled because the weather was good in recent years. Climbers underestimated probability of bad storm and thus probability of failure.

Adapted from

Bazerman, M.H. and Moore, D.A., 1994. Judgment in managerial decision making (p. 226). New York: Wiley.

Common biases in decision making

In general there are three general heuristics namely availability, representative and confirmation heuristics. They encompass eleven specific biases.

Heuristic definition

Individuals rely on rules of thumb (heuristics) to lessen the information processing demands of making decisions.

Availability heuristic

The inferences we make about event commonness based on the ease with which we can remember instances of that event.

Retrievability bias

We are better at retrieving some subjects from our memory than other things. Individuals base judgement on commonality and easier base strategies.

Base rate fallacy

People tend to ignore background information relevant to the problem such as base rate. We tend to assume that causes and consequences are related.

Gambler’s fallacy

Simple statistics claims each event in a sequence is equally likely to occur. But individuals believe random and non-random events will balance out.

Small sample size fallacy

Simple statistics state that we are more likely to observe an unusual event in a small sample compared to a large one. Learn more.

Conjunction fallacy

Describes how conjunction is judged to be more probable than a single component descriptor. Intuitively thinking, something appears to be more correct.

Confirmation trap

People naturally tend to seek information that confirms their expectations and hypothesis, even when evidence is disconfirming or falsifying.


Something went wrong. Please refresh the page and/or try again.