Cognitive bias in investment decision-making
We’ve compiled a list of the main unconscious biases and their influence on the way we invest and make financial decisions.
To understand the impact of cognitive bias on active management, download our free white paper by investment process expert, Eric Rovick. It offers practical steps for portfolio managers who want to recognise and manage bias in their daily investment processes.
When we rely too much on the first piece of information we came across when making a decision.
Availability Heuristic (aka Availability Bias)
A mental shortcut by which one overestimates the importance or likelihood of something based on how easily an example or instance comes to mind.
This bias is important because of its impact on how well we perceive risk. Of course, what we remember or how easily something comes to mind will be influenced by many things but media coverage is usually a big factor.
“People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media. Frequently mentioned topics populate the mind even as others slip away from awareness…”.
Daniel Kahneman – Thinking, Fast and Slow (2011)
Believing something is true or correct because many other people do.
Blind Spot Bias
Demonstrated when we think we’re less prone to cognitive bias than those around us.
“People see themselves differently from how they see others. They are immersed in their own sensations, emotions, and cognitions at the same time that their experience of others is dominated by what can be observed externally.”
Emily Pronin – How We See Ourselves and How We See Others (2008)
Photo source: Department of Psychology, Princeton University
This representativeness heuristic is a tendency to assume that specific conditions are more probable than general ones.
The tendency to overestimate the importance of small patterns or clusters found in a large amount of data.
Also called confirmatory bias or myside bias, this is the tendency to search for, interpret, and remember information in a way that confirms their existing preconceptions. This unconscious bias makes it possible to miss findings or ignore evidence that could otherwise change our view.
“Now, there was a smart man, who did just about the hardest thing in the world to do. Charles Darwin used to say that whenever he ran into something that contradicted a conclusion he cherished, he was obliged to write the new finding down within 30 minutes. Otherwise his mind would work to reject the discordant information, much as the body rejects transplants. Man’s natural inclination is to cling to his beliefs, particularly if they are reinforced by recent experience–a flaw in our makeup that bears on what happens during secular bull markets and extended periods of stagnation.”
When we cling to an initial viewpoint even when there’s new information or evidence that challenges it.
The Curse of Knowledge
When knowledge of a topic diminishes our ability to think about it from a less-informed, but more neutral, perspective.
An expression of loss Aversion, the disposition effect was identified and named by Hersh Shefrin and Meir Statman in 1985. It is the tendency common to both professional and amateur investors to hold on to losing investment positions for too long, whilst selling winners too soon.
Research into the investment impact of the disposition effect by Terrance Odean (Are Investors Reluctant to Realize Their Losses? – 1998) showed that winners that were sold outperformed losers that were retained by an average excess return of 3.4% per annum.
“Meir Statman and I… coined the term disposition effect as shorthand for the predisposition toward get-evenitis.”
Hersh Shefrin (2000)
When we consider an asset that we already own as more valuable than similar assets that we don’t. Also known as the divestiture aversion.
The term was coined in 1980 by Richard Thaler who was the first person to systematically study the bias.
One of the most famous experiments into the endowment effect is the 1990 research carried out by by Daniel Kahneman, Jack Knetsch and Richard Thaler. In this study, some of the participants were each given a mug. They were not told the retail price of the mug and were each asked to list the lowest price they’d be willing to sell it for. Other participants didn’t receive a mug and were asked how much they’d be willing to pay to buy one.
The difference in price from each group for the same mug was striking: those with the mug listed selling prices that were too high for the buyers (on average, they would not sell for less than $5.25). In contrast, those buying the mug did not want to pay more than $2.25 – $2.75.
This experiment was repeated with other objects, including pens. Even when the price tag was left on the pen, the same effect was demonstrated – with pen sellers listing sale prices of $4.25-$4.75, even when the pen’s price tag showed $3.98.
Watch: A nice explanation of the endowment effect and how it relates to loss aversion
The Framing Effect
Drawing different conclusions from the same information, depending on how or by whom that information is presented.
The belief that future probabilities are altered by past events. This bias is a product of “representativeness”, a psychological phenomenon that leads us to rely overly on heuristics (rules of thumb) or “stereotypical thinking” when making decisions or judgements.
Linked to the hot-hand fallacy in basketball, where observers predict a player will continue to play well because recent performance has been so good:
“….statistically, there is no such thing as a hot hand. This is not to say that we do not see streaks. We do see them, but the point is that they have no predictive power! The idea that streaks have predictive power is an illusion. The fact is, as a species, humans have very poor intuition about random processes, whether the process is coin tossing or whether it is three-point attempts in basketball. In particular, representativeness leads us to extrapolate recent performance.”
Hersh Shefrin et al – Behavioral Finance: Biases, Mean–Variance Returns, and Risk Premiums (2007)
The tendency to favor short-term gain over greater gain available in the long-term.
The tendency to follow the actions of a larger group.
Howard Marks, CFA, Oaktree Capital Management (2006)
House Money Effect
The tendency to take on greater risks when investing with profits. The name is derived from the casino-related expression “playing with the house’s money”.
The effect was identified by Richard Thaler and Eric Johnson in their 1990 paper, Gambling With the House Money and Trying to Break Even: The Effects of Prior Outcomes on Risky Choice.
The mental accounting of this bias suggests that a successful outcome in a recent trade or investment with above-average risk then leads to a temporary reduction in the investor’s risk tolerance. As a result, the investor seeks even more risk with their next trade.
“How is risk-taking affected by prior gains and losses? While normative theory implores decision makers to only consider incremental outcomes, real decision makers are influenced by prior outcomes”.
Thaler and Johnson (1990)
The IKEA Effect
This bias explains the tendency for people to place a disproportionately high value on objects that they partially created themselves, regardless of the quality of the end result.
In the original 2011 research, The IKEA effect: When labor leads to love, consumers assembled IKEA boxes, folded origami, and built Lego sets. Participants of the study saw their amateurish creations as similar in value to those created by experts and expected others to have the same view. The IKEA effect was only evident when the participants had successfully completed their assembly task; when they built and then destroyed their creations, or failed to complete it, the cognitive bias dissipated.
“The overvaluation that occurs as a result of the IKEA effect has implications for organizations more broadly, as a contributor to two key organizational pitfalls: sunk cost effects (Arkes and Blumer 1985; Staw 1981), which can cause managers to continue to devote resources to failing projects in which they have previously invested (Biyalogorsky, Boulding, and Staelin 2006), and the “not invented here” syndrome, in which managers refuse to use perfectly good ideas developed elsewhere in favor of their – sometimes inferior – internally-developed ideas.”
Norton, Mochon and Ariely (2011)
The Illusion of Control
This occurs when we tend to overestimate our ability to control events or outcomes. The effect was identified by Ellen Langer in 1975. Her research showed that people can behave as if chance events are accessible to personal control. The bias was notable in people’s perception of gambling and the paranormal.
In a series of experiments, Langer found that people were more likely to demonstrate the illusion of control when ‘skill cues’ were present – ie there was something in the chance situation that could be readily-associated in the individual’s mind with the exercise of personal skill or control. A simple form of this effect was found in casinos: when rolling dice in a craps game people tend to throw harder when they need high numbers and softer for low numbers.
The 2003 research, Trading on illusions: Unrealistic perceptions of control and trading performance, (Fenton-O’Creevy*, Nicholson, Soane and Willman) explored the illusion of control in the context of financial market decision-making:
A group of traders were asked to watch a graph being plotted on a computer screen, similar to a real-time graph of a stock price or index. Using three computer keys, they had to raise the value as high as possible. They were warned that the value showed random variations, but that the keys might have some effect. In fact, the fluctuations were not affected by the keys. The traders’ ratings of their success measured their susceptibility to the illusion of control. This score was then compared with each trader’s performance. Those who were more prone to the illusion scored significantly lower on analysis, risk management and contribution to profits. They also earned significantly less.
“Traders with a high propensity to illusion of control exhibit a lower profit performance and earn less than those with low illusion of control. There is also support for a link between illusion of control and poor risk management and analysis.”
Mark Fenton-O’Creevy et al – Trading on Illusions (2003)
* Professor Mark Fenton-O’Creevy is a member of Essentia’s Advisory Board.
Illusion of Validity
Describes the tendency to overrate our ability to make accurate predictions, especially when analyzing data that presents a consistent pattern or tells a coherent story.
Sometimes we tend to seek information even when it does not affect action. Better decisions can often be made with less information – more is not always better.
The tendency for people to prefer avoiding losses than acquiring gains. Associated with Kahneman & Tversky’s prospect theory, the behavioral model that shows how people decide between alternatives when the probability of outcomes is unknown.
“In human decision-making, losses loom larger than gains.”
Kahneman and Tversky – Prospect Theory: An Analysis of Decision under Risk (1979)
WATCH: Dan Ariely explaining loss aversion and its relationship to other important biases
Mental Accounting Bias
Also known as the “two-pocket theory”, this is when we divide our money into separate categories according to subjective criteria, such as where the money came from, or what we intend to use it for. In reality, money is fungible and one dollar is worth as much as the next, whatever its source or purpose. This bias, and its impact on how rational we are in our spending and investment decisions, represents a “fungibility violation”.
“Mr. and Mrs. J have saved $15,000 toward their dream vacation home. They hope to buy the home in five years. The money earns 10% in a money market account. They just bought a new car for $11,000 which they financed with a three-year car loan at 15%.”
Richard H. Thaler – Mental Accounting and Consumer Choice (1985)
Observational Selection Bias
The effect of suddenly noticing something we didn’t notice much before and wrongly deducing from this that its frequency has increased.
Observer Expectancy Effect
When a researcher anticipates a certain result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it.
This is seen when we tend to overestimate the probability of positive outcomes and fail to acknowledge the potential for adverse consequences.
The tendency to judge a decision by its eventual outcome, rather than the quality of the decision when it was made. This behavioral tendency leads us to de-emphasize the events preceding an investment outcome, whilst overemphasizing the outcome.
“The fact that something worked doesn’t mean it was the result of a correct decision, and the fact that something failed doesn’t mean the decision was wrong. This is at least as true in investing as it is in sports.”
Howard Marks – Inspiration from the World of Sports Memo (2015)
When confidence in our own judgement is greater than the objective accuracy of those judgements.
The tendency to focus on what can go wrong and overestimate the likelihood of negative events.
This theory is concerned with how we make decisions when taking into consideration risk and the likely ‘prospects’ for a given decision or gamble.
The theory was first presented by Daniel Kahneman and Amos Tversky in their 1979 paper, Prospect Theory: An Analysis of Decision Under Risk, and then developed until 1992. Their work on the theory won Kahneman the Nobel Prize in Economic Sciences in 2002 (Tversky died in 1996).
The 1979 paper was a critique of the expected utility theory and its largely rational model for individual decision-making. Kahneman and Tversky presented what they saw as a psychologically more realistic model in which – for financial decision-making – a sense of gain or loss balances the consideration of how useful money is, per se.
Kahneman and Tversky observed a number of effects which, taken together, form prospect theory.
They found that decision-makers will predominantly go for the sure thing when choosing between a sure gain and a risky gain of equal or better expected value. This attraction to guaranteed outcomes – even when there is the possibility of a better result – is a function of something called they called the certainty effect.
For example, most people prefer to win $100 with certainty, rather than entering a gamble whereby, with the toss of coin, they can either win $150 or take home nothing.
Kahneman and Tversky also found distinct decision making behavior around losses. They called this effect loss aversion. According to this, losses have a greater emotional impact than a gain of the equivalent amount. In simple terms, we dislike losing more than we like winning. Indeed, they argued that the feeling of pain due to a loss is 2-2.5 times (on average) greater than the feeling of pleasure felt from an equivalent gain.
As a result of these findings, Kahneman and Tversky noted that, whilst people will tend to avoid risk when gains are at stake, they will not hesitate to take on risk if there is a chance of avoiding losses – (even if this means foregoing the possibility of a smaller, more certain loss). They illustrated this with the following example:
You must choose between one of the two gambles:
Gamble A: A 100% chance of losing $3000.
Gamble B: An 80% chance of losing $4000, and a 20% chance of losing nothing.
Next, you must choose between:
Gamble C: A 100% chance of receiving $3000.
Gamble D: An 80% chance of receiving $4000, and a 20% chance of receiving nothing.
Kahnemann and Tversky found that whilst 92% chose B, only 20% of people chose D (a prospect of similar risk or likelihood).
“Prospect theory turned out to be the most significant work we ever did… We retained utility theory as a logic of rational choice but abandoned the idea that people are perfectly rational choosers.”
Daniel Kahneman – Thinking, Fast and Slow (2011)
When we reject proposals that are potentially favorable to us just because they come from another party, opponent or rival.
When people weigh recent events and observations more heavily than those in the past.
This is a version of the availability heuristic whereby we tend to base our thinking disproportionately on whatever comes most easily to mind. In an investment context, this can be dangerous because we are likely to lean more heavily on our experience of recent investment performance when considering future returns.
In an interesting article in the Wall St Joural, The Financial Price of Forgetting Bad Times, Dr Shlomo Benartzi and Dr Alan Castel revealed that older people can display marked recency bias, with a focus on positive memories. This has potentially significant implications for the important investment decisions they make as they approach retirement.
“We look at the most recent evidence, take it too seriously, and expect that things will continue in that way.”
Dan Ariely – Predictably Irrational (2010)
The tendency to avoid making decisions that we fear we could later regret.
Suggests that we adjust our behavior according to our perception of the risk level, becoming less careful when we feel safer and more cautious when the perceived risk level increases.
Status Quo Bias
Evident when people resist change and prefer things to stay the same or stick with previous decisions.
Sunk Cost Effect
The tendency to throw good money after bad. Can lead us to continue investing into a project based on our earlier decisions, rather than on its current objective merits or despite new evidence suggesting that the decision was probably wrong. Also called irrational escalation.
See how one large, long-term fundamental equity manager used Essentia to mitigate cognitive bias and increase alpha in just 6 months: