Subscribe via RSS Feed Connect on Google Plus

Investment Bias Part 1: Curing Investment Bias

April 29, 2014 17 Comments

Finally! After pouring my heart and soul into this project for more than 2 months it’s here! This post contains the main introduction to my eGuide (the Practical Guide To Curing Investment Bias) which is available for FREE via subscription to ContrarianVille. Sign up and a free copy will immediately be sent to your inbox. If you have already subscribed to ContrarianVille and would like a copy please re-subscribe to the new email list (you can unsubscribe to the old one) or simply send me an email at or contact me via twitter @ContrarianVille.


Image courtesy of ddpavumba /

Instead of being sensible and identifying ourselves with our brains, we identify ourselves with a very small operation of the brain – which is the faculty of conscious attention.
-Alan Watts

To better demonstrate just how minor a role conscious processing plays relative to the processing that goes on inside the human brain as a whole, consider this fact: according to physiologists, the brain receives, processes, and filters some 11 million bits of information from the senses each and every second – but the conscious mind is able to process only about 40 bits per second. This means that the “data” that we are consciously aware of is only about 0.00036% of the amount of information processing that goes on inside the brain.

Daniel Kahneman likens our perception of our own conscious awareness to a supporting character in a movie who thinks that she’s the hero. Croskerry et al. point out that cognitive psychologists estimate that the average person makes roughly 95% of their day-to-day decisions based on unconscious intuitions rather than deliberate, conscious problem solving.

The vast majority of people are completely unaware of how much of our lives are dictated by unconscious processing.

Since the Enlightenment we have viewed people as being highly rational, with some small exceptions. Due to advances in evolutionary biology, cognitive neuroscience, and history we now know that this view is patently false. People continually make substantial deviations from rationality that are systematic in nature.

That old quote (I don’t quite remember who it’s by) is very apt here:

Humans are not the descendants of fallen angels, but of risen apes.

Contrary to what Descartes thought, “we” (referring to conscious awareness) are not a “disembodied intellect” whatsoever. We are not separate from our brains, and our brains are not separate from our bodies. Our bodies are not separate from nature, and there is no evidence to suggest that humans are the only part of the Animal Kingdom which are born without an array of hard-wired instincts (i.e., we are not born with a “blank slate” – Steven Pinker has a great book on this), or that we do not develop conditioned, instinctual responses over our lifetimes.

Because so much of our thought process is hidden from us, and is based on vast connections of associated memory networks, and is driven by a process of seeking out connections rather than by a process of logic, systematic errors in reasoning occur whenever the unconscious mind plays a role in decision making (which is, to at least some degree, taking place at all times).

According to Croskerry et al. the first stages of “cognitive debiasing” involve moving from a state of ignorance regarding not only the existence of the various cognitive biases, but also of the extent to which they influence our daily decisions, to a state of awareness.

It is therefore my goal in this introductory post to cover these first stages and convince the readers that human beings are not only imperfectly rational in some small anomalous ways – but that we make large, persistent, and systematic deviations from rationality on a continual basis.

The methods and techniques for overcoming many forms of cognitive bias will then be detailed in the Practical Guide To Curing Investment Bias.

According to Daniel Kahneman, people:

-overweight the present over the future;
-overweight losses over gains;
-overweight the amount over the probability (for lottery tickets);
-overweight information that is currently available over information we do not have;
-overweight emotional factors over dry factors;
-overweight vivid examples over statistical information;
-underweight base rate figures because there is a tendency to think about and focus on individual cases

And Charlie Munger adds that there is an enormous tendency for people to under-recognize the power that incentives play in decision making.

Many of the cognitive biases that plague decision makers would likely have been highly adaptive – or useful, in the ancestral environment in which we evolved.

For example, animals that are (relatively speaking) far more fearful of losses than they are of gains would be more likely to avoid taking big risks with their lives and likely would have survived to pass on their genes at greater rates. This explains how the well-documented phenomenon known as loss aversion could have evolved in our early ancestors – and studies show that loss aversion plays an enormous role in economic decision making. The average loss aversion ratio ranges between about 1.5X to 2.5X (meaning that most people are roughly twice as fearful of losses relative to equivalent gains). In order to risk $100 on a coin flip, most people would require roughly a $200 pay-off to take the risk (“Heads I win $200, tales I lose $100″) – contrary to rational decision making odds which dictate that people ought to be willing to take any bet with a positive expected value (at 50-50 odds that would imply any reward in excess of $100 for a $100 risk).

What loss aversion means is that there is a kink in people’s value function at the origin, which represents an arbitrary reference point distinguishing losses from gains. Anything below the arbitrary reference point is a loss and anything above it is a gain. Almost everybody knows this intuitively, here’s a Kahneman and Tversky example:

If today Jack and Jill each have a wealth of $5 million, but yesterday Jack only had $1 million whereas Jill had $9 million, would they be equally as happy (i.e., have the same utility)?

Clearly Jack will be thrilled that his net worth quintupled overnight from $1 million (Jack’s reference point) to $5 million, and clearly Jill will be crushed and mortified as she has lost almost half her wealth (reference point of $9 million). This phenomenon shouldn’t exist according to standard economics – all that ought to matter are end states (such as the ending state of utility).

See that’s just the problem. The word “ought”. Economists largely consider the way people ought to behave, but they ought to consider the way people actually behave as well.

Charlie Munger put it best:

How could economics not be behavioral? If it isn’t behavioral, what the hell is it? And I think it’s fairly clear that all reality has to respect all other reality. If you come to inconsistencies, they have to be resolved, and so if there’s anything valid in psychology, economics has to recognize it, and vice versa.

Daniel Kahneman was shocked when he found out that the worldview of the economists just down the hall from him was that people are highly rational, self-interested, and that their tastes do not change – all known to be demonstrably false assumptions in the psychological literature for quite some time. Richard Thaler coined the term “Econs” to poke fun at the seemingly different organism that inhabited the realm of economic theory compared with how humans actually behave in the real world.

Almost all of the following examples are directly from Kahneman and Tversky.

For instance, Econs should be indifferent to alternative ways of describing the same thing (termed “equivalent frames”) but humans are not. Kahneman and Tversky showed that humans view cold cut meat which is labeled as 20% fat free far more favourably than they view the same piece of cold cut labeled as 80% fat (which is conveying exactly the same information, just from a different perspective – or frame).

Econs are supposed to view the future as a multitude of possible events occurring with a given probability associated with each event, but Kahneman, Shiller, and other researchers show that the human mind is wired to roughly reduce the probability of an event taking place in the future into one of three categories: won’t happen, might happen, or will happen.

Econs are guided by the long-term prospects of wealth and maximizing utility, but humans are guided by the immediate emotional impact of gains and losses. -Kahneman

Modern Portfolio Theory, which is based on the standard economic view – and therefore the view that people behave as Econs, accepts that there are emotional investors – but asserts that rational investors swiftly arbitrage away any price distortions, and thus rational investors dominate stock market prices. But Thomas Howard asks us to consider the ramifications if this picture were backwards. What if emotional and heavily biased investors actually predominantly influence stock market fluctuations? How would that change our understanding of markets and portfolio management?

As Nobel prize laureate Robert Shiller’s research concluded:

After all the efforts to defend the efficient markets theory there is still every reason to think that, while markets are not totally crazy, they contain quite substantial noise, so substantial that it dominates the movements in the aggregate market. The efficient markets model, for the aggregate stock market, has still never been supported by any study effectively linking stock market fluctuations with subsequent fundamentals.

So I hope I’ve convinced you that humans operate under very imperfect conditions of rationality, and I hope that you’re motivated enough to move on to tackling the Practical Guide To Curing Investment Bias, which goes into great depth exploring a wide variety of specific cognitive biases that impact investment decision making and suggests tailor-made methods for overcoming them.

Don’t be, as Munger says, a one-legged man (or woman) in an ass kicking contest. Subscribe and get a FREE copy of the Practical Guide To Curing Investment Bias today and arm yourself against your emotions and protect your investment returns from cognitive errors.

All citations for information and quotes in this article can be found in the eGuide.

Follow me on Twitter: @ContrarianVille

Comments (17)

Trackback URL | Comments RSS Feed

  1. Andrew says:

    I’ve heard a few times now that loss aversion is irrational. Whoever came up with that idea is apparently unfamiliar with the fact that death is permanent; once you have $0 worth of resources (including non-monetary resources, of course), you are dead and cannot place any more bets. In this situation, a rational actor will demand that expected gains be higher than expected losses. The larger the proportion of one’s total resources that a lost bet will wipe out, the larger the winnings that a rational actor should demand to compensate.

    • Mike Mask Mike Mask says:

      Andrew, that’s an interesting point that you raise. And in general I do agree with your reasoning, which is why I suspect that there were tremendous selective pressures favouring the development of loss averse tendencies in our early ancestors (though not for the prevention of financial death, but actual death).

      The question to ask based on your line of reasoning is this: do wealthy individuals display similar levels of loss aversion over smaller sums of money as middle class folks?

      Because of course loss aversion makes sense if one is poor. But if you are dealing with relatively small (relative to your total wealth that is) bets?

      If a multimillionaire refuses a heads-or-tails flip where if he wins he gets $15 and if he loses he loses only $10, then it is hard to describe that as rational behaviour. In general though, even if loss aversion is rational in the sense that you are describing, it is not rational according to the dominant expected value view that forms a foundation for rational decision making under uncertain conditions. So while loss aversion can be described as rational under certain conditions and in different domains, it certainly pokes holes in the standard economic view of how humans ought to behave under uncertainty.

      Great comment!

      • Andrew says:

        Testing the effect across social classes and income ranges as you suggest would make for a fascinating and useful experiment. How much of the apparent size of the effect is based on experiments with penurious college students, who may be slightly hungry at the time of the experiment and thinking about whether they’ll have enough cash to afford pizza tonight?

        It would also be interesting to test on people who were raised rich but ended up poor, and vice versa, to see if childhood experience has an outsize effect.

        • Mike Mask Mike Mask says:

          Exactly what I was thinking: many of these experiments are done with poor college kids and that undoubtedly biases the results.

          My suspicion is that loss aversion in the financial realm is an evolutionary holdover, much like how our sweet tooth (very likely) developed because sugar was incredibly rare in the ancestral environment and signalled fruit and therefore it served a useful purpose back then – even though in modern times where chocolate, fast food, and high fructose corn syrup surround us it has led to an obesity epidemic. So similarly to the sweet tooth being built into human biology and leading to poor results in the modern environment, I believe that loss aversion is a domain general psychological feature to essentially reduce the chance that we will take risks with our lives.

          So because of that I would suspect that loss aversion would still affect wealthy people – though likely at a lesser degree than, say, a poor college student.

          However, as I am typing this I did a scholar search and found a fascinating paper that you may be interested in reading:

          It seems that the authors are suggesting that loss aversion may have developed for specific contexts, and may not be domain general after all. I’ll have to look into it more..

          • Mike Mask Mike Mask says:

            I’m still reading the article but here is a quote from it: “We predict that loss aversion will be exacer- bated in some contexts but may be erased—or even reversed—in others.”

            Seems like it would be right up your alley.

          • Mike Mask Mike Mask says:

            “Throughout most of human history, resource losses could have resulted in starvation and death and were thus a more im- portant consideration than gaining an extra bit of food. Even though modern, urban environments are often different from an- cestral environments, our deep-seated tendency would not have been eradicated”

          • Mike Mask Mike Mask says:

            Now I’m apologizing for so many comments but this article is absolutely fascinating.

            “These findings suggest that, in order to make predic- tions about when people are likely to fall prey to biases such as loss aversion, it is important to consider what motivational state they are in at the time.”

          • Andrew says:

            Looks interesting. I had a feeling that I couldn’t have been the first person to raise these objections. :-)

            And it looks like Kahneman et al. were responding to the deficient take on “rationality” used by economists in the ’60s and ’70s to build their theoretical sand castles on top of homo economicus. In that context, they can be forgiven for calling loss aversion “irrational”.

      • Andrew says:

        One more idea: When it comes to multimillionaires (or anyone at the top of the food chain), they often have the power to force their real-life bets to be structured so that the expected value is positive for them and negative for the less powerful actor. (And/or they have more freedom to refuse less favourable bets, which has an effect in the same direction.)

        A powerful actor who ignored this advantage would end up less well off than a powerful actor who consistently leveraged it. This seems again to make the behaviour rational, this time for the rich instead of the poor.

        • Mike Mask Mike Mask says:

          I think what you are saying here is that despite the fact that the net expected value to society is negative, the extremely wealthy can structure the bets such that the expected value to *them* is incredibly positive and the expected value to society in general, or the less powerful! is negative.

          This I would agree with, but this is in line with standard economic reasoning and isn’t really about loss aversion, unless I’m missing something.

          People aren’t loss averse when it comes to how well-off society is, but rather to how well-off *they* are.

          What you are saying reminds me of too big to fail bankers with no long term skin in the game and no consequences for poor long term results taking massively leveraged bets in order to increase the odds of obtaining similarly massively leveraged bonuses and jump ship before the chickens come home to roost, which is an absolute tragedy for society that our system is set up in such a way to actually encourage and incentivize this appalling behaviour.

          • Andrew says:

            (I think that) I’m saying that you wouldn’t be able to tell the difference between power leveraging and loss aversion in an experiment. Even in the case of the poor students, they have some leverage: If they don’t participate, the researcher doesn’t get a paper, and if the researcher doesn’t get a paper, maybe they don’t get tenure. How do you tell if someone is trying to avoid loss or is trying to leverage an advantage?

          • Andrew says:

            “…which is an absolute tragedy for society that our system is set up in such a way to actually encourage and incentivize this appalling behaviour.”

            The profit motive, like any incentive system, is gameable.

          • Andrew says:

            From the article: “…and the bias would have continued to be adaptive to the extent that there were even occasional periods when people must live near the margin of survival.”

            “Poverty and Famines” by Amartya Sen and “Mother Nature” by Sarah Hrdy (not a misspelling) are great works of scholarship on the social/economic and evolutionary forces, respectively, that shape responses under marginal conditions. Both worth a read if you haven’t read them already.

  2. Andrew says:

    Sorry for yet another response… what you’re saying has really got me thinking. :-)

    A bet always has a fixed cost, at very least in time and in enforcement of outcomes, which is another reason for a rational actor to only accept bets with positive expected outcomes. For a rational actor to participate, the expected payoff should match the expected fixed cost.

    Poor and rich:

    A poor person may not have the social capital to get other people to enforce the outcome of a bet if the person offering it decides to rip them off. In order to make a rational decision, then, they have to factor in the fixed cost of enforcing the outcome on their own before taking the bet. (Example: If you were to invest in a shady Chinese reverse-merger company, it would be hugely expensive for you to do anything about the CEO walking away with the money. If the head of the People’s Liberation Army of China were to make the same bet, they would merely have to mention “so-and-so ripped me off” in order for their money to come back to them after some no-doubt-completely-coincidental accident happened to the CEO.)

    A rich person’s time is often very valuable. What’s their fixed opportunity cost for sitting down with you for a $10 bet? Unless the expected value of the bet is enough to make up for the fact that they could’ve been using their time much more lucratively, there’s no reason for them to listen to you or take your bet.

    • Mike Mask Mike Mask says:

      For your last point – I agree that a rich person’s time is very valuable and they likely wouldn’t (and shouldn’t) sign up for such psychological studies involving low betting amounts.

      I think it was either Dan Ariely or Daniel Kahneman who attended a large financial professional conference with lots of wealthy hedge fund managers and the like attending, and none of whom were aware that he was about to ask them to take a small amount of time to partake in a psychology experiment.

      Because of social standards the vast majority of them complied and participated. So that is how you would have to approach the study in my view. Economic standards are very different from social standards and you must appeal to the wealthy person’s social standards if you want to have any chance in getting them to participate because it would obviously not be in their economic interest to do so.

      And don’t worry at all about the questions and comments – I really appreciate them. I have a good number of readers but not too many actually pose questions and really discuss the issues and topics I write about so thank you! After all, the biggest reason I started this blog was to aid my own thinking (the process of reading and then writing about what you read leads to a ton of thinking and organizing/formalization of one’s thoughts – but also the discussion with others who are interested in the topics and have outside viewpoints also adds tremendously to the grasping of these issues).

      Your point on China – any potential costs regarding the enforcement of the outcome of a bet would definitely need to be taken into account. In the real world situations are often much more messy and sticky than in the lab, but to try to really dig into how loss aversion would play out under variant environmental conditions you would really have to sterilize the conditions of all externalities such as the doubt over being paid if you win.

      • Andrew says:

        So what happened when they tried the bet with wealthy fund managers?

        • Mike Mask Mike Mask says:

          I will respond to your other comments here as the threads apparently reached their nesting limit (I’ll look into how to increase it).

          I believe the research they conducted on the wealthy fund managers was of a different sort – my point was that if you want to conduct this sort of study on wealthy folks you need to (a) surprise them and (b) appeal to their social standards rather than their economic standards if you want any hope of doing the desired research.

          I’ll check out those books (articles?) you mentioned when I get a chance – trying to really dig into Alchemy of Finance now, which is great but requires a lot of thinking and re-reading for me!

          - I am not against the profit motive per se – but I am against setting up institutions which are absolutely vital to our entire economy in which short term profits trump all and where there are no real downside risks for the folks making the gambles. I think forcing CEOs of big banks to invest the majority of their familial assets in whichever bank they are running, and keep their assets invested there for 10-20 years after they leave would lead to far less reckless risk taking. This is just one idea though. No system should be set up such that the gamble is “heads I win – tails the taxpayer loses”.

          - I see your point with the students potentially trying to take advantage of the situation. The thing is, in many of these studies the students are required (as a part of their psychology courses) to participate in a certain number of studies. This dramatically reduces any supposed leverage they would have. Also, there are many, many different ways of testing loss aversion – I believe Kahneman did experiments where he simply asked people to imagine that they were betting and the results were the same.

Leave a Reply

4 − = one