Thinking Unbiased

Click here to subscribe to Newcula!

This is part 2 of Glimpse the Future – a series about how to make good predictions.
Part 1 is here.
length ~18 minutes.

People are not accustomed to thinking hard, and are often content to trust a plausible judgment that comes to mind.

– Daniel Kahneman

I hesitate to put this as part a series about making good forecasts. While overcoming bias is absolutely essential in building forecast models and making sound estimates, it’s also exceedingly important in making sound judgments and decisions.

The impact of overcoming bias reaches much farther than simply helping us see the future. Being aware of, and effortfully removing biases in my decision making software has yielded me great return on investment.

In 1974, Amos Tversky and Daniel Kahneman published a groundbreaking paper1 about biases inherent in our thinking which distort our ability think rationally and arrive at the correct conclusions.

Kahneman continued his work on biases and errors in judgment. Although the field of study is clearly psychology, Kahneman’s work is most prized in the fields of finance and economics because there is much profit to be made when we can understand the weaknesses in our thinking, and recognize the weaknesses in others. His work eventually earned him the Nobel Prize in Economic Sciences, marking a rare instance of the Nobel being awarded for incidental contributions to the field.

Personally, 2015 proved to be an acceleratory inflection point in my pursuit of self-improvement. Kahneman’s book Thinking Fast and Slow|amzn| kicked off the year and played a vital part in my progression. The studies he conducted revealed dozens of biases which hinder our thinking, and I’ve classified the most impactful ones into three major categories: lazy thinking, self-esteem preservation, and inertial thinking. These categories are not mutually exclusive, and are often strongly related to one another.

Biases are Born of Heuristics

Heuristics are sets of rules or mental shortcuts we use to help us think, rationalize, and make decisions. They are what we put into place to reduce the effort in reaching conclusions and making judgments. Heuristics are not intrinsically bad; in fact they are necessary for us to be functional, thinking mammals.

Trust is an important heuristic we build. People we put into the trust category no longer require the analysis of “are they trying to hurt me?”, “how can this cause a bad outcome for me?”, “am I putting myself in danger?”, or “am I getting screwed?” and so forth every time we engage with them. It reduces the cognitive load and is essential for healthy, happy friendships and relationships. It explains why we are blind to obvious betrayals from someone within our trust circle and why acts of betrayal are the most painful and anger-inducing experience we can encounter.

Even though heuristics are necessary, they also introduce the nasty side-effect of bias leading to irrationality. So while we cannot get by without them, we have to be conscious of which heuristics we are allowing or introducing into our thinking, and be prepared to counteract the inevitable biases that result.

Lazy Thinking

In my experience, the most devastating type of biases result from lazy thinking heuristics. Lazy thinking can be useful in reducing mental fatigue. We can’t afford to think about every decisions we make in excruciating detail, such as “what color shirt should I wear today?”, “how many creams do I want in my coffee?”, “what time should I leave for work?” and so forth. Kahneman would say this class of heuristics produces cognitive ease.

Unfortunately, we utilize heuristics we formed to help us with trivial decisions to also make important, life-altering judgments.

1. Coherent Stories

In order to make sense of the world around us – all the stimuli and random occurrences – we weave stories of cause and effect and try to fit every event neatly into one narrative. The simpler the narrative, the more seductive it is.

An example of our strong preference for coherent stories and simple narratives is “We are experiencing economic decline due to Jews/Chinese/immigrants/the ‘others’.”

The economy is complex machinery. There are many moving parts, small events can have huge, unforeseeable consequences, and multiple factors converge creating unpredictable chaos, like throwing many stones into a pond and watching their ripples collide. And that’s before we even begin considering fair economic distribution.

Volumes of books, articles, and studies can be (and have been) written about this, and yet people typically choose a single-factor narrative, and try their cognitive best to fit every new event into that narrative, even if it’s a square peg into a triangular hole.

Another result of the coherent stories heuristic is the conjunctive fallacy. In one Tversky and Kahneman study, they described a fictional woman Linda as follows:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

There was a one-question survey:

Which is more probable?

  1. Linda is a bank teller.
  2. Linda is a bank teller and is active in the feminist movement.

The coherent story being presented is the stereotype of a feminist.

All feminist bank tellers are bank tellers, and all non-feminist bank tellers are bank tellers. So statement 1 encapsulates everyone in statement 2.

By adding an “and” (conjunctive) clause, the probability is clearly diminished; but the plausibility of the statement was strengthened, and the study’s participants were seduced by that apparent coherence.

2. Representativeness

The representative heuristic is the tendency to use categories as generalizations for individual cases, simply because the individual case appears superficially similar to the category. This results in a whole slew of undesirable biases and erroneous beliefs.

Simple examples are the belief that eating fatty foods causes us to be fat. The fat we are eating is similar to the fat on our bodies, so it’s easy to believe in causation, even though this conclusion entirely ignores the metabolization process in our bodies, hormones such as insulin, blood sugar, and genetics.

This type of categorization can also leaves us vulnerable to racism – “all black people are similar in skin color to those participating in the gang violence in Chicago, and thus must also be violent.”

3. Substitution

A close cousin of representativeness is substitution. This heuristic occurs when we are faced with a computationally complex situation and we substitute a known, simpler situation instead and try to solve it by analogy. We could also be missing a critical piece of information to make a decision and substitute it with a known piece of information instead.

An example of this is occurs in sports where talent-evaluators often project an unknown player’s future production by substituting an existing player’s production where the existing player and the unknown player have similar attributes like height, speed, or jumping ability.

Too often I’ve heard players like Darko Milicic (who?) compared to Dirk Nowitzki (because he’s white?) and Hakeem Olajuwan (because he’s tall?). For the non-sports fans, Nowitzki and Olajuwan are considered greats in the sport and Milicic, well let’s say the predictions made based on the substitution heuristic were very wrong.

Substitution is also frequently used in making investment decisions, even by professionals. Rather than assessing a company on its own merits, how many analysis are done by drawing parallels between a startup company and Uber, Apple, Facebook, or Google? How often do you hear a company described as the [blank] of [blank] – “FlipKart is the Amazon of India”? As evidenced by the lackluster performance of most sell-side analysts, far too often.

Startups often use this heuristic against investors. During the dot-com bubble, it was common to substitute clicks, “eyeballs,” and other metrics for revenue and profit. Pitch-decks are filled with references to already successful startups in unrelated fields as a way of explaining a new business.

4. Availability

Availability bias describes the tendency for us to over- or underestimate the likelihood of an event based on how easy it is to recall that type of event in the past, or envision it happening in the future. In other words, how available it is in our cognition.

One study of couples found that because they each did chores regularly, each person in the relationship estimated that they complete 70% of household chores. Another study found that people are more likely to get a checkup after someone close to them is diagnosed with cancer. A third study found that hearing news of muggings increase a person’s estimate of total muggings occurring in their city.

Plane crashes are often sensationalized and reported on for extended periods of time. The two tragedies involving Malaysia Airlines the recent past are an example of this. By making the events ready for recall, people can be influenced to judge air travel as more dangerous than driving, even while air travel in aggregate – or even Malaysia Airlines on its own – is much safer than getting into your car and driving to work.

This heuristic is of particular value to politicians and marketers alike since the consequence is that hearing about something repeatedly is likely to influence our decision making. By making certain events more available for recall, media can – intentionally or not – alter our judgment.

Thus, lazy thinking heuristics leave us particularly vulnerable to manipulation. Mass media and politicians often use coherent stories, stereotypes, narratives, false equivalence, and repetition to manipulate our judgments.

Need for Self-Esteem

A second category of biases result from our individual need to feel valuable, important, and superior within their peer group. There’s nothing wrong with this phenomenon. As products of evolution both biologically and socially, the self-esteem bias is natural and essential.

While we shouldn’t let anyone try to convince us that selfishness, self-importance, or self-preservation are vices, we should also be cognizant of the role they play in distorting our decision making and ability to forecast.

5. Self-Attribution

Evidence of this bias are plentiful. When we earn success and accolades, the tendency is to give ourselves credit. We are more likely to see causal effect between our actions and the resultant success and down play the role of luck.

When someone else gains the same level of success, we are much more willing to attribute it to chance. Amateur poker players coming off of a successful night are much more likely to remember the one brilliant play they made than to remember when chance rescued them several times. For the losers from the same table, they are much more likely to remember the hands where they were ahead statistically but lost due to chance, rather than the numerous mistakes they made.

Combine this with the next bias and the trouble compounds.

6. Hindsight Illusion

Either you or someone you know must be like me – a vocal, highly opinionated individual. I make plenty of predictions in social settings – eating with my friends, or at a house party. I have, at various times, incorrectly predicted a housing crash in Vancouver, a market crash in 2015, Netflix stock to plummet, and so on.

There are neither social nor financial costs to my private musings. I’ve also made correct predictions, like the future business prospects of the Royal Bank of Canada and Disney. But what was my degree of certainty at the time I made those predictions? Was I more certain about a Vancouver housing crash and less so about Disney? Even if that were the case, since the Disney prediction was correct, I retroactively assign a great degree of certainty to that prediction. It’s “obvious in hindsight.”

Because of our need for self-esteem, we are likely to bundle hindsight illusion and self-attribution and fool ourselves into thinking we are much better at estimating and prognosticating than we actually are. This robs us of the opportunity to examine our failures. Not only are incorrect predictions failures, but correct predictions without enough certainty to act are, also.

In 2007 and 2008, many market watchers warned of dangers in the housing market.

One of my favorite movies in 2015 – The Big Short|Amazon| (with an star-studded cast of Steve Carrell, Ryan Gosling, Brad Pitt, and Christian Bale!) tells the story of a group of individuals who had both correctly and certainly predicted the financial crisis and thus profited massively.

Hedge fund manager John Paulson quietly made $4 billion dollars in 2009 betting against the housing market2 through credit default swaps. Some may question the morality and legality of his methods, but without a doubt he was certain enough of his prediction to risk his entire fund when others just talked. Even more impressively, he proved it was no fluke by betting that US banks would recover and European banks would falter instead. This scored a second pay day of $5 billion  in 2011.

These are the outliers. Nearly every other pundit, expert, and layman who saw something wrong with the housing market in 2007 and 2008 and enjoyed years of haughty righteousness and “I told you so” are suffering from hindsight illusion. The famed “Dr. Doom” – economist Nouriel Roubini – who many credit with correctly predicting the financial crisis didn’t manage much profit from his foresight. There are only two explanations 1) he doesn’t care about money, which is unlikely since he runs his own fund hoping to make profit or 2) he didn’t have as much certainty as he would have us believe in hindsight.

7. Optimism Bias

Optimism in this case refers to our willingness to ignore critical unknown information in favor of believing in ourselves. But aren’t we taught to always believe in ourselves? I’m not talking about grit, but rather not taking base rates, other people’s failures, and uncertainties into account when making decisions, choosing instead to focus on our own strengths.

A startup founder might ignore the base rate failure rate of 90%+, repeat mistakes of those who came before, and dive headlong into businesses believing that whatever they are skilled at is the most important part of success in that particular industry.

8. Loss Aversion

The inverse of self-attribution, hindsight illusion, and optimism bias is loss aversion. In many cases, loss aversion makes a lot of sense. When it comes to finances, for example, losing $10,000 may do more damage than gaining $12,000. And so even if the probability of each one was 50%, it could make more sense to decide against taking such a chance.

However, loss aversion here refers to the psychological cost of losing versus the psychological gain of winning. Supposing that on a rational level, a win and a loss are equal, people prefer to not lose than to win. This leads to irrational decisions affected by how a situation is framed. We are more likely to choose something with an 80% likelihood of a positive outcome than a 20% likelihood of a negative outcome, even though those are logical equivalents, and the reason for that is the preserve self-esteem.

This could lead us to pass up opportunities, not because we cannot afford the financial implications of losing, but rather because we overestimate the emotional cost of losing.

Inertial Forces

The last two biases explain why people are generally unable or unwilling to change our minds. These connect both to self-esteem and lazy thinking: self-esteem because we enjoy being right, and dislike the psychological cost of admitting being wrong; lazy thinking because we don’t want to bear the cognitive cost of reevaluating a belief.

The combination of the two spirals further and contributes heavily to tribalism. Not only are we predisposed to tribalism due to our evolutionary past, but tribalism offers both the cognitive ease of lazy thinking as well as a safer path to self-esteem by being with those who agree with us. That’s why often times our natural inclination is to offload our judgments and reasoning to a particular tribe such as in religion, political affiliation, culture, or some common cause like veganism or anti-nuclear proliferation.

Sadly, this is the exact opposite of critical thinking.

9. Confirmation Bias

Once we subscribe to a particular narrative, or make a particular judgment, we begin to filter new incoming information. In the event that confirming and contradicting observations arrive at the same rate, we are most likely to completely ignore the contradicting observations, and accept confirming observations to bolster how good we feel about our original position.

The amount by which new information contradicts our original position must be overwhelming before we are willing to change our minds. Often times as more contradictory information comes, it becomes cognitively cheaper to actively seek out confirming, comforting facts (or even non-facts) than to change our mind. The current state of machine learned content generation has the negative side-effect of helping us build echo chambers and entrench ourselves into our positions in the face of all contradiction.

10. Anchoring Effect

The anchoring effect makes matters even worse. Kahneman and others have repeatedly shown our brain’s preference for cognitive ease by anchoring itself to the first number or suggestion it hears. For instance, if we are asked to guess the age of Ghandi’s death, and are told the number 10 in an unrelated context, we are likely to underestimate. The reverse is true of the first number had been 65.

This is also a negotiation tactic favored by pros. They may start at an unreasonably high or low number in order to anchor our perception. For us, the negotiation began after that first number. For the pro, it hasn’t begun at all. Rather, they’ve just primed us to agree to a worse deal.

Imagine if we were successfully anchored to an initial position subconscious- or semi-consciously. Our own confirmation bias works against us after that and our own biases reinforce the initial judgment. No one needs to further convince us – we will actively defend the initial position, no matter how irrational.

How do We Break Free?

At this point, we may be realizing just how many heuristics we’ve unknowingly deployed into our cognitive process, and thus how difficult it is to make rational decisions, judgments, and forecasts.

Thankfully, we are not the first generation of people to come against these struggles. Many powerful thinkers have also felt the affliction of heuristic biases and have been fighting back. We can glean from their wisdom.

Here are three strategies we can deploy today in order fight back. Fair warning, there are no simple hacks, only effortfully forming new ways of thinking.

1. Reason Less From Analogy, More From Evidence

This first strategy takes aim at lazy thinking.

Make a list of all the most coherent stories you hold dear. My list includes opinions like “evolution is reality,” “democracy is good,” and “free trade and free markets are the best economic solutions.” Listing these out doesn’t mean I’m suspicious of their truth. Quite the opposite. Since these are some of my strongest opinions, I should be able to easily defend them. Test them out by asking questions like “how do I know for sure?” “what evidence is this opinion based on?” “am I sure this is causal?” and “why must it be this way?”

For narratives we hold dear, coming up with a few why’s should be easy. But we shouldn’t stop there. Now that we’ve listed out facts and figures, we should go check if they are indeed true. Since we’ve disassembled the rationale for our belief, we can more objectively assess each one. Are we substituting simple for complex? Are we using a category to represent an individual based on apparent similarities?

Once we’ve done this exercise, we can apply this process for future judgments, opinions, and decisions we come across as often as possible. This will force us to what Kahneman refers to as “slow thinking3” which reduces the likelihood of bias.

The imperative is to hold our opinions carefully, and only as tightly as the evidence allows.

2. Argue Strongly for the Other Side

Charlie Munger, co-founder of Berkshire Hathaway, and one of the greatest multi-disciplinary thinkers we have the luxury of learning from says, “I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”

Since we have the unconscious propensity to confirm our own biases and protect our self-esteem, we can combat this by presupposing the other side is correct, and argue for them as strongly as possible. This removes the emotional pain of being wrong, since we have now taken both sides of the argument.

Note that I’m not talking about empathy, but rather learning the rationale of our ideological opponents thoroughly, turning over the same evidence they weighed, and dissecting why they may have arrived at the opposite conclusion as us.

This raises the bar of holding an opinion very high and any position where we cannot argue strongly for the other side must be held lightly.

3. Be Willing to Change if Evidence Dictates

While this seems obvious when we hear it, very few of us actually do it. In the current climate of extreme tribalism, it’s actually often considered a weakness to change positions. In other words, the cost to self-esteem is very high. People use words like “flip-flopping” to describe someone who changed their mind about something.

Easier said than done, we should eschew these silly labels and be willing to change our minds in light of new evidence. Renowned economist John Maynard Keynes put it best, “When the facts change, I change my mind. What do you do, sir?”


Lazy thinking, self-esteem, and inertia all come naturally to us and in order to think better, we must actively and effortfully combat these biases.

Hold opinions loosely, be willing to argue from the other side, and change our position when the evidence demands it.

To sum it up in the words of Charlie Munger,

The ability to destroy your ideas rapidly instead of slowly when the occasion is right is one of the most valuable things.

You have to work hard on it. Ask yourself what are the arguments on the other side.

It’s bad to have an opinion you’re proud of if you can’t state the arguments for the other side better than your opponents. This is a great mental discipline.

Don’t forget to subscribe, share, or leave a comment!


1Tversky and Kahneman – Judgment under Uncertainty: Heuristics and Biases

2The Wall Street Journal – Trader Made Billions on Subprime

3Daniel Kahneman – Thinking, Fast and Slow