Foxy Predictions

Click here to subscribe to Newcula

This is part 3 of Glimpse the Future – a series about how to make good predictions.
If you haven’t yet, check out part 1 and part 2.
length ~10 min

The Greek philosopher Archilochus originally said, “the fox knows many little things, but the hedgehog knows one big thing.”

Hedgehogs are those who believe in one grand vision or one big idea. In some situations, this is absolutely an admirable trait. Many startup founders are at least partially hedgie. They push forward into barren wastelands, struggle through the swamplands, plodding on and on until … more than 90% of them fail and fall into obscurity.

Philip Tetlock, the author of Superforecasting: The Art and Science of Prediction said of hedgehogs, “They tended to have one big, beautiful idea that they loved to stretch, sometimes to the breaking point,” but also noted that this devotion to the One Big Idea enables them to be polished, articulate, and forceful in communicating that idea.

Hedgehogs make for excellent TV pundits and orators. Their opinions are clear, unambiguous, and neatly divides the world into black and white. They are confident, self-assured, and have no patience or tolerance for self-doubt. For a majority of people, investing cognitive cost to think critically and with nuance is too high. The simplified models of the world offered by hedgehogs are much more palatable. And so hedgehogs come to dominate our media and our mindshare.

Foxes, according to Tetlock, are quite the opposite: “self-critical, eclectic thinkers who were willing to update their beliefs when faced with contrary evidence,” “doubtful of grand schemes,” and “rather modest of their abilities.”

When it comes to making predictions in particular – which is the theme of this series – foxes handily outclasses their hedgehog counterparts. The reason is simple – our world is not simple. Any single- or dual-factor model is statistically likely to be wrong. Simple models, it should be said, are very useful and good predictors within a small range. Once we stretch them “to the breaking point,” the error rate sky rockets like in this nifty diagram I made.

fox_hedgehog_predictions.png
Figure 1: The gap between the two lines is the error.

Becoming Foxy

Foxy thinkers don’t have one big idea or a single-factor model to predict every situation. The legendary investor and multi-disciplinary-thinker-extraordinaire Charlie Munger said, “the first rule is that you’ve got to have multiple models because if you just have one or two that you’re using, the nature of human psychology is such that you’ll torture reality so that it fits your models, or at least you’ll think it does.”

Munger’s idea is to build “lattices of knowledge” across different fields and relate them to one slowly over time, but not holding any of our suppositions too tightly. It will be a constant struggle not to intermix the excitement of learning a new concept with attributing absolute explanatory power to that model. We must be diligent not to let any single particularly powerful or sexy sounding idea cause us to stop exploring new models and new ideas.

As we build up mental models from areas such as statistics, machine learning, astrophysics, business, finance, philosophy, and much more, we realize that no one model consistently succeeds and outperforms all others.

We must remain critical, and even self-critical, relentlessly looking for mental biases, adapting to new ideas and new evidence.

There are two types of self doubt: doubting yourself and doubting your predictions. The first is likely detrimental to our decision making prowess, but the second is essential. When we start believing our own BS, we are in imminent danger from the Dunning-Kruger effect. The first step of becoming a good forecaster is the same as the first step of Thinking Slowly – build up a skeptical mind1.

Four Keys to Foxington

Hedgehogs are stubborn, they seek order, and become highly specialized, and they’ve been shown to be worse than random at making predictions. How does the fox differentiate?

1. Think Probabilistically

It turns out knowing the basic concepts of probability make us several-fold more employable and valuable because it helps enables our thinking to better approximate reality, and is essential to making predictions.

Thinking probabilistically means acknowledging the randomness in our universe. Randomness is everywhere – in our politics, our biosphere, our economies, and so on.

thinking_probalistically1.png
Figure 2 – Daily returns of the S&P 500 versus Gaussian distribution

Look at this chart of daily returns in the stock market over a 10-year period in blue, overlaid with the standard Gaussian distribution (or bell-curve). The Gaussian distribution is very important because it accurately approximates many things in nature. It makes the general claim of central tendency: that extreme events far from the average are far less likely than events that approximate the long-run average.

See that the returns are almost perfectly distributed around 0%, meaning on any given day, stocks are far more likely to barely move, rather than have wild swings; it also says that returns are pretty much as likely to be negative as they are to be positive, and that extremely positive and extremely negative days are both equally rare. In fact, stock market returns are more boring than the standard Gaussian distribution – they cluster more towards zero than what a normalized bell-curve would predict.

So if someone asked me to predict the return of the S&P 500 the next day, I can fairly confidently say that it will very likely be between -1% and +1%. Now that might be a boring prediction, but I’ll probably be right. If someone offered a game where I pay $50 every time the S&P moves more than 1% a day, but I get $50 every time it makes less than that, I would take that deal in an instant.

The opposite of that is to predict that the S&P will return exactly 1.05% tomorrow. The chances of that happening are microscopic. A range of values will always be superior to a specific value simply because it covers more area under the curve.

Warren Buffett wrote that “I’d rather be approximately right than precisely wrong.”

Sometimes there is bias in the randomness. For example, inflation rate is almost always positive (except for the Japanese). Therefore, the center of the curve showing the distribution of inflation wouldn’t be zero. It would be 2%, 3%, or more, depending on the country’s economy historically.

Sometimes the variance is not purely random. Highly positive events might be more likely than highly negative events. For example, the annual return of the S&P 500 is far more likely to be positive than negative.

Probability also can change over time as we acquire more information. Nate Silver, who runs the blog fivethirtyeight.com specializes in making predictions probabilistically. In fact, this 4-point list is co-opted from his book The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t.

thinking_probabilistically2.png
Figure 3 – Trump vs Clinton’s chance of winning Ohio in 2016

The chart above shows what his model predicted to be Hillary Clinton and Donald Trump’s chances of winning Ohio in the 2016 presidential election. The input to his probability model were poll numbers, which swung back-and-forth after major events, such as the first presidential debate, or James Comey’s revelation of FBI investigations into Hillary’s emails. As we can see, data inputs changed the probability of the final outcome being predicted. A foxy thinker has no problem believing in August that Trump only had a 25% chance of winning Ohio, and believing in November that his chances were 65% instead.

2. Don’t Let Your Past Positions Anchor You

In the modern climate of thinking, changing our minds is somehow seen as a weakness. When politicians do it, we call it “waffling” and vote them out of office. So they stopped waffling, and instead dug deeper and deeper into their initial positions. We keep electing hedgehogs into government just because it’s easier to keep track of their positions and we get to save mental energy.

A foxy forecaster doesn’t get tied down by his previous opinions. As we can see from figure 3, we can make multiple predictions over time, and hold opposing positions as the facts change.

There is no real cost to changing our minds, only imagined ones.

One of my favorite quotes comes from the famed economist John Maynard Keynes: “When the facts change, I change my mind. What do you do sir?” Both Nate Silver and John Keynes are foxy thinkers.

3. Wisdom of the Crowd

I previously wrote about how contrarianism isn’t just believing the opposite of everyone else2. Being right while everyone else is wrong is the fast lane to fame. However, be aware that most of the time, the consensus is correct.

So if we are going to go off the beaten path, we better have a damn good reason. Foxes don’t follow the crowd, but very often, their independent conclusion will coincide with the popular consensus. When they diverge, turning self-criticism to full-blast is strongly advisable. Go over each assumption, gather more data, and see if we still arrive at the same contradicting opinion. That may very well be the case, and foxes will then need the self-confidence to go against the wisdom of the crowd.

4. No Silver Bullets

Lastly, it’s worth repeating that once we develop a model that seems to work well in certain contexts, we should be very careful not to over apply it to every situation and scenario. Once we develop a useful model for prediction, be diligent to stay inside the “Everything’s Swell” green box of figure 1 and not wander far outside of it.

As seductive as “the One Idea to predict them all, One Idea to rule them all, and in the darkness, forecast them” sounds, that One Idea simply doesn’t exist. We should all strive to be like Charlie Munger who recommends anywhere between 80 to 90 different models of thought to help us make useful predictions.


Objectivity is hard because unbiased thinking is hard. Foxes remain self-critical, constantly assessing their predictions, and not be seduced by the One Big Idea.

Lastly, I want to acknowledge that stubbornness and specialization are very useful in other contexts of life – but not in making predictions. Combined with grit, they can lead to very successful careers and life journeys. Hedgieness and foxiness also exists on a spectrum.

When making predictions, we should definitely be more like the fox. But in other areas of life, we should be more like the hedgehog. Don’t let “foxy thinking” become the silver bullet applied to all areas of life.

Don’t forget to subscribe for new-post updates!


1 Newcula further reading – The Value of a Skeptical Mind

2 Newcula further reading – The Secret to Contrarianism

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s