My finances, my projects, my life
December 22, 2024

Investments: don’t confuse cause and effect

Are you planning to go it alone in the investment world? If so, there’s one key fact you need to know. This essential statistical rule will help you become that worldly-wise investor who always takes the time to assess the quality of the data presented and knows how to verify their sources with intellectual rigour.

In the myLIFE series on behavioural finance, we often try to alert you to the various forms of cognitive bias that can cloud your judgement and lead you to make the wrong investment decisions. In light of this bias, we regularly highlight just how useful professional guidance can be. But there’s nothing to stop you from going it alone.

If this is your chosen path, you’re sure to be on the lookout for reliable indicators on which to base your decision-making. Seeking to assess the potential performance of an investment means engaging in a process of factual analysis, based on figures and, most likely, statistics.

Looking for indicators to guide your decisions is definitely a good move, so long as they are based on reliable and relevant data. Just as a recipe that looks good on paper will lead to a bad dish if you use poor-quality ingredients, good statistical analysis cannot produce reliable results if it’s fed inaccurate or falsified data.

Statistical analysis cannot produce reliable results if it draws on inaccurate or falsified data.

“Garbage in, garbage out!”

There’s a lot of truth in the phrase “garbage in, garbage out”. It’s definitely the case that if you feed bad data into a calculation system, you’ll never end up with reliable indicators. It’s vital to understand that if there is any “noise” affecting the quality of the data, it’s almost certain that any lessons you draw from your analysis will be less than crystal clear.

Let’s imagine an investor sitting down to look for financial information to guide his investment decisions, as he does every morning. He visits all of his favourite media outlets, reads his emails and checks his professional and personal social media feeds. As he reads, he finds out that scientists are suggesting that there could be a link between eating a particular food and a decline in a given disease. His curiosity is piqued, but he’s not about to waste time. “How can I use this information to invest?” he muses. At this stage, there is no proof of the link in question, and in any case the details of the study are not reported.

A short time later, he reads the latest post by an influential blogger who shares an eye-catching graph: for the last decade, the performance of a stock exchange index he has access to has followed an astonishingly similar path to that of banana production in Brazil. Our investor thinks he’s onto a winner. Why do any more research when the facts are as clear as day?

Do you see the link between these two examples? The two situations are very similar when you stop to think about it. In both instances, the investor would do well to set his reading preferences aside and take the time to dig deeper and understand what could be hidden behind the information. There is obviously a likely correlation between the two variables in both cases, but that doesn’t necessarily mean that there’s a causal link. And in the world of statistics, this distinction between correlation and causality changes everything.

Unlike a cause, a correlation is simply a historical, temporary measure of two different things that may not actually be linked. There’s nothing to show that this statistic won’t change over time or that it will hold true in future. In investment, there are bad actors out there who like nothing better than unearthing correlations over insignificant periods to cloud the judgement of the investor.

What’s the moral of the story? When you hear about a promising investment strategy, always make sure that it is based on an actual relationship of cause and effect rather than mere correlation.

Remember the golden rule of statistics: correlation is not causality.

Correlation is not causality

Never forget the golden rule of statistical analysis: correlation is not the same as causality. Just because two variables move in almost exactly the same way over time, that doesn’t mean that one change is causing the other.

Before you make an investment decision, always make sure that the information you’re shown is not what are sometimes referred to as “spurious correlations”.

In statistics, a spurious correlation is a link between two variables that appears causal but isn’t. This false correlation is often caused by a third element called a “confounding factor” that isn’t apparent when you assess the information presented.

Our brains love simple narratives and delight in drawing easy-to-understand associations, even where they don’t exist. Jumping to conclusions is what our brains do best, which is why it’s so easy to believe, when we look at a graph, that a movement in variable A is linked to a movement in variable B, or vice versa. If we dig into the statistics a little more, though, it may become clear that such a movement on a graph is a simple coincidence rather than cause and effect.

To help make this a little clearer, here’s an example from everyday life that’s credible, recurrent and easy to debunk. It’s a classic trope in summer to look at the relationship and show the correlation – generally quite a high correlation – between the number of ice creams sold and the death rate. But we all know that ice creams aren’t deadly. In this example, if it’s a particularly hot summer, it is logical that everyone will want to eat more ice creams and that people in a fragile state of health will sadly stand a greater chance of passing away in the heat.

Here, the confounding factor is undoubtedly the heat and we are bound to be able to show a causal link by finding the exact number of days of hot weather and the precise temperature beyond which both ice cream sales and deaths increase.

Consider yourself warned! If you would like to find out more about spurious correlations by looking at a few examples, you might be interested in our next article: “Assessing performance: why not everything is as it seems!