Why do we bother with stock market forecasts?

Research shows people see value in humility after recognising limited understanding

“Market forecasts are particularly tricky. No one can see the future – the world is inherently uncertain and surprising things will happen.” Photograph: Shannon Stapleton/Reuters
“Market forecasts are particularly tricky. No one can see the future – the world is inherently uncertain and surprising things will happen.” Photograph: Shannon Stapleton/Reuters

“Forecasting: the attempt to predict the unknowable by measuring the irrelevant; this task employs most people on Wall Street.”

The words of Jason Zweig, author of the Devil's Financial Dictionary, are particularly apt at this time of year. We hear a lot from financial forecasters every January, as strategists prognosticate on what's in store for markets for the year ahead, even though decades of research confirms the prediction game is a pretty fruitless one.

As far back as 1933, US economist Albert Cowles concluded even the most successful market forecasters did “little, if any, better than what might be expected to result from pure chance”. Cowles conducted a larger follow-up study 11 years later; the results were no different. Since then, studies have investigated the accuracy of forecasts from market analysts, investment newsletter writers, financial journalists, and various other investment experts. Suffice to say, today’s forecasters are no more accurate than their predecessors in the 1930s.

The stock market is like a beauty contest where you make money not by selecting which face you think is the prettiest, but by guessing who others will think the prettiest

This problem is not confined to the financial domain. Predictions expert Philip Tetlock’s famous study of political forecasts over a 20-year period found the average expert is “roughly as accurate as a dart-throwing chimpanzee”. Countless studies in other fields have reached the same conclusion.

READ MORE

Market beauty contest

Market forecasts are particularly tricky. No one can see the future – the world is inherently uncertain and surprising things will happen. Even if you know what’s going to happen, however, you might not know how markets will respond.

Last month, financial blogger Urban Carmel asked his Twitter followers to predict whether the Federal Reserve would raise interest rates in December and the market response to same. Twenty-three per cent said the Fed would hike and the S&P 500 would rally; 20 per cent said the Fed would hike but the S&P would drop; 34 per cent said the Fed would pause and the S&P 500 would rally; 23 per cent said the Fed would pause and the S&P would drop.

In other words, not only did respondents not know what would happen, they had no idea how markets would respond. As economist John Maynard Keynes once observed, the stock market is like a beauty contest where you make money not by selecting which face you think is the prettiest, but by guessing who others will think the prettiest – not an easy task.

The same point is made by former Motley Fool columnist Morgan Housel, now a partner at the New York-based Collaborative Fund. “The most important thing to know to accurately forecast future stock prices is what mood investors will be in in the future,” writes Housel. “Will people be optimistic, and willing to pay a high price for stocks?” Or will they be “bummed out” and unwilling to do so? “You have to know that. It’s the most important variable when predicating future stock returns. And it’s unknowable.”

So why do we bother with the forecasting game? Market strategists would say there is a demand for price targets and the like and that they have little choice but to cater to this demand. As for those forecasters who make especially eye-catching predictions, they know inaccurate forecasts will quickly be forgotten, while the occasionally correct one can be milked for years.

People are over-confident and overestimate their understanding of financial markets. In fact, we overestimate our understanding of all kinds of things

Mathematician John Allen Paulos calls this the Jeane Dixon effect, named after the celebrity astrologer who became famous after apparently predicting president John F Kennedy's death. Dixon, who advised Ronald and Nancy Reagan, also predicted that the third World War would begin in 1958, that a cure for cancer would be found in 1967, and that there would be world peace in 2000, but those forecasts were overshadowed by her prediction of a presidential assassination.

Seer-sucker theory

This propensity to remember successful forecasts and discount failures points to our symbiotic relationship with forecasters, writes Forewarned author Paul Goodwin, a University of Bath academic and fellow of the International Institute of Forecasters. We feel uncertainty like a pain, says Goodwin, who points to research showing belief in experts' forecasts can persist even when the forecasts being offered are "manifestly useless".

Goodwin points to what marketing professor Scott Armstrong dubs the seer-sucker theory. If things go wrong, says Armstrong, you can avoid responsibility and blame the forecaster. As a result, “no matter how much evidence exists that seers do not exist, suckers will pay for the existence of seers”. This same theory was advanced by Albert Cowles.

Asked in later life why his research went largely ignored, he replied: “Even if I did my negative surveys every five years, or others continued them when I’m gone, it wouldn’t matter. People are still going to subscribe to these services. They want to believe that somebody really knows. A world in which nobody really knows can be frightening.”

There’s another explanation as to why forecasters keep forecasting – people are over-confident and overestimate their understanding of financial markets. In fact, we overestimate our understanding of all kinds of things.

In an influential 2002 study conducted by psychologists Leonid Rozenblit and Frank Keil, people were asked to rate their understanding of the workings of mundane things such as a toilet, a zip, a refrigerator and so on. Almost everyone believed they had a good understanding. However, when asked to explain how they worked, people tended to fail miserably.

Successful forecasters think in terms of probabilities, not certainties. They focus what they don't know as much as what they do know

People think they understand things with “far greater precision, coherence, and depth than they really do”, noted Rozenblit and Keil, who describe this tendency as an “illusion of explanatory depth”. Similar research by University of Liverpool psychologist Rebecca Lawson found people made frequent mistakes when asked to draw a bicycle, such as believing the chain went around the front wheel as well as the back wheel. Contrary to what people thought, many had “virtually no knowledge of how bicycles function”. We think we know how things work – toilets, fridges, bicycles, economies, stock markets – but our understanding is not nearly as comprehensive as we imagine.

Bright side

There is a bright side, however. Research shows people discover the value of humility after becoming aware of their limited understanding and tend to adopt more moderate, open-minded positions.

Similarly, new research by Philip Tetlock shows there is a good side to forecasting. "People often express political opinions in starkly dichotomous terms, such as 'Trump will either trigger a ruinous trade war or save US factory workers from disaster'," writes Tetlock.

Earlier research by him showed successful forecasting is possible, and that a small minority of “super-forecasters” prosper by taking a scientific, probabilistic approach. In Tetlock’s new paper, ordinary participants took part in a multi-month forecasting tournament where they were asked to translate their beliefs into probability judgments and track their accuracy over time. By the end, the forecasting tournament had induced an “epistemic humility”; polarisation declined and people became more moderate in their views.

Successful forecasters think in terms of probabilities, not certainties. They focus what they don’t know as much as what they do know. They update or change their forecasts rather than becoming wedded to them. Doing otherwise is dangerous.

As decision scientist Baruch Fischoff once cautioned: "When both forecaster and client exaggerate the quality of forecasts, the client will often win the race to the poorhouse."