Financial predictions lose their edge after Brexit and Trump

However, market prices still ‘trounce’ polls and pundits in forecasting events

Forecasts in financial markets are heavy on caveats and surprises can happen
Forecasts in financial markets are heavy on caveats and surprises can happen

It is often taken for granted in financial circles that if you want to ascertain the likelihood of an event, you're better off consulting market prices and betting markets than heeding the advice of the commentariat or relying on simple opinion polls. However, prediction markets were slow to accept that Donald Trump's rise was anything other than temporary, and they got it badly wrong on Brexit. So is the so-called wisdom of crowds overstated?

Surprises happen in financial markets, but traders can hardly plead that the Brexit result was an unforeseeable black swan event. For weeks, opinion polls indicated the result was too close to call, but betting markets always viewed a pro-Brexit vote as a relatively slim possibility.

The odds on a Brexit vote did tighten in early June, when a raft of polls put the Leave camp in the lead, but even then gamblers remained confident that Britain would ultimately vote to remain in the EU. On the day of the election itself, most polls indicated Remain led by between one and two percentage points; although that was well within the margin of error, bookmakers estimated the odds of a Remain vote to be around 90 per cent.

Global currency markets, too, saw the referendum as a one-way bet. The consensus among economists and investors was that sterling would fall to about $1.30-$1.35 in the event of a Leave vote, but that a Remain vote would see the currency trade around the $1.50 mark.

READ MORE

For most of the week leading up to the vote, sterling was trading at $1.47 or $1.48; shortly after the polls closed on the day of the referendum, sterling had already hit $1.50.

As David Rothschild of predictions website PredictWise noted, "financial markets went all in" that Brexit would fail.

Super-forecasters

Predictions experts such as Rothschild were similarly wrongfooted. PredictWise estimated the odds of Brexit to be about 25 per cent, as did the Number Cruncher Politics Brexit Probability Index compiled by UK polling analyst Matt Singh, while prediction market Almanis estimated the odds of a Remain vote to be about 80 per cent. Good Judgment Inc, a panel of so-called "super-forecasters" known for their high accuracy levels, estimated the probability of a Leave vote to be about 24 per cent.

Prediction markets were similarly slow to accept that Donal Trump was going to become the Republican nominee for the US presidency, even though he consistently led in opinion polls within weeks of declaring his candidacy in June 2015.

With both Trump and Brexit, you would have made money by simply believing the polls rather than assuming market prices reflected some kind of special insight not available to pollsters.

That should not be the case. Those betting on the market, as well as predictions experts like David Rothschild, Matt Singh and Nate Silver of the FiveThirtyEight website, have access to polls as well as to other sources of relevant information. Polls can be misleading for all kinds of reasons; for example, Democrat and Republican presidential candidates tend to get a sizable but short-lived bounce in polls following their party conventions in the months leading up to the election. By adjusting polls for such biases, analysts should have an inherent advantage over pollsters.

Additionally, anyone with access to crucial private information may well be tempted to profit from his or her knowledge, which is why inside information is often reflected in market prices long before it becomes public.

What went wrong?

So what went wrong? In the case of Trump, Nate Silver admitted in May that he had fallen prey to “a lot of the same biases as the pundits I usually criticise”. Although forecasters could have been forgiven for being initially sceptical of a candidate as unorthodox as Trump, Silver said he had anchored on his initial forecast, been too slow in updating his forecast as new data rolled in, selectively interpreted the evidence and engaged “in some lazy reasoning”.

Similar criticisms could be made of prediction markets' handling of the Brexit referendum. Nichols Colas, a market strategist at Convergex, said the result "gives real pause as to whether markets can forecast important political outcomes, particularly where one of the outcomes is not held to be politically correct".

The financial community was largely unanimous in seeing a vote for Brexit as a foolish act of self-sabotage; asset owners, said Colas, “do not understand what those people who do not have wealth think about the world”.

Predictions expert Philip Tetlock, the brains behind the Good Judgment's aforementioned super-forecasting project, has made the same point, saying market professionals and forecasters "were socially far removed from modal Brexit voters".

Other factors may also have been at play. Forecasters must strike a balance between learning too much from history and learning too little, said Tetlock; too much weight may have been placed on past independence referendums such as in Scotland and in Quebec, where voters edged towards the status quo in the final days of the campaigns.

Influenced result?

Ironically, there is also the possibility that betting markets actually influenced the result. Post-referendum polling carried out by PredictWise on June 28th showed a jump in support for Remain. The cause of that swing was not because a large proportion of Leave voters changed their minds – it was because non-voters regretted not casting their vote. Many non-voters may well have assumed their vote was not needed, as the consensus was that Remain would prevail.

The possibility that prediction markets could influence election results has been noted before. A 2013 study into the US presidential election noted the (now defunct) predictions market InTrade may have been manipulated by a trader making big bets on Republican nominee Mitt Romney. Although Romney was the underdog, this trader bet millions of dollars he would win – in all, about one-third of the money bet on Romney over the last weeks of the campaign came from the one individual, resulting in InTrade odds giving Romney a much better chance of victory than other bookmakers.

Trader A, as the researchers dubbed him, may have been “attempting to manipulate beliefs about the odds of victory in an attempt to boost fundraising, campaign morale and turnout.” It is “worth knowing”, they concluded, “that a highly visible market that drove many a media narrative could be manipulated at a cost less than that of a prime time television commercial”.

Prices beat polls

It would be a mistake, however, to conclude that investors should ignore predictions markets. Most academic literature has found prediction market prices are in fact much more accurate than simple polls data in forecasting elections.

For example, one 2001 study that examined 964 polls over five US presidential elections found market prices were closer to the end result in 74 per cent of cases. When forecasting more than 100 days in advance, market prices beat poll data on every occasion. Another paper coauthored by US economist Justin Wolfers came to the same conclusion, with market prices outperforming "both professional forecasters and polls in a variety of statistical tests".

Prediction market track records may have taken a hit recently, but they "still trounce chance", as Philip Tetlock tweeted in the aftermath of the Brexit result. In contrast, Tetlock's 2005 book, Expert Political Judgment, showed historical forecasts from most political "experts" were "roughly as accurate as a dart-throwing chimpanzee".

Similarly, there is also a danger in “overcorrecting” after unexpected events, as Nate Silver put it. Silver’s main models for predicting US presidential primaries and caucuses in 2016 have correctly predicted the winner around 90 per cent of the time; while there will always be occasional surprises, they have occurred “about as often as they’re supposed to but not more often”.

It is not hindsight to say betting markets underestimated the odds of Brexit: it is baffling that the odds of a Remain vote were as high as 90 per cent on the day of the election considering polls consistently showed that this was a close race.

However, while the Brexit shock has highlighted the frailties both of financial markets and of statistical models, it is worth remembering that the alternative for investors – relying on “professional forecasters, polls, pundits, or a combination of the three”, to borrow from the Justin Wolfers study – is an altogether less enticing prospect.