Enhancing key climate datasets and what it tells us about temperature targets

Every 0.1 degree matters as the world attempts to keep within 1.5-degree rise

The Paris Agreement is frustratingly vague and imprecise as to what constitutes pre-industrial
The Paris Agreement is frustratingly vague and imprecise as to what constitutes pre-industrial

Research groups around the world provide updates of global surface temperatures that are critical to evaluating climate change trends.

They differ based on starting year, corrections made to changing measurement techniques, and how they deal with gaps between measurement locations. Crucially, all show similar levels of warming over the past 150 years – between 1 degree and 1.2 degrees above pre-industrial levels; every 0.1 degree matters.

A major update on a key global temperature dataset moves it from one of the slowest warming records to the fastest. The Met Office and the University of East Anglia (UEA) have made a series of key improvements to their long-running global temperature data set known as HadCRUT in the first update in eight years.

The revision shows faster warming since the 1970s, and up to 0.2 degrees higher in recent years due in part to exceptionally rapid Arctic warming.

READ SOME MORE

It is one of the world’s leading datasets, stretching back to 1850. It brings together measurements of near-surface air temperature made at weather stations around the world with measurements of the temperature of the top-most layer of the ocean; sea-surface temperature.

Updates include: adjustments to handle biases between different ways of measuring sea-surface temperature; a significant increase in the number of weather stations used over land; and applying statistical methods to extend coverage in areas that are still “data-sparse”, such as the Arctic. This provides more accurate estimates of global, hemispheric and regional temperature changes.

HadCRUT5 indicates the average global temperature for 2010-2018 is 1.07 degrees warmer than average conditions in 1850-1900 – 0.16 degrees more than in the previous version of the data set. It is now more consistent with other datasets during recent decades and shows slightly more warming than most of them do over the full period since 1850. Most significantly, it moves it from one of the slowest warming records to the fastest.

Each of the main global temperature datasets are compiled separately using different methods, but all are based on a finite amount of global temperature observations, explains Dr Colin Morice of the Met Office. The approaches result in very similar estimates of overall warming "and provides scientific confidence in the changes seen and also reinforces the fact that the world has warmed considerably since the mid-century".

Warming

UEA's Prof Tim Osborn underlines: "Climate change has not suddenly got worse: instead, we have an improved estimate of how much warming has taken place."

The main contributor to warming over the past 170 years is human influence on climate from increasing greenhouse gases in the atmosphere. Effects from human-induced climate change are not limited to surface temperature. Warming of the climate system is seen across a range of climate indicators that build a holistic picture of climate disruption far beyond expectations from natural variability across the land, atmosphere, oceans and ice.

An updated version of the land-only global temperature dataset, CRUTEM5, has also been issued. “Climate science predicts that the land areas of the world should be warming faster than the ocean areas and our temperature datasets confirm this,” he says.

The dataset is a clear improvement upon the prior version addressing two key issues, according to leading Irish climate scientist Prof Peter Thorne of Maynooth University. "Firstly modern marine temperature biases arising from the move from mostly ship-based to mostly buoy-based measurements are now accounted for."

Secondly, the impacts of not taking direct measurements everywhere are much better handled by improved estimation of the impacts via interpolation over reasonable distances. "We know this is reasonable because at its very simplest if November was warm in Dublin, it was likely warm in Cork, Belfast, London and Paris too," he explained.

The Paris Agreement is frustratingly vague and imprecise as to what constitutes pre-industrial <br/>

These improvements lead to an increased estimate of warming to date, but bring us considerably closer to 1.5 degrees; a key Paris Agreement limit. “By our best estimate across a range of estimates produced by a range of groups of researchers now averaged over the past 5 years we probably stand somewhere close to 1.2 degrees relative to 1850-1900 and possibly a bit higher still relative to a true pre-industrial climate state.”

The revised datasets provide further evidence the Paris target will be breached sooner and directly impacts the viability of avoiding 1.5 degrees. But on top of that, Prof Thorne notes: “I think it’s fair to say the possible policy implications are vexed and tricky.”

Why it gets a bit vexed is that Paris was predicated upon an implicit assumption as to how much the globe had warmed to date at the time, which is not made explicit in the text. So it has become an arduous issue not just for eminent climate scientists but the best environmental lawyers.

The Paris Agreement is frustratingly vague and imprecise as to what constitutes pre-industrial and what the estimated warming to date used to set the aspirational targets of 1.5 and 2 degrees had been, Prof Thorne points out. “It would have been far more sensible to define a target relative to 2015 or say 2006-2015 because that is a well observed period and baselines the target unambiguously to a modern period reference, which would be much less uncertain and much less likely to subsequently change substantively.”

Kevin O'Sullivan

Kevin O'Sullivan

Kevin O'Sullivan is Environment and Science Editor and former editor of The Irish Times