The baseline problem

A baseline is simply a period of successive years whose average temperature is subtracted from a time series to produce temperature “anomalies”.

One normally associates anomalies with weather station data where the baseline is then the 12 monthly seasonal averages for each station. What is less well known is that climate model projections  also need a baseline and that the end result depends on both the choice and length of that baseline period. Subtle changes in this choice can transform the apparent agreement between models and data.  We  have already met  this effect once before when reviewing Zeke’s wonder plot.  The question one might ask is why would climate models need any normalising at all ?

The underlying reason is simple – Climate models do not conserve energy at the observed surface temperature. They cannot balance the energy in from the sun with the energy out from IR radiation except by adjusting the mean surface temperature. This problem was beautifully explained by Ed Hawkins in a 2015 talk and later on his blog.

Hawkins slide 1. CMIP5 models all give wildly different  values of the global average surface temperature in order to balance energy. Hence the need for baselines.

CMIP5 models all predict different average surface temperatures. The model projections that we see in various IPCC reports and in the press have all been normalised to some arbitrary common baseline, but they are not normalised in the same way as the  measurement temperature data. Instead they are each artificially shifted so that every model averages to zero during the chosen baseline period. As a direct result of this all models can now agree with each other that the temperature anomaly is zero within this selected  baseline.

Model projections so adjusted can now be compared to the data once that too has been shifted to match their same baseline. This is an arbitrary shift without any scientific basis, yet the agreement between models and data now simply depends on that choice of baseline. Models can basically be tuned to fit the temperature data by selecting an optimum timebase. This is the dirty secret behind climate science.

Fig 2: Simply  changing baselines models improves agreement.

Figure 2. from Ed Hawkins demonstrates this effect perfectly. Models can appear to have almost perfect agreement to data if the baseline spans recent years. Agreement gets much worse when you select a more standard baseline like 1961-1990. This is also why Zeke got such good agreement for models as shown below. He chose a baseline which spans the full time interval for all the displayed data !

Zeke’s wonder plot.

Using his baseline the models and the data had to agree because by definition both agree that the average temperature anomaly is zero, and indeed it is !

There is another trick which can be used to fine tune agreement – varying the length of the baseline period. This changes the relative spread of the model data because of short time span variations between models. Figure 3  demonstrates the effect of varying the baseline timespan. This animation is from Ed Hawkin’s Blog and shows how short timebases can radically change the dispersion in models.

Effects of different time spans on normalisation.
Animation – Ed Hawkins

Choosing a shorter of longer baseline time period affects the spread and ordering of individual model projections. This is because the baseline captures just one snapshot in model variability freezing it in based on just one time interval.

In general series of measured temperature anomalies can be moved to a different baseline by a linear shift of all points. Here for example is GHCN V4 calculated on different timebases.

Global Land temperature anomalies calculated relative to 5 different baselines. The numbers in brackets are the number of stations contributing for each baseline period.

It is the model projections which change dramatically when using different baselines. One should always bear this in mind whenever  an ensemble of models seem to perfectly  match the data well. They are simply constrained to do so ! That is also why at any given time future projections  fan out so dramatically 30 years into the future.

Don’t worry though because in 30 years time all the models will yet again agree with those measured temperature anomalies !

About Clive Best

PhD High Energy Physics Worked at CERN, Rutherford Lab, JET, JRC, OSVision
This entry was posted in AGW, Climate Change, climate science and tagged , . Bookmark the permalink.

5 Responses to The baseline problem

  1. Verbascose says:

    Your suspicion that the baseline problem would be caused by lack of conservation of energy is wrong. Any sublte differences in model configuration (e.g. land or ocean surfaces with different albedo or emissivity), or any model parameters resulting in a different albedo or outgoing longwave radiation will result in a slightly different energy balance and, hence, slightly different average surface temperature. This is not the result of missing conservation of energy, but of the opposite!

  2. Jerry says:

    Your analysis points out the immaturity of climate “science” as it currently exists. It seems that the climate is changing, but the current “science” is not explaining why. These climate “scientists” are certainly not embracing the concept of challenging models in an attempt to continually ferret out the truth. My concern is that without rigorous debate, we will miss out on any real reasons for this change and turn our focus to things that will cause great sacrifice and yet not address the problem. Perhaps there is no problem or anything we can do about it. But until we have true scientific debate on a broad scale, we will never know.

  3. Steven Crow says:

    I agree with both Verbascose and Jerry. Climate “science” as currently practiced is leftist politics rather than science and is missing opportunities for finding the real mechanisms of global warming. The first thing that should go when real climate science emerges is the word “global”. Clive Best’s global temperature maps reveal that warming is concentrated around the arctic, mainly in Northern Canada and Siberia. The footprints of the warm spots suggest heat flow up from the Earth’s mantle rather than warming from atmospheric CO2 in the temperate areas where it is produced. In any case, CO2 as measured at Mauna Loa can account for a temperature increase no more than 0.25 deg C.

  4. oz4caster says:

    Clive, great work and quite interesting. I have pulled the CMIP5 monthly model mean global surface temperature output for over 40 models from KNMI to compare with climate model reanalysis initialization output (CFSR/CFSV2 and NCEP/NCAR R1). I will be looking at the temperature output rather than converting to anomalies. I have already looked at the Russian model inmcm4 for RCP4.5 and RCP8.5, which compares fairly well, but the running 12-month mean is too low by about 0.5C for 2019 for both RCPs. This model tracks the seasonal peak in NH summer well, but the the seasonal minimum occurring with NH winter is too low in the model and seems to account for most of the discrepancy versus the reanalysis.

    Do you have a feel for which model(s) I should compare next?

    I will be publishing the results on my blog as time permits.

Leave a Reply to oz4casterCancel reply