A reduction in Tmax-Tmin of about 0.1C is observed since 1950. Minimum temperatures always occur at night over land areas. This means that nights have been warming faster than days since 1950. The effect is actually much larger than 0.1C because nearly 70% of the earth’s surface is ocean with just single monthly average ‘anomalies’. So nights over land areas have on average warmed ~ 0.3C more than daytime temperatures.

If we assume that average land temperatures have risen by ~1C since 1900, then maximum temperatures have really risen only by 0.85C while minimum temperatures have risen by 1.15C.

This effect may also be apparent in equatorial regions where the night/day and winter/summer temperature differences are much smaller than at high latitudes.

Radiative cooling of the land surface mostly occurs at night. It is much greater when the air is dry such as over desert regions and at the poles. During the day convection and evaporation dominate heat loss. Enhanced CO2 reduces slightly night time cooling efficiency. UHI is also larger at night.

]]>The temperature colour scale used in all animations is this one.

This is the latest data for March 2017 showing explicitly the spherical triangulation. The underlying data I use is GHCN V3 station data normalised to 1961-1990 combined with HSST3 ocean data.

There is also a youtube video of 137 years starting in 1880. The distorted earth shape in early years is caused by poor sampling.

I think that the best way to view these results on the Web is probably to use WebGL. I have been putting this off as from my past experience Javascript can quickly become a can of worms. However I may eventually to delve into this as WebGL allows to interact directly with 3-D data.

]]>December 2010 saw a similar pattern with AO values reaching very low values approaching -5

In December 2010 the lowest ever temperatures were recorded at several UK stations. This is how the global temperature anomaly distribution looked, not too different than 1963.

A negative phase of AO corresponds to high pressure at the pole. The Jet Stream moves further south with large meanders, causing cold polar air to be dragged with it. As a result polar temperatures actually appear to be slightly warmer than normal.

]]>The next animation shows the temperature anomalies calculated on the same grid, as described in the last post. I have to use the original aspect ratio this time otherwise the animated gif washes out the blue colour. Yes it is still abnormally cold in Antarctica.

The temperature scale for anomalies is ± 10C (blue to red). For comparison here is one of the coldest months in the last decade : January 2008. There was exceptional cold conditions over Siberia and the global average was about the same as the normalisation period 1961-1990.

Animations are all a bit of a gimmick, but I just can’t resist them. However I will try to make some better quantitative visualisation for a given month.

]]>The most elegant method for spatial integration of irregular temperature data must surely be spherical triangulation over the earth’s surface. This is because it treats each measurement equally by covering the earth’s surface with a triangular mesh of station & SST nodes. Unlike linear triangulation (described previously), spherical triangulation also spans polar regions. There is no need for any ‘kriging’ or linear interpolation into sparse polar regions, since they are naturally included. I finally deciphered IDL’s spherical triangulation output, thanks to Nick Stokes. Here then is the result in 3D for temperatures in January 2016. The shading for each triangle is the average of each node’s temperature anomaly (-5C – +5C).

I now need to find a better visualisation method as this one takes way too long, however at least it shows how triangulation now covers both poles.

Each triangular area shown in Figure 1 is calculated in 3D cartesian coordinates to derive the area weighting used for averaging. Figure 2 shows how the final spherical results compare to the 2D (lat,lon) triangulation results.

There is really very little difference in the annual temperature anomalies between the spherical results and the 2-D triangulation results. Based on these results it would seem that Cowtan & Way have exaggerated polar warming effects between 2005 and 2013. Figure 3 shows the monthly comparison and just how remarkably similar the 2-D and 3-D results are despite completely independent methods of integration.

Aesthetically the spherical triangulation grid is my favourite. Unfortunately, the extra effort makes only a tiny difference compared to the more easy 2D triangulation solution based on lat,lon coordinates. Despite this, both methods, in my opinion, are better than simple rectangular gridding as used by Hadcrut4 and (partially) by GISS and NOAA. Furthermore they avoid interpolation.

I will post the code and the data soon.

]]>The second animation shows all winter January temperatures from 1880 to 2016. Extreme winters of 1942, 1947, 1963 and even 2010 are particularly noticeable. Scale shown is Blue to Red -10C to +10C from normal. Black are extreme temperatures out of scale.

]]>In recent posts I have been combining GHCN V3C station data with Hadley HADSST3 Ocean data. The station data are located at specific locations, while HADSST3 data are recorded sporadically within a regular 5×5 degree grid. In addition active station locations and active SST cells vary from month to month. This is particularly true for early years in the temperature series.

I have been using IDL to make a Delauney triangulation in (lat,lon) of all active measurements globally for each month, and then investigate different interpolations of this onto regular grids. However, following a proposal by Nick Stokes, I realised that this is unnecessary because the triangulation itself can produce an almost perfect global average. I believe this is probably the best spatial integration possible because it uses all available data and gives a direct result. There is no need for any interpolation or grid cell averaging. The last two posts showed how such interpolation can introduce biases. This is avoided using triangulation based spatial averaging. Here is the algorithm I used.

- Each triangle contains one measurement at each vertex. Use Heron’s formula to calculate the area of the triangle.
- Calculate the centroid position and assign this an anomaly equal to the average of all 3 vertex values. This centroid value is then used as the average anomaly for the whole triangular area.
- Use a spatial weighting for each triangle in the integration equal to cos(lat)*area. Where Lat is the latitude of the centroid.
- Calculate the global average =

Using this method every possible measurement is used and no extrapolation outside triangular boundaries is needed. Each month is normalised to the surface area covered only by the active triangles. This also means that for areas of dense station data like the US and Europe regional temperature variation can be studied at full resolution. There is no need to average nearby temperature measurements. I think Triangulation is a near perfect perfect solution also to regional and global temperature studies. It is something I will look at next.

To see how all this works in practice we first look at the triangulation results for January 1880 with rather sparse measurements.

In comparison here is January 2015 with dense data in the US and Europe. I am plotting each individual triangle, showing just how dense measurements are in the US. For regional studies it is best to suppress triangular borders and show just the colour coding.

For all anomalies I use the same normalisation period as Hadcrut4 1961-1990. To compare the results to other series I renormalise these also to 1961-1990. The offsets used for each series are shown in the plots. A detailed comparison of recent years is then shown below. The new result neatly lies between those temperature series which use infilling and Hadcrut4 which doesn’t.

A comparison of the monthly values shows a remarkably close agreement with Berkeley Earth.

In conclusion a new method calculating spatial averages of randomly dispersed temperature data has been developed based on triangulation. The original idea for this method is thanks to Nick Stokes. The method has potential benefits also for detailed regional studies.

Annual temperature series are available here

Monthly temperature series are available here

IDL code to generate all results is available here

Of particular interest is the extrapolation of data near the poles, which is where most warming has been observed. The problem though is that there are very few measurements in that region especially before 1940. Figure 2 shows triangulation grids for both poles in 1880 and 2016.

In 1880 there are no measurements further south than 70S or further north than 70N, and triangles cover huge distances. Therefore artefacts are likely introduced by krigging into these regions. The 2016 triangulation shows much better coverage but there are still no data inside 75N. Antarctica does a better job because stations now exist at research stations including one at the south pole.

Here is the full 167 year comparison.

My conclusions are that before 1940 it is best not to use any interpolation into unmeasured areas of the world because the coverage is so low, Hadcrut4 methodology is preferable. Recent warming is enhanced by interpolation because those empty lat,lon cells are filled by the influence of ‘warm’ nearby neighbours. Likewise natural warming cycles such as el Nino also get enhanced.

Data are available here

IDL code that generates this data is here.

Next I will use the triangulation itself of measurement locations to calculate the global average, avoiding any interpolation.

]]>How does this compare to other ‘kriging’ methods which supposedly remove the coverage bias of Hadcrut4? What I discovered is that the end result depends critically on what grid spacing you interpolate onto. If you chose a fine grid spacing, such as the 1 degree used by Berkeley Earth, then you get an enhanced warming trend over recent years. If however you chose the same grid size as Hadrut4 (5 degrees) then you get a reduced trend. This implies that a systematic error is introduced by the methodology. Here is the comparison.

The 2 degree results are very similar to Berkeley Earth but give a slightly larger warming trend. However by using the same 5 degree target grid size as Hadcrut4 the result gives a much reduced warming trend. Cowtan and Way use the HADCRUT4 station data rather than V3C and lies somewhere in the middle. Here is a detailed comparison of results for one month – September 2016.

The 2 degree resolution extends the expanse of each warm zonal area.

The 5 degree resolution is in line with that of HADSST3 and HADCRUT4

This is Cowtan and Way version 2 which reconstructs ocean and land separately and then blends them during the time period shown.

Does kriging actually improve the accuracy of global temperatures? While it is probably correct that Hadcrut4 has a ‘coverage bias’ over polar regions, what is even clearer is that interpolation to remedy this can itself introduce a systematic warming bias dependent on method and target grid size. The other temperature series all use data infilling based on ‘kriging’ type techniques.

]]>This clearly shows why the global average of the kriged data gives larger values than Hadcrut4.5 when anomalies at high latitudes are warmer.

However I still prefer Hadcrut4.5 because it makes less assumptions, which is especially important for early years.

One interesting possibility of kriging is that it can also handle irregular grids. This means that you don’t even need to average the data in grid cells first. I might even try this, if I find the time. I suspect this is more or less exactly what Berkeley Earth does.

]]>