A sceptic’s guide to global temperatures

Climate change may well turn out to be a benign problem rather than the severe problem or “emergency” it is claimed to be. This will eventually depend on just how much the earth’s climate is warming due to our  transient but relatively large increase in atmospheric CO2 levels. This is why it is so  important to accurately and impartially measure the earth’s average temperature rise since 1850. It turns out that such a measurement is neither straightforward, independent nor easy.  For some  climate scientists there sometimes appears to be a slight temptation to exaggerate recent warming,  perhaps because their careers and status improve the higher temperatures rise. They are human like the rest of us. Similarly the green energy lobby welcome each scarier  temperature increase to push ever more funding for their unproven solutions, never really explaining how they could possibly work better than a rapid expansion in nuclear energy instead. Despite over 30 years of strident warnings and the fairly successful efforts of G7 countries to actually reduce emissions, CO2 levels in the atmosphere are still stubbornly accelerating upwards. This is because simultaneously the developing world has strived to raise the wellbeing and living standards of their large populations through the use of ever more coal and oil, exactly as we did. This is our current dilemma. Should they somehow be stopped from burning fossil fuels, or maybe compensated financially  to ‘transition’ to so-called renewable energy instead? All this again depends on the speed of climate change, which simply translates to the slope of the temperature record.

The good news is that once global emissions start reducing, as they inevitably will, so the climate will rather quickly stabilise. Yet it may still take a thousand years or more for the earth to fully return to a supposedly “normal” pre-industrial climate. But is this actually what we want? This idyllic  normal climate also includes regular ice ages, the last one of which reduced the  human population to just a few thousands of individuals struggling to survive outside of Africa. The earth’s climate has anyway been cooling by ~4C over the last 5 million years mainly due to land uplift caused by plate tectonics, which initiated violent natural swings in climate.  Human civilisation only developed within the current holocene interglacial (last 10,000 years).  Therefore it is in all our interests for the holocene to continue for as long as possible. Once we get through this temporary climate “transition” we will soon realise that controlling  the climate through some enhanced CO2 levels is a far better outcome for humanity than returning to a pre-industrial climate. A 2C colder climate is far worse than a 2C warmer climate.

So how accurately do we really know what the earth’s temperature really is? The answer would appear to be not very well  at all since it has been constantly changing since the last IPCC report. The most notable of these changes since 2012 is a  dramatic increase in what experts now say the global temperatures is compared to what those same experts  said it was 10 years ago. The hiatus as reported in the latest IPCC report has now completely vanished.  How is this possible?  The Paris agreement proposed limiting temperature increases to 1.5C since pre-industrial times, but if you believe the most recent temperature results this limit was already breached in 2016, and will surely be exceeded with another major El Nino event. In this report I hope to explain how a combination of new temperature data, further adjustments to those data, and changes in the methods used for calculating global averages can explain why temperatures ended up being ~0.25C warmer than they were originally reported in 2012. Somehow each new version of any temperature series essentially rewrites climate history. Global warming always turns out to be  far worse than we feared it was a year previously. This ratcheting up of alarm is continuous.

A 7 year evolution in global temperatures.  Each new version of a temperature series rewrites climate history

Recent temperatures now apparently show a ~0.25C  warming  simply due to a continuous process of adding, merging and adjusting multiple short temperature records. But how is that possible?  To really answer that question we first need to understand what the term “global temperature” really means.

Temperatures have traditionally been measured by national weather stations around the world since the early 19th century. Traditionally these used mercury thermometers inside a Stevenson screen designed to shield them from direct sunlight and allow ambient air to mix inside. For this reason the slats of the Stevensen screen are painted white to reflect sunlight. This is because  the aim is to measure the ambient air temperature at about 1.5-2m above the ground and not any localised effects. The siting of weather stations should therefore be away from buildings and any other artificial source of heat.

Weather stations were always read manually by an operator until about the early 1990s. Large stations would be read 8 or more times per day, while smaller ones read just twice a day usually at 9am and 4pm. In addition to real time thermometers, nearly all stations also contained a max/min thermometer. This is a linked dual glass bulb thermometer where the mercury on one side could only move up and not down and vice versa on the other side.

simple min/max thermometer

These thermometers measure the maximum and minimum temperatures reached between each reading whereupon they were reset by shaking the mercury back down/up again. These max/min thermometers were read and reset once a day. The time of observation (TOBS) was important because if they were read at 9am then the minimum temperature would be for today (early morning) but the maximum temperature would be for yesterday eg. 3pm. However if they were instead read at say 6pm then both maximum and minimum would be for today. The readings were then written in log books, which have since been digitised. Typically then each station produces 3 values each day. Tmax, Tmin and Tav. Tav is defined as (Tmax-Tmin)/2, but this is really an approximation to the average daily temperature, because the shape of the diurnal variation changes every day. A better estimate would be to use the hourly temperature values and  integrate them over a 24 hour period. However since these don’t exist for the early data, it is common practice to simply use Tav = (Tmax+Tmin)/2 instead. Note that Tav also increases if  just Tmax increases or just Tmin increases. Corrections for TOBS have become notorious especially for the US stations because the corrections are time dependent as operating practices evolved in time. The TOBS corrections alone produce the US warming, although on paper these corrections seem correct. However I discovered that is probably just as good to use the twice per day measurements at fixed times to calculate temperature anomalies.

Automated weather stations began to be introduced in the 1980s and essentially replaced all manual weather stations by the late 1990s. These log temperatures automatically every hour. This eventually removed any TOBS effects.

Spatial Coverage

The historical record shows a large increase in the spatial coverage of weather stations globally with time. In the 18th century weather stations mainly existed only in Europe, the US and a few colonial outposts. Even today large parts of Africa and South America still have no coverage of weather stations.

Weather stations before 1900

An archive of the daily temperatures and rainfall back to the 18th century has been collected at NCDC and is updated regularly. The data has been subjected to quality control but should otherwise be as close to the original measured values as possible. Monthly temperature averages predate GHCN-Daily and are processed and published by GHCN-Monthly, and CRUTEM. Berkeley Earth used a collection of archives to maximise station coverage, but today you can do almost the same by using GHCN-Daily. Recently NCDC have released version 4 of GHCN-Monthly which is 75% the monthly averages of GHCN-Daily and 25% of national archives. There are 24,000 stations in total but 17,000 with data between the standard 30y period 1961-1990. V4 has been subject to quality control features and in the adjusted version has homogenisation applied.

Sea Surface Temperatures

The ocean temperature data is perhaps even more complicated as ‘sea surface temperatures’ were measured by ships until after the 1980s. Ships move and measured temperatures in three different ways by using a) wooden buckets b) canvas buckets and c) Engine Room intake temperatures. There are problems with all three. The first two read cool because of evaporation removing latent heat each by a different amount and the third reads too warm. In addition they read temperatures at different depths which also clearly depend on swell conditions. Anyone who swims in the Mediterranean soon realises  that their toes  feel ~5C colder that their shoulders so there is no absolute SST. As a result there are some ad hoc corrections applied depending on which type of measurement was made. For early years ocean temperature data follow the trade routes and sampling very small areas of ocean. After the 1980s drifting buoys and fixed were deployed improving coverage and accuracy. An archive of historic records is kept at ICOADS and there are two processed SST temperature series on a 2 degree grid ERSST4 and 5 degree grid HadSST3. Interestingly the SST data is the only one which could publish absolute temperatures rather than temperature anomalies because all measurements are at the same altitude (zero). However to be compatible with the land values they all produce anomalies which depend critically on the normalisation period (1951-180 or 1961-1990) which itself spans the uncertain bucket/ERI period. Since a third of the earth’s surface is ocean SST values tend to dominate the global average. They are also the most uncertain during the early 20th and 19th centuries because they rely on systematic bias corrections.  Of course the temptation is to increase recent temperature trends through these corrections. The new HadSST4 is an example.

The new version of the HadSST4 has reassigned ship based  measurements from before WW II  into the early 1990s. Bias adjustments depend on the fraction of measurements using wooden or canvas buckets and engine room intakes (ERI), which are partly defined by the metadata in ICOADS based on ship logs. The assignment of each measurement to the type of bucket or ERI is sometimes uncertain. HadSST4 now use instead the diurnal temperature dependence of the measurements (time of day) to identify which measurement type was used by each ship. The overall bias adjustment to SST will change if this procedure changes the fraction of data falling into into each category since each adjustments is different.

They claim that 75% of measurements could be classified in this way, and that buckets were still in use in US ships into the early 1990s. Since then measurements are based on floating buoys and Argo buoys and these recent temperatures measurements are unaffected. However that doesn’t matter because the crucial 1961-1990 normalisation period certainly is affected and HadSST4 only publishes temperature anomalies – not absolute temperatures. So the net effect of the new assignments  is to to lower the zero line (normals) from which anomalies are calculated,  and as a result all recent  anomalies have indeed increased in ‘temperature’. Hey presto the oceans have now warmed by an extra 0.1C.

Raw data.
It is almost impossible to get the raw daily measurement data from individual stations as they were originally recorded. GHCN provide a version of the monthly data which they call unadjusted but which in reality is quality controlled monthly averages from stations, some of which have already  been merged with nearby previous stations. Thanks to Ron Graff I was able to access the original hourly and daily data from Australian stations. In one location like Darwin we have 3 or 4 stations in different locations covering different time periods. These then get merged into one station to cover the whole period. GHCN V4 and DAILY contain all 4 stations but the Airport is still a merge of all 3 leading to double counting. There are several other similar examples. For Australia I was able to compare the average anomaly from the raw data (hourly and daily) to the adjusted ACORN data.  This showed warming was dominated by minimum temperatures, whereas maximum temperatures were more of less stable. In other words warming occurs at night.

First we compare the raw data with the adjusted data for ACRON-SAT

Raw data compared to adjusted data for Australia (ACORN SAT)

The raw data don’t show any warming after 1980, whereas the homogenised data clearly does. Even the homogenised data show an interesting effect. It is mostly minimum temperatures that are rising and this causes the average temperature to increase. This essentially means that warming occurs mostly at night.

Average annual maximum and minimum temperatures across Australia

Another way to view the same thing  is to use the annual maximum temperature range (hottest-coldest) area averaged across all ACORN stations. The range of extremes is actually reducing as the average temperature anomaly rises.

Annual extreme temperature range (max-min) compared to temperature anomalies for ACORN-SAT stations.

It is the adjustments made to the data through ‘homogenisation’ which always increases warming. So what does homogenisation really mean and is it justified?

Homogenisation.
Homogenisation is the name given to the process carried out to adjust the underlying measurement data. The basic assumption is that nearby stations behave all in the same way. They are assumed to all warm in synchrony, and that any outliers that do not must be due to data problems at that station. Over long periods of time stations can get moved or have instrumental changes which will affect the measurements that they make through break points (linear shifts). Another occurrence  is when 2 or more nearby stations are merged together to cover longer timescales. An older station may have closed in 1970 but overlaps with a more recent nearby station and can yield a continuous 150 year timescale when combined. Maybe they are in the same town but differ slightly in altitude, and the temptation then is to combine them by merging the overlap period. The standard homogenisation procedure is to look for breakpoints and for differences in trend with near neighbours – so-called pairwise homogenisation. Sudden kinks are evidence of station moves which then get corrected by shifting earlier data up or down by a linear offset. However there are more subtle effects that homogenisation produces. The pair-wise process has a tendency to align trends to all follow the same positive value. I have seen this clearly in the Australian data.

1.Launceston, Tasmania. The ACORN time series is actually a combination of 3 nearby sites in the city. a) Launceston Pumping Station from 1910-1946. b) Launceston Airport (original site) from 1939-2009 and c) Launceston Airport (current site) from 2004 onwards. The joining of the 3 bands at first sight looks to be fine, but closer inspection shows that the maximum and minimum temperatures in the central section are differentially being shifted so as to produce a linear rising temperature trend where there was none apparent before. There are no obvious kinks in the raw data to justify this.

2. Alice Springs consists of a merge between the Post Office station (1910-1953) and the Airport since 1953. Note how the animation shows an increase in minimum temperatures at the airport resulting in a linearisation of the trend, neither of which has any direct connection with the merge.

 

3. Dubbo shows how a trend can be generated from a simple merge of two stations, the  Post Office and airport.  This should be a straightforward linear shift between the two locations but again the shape of the early data is changed thus producing a small linear warming trend. This is most likely generated by the pair wise homogenisation.

Notice how the shift in the early PO data is time dependent generating a warming trend where none previously existed.

So why do we use Temperature “Anomalies” anyway?
If there was perfect coverage of the earth by weather stations then we could measure the average temperature of the surface and track changes with time. Instead there is an evolving set of incomplete station measurements both in place and time. This causes biases. Consider a 5°x5° cell which contains a 2 level plateau above a flat plain at sea level. Temperature falls by -6.5C per 1000m in height so the real temperatures at different levels would be as shown in the diagram. Therefore the correct average surface temperature for that grid would be roughly (3*20+2*14+7)/6 or about 16C. What is actually measured depends on where the sampled stations are located. Since the number of stations and their location is constantly changing with time, there is little hope of measuring any underlying trends in temperature. You might even argue that an average surface temperature, in this context, is a meaningless concept.

The mainstream answer to this problem is to use temperature anomalies instead. Anomalies are typically defined relative to 12 monthly ‘normal’ temperatures calculated over a 30-year period for each station. CRU use 1961-1990, GISS use 1959-1980 and NCDC use the full 20th century. Then, in a second step, these ‘normals’ are subtracted from the measured temperatures to get DT or the station ‘anomaly’ for any given month. These anomalies are averaged within each grid cell, and combined in a weighted average to derive a global temperature anomaly. Any sampling bias has not really disappeared but has been mostly subtracted. There is still the assumption that all stations react in synchrony to warming (or cooling) within a cell. This procedure also introduces a new problem for stations without sufficient coverage in the 30-year period, perhaps invalidating some of the most valuable older stations. There are methods to avoid this based on minimising the squares of offsets, but these rely on least squares fitting to adjust average values. The end result however changes little whichever method is used. Far more important however, are the  coverage biases involved in spatial averaging. How do you combine SST, and land data to derive a single global temperature (anomaly).

Global averages
Deriving an average global temperature anomaly involves the surface averaging of the combined ocean and station temperatures. The distribution in space and time varies enormously before about 1950. Different groups adopt different averaging schemes. HadCRUT4 and NCDC adopt a simple (Lat,Lon) binning whereby they take the average of temperature anomalies within each bin. A similar process has already been performed on the ocean data HADSST3/ERSST4. They then form a weighted average over all occupied bins weighted by cos(lat) to compensate the bin size change with latitude. GISS instead use approximately 8000 equal area cells for binning. Each station within 1200km can contribute inversely weighted according to its distance from the bin centre (zero at 1200km). This populates empty bins with data extrapolated from nearby bins, so essentially they interpolate data into empty regions. The area weighted average is then done in the same way as HadCRUT4. Berkeley Earth (Robert Rohde, 2013) use kriging from the start to derive a temperature distribution projected onto a 1-degree grid. Cowtan and Way (Cowtan & Way, 2014) attempt to correct Hadcrut4 for empty bins by kriging the HadCRUT4 lat,lon results into empty bins and in particular extending coverage over Arctic regions.  You can see what the effect of kriging has on empty cells below where I krig the HadCRUT4 results for 2016 and then the  far sparser result for 1864.

Hadcrut4.5 measured anomaly data (no kriging)

Kriged result for February 2016

The infilling of the Arctic depends on the few nearby isolated data.  As a result these kriging techniques mostly only affect recent data because before 1950 the data is so sparse that kriging into empty areas changes very little. This then introduces an information  bias in that the most recent temperatures are enhanced by interpolating sparse nearby data to cover more of the Arctic regions where warming is stronger, but not the earlier data which lacks any Arctic coverage to work at all so biases the Arctic  cold.

Raw H4 data Jan 1884

kriged values Jan 1884

 

Finally there are the 3-D techniques which myself and Nick Stokes have been developing independently, which I think are the most natural way to integrate temperatures over the earth’s surface. Weather station data and sea surface temperature measurements are considered point locations (lat,lon) on the surface of a sphere. A triangulation between these point locations then generates a 3D mesh of triangles each of whose areas can be calculated. The temperature of each triangle is calculated as the average of the 3 vertices, and the global average is the sum of the area weighted temperatures divided by the surface area of the earth.

Figure 2 Spherical triangulation of CRUTEM4 & HadSST3 (Hadcrut4.5) temperature data for February and March 2017.

The way you calculate the global average changes the result, but only in recent years. Interpolating into unmeasured regions like the Arctic increases the weighting of warmer areas. There is a simple reason why the Arctic appears to warm faster than other regions – because it starts off being much colder. The extra energy needed to raise the temperature from -50C to -49C is much less than from say +24C to +25C. This is due to Stefan-Boltzmann’s law \frac{DS_1}{DS_2} = \frac{T_1^3}{T_2^3} or 0.42. That means  radiative forcing only needs to increase  by a 40% rise in CO2 at the North Pole to produce a 1C rise in temperature.  So clearly if you focus on increasing coverage of the Arctic instead of say Africa then you will boost the global average. However this only works if you have stations in the Arctic, and for some reason the Antarctic shows much less warming.

CMIP5 Model comparisons

A fundamental problem with all models is that they also cannot agree on what the absolute global temperature should be to balance energy. As a result they too rely on the use of  temperature anomalies to make their “projections”. To do this they simply normalise to past measured  temperatures and tune volcanic forcing. Only future projections can be eventually tested. The error in projections is still about 1C.

CMIP5 Global surface temperatures taken from the paper. The coloured graphs are Meteorological reanalyses and represent best ‘observed’ global values

By comparing anomalies to past data they can adjust and subtract their ‘normals’ and then tune volcanic and aerosol forcing so as to match past ‘measured’ temperature anomalies. They then project these values into the future. There was then clearly a problem for climate science in AR5  because all models from such a 2005 ‘normalisation’ were running much hotter than the data.

AR5 Comparison of global temperature anomalies with CMIP5 models

So were all the models wrong?

SST blending 

Once again Kevin Cowtan (who is a chemist) came to the (partial) rescue of climate science. He noticed that the ocean data measured sea surface temperature, whereas the models were predicting  2m air temperatures. Theoretically there is a slight difference due to latent heat of evaporation cooling air temperatures. He ‘blended’ the model variable ‘tos’ (temperature at surface) over oceans with ‘tas’ (air temperature 2m above the surface) over land. This reduced slightly the discrepancy of the models with the data as shown below. Blending is the difference between the red and blue trends.

The blue curve is the average of 6 CMIP5 model results for the global temperature anomaly. The red curve is the blended result corresponding to Hadcrut4.

Of course ocean temperatures don’t really measure SST either. The historical use of buckets, ERI and Buoy measurements all favour  different depths below the surface. Only satellites can truly measure real SST values. Furthermore some areas of the oceans have average swell heights well over 2m making any small differences in depth or height above the ocean almost irrelevant.

Warming occurs mostly at night

Normally only the average temperature (Tav) anomalies is presented in the media and the rise in Tav is what we call global warming. However I decided to analyse in exactly the same way both components of Tav, namely Tmax and Tmin anomalies. Here are the results for global temperatures (including oceans).

Comparison of temperature anomalies normalised to 1961-1990 for a)Tmax b)Tmin c)Tav. The difference Tmax-Tmin is shown by the blue curve plotted on the right hand Y-scale.

A reduction in Tmax-Tmin of about 0.1C is observed since 1950. Minimum temperatures always occur at night over land areas. This means that nights have actually been warming faster than days since 1950. The effect over land is of course much larger than 0.1C because nearly 70% of the earth’s surface is ocean with just single monthly average temperature ‘anomalies’. So nights over land areas have on average warmed ~ 0.3C more than daytime temperatures. So if we assume that average land temperatures have risen by ~1C since 1900, then maximum temperatures have really risen only by 0.85C while minimum temperatures have risen by 1.15C. This effect may also be apparent in equatorial regions where the night/day and winter/summer temperature differences are much smaller than at high latitudes.

Meridional Warming

If you take a spatial temperature series and integrate it over longitude and seasons then you get a meridional temperature profile. Here are the results for Hadcrut4 from 1900 to 2016.

Figure 1. All 117 meridional temperature anomaly profiles from 1990 to 2016. They are coloured blue if the annual global anomalies < -0.2C, Blue,-0.2<grey<0.2, 0.2<yellow<0.4, red > 0.4. Traces are 80% transparent to view them all.

This shows that warming is amplified at high latitudes as expected, although with surprisingly less evidence of any increased change in Antarctica. Another way to view this is to see how little the net average temperatures have changed versus latitude. Now we see that the differences are relatively small. So what seems like a large global increase of 2C is small on this absolute scale. Arctic ice will not disappear while average annual temperatures remain below -15C

Figure 2. Average temperature profiles from 1900 to 2016 calculated relative to a standard profile. Colour scheme is the same as Figure 1.

Over the long history of the earth the climate has been through extreme states of  a ‘Hothouse’ during the Jurasic and a ‘Coldhouse’ during Snowball earth and the current Pliocene.  These are the meridional profiles estimated for such extreme climates.

Credits: Christopher R. Scotese. Palemap Project 2015

The current global average annual temperature is about  14C rising to 15C. So we are currently in an Ice House climate, but 15,000 years ago the earth was in a Severe Icehouse.

A wetter world

A warmer world is likely going to be a wetter world. GHCN-Daily contains the raw precipitation measurements from about 100,000 weather stations some dating back back to 1780.  The rainfall data can be processed in exactly the same way as the temperature data to derive rainfall anomalies relative to a 1961-1990 average. So I put my computer  (iMAC-i7) to calculate this , but it took a week of CPU time!  Here are the results as compared to temperature anomalies.

Top graph is the change in the global annual average daily rainfall compared to the 30 year ‘normal’ value from 1961-1990 (Rainfall Anomaly) . The bottom graph compares this to the global average temperature anomaly (CRUTEM4 in blue) and my own GHCN-DAILY in green.

There has indeed been about a 1mm increase in average daily rainfall since about 1980. This is actually beneficial for arid areas. A slight increase in land rainfall makes sense because of more evaporation form the oceans. So dire warnings of famine and drought are likely false, whereas we can probably better adapt to a warmer but wetter world.

Summary

There is no doubt that human activity has led to a ~40% increase in atmospheric CO2 and that this has warmed the planet so far by roughly 0.8C. This could rise to a doubling of CO2 levels with a temperature rise of 2C by 2100. By then we will have stabilised emissions at some lower level leading to temperatures peaking at ~3C warming in the 22nd century followed by a slow stabilising of climate to some cooler level at perhaps 1C above the natural climate. This is not a climate disaster. It will not lead to any mass extinctions, flooding of coastal cities or vanishing coral islands. Humans have for millenia dramatically changed the earth’s environment through deforestation, hunting of large animals, use of fire, farming and pollution. Modern life also introduced an overuse of plastics which all ends up in the oceans. Much of the world does not have ‘recycling’ so all this plastic ends up in rivers and transported it to the sea, endangering marine life. We know how to stop plastic pollution. We know how to improve the environment, but what can we possibly do about climate change?  Climate change is a problem in the same way that original sin is a problem.  Modern society cannot function without energy. We abandoned renewable energy in the 19th century for good reason, can we really turn the clocks back? I doubt it because today’s renewables aren’t even what they claim to be because they can’t even renew themselves without fossil fuels, let alone power transport and heating. Nuclear Energy is a solution as France proved in the 1980s, but  we can’t even agree on that anytime soon. Meanwhile we pretend to go green and buy electric cars dependent on lithium mining in Chile.

Climate change is a ‘problem’ but not as serious as it is made out to be. It is more of a distraction from accepting that we ourselves are the real problem. Climate change is just a symptom of that. We have been so successful for the last 10,000 years because we learned how to exploit the environment for our benefit. Now we have to learn how to maintain that environment on which our future and the rest of nature depend. If we can do that the climate will then simply look after itself.

This entry was posted in AGW, Climate Change and tagged , . Bookmark the permalink.

35 Responses to A sceptic’s guide to global temperatures

  1. Hugo says:

    Hi, great work….. I fuly agree.
    Another factor worth mentioning is the killing of the big land and sea grazers. And of course we destroyed an immense amount of natural rain forest.
    Its a big shame that politics will ignore this.

    P.S
    ….
    “So I put my computer (iMAC-i7) to calculate this , but it took a week of CPU time!
    If its a database system get a memory disk. if not. …..
    ….. You know the main reason for the huge increase in electrical power consumption is / are computers and similar devices.

    • Clive Best says:

      Thanks,

      Yes, early man killed off mammoths, hairy rhinos etc. at the end of the Ice Age. Likewise the Megafauna in Australia. Our most lethal weapon is fire.

      I promise you it was pure CPU calculation time There are 100,000 stations each containing 365 days per years of data. Probably I had a daft algorithm.

      😉

  2. Hugo says:

    Well thank you.. For all the hard work.

    Sure hunting of mammoths etc. but I actually do not think that part really contributed. Probably the meteorite impacts did a lot more for that extinction part. For hunters lack of food is directly connected to survival. Wild meat can not be preserved for extensive time so you need fresh or you die. (sure freeze it.like eskimos..)Agriculture and domestication of animals changed that. You could store food for longer times, and it was plannable. That made also why populations started to really grow.
    But what happened the last 2000 years or so. In Europe Africa and USA and Australia too and especially the acceleration when firearms became common made a big difference. Things like the mass killing of the buffalos, elephants rhino’s etc. massive whale killings is a big underestimated factor. Planes which were fertile turned into deserts, which has a big influence on soil temperatures.

    The co2 oil part.. real estimates are that we only have found like 30% of all usable hydro carbon reservoirs, which is not that much, considered it was all taken out of circulation too. C02. below 250 is ice age level… Plants need really a lot more to grow optimal. I do think we actually saved our self’s.
    Like you said too. A little warmer you can overcome, as most green loves warm. and most people love homes in the Caribbean. Colder will mean less food. and that’s a lot harder to overcome, with a population which grew from like 1 billion into more than 7 in less than 200 years…..
    And tesla batteries will not do the job. Maybe the fusion test plant in France will.

    I had major discussions over the last ten years. And still have.
    I refer you your great work many times. I live in the tropics on an island a 100 meters from the sea. We even do not really have a temperature weather forecast. Temperature is always like 28-33 or so. We have wave heights…. Normally between 1 and 2 meters. Y always laugh when people start about sea level rising. in mm per year. They forget that a regular storm can create local differences up to over 30 meters in hours.. I was born in the Netherlands and lived 8 meters below sea-level… The west parts drops up to 5 mm per year. while some parts in the east are rising. Nobody there is panicking about rising sea levels… Only about the costs. But being relatively one of the richest countries in the worlds that is also a lot of fuss about nothing.

    I love to read your work. I did a lot of research too. The ice drilling part is immense complex if you really get into it.. But yes they too are totally depended on funding and fear rules over honesty and relativity.

    To put everything in writing and on the internet is a lot of hard work, especially to keep data pure.
    And I know because its part of my work, Data and information is not the same. and with data statistics you can prove anything you want. Depending on the selection.
    Unfortunately i do not have that time, Need to work hard to feed my young children i have 5, and they are like for so many, others our future. We spend trillions on arms and war. Pollute and burn and destroy more than needed. And think global warming is a threat, while probably its extra time given to learn and grow, until the next meteorite.
    So again thank you…

    An extra PC installed with SQL server, to store all the data. and to run query’s on. ??

    • Clive Best says:

      Thanks for the comments,

      Yes it was really the whole scale clearance of land for farming which killed wildlife. Deforestation for palm oil plantations and cattle grazing is continuing. It is hypocritical of us to complain when the third world starts doing the same, but somehow we need to preserve large tropical forests which breath in CO2.

      Whaling was really about oil for lamps and soap before electricity and natural/coal gas became available. It seems incredible that we decimated the population for something seemingly today so banal.

      Yes a 20cm sea level rise over 100 years is tiny compared to 5m spring tides in 6 hours. Tropical temperatures are rather unlikely to change much in the future. You might possibly get slightly more thunderstorms and rain.

      cheers

    • Mike Ellwood says:

      QUOTE: “Maybe the fusion test plant in France will.”

      And maybe it won’t:

      https://thebulletin.org/2018/02/iter-is-a-showcase-for-the-drawbacks-of-fusion-energy

      https://thebulletin.org/2017/04/fusion-reactors-not-what-theyre-cracked-up-to-be

      A better approach:

      http://www.thesciencecouncil.com/pdfs/P4TP4U.pdf

      ( “Prescription for the Planet” by Tom Blees )

      • Clive Best says:

        Let’s stick with nuclear fission then unless there is a breakthrough in fusion. e.g.

        https://www.tokamakenergy.co.uk

        • Mike Ellwood says:

          I wish them well, of course. Perhaps the naysayers regarding fusion will turn out to be as wrong as the naysayers regarding fission are (IMHO) wrong.

          @Hugo: Yes, the nuclear industry hasn’t always done itself a lot of favours in many ways. For one thing, I think at one time it suffered from the kind of hubris, that we now see, ironically enough, from religious renewals advocates.

      • Hugo says:

        Well the project is not set up to be commercially successful. But its a very good try. All major players work together that’s worth smething. At least they are not investing in weapons. We probably would have had a solution years ago if not plutonium was the goal. Thorium reactors were operational..
        Even if they do no succeed. I think a spin off will.

  3. Mr Broccoli says:

    Thanks for this essay Clive. The analysis is fantastic, a real labour of, not live, perhaps persistence of obstinacy. It’s a shame that when people stop sitting on the fence that they become deaf and so will not even read this great essay. I will give a copy to some of my brother students and hopefully proud them into thinking and discussing instead of regurgitating the “facts” that they have been taught.

    Interesting to see your calculation on the increase in precipitation. A warm wet planet, or a dry cold one, but never a warm dry one, or at least not until a the seas gang dry. An increase in precipitation means that on average there must be more water in the atmosphere. I know that climatologists negate the role of water because it cycles fast, but I wonder if a small increase in average water content will not be more powerful than an increase in CO2? I have read that because water absorbs heat across a broad spectrum of is about 30 times more powerful as a greenhouse as CO2. I have also read that in the lower atmosphere water is about 2%. CO2 is currently 0.04%. If average water concentration Increased to 2.01 % this would affect warming much more than a small increase in CO2 concentration. Of course water is tricky because it exists in different states and different cloud types.

    • Clive Best says:

      Thanks for the comments!

      A warm wet planet is much better than a cold dry one. It is true that a warmer atmosphere atmosphere can hold more water, yet the water cycle also speeds up. So more H2O gives more GHE, but more clouds will also cool the surface. Another complication is that the lapse rate reduces changing towards the moist lapse rate 6.5C/1000m. So it is not clear whether the net effect of more H2O is to warm or cool the planet. Most models predicts it warms acting as a positive feedback to CO2. The effect probably isn’t even linear, but however you look at it the oceans stabilise the climate on earth.

  4. Excellent article. Thank you for your work.

  5. Ron Clutz says:

    Thanks Clive for this compendium of facts regarding the attempt to measure global temperatures. You really delivered a guide, not just for sceptics, but for anyone curious enough to learn about all the moving parts relating to this issue. I have also much appreciated your posts describing how using anomalies biases averages in favor of the more volatile temps in higher latitudes.

    Count me among those thinking it futile to pursue global mean temperature in light of all the variability over both time (daily, seasonally and longer) and place (latitude, altitude, land and sea).
    I understand that climatologists are locked into anomaly analysis, but do think it preferable to calculate slopes at the individual sensor and compile those rather than try to average the temps themselves. Lobos Motl demonstrated this some years ago, and I did some studies also to show the usefulness for local and regional policymaking.

    An example: https://rclutz.wordpress.com/2015/04/26/temperature-data-review-project-my-submission/

    • Clive Best says:

      Ron,

      Yes, anomalies tend to lose sight of what we actually experience – local temperatures and these follow the seasons. So local temperature series keep things in perspective. Here is one of the longest series Central England Temperatures (CET) for two winter/summer months. This spans before the industrial revolution to the modern world.

  6. climanrecon says:

    A very readable and useful post, but there are several issues that I don’t agree with.

    The first one is about the variation of Tmax in Australia, which I find has increased in recent decades (at least in the South East and in New Zealand) but exhibiting the classic “Pause” in the 21st century. I agree that it is Tmin that is the pause-buster.

    Here is my reconstruction of 12-month moving average Tmax in the Walgett region of NSW, this was obtained by a combination of “cutting” records at known and detected inhomogeneities, and regional averaging by the “first-difference” method (integration of average interannual temperature changes, independently for each month).

    ?w=840

    • Clive Best says:

      I suspect we are plotting different things here. Correct me if I am wrong but I think yours is the annual ‘anomaly’ of Tmax. This does indeed show an increase. However I bet that the same thing for Tmin ‘anomaly’ shows a larger increase.

      My plots are absolute temperatures and are based on the maximum and minimum recorded temperature at each station per year. These are then area averaged over Australia. This covers all stations not just ACORN-SAT. What it shows is that on average the record temperatures haven’t changed, but the record minimum temperatures (at night in winter) have increased.

      These are the record temperatures ever recorded from all Australian stations (Albany must be spurious though! )

      • climanrecon says:

        I don’t use anomalies as such, everything is based on interannual temperature changes, with an arbitrary choice of 2018 as the zero point.

        I don’t think that “Global Temperature” is the demon that is sometimes portrayed, as all that really matters are CHANGES in temperature. If everywhere in the world changes temperature by 1C between (say) 1900 and 2000 then it would be uncontroversial to say that the Global Average Temperature Change was 1C, and an area-weighted average can give the actual value for the Average Change.

        Your absolute temperatures have the problem of changing station locations, such as many being in towns until around 1970, then being relocated to airports.

        • Clive Best says:

          I agree that temperature changes DT are what matters globally.

          Those absolute temperatures however were at fixed stations. To check my results simple download the daily minimum and maximum temperatures from all 112 stations that form ACORN-SAT. Then spatially average all annual maxima and minima temperatures.

          • climanrecon says:

            OK, station relocation bias doesn’t apply (if ACORN-SAT has done its thing properly, some say it hasn’t), but there is another potential bias, that not all of the locations have data back to 1910, some only start around 1960, especially the ones in the very hot interior.

  7. Steven Crow says:

    Superb research and fine writing. Where can I find a biography of Clive Best?

  8. dpy6629 says:

    Clive, its a very clear and detailed post. Thanks for doing the work!!

  9. Planting trees may not be the answer., certainly not in the Amazon. Recent studies show a massive methane release from the Amazon. Using the short term (~2 year) multiplier of 84X on methane for CO2 equivalence (as the Amazon continually emits methane):

    42.7 Tg CH4 annually (Pangala, et al, author featured in article) = 3.6 billion tons CO2 equivalent (convert to ton, multiply x 84)
    Amazon uptake = 2.2 billion ton uptake of CO2, annually (NASA, Espirito Santo)

    1.4 billion ton CO2 equivalent NET release!

    https://e360.yale.edu/features/scientists-probe-the-surprising-role-of-trees-in-methane-emissions

  10. Al Shelton says:

    Do you believe in the GHG Theory?

    The UN IPCC said that a doubling of CO2 from 400ppm [parts per million] to 800ppm would result in a global increase of temperature of about 2C degrees.
    The increase is 400ppm. and that is equal to 1 in 2500. Right?
    That is 1 molecule of CO2 in every 2500 molecules of air.
    How can 1 molecule of CO2 “trap” enough “heat” to heat the other 2499 molecules of Nitrogen[N2] and Oxygen[O2] 2 C degrees. The CO2 molecule would have to be nearly as hot as the surface of the sun.
    That is totally absurd. Please explain.
    Thank you

  11. Dan Pangburn says:

    Humanity added to natural warming, but from increased water vapor, not increased CO2. NASA/RSS have been measuring water vapor by satellite and reporting it since 1988 at http://www.remss.com/measurements/atmospheric-water-vapor/tpw-1-deg-product. Fig 3 in my blog/analysis at http://globalclimatedrivers2.blogspot.com is a graph of the NASA/RSS numerical data. When normalized by dividing by the mean, the NASA/RSS data are corroborated by NCEP R1 and NCEP R2.

    Blinded by a misguided focus on the increase in CO2, ‘climate science’ has apparently failed to notice that in the period 1988-2002 about 5 water vapor molecules were added for each CO2 molecule. Since 1900, on average, about 3.6 WV molecules were added for each CO2 molecule. The WV increased about twice as fast as calculated from the average global temperature increase (calculation in Section 8).

    According to Spectracalc/Hitran, at zero altitude there are about 24 H2O molecules for each CO2 molecule and each H2O molecule is about 5 times as effective as a CO2 molecule at absorb/emit of thermal (LWIR) radiation emitted from earth surface. https://pbs.twimg.com/media/ECWhyyDUYAA1P89?format=jpg&name=medium

    A brief explanation of how CO2, in spite of being a ghg has no significant effect on climate is in the last paragraph of Section 2 in my b/a. A more detailed explanation is provided in http://diyclimateanalysis.blogspot.com as follows: (2nd paragraph after Figure 1): “Well above the tropopause, radiation [emitted from molecules there] to space is primarily from CO2 molecules. If you ignore the increase in water vapor near the surface vs time (big mistake), WV averages about 10,000 ppmv. The increase since 1900 in absorbers at ground level is then about 10,410/10,295 = ~ 1%. WV above the tropopause is limited to about 32 ppmv because of the low temperature (~ -50 °C) while the CO2 fraction remains essentially constant with altitude at 410 ppmv; up from about 295 ppmv in 1900. The increase in emitters to space at high altitude (~> 30 km, 0.012 atm), and accounting for the lower atmospheric pressure, is (410 + 32)/(295 + 32) * 0.012 = ~ 1.6%. This easily explains why CO2 increase does not cause significant warming (except at the poles) and might even cause cooling. The exception at the poles (about 13% of earth area) is because it’s cold there at ground level so WV is already very low.”

    Summary, the good news:
    1. WV increase is self-limiting so no catastrophe from warming.
    2. The increasing water vapor is delaying the average global temperature decline expected by many as a result of the quiet sun and eventual decline of net of ocean surface temperature cycles.
    3. CO2 increase has increased plant growth (i.e. food) by at least 15%.

  12. DP said:
    “Humanity added to natural warming, but from increased water vapor”

    This is idiotic. Absolute humidity increases with temperature so there is no way you can flat out assert causality like that. Don’t be such a knob.

  13. Jacques Hagoort says:

    Impressive overview, very well written. One question: according to IPCC SR15 (2018) the temperature in the period 2006-2015 has increased 0,87C since pre-industrial times. How do you reconcile that with your statement that the 1,5C limit was breached already in 2016?

    • Clive Best says:

      The temperature increase depends on which data series you use. So if you base it on the conservative HadCRUT4 data then decadal temperatures have risen by 0.87 ± 0.05 C sine the 19th century. However if you use the latest GHCN-V4 combined with HadSST4 and cover the whole globe then the apparent temperature increase is much larger and transiently exceeded 1.5C in the 2016 El Nino year.

      That doesn’t mean that either one is “correct” but rather shows just how large the systematic error is when defining net “warming”.

  14. JPinBalt says:

    A lot of the source land thermometer data is biased by bad station location, nearby growing urban areas, air conditioners put in at some point close by, more cars and concentrated surface urban ozone levels traping heat, parking lots or new buildings added nearby, add another runway at the airport, or simply urban heat island effect. Anthony Watts has documented many crap stations still going into official statistics or contaminating data.

    Knowing this, they set up for US in 2005 a set of pristine state of art weather stations away from humans and thought to remain in pristine undeveloped areas in future, time series not that long, USCRN.
    Here is the data set from those USCRN stations:
    https://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn&parameter=anom-tavg&time_scale=p12&begyear=2005&endyear=2019&month=5
    which show no warming trend to date since 2005.

    I also seriously question any adjusted and imputed data put out by mathematician and Hansen protogy Gavin Schmidt at NASA GISS which has become political aimed at a producing warming trend, usually look at UAH satilite data.

  15. JonA says:

    Nice article Clive. Has anyone every used historical barometric maps
    in combination with pair-wise homogenization? Weather fronts could
    easily explain some of the ‘wrong’ values being discarded in
    regional comparisons.

    • Clive Best says:

      Thanks.

      I doubt whether anyone has looked at historic weather patterns. However the temperature averages are for monthly values and it is assumed that weather differences average out. This of course depends on far away the stations are from each other.

  16. nobodysknowledge says:

    Impressive analysis. I had a reference to it at a discussion at Science of Doom, to present the change in nighttime temperatures. https://scienceofdoom.com/2019/03/31/opinions-and-perspectives-9-pattern-effects/#comment-143997
    One thing that i would emphasize is the cooling of the sea surface of Antarctica. This goes in the opposite direction of a point of no return of melting icesheets. “There seems to be a pattern of cooling sea surface temperatures around Antarctica, south of ca 50 degS. If you look at Nullschool SST anomalies this is clearly shown. Looks like a trend, perhaps linked to the temperature inversion over south polar area? It should be interesting to know if this is a seasonal trend.” and ” It seems that most of the year the SST shows values under the freezing point along the shores of Antarctica.”
    From wikipedia:”In fact, the troposphere over the Antarctic plateau is nearly isothermal. Both observations and calculations show a slight “negative greenhouse effect” – more radiation emitted from the TOA than the surface. Although records are limited, the central Antarctic Plateau has seen little or no warming.” Schwarzschild’s equation for radiative transfer.

    • Clive Best says:

      Thanks,

      There are some interesting comments on SoD’s post!

      The greenhouse effect depends on a positive lapse rate. In Antarctica (and the Arctic in winter) the tropopause almost disappears and the stratosphere touches the ground. Increasing CO2 will indeed increase radiative cooling !

  17. Liz Falconer says:

    Hi Clive,
    Thanks so much for this blog. I’m no climate scientist and the science does make my eyes cross at some points 🙂 But, you make the arguments very clear and you are really balanced in how you interpret data – I have struggled so hard to find that for ages, it’s such a relief to finally do so!
    My PhD was in risk decision making, and the effects of worldview and risk perception on individual and group responses to risk. I then moved on to researching immersive virtual technologies and how we can experience a sense of place in them, continuing on from my earlier work in how we model the world in our minds. AGW passed the tipping point (sorry!!) from scientific enquiry to belief system some years ago, and the polarisation that has been occurring since saddens me, but is also a fascinating case study in risk perception.
    I guess my main concern is for the continuation of proper scientific method, where falsification is seen as being an important part of that process and not quashed as heresy. Publication bias is a well known consequence of consensus bias, and they mutually reinforce each other. I’ve been reading some of the ‘if deniers (hate that term) are so right, why aren’t they publishing more?’ comments on AGW supporting websites – the commentators clearly don’t know how academic publishing works, or are conveniently forgetting!
    That’s why I had to write to say thank you for this blog and to appreciate the enormous amount of work you clearly put into it.
    Thanks again,
    Liz.

Leave a Reply to Mike Ellwood Cancel reply