The Atmosphere is getting heavier

The earth’s atmosphere is gaining mass due to fossil fuel burning.  When we burn coal we add extra carbon atoms to the atmosphere in the form of CO2. For every O2 molecule that we take out of the atmosphere we simply release back an extra carbon atom tacked on. The net effect of this is to increase the total mass of the atmosphere, resulting in a net  increase in atmospheric pressure. How large is this effect and  are there any long term consequences? I decided to look into this after a twitter exchange. All estimates  and any errors are all my own fault. First some facts.

  • Molar mass of CO2 is 41 44g/mol
  • Molar mass of O2 is 32g/mol
  • Mean molar mass of air is 29g/mol

CO2 levels have increased by about 43% since 1750. This means that about 0.14% of atmospheric oxygen has been converted to CO2. This is also confirmed by measurements.

o2_plot

O2 levels are falling by about 19 parts per million each year. This has no effect on nature or on human health, but it is still significant.

So the net fractional increase in mass for the oxygen component is 0.0014*(44-32)/32 = 0.0005. Oxygen is 20% of the atmosphere but makes up 28% of its mass. Therefore the increase in atmospheric mass caused so far by fossil fuel burning is 0.0011 0.0014%. This works out at ~ 5.7 7.2 x 10^14 kg

This figure is nearly half of the annual variation in atmospheric mass of 1.5 x 10^15 kg due to water vapor (1.5 x 10^15 kg). So it is certainly not negligible. This increase in mass m implies a proportional increase in surface pressure P_s through the hydrostatic relationship
P_s = mg
Therefore average surface pressure has increased by ~ 0.011 mb. Does such a small increase matter? What effects if any will this extra mass have?

  • Firstly the slight increase in surface pressure combined with a 40% increase in CO2 density will increase the absorption of CO2 in the world’s oceans.
  • Secondly the extra CO2 molecules will lead to an enhancement in the dissociation of CO2 by UV to CO and O2 in the stratosphere.
  • Thirdly a higher concentration of CO2 will lead to enhanced rock weathering through the dissolving of CO2 in rainwater.

All these tend to increase the natural sinks that remove CO2 from the atmosphere over the long term.  The retention of CO2 emissions is stable at about 42%, so about half of the excess is absorbed each year. This fraction has not noticeably changed for several decades. More subtle effects I can think of are as follows.

  • Barometric pressure falls as P = P_0 \exp{- \frac{M_ag}{RT}z} There has been a very small increase in M_a (molecular weight of air) which therefore will slightly decrease the scale height. The troposphere shrinks a little.
  • There would also be a very small increase in the Dry adiabatic lapse rate \Gamma = -\frac{g}{C_p} because C_p for CO2 is 16% smaller than ‘air’.

The first effect would tend to offset the enhanced CO2 greenhouse effect, whereas the second would enhance it further, although this lapse rate change (1 part in a thousand) is essentially negligible.  In any case, I doubt whether any of this is included in any climate model.

Update: corrected for Molar mass of CO2 (44 not 41)

Posted in Climate Change, climate science, Science | Tagged | 17 Comments

Climate Change spiralling out of control

Last

Click on above to view full animation

This is an updated animation to make it look similar to that of Ed Hawkins Click on the image to view the full animation.

The Eemian interglacial was about 3C warmer than today. If we can learn to control CO2 levels perhaps we can avoid the next devastating glaciation due to start in a few thousand year time. We would then need to keep CO2 > 400ppm !

Posted in Climate Change, Ice Ages | 6 Comments

Land Temperature Anomalies.

This post studies the methodologies and corrections that are applied to derive global land temperature anomalies. All the major groups involved use the same core station data but apply different methodologies and corrections to the raw data. Although overall results are confirmed, it is shown that up to 30% of observed warming is due to adjustments and the urban heat effect.

What are temperature anomalies?

p1If there was perfect coverage of the earth by weather stations then we could measure the average temperature of the surface and track any changes with time. Instead there is an evolving set of incomplete station measurements both in place and time.  This causes biases. Consider a 5°x5° cell which contains a 2 level plateau above a flat plain at sea level. Temperature falls by -6.5C per 1000m in height so the real temperatures at different levels would be as shown in the diagram. Therefore the correct average surface temperature for that grid would be roughly (3*20+2*14+7)/6 or about 16C. What is actually measured depends on where the sampled stations are located. Since the number of stations and their location is constantly changing with time, there is little hope of measuring any underlying trends in temperature. You might even argue that an average surface temperature, in this context, is a meaningless concept.

The mainstream answer to this problem is to use temperature anomalies instead. Anomalies are typically defined relative to monthly ‘normal’ temperatures over a 30-year period for each station. CRU use 1961-1990, GISS use 1959-1980 and NCDC use the full 20th century. Then, in a second step, these ‘normals’ are subtracted from the measured temperatures to get DT or the station ‘anomaly’ for any given month. These are averaged within each grid cell, and combined in a weighted average to derive a global temperature anomaly.  Any sampling bias has not really disappeared but has been mostly subtracted. There is still the assumption that all stations react in synchrony to warming (or cooling) within a cell. This procedure also introduces a new problem for stations without sufficient coverage in the 30-year period, invalidating some of the most valuable older stations. There are methods to avoid this based on minimizing the squares of offsets, but these rely on least squares fitting to adjust average values. The end result however changes little whichever method is used.

The situation in 1990

CRU had developed their algorithms several years before 1990 when NCDC first released their version (V1) of their station data in 1989. This contained 6000 station data many of which overlapped stations provided by CRU. I have re-calculated the yearly anomalies from V1 and compared these to the then available CRU data(Jones-1988) as shown in Figure 1. The first IPCC assessment (FAR) was written in 1990 and essentially  at that time there was no firm evidence for global warming. This explains why the conclusion of FAR was that any warming signal had yet to emerge.

Figure 1: First release of GHCN (1990) compared to Jones et al. 1988.

Figure 1: First release of GHCN (1990) compared to Jones et al. 1988.

We see that all the evidence for global warming has arisen since 1990.

Situation Today

The current consensus that land surfaces have warmed by about 1.4C since 1850 is due to two effects. Firstly the data from1990 to 2000 has consistently shown a steep rise in temperature of ~0.6C. Secondly the data before 1950 has got cooler, and this second effect is a result of data correction and homogenization.

The latest station temperature data available are those from NCDC (V3) and from Hadley/CRU (CRUTEM4.3). NCDC V3C is the ‘corrected & homogenized’ data while V3U is still the raw measurement data. I have calculated the annual temperature anomalies for all three datasets: GHCN V1, V3U (uncorrected) and V3C (corrected). The CRUTEM4 analysis is based on 5549 stations, the majority of which are identical to GHCN. In fact CRU originally provided their station data as input to GHCN V1.

Figure 2 shows the comparison between the raw(V3U) and CRUTEM4 while Figure 3 compares V1 and V3U with the corrected V3C results.

Figure 2: Comparison of V3raw with CRUTEM4.3.

Figure 2: Comparison of V3raw with CRUTEM4.3.

 

 

 

 

Figure 3 Comparison of V1, V3U and V3C global temperature anomalies. V3C are the red points

Figure 3 Comparison of V1, V3U and V3C global temperature anomalies. V3C are the red points

The uncorrected data results in ~ 25% less warming as compared to V3C. This is the origin of the increased ‘cooling’ of the past. The corrections and data homogenization have mostly affected the older data by ‘cooling’ the 19th century. There is also a small increase in recent anomalies in the corrected data. A comparison of V3C, GISS, BEST and CRUTEM4 gives a consistent picture of warming by ~0.8C on Land surfaces since 1980. Data corrections have had a relatively smaller effect since 1980.

Figure 4: Comparison of all the main temperature anomaly results

Figure 4: Comparison of all the main temperature anomaly results

Relationship of Land temperatures to Global temperatures

How can the land temperature trends be compared to ocean temperatures and global temperatures ? The dashed curves in figure 4 are smoothed by a 4-year wavelength FFT filter. We then average all these together to give one ‘universal’ trend shown as the grey curve.  The result is a net 0.8C rise since 1950 with a plateau since 2000 (the Hiatus). Let’s call this trend L.

Now we can take the ocean temperature data HadSST2 and perform the same FFT smooth giving an ocean  trend ‘O’. We can define a global trend ‘G’ based on the fractions of the earth’s surface covered by land and that covered by ocean.

G = 0.69*O + 0.31*L

Next we can simply plot G and compare it to the latest annual Hadcrut4 data.

P6

It is a perfect fit! This demonstrates an overall consistency between the ocean, land and global surface data. Next we look at other possible underlying biases.

CRUTEM3 and CRUTEM4

There are marked differences between CRUTEM3 and CRUTEM4 for 2 main reasons. Firstly CRU changed the method of calculating the global average. In CRUTEM3 they used (NH+SH)/2 but then switched to (2NH+SH)/3 for CRUTEM4. The argument for this change is that land surfaces in NH are about twice the area of those in SH. The second reason for the change moving to CRUTEM4 was the addition of ~600 new stations in arctic regions. Arctic anomalies are known to be increasing faster than elsewhere. This is also the main reason why 2005 and 2010 became slightly warmer than 1998. CRUTEM3 has the hottest year being 1998.

This process of infilling the Arctic with more data has been continued with Cowtan and Way[3] corrections to CRUTEM4. This is possible because linear (lat,lon) grid cells multiply fast near the poles.

Figure 5 Differences between CRUTEM3, CRUTEM4 and different global averaging methodologies. All results are calculated directly from the relevant station data.:

Figure 5 Differences between CRUTEM3, CRUTEM4 and different global averaging methodologies. All results are calculated directly from the relevant station data.:

To summarise

  1. ~0.06C  is due to a large increase in Arctic stations
  2. ~0.1C is simply due to the change in methodology

Arctic stations have large seasonal swings in temperature of 20-40C. Interestingly the increase in anomalies seems to occur in winter rather than in summer.

Urban Heat Effect

UHI

Fig 6: Growing cities appear to be ‘cooler’ in the past when converted to temperature anomalies

The main effect of Urban Heat Islands(UHI) on global temperature anomalies is to ‘cool’ the past. This may seem counter-intuitive but the inclusion of stations in large growing cities has introduced a small long term bias in global temperature  anomalies. To understand why this is the case, we take a hypothetical rapid urbanization causing a 1C rise compared to an urban station after ~ 1950 as shown in figure 6.

All station anomalies are calculated relative to a fixed period of  12 monthly normals e.g. 1961-1990. This is independent of any relative temperature differences. Even though a large city like Milan is up to 3C warmer than the surrounding area, it makes no difference to their relative anomalies.

Figure 7: comparison V3U(blue) V3C(red) CRUTEM4(green). Top graph shows adjustments

Figure 7: comparison V3U(blue) V3C(red) CRUTEM4(green). Top graph shows adjustments

Such differences are normalized out once the seasonal averages are subtracted. As a result ‘warm’ cities when graphed as anomalies appear to be ‘cooler’ than the surrounding areas before 1950. This is just another artifact of using anomalies rather than absolute temperatures. A good example of this effect is Sao Paolo, which grew from a small town into one of the world’s most populated cities. There are many other similar examples – Tokyo, Beijing etc.

Santiago, Chile. NCDC corrections increase 'warming' by ~1.5C compared to CRUTEM4

Santiago, Chile. NCDC corrections increase ‘warming’ by ~1.5C compared to CRUTEM4

To investigate this effect globally I simply removed the 500 stations showing the largest net anomaly increase since 1920, and then recalculated the CRUTEM4 anomalies. Many of these ‘fast warming’ stations are indeed large cities especially in developing countries. They include Beijing, Sao Paolo, Calcutta, Shanghai etc. The full list can be found here.

UHI-trend

Figure 8 The red points are the standard CRUTEM4.3 analysis on all 6250 stations. The blue points are the result of the same calculation after excluding 500 stations showing greatest warming since before 1920. These include many large growing cities .

Figure 8 The red points are the standard CRUTEM4.3 analysis on all 6250 stations. The blue points are the result of the same calculation after excluding 500 stations showing greatest warming since before 1920. These include many large growing cities .

The results show that recent anomalies are little affected by UHI. Instead it is the 19th century that has been systematically cooled by up to 0.2C. This is because large cities had mostly already warmed before 1990. Anomalies just measure DT relative to a fixed time.

Data Adjustments and Homogenization.

There are often good reasons to make corrections to a few station data. Some individual measurements are clearly wrong generating a spike. These ‘typo’ errors are easy to detect and correct manually. Data homogenization, however, is now done by automated procedures, which themselves can lead to potential biases. Homogenization will smooth out statistical variations between nearby stations and accentuate underlying trends. Systematic shifts in temperature measurements do occur for physical reasons, but detecting them automatically is risky.

  1. Station can move location causing a shift especially if altitude changes.
  2. New instrumentation may be installed.
  3. Time of Observation (TOBS) can change – this is mainly a US problem.

The NCDC algorithm is based on pairwise comparisons with nearby sites to ‘detect’ such shifts in values. However, if this process is too sensitive then small shifts will occur regularly. This seems to be the case in the 2000 cases I have looked at. These shifts are systematically more negative in the early periods. A priori there should be no reason why temperatures 100 years ago could not be as warm as today. Yet the algorithm systematically shifts down such cases. One exception is some large cities such as Tokyo where a strong warming trend is flagged as unusual. As discussed above the net effect on anomalies is still to cool the early periods.

Some examples are shown below. Vestmannaeyja is an Icelandic station. The graphs shown are green for CRUTEM4 , Red for V3C and blue for V3U.

Vestman

The algorithm has detected a ‘shift’ around 1930 lowering measured temperatures before that date.

The graphs in each figure are ordered as follows. The bottom graph shows the monthly average temperatures. The next 2 graphs are the monthly anomalies followed by the yearly anomalies. Finally the top graph shows the adjustments made to the yearly anomalies by NOAA (red) and CRU (green).

Santiago-Pudahel

Santiago, Chile. Apparent warming has been mostly generated by the NCDC adjustments themselves. Is this justified?

Most of the NOAA adjustments are simple step functions that shift a block of measurements up or down. These are the sort of corrections expected for a station move, but have been applied to all stations. The algorithm searches out unusual trends relative to nearby stations and then shifts them back to expected trends. Mostly it is the early data that gets corrected and most corrections are predominantly negative. This explains the origin of the 25% increase in net global warming for V3C compared to the raw data V3U. Some changes are justified whereas others are debatable. One example where the correction algorithm generated an apparent warming trend which was not present in the measurement data is Christchurch in New Zealand.

Christchurch

Christchurch, New Zealand. The top graph shows the shift introduced by NCDC correction algorithm. There seems to be no apparent justification for the two shifts upwards made in 1970 and 1990. Without these shifts there there is no warming trend in the data after 1950.

The pair wise algorithm looks for anomalous changes in temperature data by comparing measurements with nearby stations. Since some of the stations are also affected by UHI, there remains the suspicion of a past cooling bias also infecting the correction algorithm.

Conclusions

Measuring global warming based on temperature anomalies is a bit like deciding whether the human population is getting taller based on a random sample of height measurements. Despite this, there can be little doubt that land surfaces have warmed by about 0.8C since 1970, or that since 2000 there has been little further increase.

Data correction and homogenization as applied to station data have suppressed past temperature anomalies and slightly increased recent ones. The net result is to have increased apparent global warming since the 19th century by 25%.

The Urban Heating Effect also decreases apparent values of past temperature anomalies by ~0.2C while leaving recent values mostly unchanged. This is due to a re-normalization bias of the zero line.

CRU changed their definition of the Land global average in the transition from version 3 to 4. This and the addition of 600 new arctic sites explain the increase in recent temperature anomalies.

Data sources:

  1. NOAA GHCN V3 : https://www.ncdc.noaa.gov/ghcnm/v3.php
  2. CRUTEM4 : http://www.metoffice.gov.uk/hadobs/crutem4/data/download.html
  3. Cowtan & Way http://www-users.york.ac.uk/~kdc3/papers/coverage2013/
Posted in AGW, Climate Change, NOAA, UK Met Office | 71 Comments