Carbon Budgets

It started as a nice simple idea: There is a finite amount of Carbon that humanity can burn before the planet warms above 2C. This idea was based on  AR5  Earth Systems Models (ESMs) ‘showing’ that the relationship between global temperatures and cumulative emissions was linear. At last the IPCC had something easy for world leaders to understand! This was all nicely  summarised in Figure SPM-10, shown below. The Paris accord is essentially derived from this one figure.

The problem though is that it wasn’t really true. After all, how could something so complex end up being as simple as a linear relationship? The difficulties  arise once you try to work back from emissions data to CO2 concentrations. One only needs to look  at Miller et al’s earlier paper to see that there were underlying problems.

A second important feature of ESMs is the increase in airborne fraction (the fraction of emitted CO2 that remains in the atmosphere after a specified period) over time in scenarios involving substantial emissions or warming (Friedlingstein et al., 2006). An emergent feature of the CMIP5 full complexity ESMs appears to be that this increase in airborne fraction approximately cancels the logarithmic relationship between CO2 concentrations and radiative forcing, yielding an approximately linear relationship between cumulative CO2 emissions and CO2-induced warming.

However their paper then found that AR5 carbon cycle models could not actually reproduce the Mauna-Loa CO2 concentrations from the historic emissions data. In short they needed to use a lower and stable airborne fraction to reproduce historic CO2 levels. I  found exactly the same problem when applying the BERN model- see Carbon Circular Logic

Their new 1.5C carbon budget paper  then found that models based on 2015 “cumulative emissions” were 0.3C warmer than reality (1.3C as opposed to 0.93C). Naturally this was then  reported across the world’s media as climate models were running too hot, and that global warming has been exaggerated.  The Climate Science community then reacted in horror with a string of denials by their peers:

which all culminated in a climb down by the authors: Authors respond to misinterpretations of their 1.5C carbon budget paper

The basic underlying problem is that climate models are CO2 concentration based and not emissions based. ESMs are required to derive atmospheric CO2 concentrations from Carbon emissions. Millar et al. showed conclusively that they fail to do this. That is why models appear warmer than reality for a given carbon budget. The models are running too hot with carbon emissions.

The real result of the Miller et al. paper is that CMIP5 Earth System Models (ESMs) are not handling the carbon cycle correctly. For a given emission scenario ESMs predict too high an atmospheric concentration for any given year. This is because the models result in a higher airborne fraction than reality, and they assume it is slowly increasing (it isn’t).

Temperatures shown in the Miller et al. paper are all plotted against Cumulative Emissions and NOT against CO2 concentrations. It is perfectly true that models are 0.3C warmer than measurements for the same Cumulative Emission. Only if you compare models and temperature measurements against atmospheric CO2 do the ‘Climate’ models appear now to be doing fine, albeit still somewhat on the warm side.

This then explains why the authors find that the remaining Carbon budget to reach 1.5C is now much larger! We now need more emissions than predicted in AR5 to bring CO2 levels up to where model reach 1.5C levels. As a direct result of this, the remaining budget has risen from ~50 GtC to ~200 GtC, which is an increase of a factor 4!

About Clive Best

PhD High Energy Physics Worked at CERN, Rutherford Lab, JET, JRC, OSVision
This entry was posted in climate science, GCM, IPCC, Science and tagged , . Bookmark the permalink.

17 Responses to Carbon Budgets

  1. Ron Clutz says:

    Clear concise summation. Thanks for distilling the recent brouhaha down to the real issue. You may be interested to see that the INMCM4 model which was “best of breed” in replicating GMT fluctuations has been upgraded to INMCM5, and a carbon cycle module is one of the enhancements.

    • Clive Best says:

      I agree that one or two models describe reality better than others. CMIP5 seems to have adopted a model democracy – rather like an old boy’s club. All AR5 models are equally valid and their spread gives a probability distribution of future outcomes. That is why the error on TCR and ECS has not changed for over 20 years.

      There really should be a proper numerical model comparison to select those few which best describe the earth’s climate. That is the only way to make progress.

  2. Eddie Garrity says:


    I remember when computers first came out, the experts were saying that 10kb of disk space is all the space we will ever need. That seems farcical now. I think Scientists tend to underestimate as does society at large, because we don’t know and we can’t know all the variables, present and future and we tend to be optimistic. The demise of the billion plus passenger pigeons is another example. How could a billion pigeons possibly be destroyed? Isn’t a billion infinite? No, definitely finite. It seems better to overestimate the temperature, unless you can be certain you have discovered all the variables, which is not the case. It’s not a modeling competition, time is running out. In a way you’re helping to decide the future of humans, the only life form that knows what’s happening and can possibly do something about it. The earth will definitely survive this, we people may not, or maybe more likely, we’ll be shoved back to an illiterate world. It may begin to happen sooner than we think. What phenomenon in the earth’s past has released as much co2 into the atmosphere at the rate we’re doing it now? And we already know the earth has been much warmer, it’s just that nobody was living on the coasts back then and of course no anywhere else either. Right now the temperature is being buffered and perhaps the sinks will continue absorb, but the buffering and sink capacities are not infinite. I’m reminded of the scene in “The Raider’s of the Lost Ark” when the Ark of the Covenant was opened. We’ve already lifted the lid.

    I’m a layman expressing my opinion.


    • Clive Best says:

      The first computer I used, aged 18, was an IBM 1130 with 8K of memory and a 1MB disk. I was still able to get it to plot 3-D visualisations of a planned ring road around York Minster, which thankfully was never built.

      I don’t think faster super-computers would really help improve modelling the earth’s climate, because the problem is not computing. The problem is the immense complexity of fluid dynamics on a rotating planet, in orbit through the solar system, interacting with life, the oceans, plate tectonics, and human activity. Parts of the system are chaotic.

      I would agree with you that, ignoring other considerations, it would be best to be over-cautious about long term effects. However those other considerations could well lead to the deaths of billions of people, if we were to suddenly ban the use of fossil fuels. Right now modern life depends entirely on the availability of cheap available energy, particularly oil. We probably can restrict the use of coal , but not oil.

      Oil is the basis of farming: fertilisers, tractors, transport, food processing etc. Oil is used for plastics, roads, construction, heating, shipping etc. Can you imagine world trade switching from container shipping to sail?

      Renewable energy is not going to work for mankind because of land area requirements. Nor is it actually renewable either because it cannot rebuild its own infrastructure. It needs steel, earth moving equipment, transport etc. which depend on oil.

      So we are left with just Nuclear Energy plus electrified heating and land transport, with a hydrogen based economy for air transport and shipping. Probably we will always need a ‘sustainable’ use of Oil & Gas, and also Coal for steel making.

    • David Walker says:

      Currently, it appears that the increased atmospheric CO2 content – of which mankind’s contribution to the natural CO2 cycle is estimated at around 4% – has been an effectively unmitigated benefit to the Earth’s climate and ecosystem.

      Greening of the Earth and its drivers
      Global environmental change is rapidly altering the dynamics of terrestrial vegetation, with consequences for the functioning of the Earth system and provision of ecosystem services. Yet how global vegetation is responding to the changing environment is not well established.
      Here we use three long-term satellite leaf area index (LAI) records and ten global ecosystem models to investigate four key drivers of LAI trends during 1982–2009. We show a persistent and widespread increase of growing season integrated LAI (greening) over 25% to 50% of the global vegetated area, whereas less than 4% of the globe shows decreasing LAI (browning).
      Factorial simulations with multiple global ecosystem models suggest that CO2 fertilization effects explain 70% of the observed greening trend, followed by nitrogen deposition (9%), climate change (8%) and land cover change (LCC) (4%). CO2 fertilization effects explain most of the greening trends in the tropics, whereas climate change resulted in greening of the high latitudes and the Tibetan Plateau.
      LCC contributed most to the regional greening observed in southeast China and the eastern United States. The regional effects of unexplained factors suggest that the next generation of ecosystem models will need to explore the impacts of forest demography, differences in regional management intensities for cropland and pastures, and other emerging productivity constraints such as phosphorus availability.

      Here’s NASA.
      Carbon Dioxide Fertilization Greening Earth, Study Finds

      From a quarter to half of Earth’s vegetated lands has shown significant greening over the last 35 years largely due to rising levels of atmospheric carbon dioxide, according to a new study published in the journal Nature Climate Change on April 25.

      An international team of 32 authors from 24 institutions in eight countries led the effort, which involved using satellite data from NASA’s Moderate Resolution Imaging Spectrometer and the National Oceanic and Atmospheric Administration’s Advanced Very High Resolution Radiometer instruments to help determine the leaf area index, or amount of leaf cover, over the planet’s vegetated regions. The greening represents an increase in leaves on plants and trees equivalent in area to two times the continental United States.

      It is very likely that a degree or two of extra temperature – incidentally, the data shows that the recent Global increase in temperature mostly affects nocturnal temperatures in the higher and lower latitudes – would be an almost unmitigated benefit to humanity, increased CO2 not only enhances plant growth but also decreases water requirement.
      On the other hand, a two degree decrease in temperature would be an unalloyed catastrophe and be responsible for the death of billions, especially in the Third World.

      It is worth noting also that the climate scientists are now admitting that the climate models are over-estimating the likely increase in Global temperature and are rowing back their doom and gloom prognostications quite substantially.

    • Ron Graf says:

      “I think Scientists tend to underestimate as does society at large, because we don’t know and we can’t know all the variables, present and future and we tend to be optimistic.”

      Eddie, I assure you that scientists per se have mostly the same tendencies as the rest of the members of educated society. They tend to inflate their own importance and that of their work. They are biased:
      1) By the bandwagon, also known as groupthink.
      2) By being staked to earlier work claims and suggested hypotheses.
      3) By what is perceived will please their superiors, funders and colleagues.
      4) By political and religious implications of their conclusions.

      The best anti-dote to counteract bias is to follow the scientific method and to promote open debate free of vilification or mind-reading. (For example, any scientist that would publicly attempt to label an opponent as a “denier” or in the pocket of funders is doing a dis-service to truth-finding and progress.

      Blogs like this one are a great place to debate. It is a shame that more authors of papers being discussed to not stick around for more than to vent a snipe. There are exceptions. Zeke Hausfather and Robert Way come to mind. I presume some climate scientists are reluctant to open their work to such scrutiny or they might shift their views, as Judith Curry did. One can witness her debating for the “team” and then her break from them on old Climateaudit strings circa 2005-2009.

      Eddie, I am optimistic that any problem can be solved if debate is valued and respected. The best ideas come from challenging claims.

  3. Myles Allen says:

    Deeply depressing that when we make a strenuous effort to counteract some irresponsible coverage that bore no relation either to the article itself or the associated briefings this is characterised as a “climb down”.

    • Clive Best says:

      I think the paper is honest and correct. You said on the Today program “that a bunch of models developed in the mid 2000s indicate that we should have got to 1.3C by the time emissions reached what they are today (cumulative emissions), whereas we have only reached 1C”.

      That was interpreted as saying that the models are running 30% too hot, which is true when compared to emissions. However, if you plot the CMIP5 models against actual CO2 atmospheric levels then the models look much better, especially if like Zeke you chose to use Berkeley Earth + blending.

      The models are not fine because the ESM components predicted higher CO2 atmospheric levels than reality. Your earlier paper already showed this –

      I apologise for using the word ‘climb down’, but I got the distinct impression you had been forced into this “clarification”. All you really needed to say is that the carbon cycle component of Models have caused us to underestimate the carbon budget for 1.5C. A lower proportion of our emissions end up in the atmosphere than previously thought.

  4. Eli Rabett says:

    You say “This is because the models result in a higher airborne fraction than reality, and they assume it is slowly increasing (it isn’t).”

    Given the measurements of the airborne fraction can you justify what is in the parenthases?

    • Clive Best says:

      I used RJ Houghton’s emission data combined with land use data, and then integrated the Bern Model with annual pulses to give atmospheric CO2. This is the result.

      Miller et al. find the same

      Derived airborne fraction needed to match M-L data is ~ 0.45 and pretty much constant.

    • Jim Bouldin says:

      That’s easy Eli. Just download, and plot, the latest (circa 2016) GCP data. Plot the atmospheric retention with or without the land flux included. Linear models fitted to either show almost zero slope, 1959 to 2015, 5+ decades.

      There is no evidence that the atmospheric retention rate is increasing and there is likewise no evidence that either the land or oceanic sinks are even remotely near anything like saturation, or showing any tendency towards same.

  5. Jim Bouldin says:

    Of course, there are two potential dynamics involved in any exponential increase in atmospheric CO2. One is the simple exponential increase in emissions themselves, which, the last few years notwithstanding, has indeed occurred since 1850 or so. And not just that, but the emission *rate* has also steadily increased, up to about 0.5% per year over the last decade or so.

    So, you can still get an exponentially increasing concentration, that approximately offsets the well-accepted logarithmically increasing RF, to produce an approx. linear T increase, without invoking an increasing atmospheric retention rate.

    Note that I haven’t read either of Millar et al.’s papers yet so I don’t know exactly what they say therein. Just making a point. But having said that, I definitely think there is something wrong with the way the Revelle buffer effect is described, estimated, or invoked/modeled, generally. Something is wrong there but I don’t know exactly what it is.

  6. Lars P. says:

    “CO2 fertilization effects explain most of the greening trends in the tropics”

    How vegetation responds to increased CO2 content has been shown in a great deal of studies:

    Over 90% of plants respond positively with increased production with more CO2. This increased CO2 intake by nature should explain why “the models are running too hot with carbon emissions”.
    Even worse, models assume an increased desertification that is not happening. We know plants can better withstand drought with more CO2 in the air – the contrary may happen, semi-desert areas getting green.

    On the other side, the temperature’s ‘data’ warming is very much a result of adjustments. In principle adjustments may be right, however we know since climategate that other reasons then scientific causes may be the base for adjustments.
    It is very cumbersome to see adjustments being in direct relationship with CO2 emissions.

  7. michael hammer says:

    Clive I have a question regarding CO2 distribution in the stratosphere. GHG absorb incident energy at the resonant wavelengths and also emit energy at the same wavelengths according to their temperature. The net effect as I understand it is to absorb radiation from the surface (at the resonant wavelengths) and replace it with radiation from the top of the GHG column. Since the top of the GHG column is cooler than the surface overall OLR is reduced making the Earth warmer. To consider the impact of a GHG one is better working in absorbance not ppm since the ppm concentration tells us nothing about the strength of the energy absorption. In terms of abs, 90% of the OLR (at the ghg wavelength) comes from the last 1 abs of the GHG column. Now the Nimbus data shows the emission temperature over the R and P branches of CO2 is about 220K which corresponds to an altitude of between 11km and about 20km. Heinz Hug claims to have measured CO2 absorptivity and his data suggests the concentration of the entire CO2 column is around 800 abs over the R and P branches (about 2000 abs at the center of the Q branch). The pressure at 20 km is 5.5 KPa meaning 0.055 of the atmospheric column is above this altitude. Other sources also claim CO2 is well mixed in the stratosphere up to 80 km. That would mean 0.055 of the CO2 is above 20km or about 800 * 0.055 = 44 abs for the R and P branches. If true, there is no way radiation over the R and P branch wavelengths could escape to space from an altitude of 20km (let alone 11 km). The emission altitude would have to be more like 46 km but that would imply a far higher emission temperature. Is CO2 well mixed in the stratosphere or is it pooled in the lower stratosphere? Are Heinz Hug’s measurements correct? Is the apparent conundrum due to a reduction of pressure broadening in the rarified stratosphere? What is the answer to this apparent paradox?

    • Clive Best says:


      You are right. The absorption in the stratosphere is a paradox. The central lines actually increase radiative cooling with increasing CO2 concentrations – i.e. an inverse GHE. I calculated this effect here

      Sorry for short reply but I am right now in Patagonia on a very slow connection !


    • Clive Best says:

      Sorry for long delay in replying. I have been travelling in Chile.

      I agree with everything you write above as being correct. The complication is in the hundreds of vibrational quantum excitation states of CO2. The central lines in the 15micron band have very large excitation cross-sections, and are saturated already at pre-industrial CO2 levels way up in the stratosphere.

      The radiative energy to space form CO2 depends on the effective emission height for a particular wavelength. Convection maintains a lapse rate – not radiation, so CO2 IR radiation follows black body radiation at the local temperature T of the emission height. I calculated this a few years ago using HITRAN, where I simply took the effective height as being the level where more than half IR photons emitted by CO2 escape to space.

      See Radiative Forcing of CO2

    • Hans Erren says:

      I received the absorbance measurements from Heinz Hug in 2007,and compared these with Standard EPA data.

      I also calculated absorption values

      So the obvious conclusion is that the bandwidth of Hug sits completely in the saturated part of the co2 spectrum, so that the co2 doubling effects cannot be accurately measured.

Leave a Reply