The Bern model has been used by IPCC for emission scenarios since SAR in 1995. We will now use it to extrapolate CO2 levels 500 years into the future with annual emissions fixed at 2013 levels (~10 GT C/year). All values are expressed as an equivalent atmospheric CO2 concentration in ppm for convenience.
To do this we integrate emissions over time and and calculate atmospheric CO2 concentrations.
where fac converts GTC to ppm, Em = emissions in year t’ and the a’s and are are the parameters of the Bern model. . For the integration I use this dataset for the emissions data.
The Bern model simply describes the time decay of an annual pulse of CO2 added to the atmosphere. It is parameterised as follows.
where for AR4:
and for TAR:
Here are the results.
Levels do not reach an equilibrium value in the Bern model because a fixed fraction (a0) in any year are assumed to remain for ever in the atmosphere. However even in this case levels rise (just) to ~1270 ppm with AR4 parameters and ~950 ppm using TAR values. The airborne fraction shows how it falls asymptotically to the fixed retention rate.
We can also ask the question as to how well the Bern model describes the measured CO2 levels between 1950 to 2016 based only on the emissions data. Here is that comparison.
The bern model agrees with the current value of CO2 (400ppm), but it does not give a good description of the trend. In fact the actual CO2 growth is slower than that of the model. This supports the hypothesis that a0 is actually very small and that CO2 levels will approach stability much faster.
Let’s consider two estimates of climate sensitivity (ECS) a median value of 2.5C (preferred by Gavin Schmidt) and a low value of 1.5C (preferred by say Nic Lewis).
|ECS||Model Version||Net Warming in 2620|
When you consider that we are assuming annual emissions held constant at ~10 GtC/y for 500y into the future, then these final result are not really that scary at all.