The decadal trend in global temperatures shows a clear almost linear increase since the 1970s. We can compare this increase to that of CO2 in the atmosphere.
One sees a nearly proportional agreement between temperature and CO2 after ~1970. However CO2 levels have recently been rising faster than linear as highlighted by the green arrow. So the question is can we isolate what the climate response is to CO2 and measure the climate sensitivity?
Each decadal temperature measurement has an associated average CO2 value for that decade. Is the global temperature a simple function of CO2 and if so how does that compare to theory ? One problem we have is that the early data from 1850 to 1960 shows a non linear trend. There are two possible reasons for this. Either the effect is real and there was a natural variability before 1960 or instead the early data coverage was not reliable enough to infer trends.
However for arguments sake we will ignore this and assume that the trends are correct. So next we plot the temperature rise versus CO2 levels and compare this to the logarithmic radiative formula I derived earlier here. For each decade I calculate the corresponding average CO2 level . In this way I can then plot temperature versus concurrent CO2 level and compare this to the logarithmic temperature dependence.
Unfortunately there is still a normalisation problem because global temperatures are always calculated as anomalies relative to some 30 year baseline period, (1961 – 1990) in my case. CO2 levels in 1970 were 330ppm so I simply normalise the result to this value. Then we can derive a “rough and ready” climate sensitivity to increasing CO2 levels.
With these assumptions a doubling of CO2 would lead to a 1.7C net increase in global temperatures relative to what appears to have been a cooler period in the 19th century.
Correlation isn’t causation.
It’s hard to figure out exactly what CO2 is doing to the atmosphere when we have no solid proof that it’s doing anything.
Consider that we’re told that with more CO2, this raises the emissions height and then we count down at this same temperature but at a higher elevation using the lapse rate to get a warmer surface.
But if the emissions height is raised in the troposphere, we’re necessarily radiating from a colder temperature. If we then count down using the environmental lase rate (6.5C/km) we get the same surface temperature: nothing changes. To say “at the same temperature” is assuming the conclusion.
If on the other hand emissions temperature is the same but height is increased, this means we’re emitting from the tropopause. But then we can’t count down using the lapse rate, can we? The lapse rate stops before the tropopause begins.
The argument is that if you suddenly were to double CO2 then the troposphere would increase in height so the surface then has to warm enough to rebalance energy through convection.
It looks like the radiative forcing model is inadequate to represent the plotted data. That’s not surprising given the model’s simplicity. Are you not concerned how rapidly the curves are diverging?
I suspect that the early temperature data has systematic problems. Ocean temperatures were measured by ships on trade routes rather than satellites and buoys. There were also almost zero temperature measurements in Africa and South America.
What allowance have you made for lag?
None. Yes I know that ECS means the temperature change after waiting 100 years or so for stabilisation following a doubling of CO2.
That’s why I think ECS is a non-physical parameter since it is unmeasurable. It also doesn’t explain the seemingly cold climate in the 19th century.
The reason I ask is that I once calculated the temperature curve expected if it were due to CO2 using the conventional forcing equation. I used climate sensitivity 3 and a forcing response of 3.7W/m^2/C.
?T=5.35ln(C/Co)3/3.7
Plotted against the GISS temperature curve, there was about a 20 year lag.
The 1.2C warming currently currently experienced corresponds to the warming from the CO2 concentration in the late 1990s, some 395ppm.
The current 420ppm would be expected to cause 1.75C, which at current rates would appear in the 2040s.
I think your analysis highlights the issues with working with anomalies.
If we are trying to measure a gradual heat build-up in the atmosphere, we need to have a factual baseline (actual temperature) rather than one that cannot easily be reproduced.
We have enough real-time automated temperature data to provide a very good estimate of average global temperature.
I have just been looking at the variation that resulted from the relocation of the Sydney Observatory. For the overlapping period of the new and old sensors there is a 0.3 degree increase in average daily maximum temperature. The variation is from -5.2 degrees to +3.4 degrees on any single day.
I dont know how they manage to work out anomalies when there is some much variation between to sensors that are meant to be measuring the same data.
Yes anomalies are used to overcome the ever changing selection of weather stations. If you were to calculate the average temperature based on the actual temperature measurements then the result would show huge variations depending on the spatial distribution of available data .
The repositioning problem showing steps in temperature has been “solved” using the pairwise homogenisation algorithm. Of course it increases overall warming trends
https://clivebest.com/blog/?p=10224
“Is the global temperature a simple function of CO2 and if so how does that compare to theory ? “
Most unlikely. CO? produces a heat flux. Temperature would be more like an integral of the flux. If you turn on a kettle, you don’t expect the temperature to immediately match the flux you put in.
In fact it will depend on flux history, via some kind of transfer function. That is why proper definitions relate a temperature change to a prescribed flux history.
The flux history of ECS is a step change, with flux thereafter constant. The ?T is the asymptotic temperature.
The flux history of TCR is a geometric doubling of ppm CO? over 70 years. The ?T is the temperature after 70 years.
You can’t get any sensible sensitivity without defining a flux history, and a point in time for measuring ?T.
?T is Delta T. The page needs a UTF-8 specification.
I really mean CO2 radiative forcing, while CO2 levels in the atmosphere continue to increasing. Surface temperature increases to re-balance energy at the higher effective emission height.
Adding CO2 changes the effective radiation height in the atmosphere in the emission lines for CO2. That produces a “forcing” to increase surface temperature because the new height is initially colder.
That is all I mean.
“Adding CO2 changes the effective radiation height in the atmosphere in the emission lines for CO2. That produces a “forcing” to increase surface temperature because the new height is initially colder.”
This argument has been bugging me for several years. Seems like an idea someone came up with to avoid confusion around backradiation and the 2LOT.
It’s my understanding that a doubling of CO2 would warm the surface via increased back-radiation. Simple as that. A warmer surface would, in turn, raise the effective radiation height by producing stronger thermals, transporting surface warmth to a higher altitude. Also, a doubling of CO2 means of course a greater concentration of CO2 at high altitudes, and therefore more energy would radiate to space from those high altitudes – even if there was no warming at all.
Here’s a more specific rebuttal to the emission height narrative. Hopefully someone can correct where I’m wrong –
Lets assume the effective radiation layer (ERL) has an average temperature of 255K, emits 240 w/m2 to space, and is in balance with the incoming absorbed solar radiation. No imbalance so no warming.
Now, imagine if we could place an additional and identical layer of atmosphere on top of the ERL – same temperature, same rate of emission to space. Following the above argument there still wouldn’t be any warming because an imbalance has not been created.
Hold on. If the new layer is emitting 240 w/m2 to space then it’s also emitting 240 w/m2 downwards towards the surface, a total of 480 w/m2. But the only radiative input to the new layer, coming from below, is only 240 w/m2. More out than in means it would start to cool and would no longer emit the earlier 240 w/m2 upwards to space.
The surface and lower levels of the atmosphere would necessarily warm until the upward flux to the new layer is 480w/m2, and it’s reached a temperature where it radiates 240 w/m2 both upwards and downwards, same as the original ERL.
If you add a new layer above the CO2 “effective emission height” nothing changes. There is no single “emission height” because it depends on the wavelengths of the various CO2 lines .
The central emission line lies way up in the stratosphere. Adding CO2 actually increases radiation and actually has a net cooling effect on the atmosphere. Everything depends on the fall off in pressure with height thanks to gravity and the height at which each photon emitted by a CO2 molecule escapes to space.
Thanks Clive,
In my mental picture the ERL is assumed to be about 2 kilometers thick, with the new, identical layer just above and still within the troposphere. I didn’t make that clear and you’re right it’s an inaccurate assumption.
In any case I’ve changed my mind and agree with you, thanks in part to something I read at Science of Doom,
“There is a lot of fascination in the subject of the “average height of emission” of terrestrial radiation to space. If we take a very simple view, as the atmosphere gets more opaque to radiation (with more “greenhouse” gases) the emission to space must take place from a higher altitude. And higher altitudes are colder, so the magnitude of radiation emitted will be a lesser value. And so the earth emits less radiation and so warms up.”
It follows from this view that GHG’s throughout the troposphere contribute to its opacity.
The back radiation argument is complete bollocks !
Complete bullocks because the temperature of the new layer would be maintained by convection, regardless of the radiative imbalance I described (480 w/m2 output but only 240 w/m2 input).
Which explains why alarmists via the IPCC had to reduce the target temperature limit from 2.0 C to 1.5 C. It would take until the end of the century to get to the +2.0 C from incremental CO2 and in reality hydrocarbons would have run out by then as we have witnessed in Europe that no longer has sufficient gas and now has to ship it in as LNG from North America, Qatar and farther afield.
> Which explains why alarmists via the IPCC had to reduce the target temperature limit from 2.0 C to 1.5 C.
That’s not quite what happened – studies are showing that many ecosystems are more sensitive to temperature changes than previously thought. Tropical coral reefs, for instance, will be substantially damaged by a 2.0C increase in global temperature.
So, if we’re setting the temperature target by the level of damage we’d like to avoid, this means we have to lower the temperature target.
My belief is that if dF is kept constant, then after some time the radiative imbalance at the TOA stabilizes. That is, the rate of heat added into the pipeline equals the rate out of the pipeline. If that point were reached and CO2 were the only forcing then I would believe your estimate for ECS.
I did much the same thing that you did using NOAA’s Annual Greenhouse Gas Index (AGGI), which includes the other WMGHG’s. It shows over the last few decades that we have a CO2 doubling equivalent per century. The temperature response over the last few decades is also somewhat constant at a rate of about 2C per century. As such, my estimate for ECS was 2C.
Has the TOA imbalance stabilized? Hard to say. I took a fitted two box model, applied a constant forcing, and found that the imbalance stabilized after roughly 200yrs (IIRC). So I wouldn’t be surprised if the imbalance is still growing. Then there is the question of all the other forcings, mostly aerosols, which adds to the uncertainty. If aerosols are causing a cooling trend, then my 2C estimate can be seen as a lower bound on ECS.
Radiative balance at TOA will probably only stabilise once we let nature itself sort it all out. Left alone plants and forests would eventually reduce CO2 levels.
Alternatively if we could stabilise emissions at some fixed level, then I strongly suspect a new balance would be reached – albeit at a warmer level than if humans had not industrialized !
By “stabilize” I didn’t mean “equalize”. There would still be an imbalance, it just wouldn’t be growing or shrinking much. If this were to happen, I would expect the GMST trends to remain the same as they have been the past few decades. Roughly 0.2C per decade.
I think it is the same thing.
If our emissions of CO2 remained at some fixed level indefinitely with no new environmental impact such as further deforestation, then the climate would eventually stabilise.
Hi Clive. I am puzzled and am struggling to reconcile your result with the conclusions of this fairly recent paper by Douglas Lightfoot: Earth’s Temperature Versus the Sun, Water Vapor and CO2, Journal of Basic & Applied Sciences, 2021, 17, 44-53 .
Obviously Lightfoot has used a very different approach, focusing on the warming effect of water vapour, but his estimates are based on measured data for 20 locations around the globe representing different latitudes and climates. To be brief his final conclusions are (copy/paste):
“5 – Warming by water vapor overwhelms that of CO2 and the other non-condensing GHGs and renders their warming ineffective.
6- For practical purposes, the level of carbon dioxide in the atmosphere is at its upper limit for warming the air of the Earth. No additional amount of CO2 can affect the air temperature”.
Is it the case that, as your post above makes no mention of water vapour, what you have calculated (1.7C) refers to the maximum effect a further doubling of CO2 could have in the absence of water vapour, but if we bring water vapour into the picture then the conclusions of Lightfoot apply ?
And, would it be worthwhile repeating Lightfoot’s method for a much greater number of stations around the globe for which data must be available ?
Hi Bob,
We just arrived back in Italy ! Yes you’re absolutely right that Water dominates the earth’s climate and stabilises temperatures on the long term for life to flourish. The sun was dimmer a billion years ago but the oceans maintained conditions for early life to develop which also had an effect on climate.
Water vapour as a greenhouse gas changes daily. Thunderstorms act to cool the tropics. There is an infinite source of water vapour to act as the ultimate thermostat.
The problem with CO2 is that natural changes such as ice ages happen very slowly, so us burning coal and oil returns changes the balance in the short term.
Bob Peckham, interestingly enough, this recounts the scientific arguments about CO2’s warming potential in the early 1900s. Fourier’s landmark paper in 1896 also predicted about 3-4C of warming per doubling of CO2, but it didn’t take into account water vapor’s effects.
This got hashed out in the scientific community from about 1900-1950. Point #1: there are regions of the Earth where water vapor is low; e.g., deserts and arid regions. Point #2: the maximum absolute humidity of air decreases rapidly with temperature, which means it decreases also with altitude. CO2’s effects become more dominant the higher in the atmosphere you go. Point #3: CO2 has spectral side bands that aren’t covered by water vapor, and these always make a small contribution to its warming effect.
(Of these, points #2 and #3 have the biggest effect)
We only discovered the side bands mentioned in #3 in the late 1940s / early 1950s. After WW2, the air force started to fund research into atmospheric spectra in order to figure out which IR frequencies could be used for heat-seeking missiles. This led to better characterization of the spectra of O2, CO2, H2O, etc., which led to the discovery of these spectral side bands, and those spectral side bands directly led in the 1960s to an increased estimate of CO2’s warming potential. That was a major step in how we got to the present scientific consensus.
Here it is !
Thanks to Windchaser for the interesting post outlining the history of our growing understanding of the warming effects of CO2 over the last 120 years. I found the setting out of points #1, #2 and#3 very useful in highlighting and understanding the need to take account of the variations in humidity around the globe, as well as the variations of the effects with altitude.
Still, the fact remains that the conclusions of Lightfoot are at odds with the “current consensus”, so doesn’t this suggest there is still need for further explanation? Is there something wrong with Lightfoot’s method or analysis? Has anyone seen a critique of it anywhere?
Oh! I forgot to also mention the effects of pressure broadening, which makes spectral curves wider/flatter at lower pressures. So when people say that the spectra of water vapor overlaps with CO2, this overlap would decrease with lower pressure, even ignoring the fact that there’s less water vapor with altitude. (And I forget when this was figured out, but it was also somewhere in that 1910-1950 period).
> Still, the fact remains that the conclusions of Lightfoot are at odds with the “current consensus”, so doesn’t this suggest there is still need for further explanation?
No, Lightfoot could also be at odds with the current consensus if he misunderstood the science.
> Is there something wrong with Lightfoot’s method or analysis? Has anyone seen a critique of it anywhere?
I haven’t seen a critique of it anywhere, but I’m a few pages into it myself and.. yeah, it’s not good. If this was a college paper, I’d expect him to get some major points off for errors in his scientific and mathematical reasoning.
For instance, in Fig 1, he presents charts of tropospheric temperature and specific humidity, which show a strong correlation. From this, he says:
“Figure 1 shows the remarkable correlation between the two temperature plots and the specific humidity plot. It appears water vapor has a profound influence on the Earth’s temperature based on the empirical evidence presented.”
Now.. one of the most basic points about specific humidity is that the amount of water that air can hold (or the amount that it’s energetically favorable to hold) changes sharply with temperature. We would *expect*, all else being equal, that the Earth’s atmosphere would hold less water when it cools, and hold more water when it warms. This is some basic thermodynamics, and has nothing to do with any greenhouse gas potential of water vapor.
So when you see a chart of humidity and temperature and see a strong correlation, your first thought should *not* be “oh, clearly the water vapor is controlling the air’s temperature”. We can show, in a lab, that there is a very strong causation going the other way.
Of course, water vapor does also influence air temperature via a GHG effect. But that’s what you have to parse out; how much is temperature affecting water vapor and how much is water vapor affecting temp. You can’t simply make a claim, based on the strong correlation, that water vapor’s GHG effect is strong.
This but I quoted from Lightfoot’s paper? That wording wouldn’t have passed review in any half-decent journal, and really, I’m sorry, but I’m not being hyperbolic.
I see other errors as I continue reading, but this comment is already too long.
In the sciences, generally papers that are just bad are ignored by the mainstream community. There’s no need to write a rebuttal; they just ignore the work and move on. So.. you shouldn’t expect to see a rebuttal paper published. If someone is getting confused about the basics in your field, there’s no point in arguing with them in the scientific literature.
Nice, clear exposition.
It’s a compelling number. Within the 1979 Charney report’s range, but right at the low end. Corroborated by the superb Nic Lewis’s paper (linked) where he concludes “values between 1.5 °C and 2 °C are quite plausible”.
Moreover, I still don’t understand the lack of focus on the TLT hotspot. The trendline for UAH TLT 12/78 to 9/22 spans 0.518C (endpoints: -0.339, 0.179). In that time Mauna Loa seasonally adjusted CO2 ppm rose from to 336.09 to 418.56, delta = 82.47.
That’s 0.518/82.47 = 0.00628C/ppm. For +280, we get 1.76C.
I’m going with 1.75C as the most likely 2x sensitivity, based on the data we have and the strength of work by independent analysts such as Nic and your good self.
https://link.springer.com/article/10.1007/s00382-022-06468-x
This is a fascinating result that you have obtained. It is almost exactly the same as the result you get if you examine the problem from the view of quite straightforward quanum physics. Basically the number of infrared photons emitted due to the Earth’s ‘black/grey body’ cooling is essentially constant. The number arises because the Sun is the sole source of energy for earth and is essentially constant. The resultant infrared photons are absorbed by one of the greenhouse gas molecules. This molecule has two possible routes for attaining thermodynamic equilibrium after excitation.
(1) Spontaneous emission of a photon
or
(2) Collisional deactivation.
The half life for spontaneous emission is about 10E-8 of a second but there will be approximately 5E+13 collisons per second. So collisional deacactivation is predominant up to 60 km above Earth. The atmosphere is warmed by this energy which increases the kinetic energy/temperature of the atmosperic molecules. This will occur throughout the troposphere and above. The excess temperature gradually dissipates via convection etc. When all the available infrared photons are intercepted by increasing the number of greenhouse gas molecules the greenhouse effect will cease. WilliamHapper describes this position as saturation. He has performed the necessary calculations taking into account several hundred absorption lines and the curve he obtains is almost identical with yours. The knee in the graph which you describe as the intersecion of two straightlines is at about 480ppm of CO2 or the year 2000. He published his results in a beautiful paper about 2 years ago
Thanks,
I used the Hitran databases of all CO2 transition cross-sections. For each frequency in the CO2 band I calculated the height in the atmosphere where >50% of photons emitted by CO2 molecules escape to space. This is the emission height. Below this level I assume that all CO2 molecules are thermalised in collisions with all the other air molecules and temperatures simply follow the lapse rate.
See: The CO2 GHE demystified
I think this is essentially the same as William Happer’s method.
“the Sun is the sole source of energy for earth”
What about the earths molten core ?
Thanks
Isn’t this just the “transient climate sensitivity”, which is a parameter that the IPCC assesses separately from the “equilibrium climate sensitivity”?
It is “transient climate sensitivity” in the sense that CO2 is still always changing. Equilibrium climate sensitivity is a proposal that the earth/ocean system takes centuries to settle down after a doubling of CO2. However it is just a theoretical concept which cannot be measured so I am not sure it is useful.
It’s still hard to follow you.
In your 2nd figure, 10y average global temperature anomaly, if I calculate the relationship between the CO2 levels shown and deg.C, the formula is deltaT = 3.98161 x Ln(ppm CO2)-23.107.
This leads to +2.76°C for a doubling of Co2. How can you indicate only +1.7°C?
What is your calculation formula?
In your 3rd figure, in addition to the error in placing the bars for doubling or quadrupling the CO2 rate, there is a concern with placing your ten-year hypotheses.
I redid the calculation based on the publication of your monthly anomalies and the CO2 rate. Every +5ppm, I look at the corresponding anomaly estimates, averaging over 6 to 22 months depending on the slope of the CO2 rate. Base 300 ppm between 1905 and 1907.
Of course, there are fluctuations depending on whether we come across a strong El Nino or a volcanic eruption, or a strong El Nina.
This gives a better view of the real evolution, without bias with averages over 120 months.
We find ~+2.7°C during a CO2 doubling period.
For me, this is only an index, because in reality, in addition to the forcing due to CO2, there is, in particular, that, positive, of other greenhouse gases and negative, that of aerosols.
I can’t post the curve. It’s a shame because it’s very explicit.
@gpiton, looking at Clive’s previous post which describes his methodology, it looks like, for this model, Clive assumes that the temperature at 0ppm CO2 will be 284K, rather than letting this be a free-floating parameter to be determined by fitting.
The result of this assumption is a poorer fit in the last figure on this post, where it does indeed look like the slope of temperature-CO2 is too shallow.
For everyone reading along, go look at this last figure in the post above. Does the slope of the orange line (observations) match the slope of the black line (Clive’s model)? Or is the orange line a lot steeper than the black line?
That’s the bad fit. A better fit would show a steeper slope for the black line, matching the orange line better, which would also mean stronger CO2 sensitivity.
If we want to get mathy, we can also mathematically demonstrate that this is a poorer fit by calculating the RMS of the difference.
Nice!
Note that no-one else plots temperatures against CO2. My argument was the following: In the stable decadal trend between the1970s and the 2002’s we have a linear temperature increase with CO2 for a 0.8C rise in temperature. Therefore the extrapolation to reach 560 ppm implies a temperature increase of (160/72) * 160 = 1.77 C . So I agree the data implies TCR is more like 1.8C.
Mon graphique à voir sur :
?dl=0
Graphique à voir sur :
http://www.dropbox.com/s/6k7rrg6lo2l7fv4/CO2.jpg?dl=0
J’espère qu’il n’y aura pas rejet sous cette forme.
Doesn’t look like my decadal data though !
What are you using? My annual data ?
Tamino, in his site “Open mind” had a post on May 1, 2022 on the comparison between temperature and CO2, supplemented on May 5 by a comparison between temperature and radiative forcing of CO2.
For the graph I sent, I used your monthly anomalies for temperature and seasonally adjusted Mauna Loa estimates or Law Dome ice cores for CO2 level.
I looked for, every + 5ppm of CO2, your corresponding anomaly estimate, averaged over 6 to 22 months depending on the slope of the CO2 rate to limit the influences of volcanoes or Enso. Base 300 ppm between 1905 and 1907.
Based on measurement estimates from 1880 to 2022, I obtained ~ + 2.7°C during a doubling of the CO2 rate (see curve previously provided).
More simply, if I do the same calculation by directly comparing each monthly anomaly evolution with the corresponding CO2 rate, I obtain + 2.68°C.
If I want a ten-year vision, I use the Loess function with smoothing over a period of 120 months, to limit background noise, the correlation coefficient is much higher. I get +2.67°C for a doubling of C02 with a logarithmic regression.
If I understood your explanation correctly, the figure of 1.7°C, which you indicated on your curve, is the additional warming expected after 2022, when the CO2 level will reach 560 ppm.
My misunderstanding came from your last paragraph: “With these assumptions, a doubling of CO2 would lead to a net increase of 1.7°C in global temperatures compared to what appears to have been a colder period in the 19th century. »
It must therefore be added to the ~ +1.2°C of warming already achieved since the pre-industrial period. This would therefore be ~+2.9°C for a period in which the rate of warming will have doubled.
Slightly more than the values ??obtained by regression above.
But this does not represent the TCR or the ECS.
For example, in the period from 1970 to 2022, we must not forget that the positive radiative forcing of other greenhouse gases has also increased. It now represents more than half of that of CO2 and adds to it. At the same time, the negative aerosol forcing increased until 2011 and then stabilized.
OK. It all really depends on the definition of the original baseline from which we estimate net warming.
These are my annual temperature anomalies back to 1880 with a baseline of 1961-1990
https://clivebest.com/data/V4S4C-annual-triang-sphere-anoms.txt
Vous pouvez sempre utiliser un URL pour voir l’image.
Thank you for the link. It is easier to calculate the warming trend in the period when the CO2 goes from 280 to 560 ppm, without going through a smoothing or measurement averages.
Starting from your annual anomalies, year by year, from 1880 to 2021, I find a trend of +2.62°C, very close to my previous calculations. See the attached curve on:
http://www.dropbox.com/s/g1g4oeeekzv8zc9/curve%20CO2%20annuel.jpg?dl
Hi,
I want to raise a question about our weather station dispersion and there numbers around our cities and industrial areas. We have build so many buildings more and used a lot of concrete and asphalt more compared with 50 years ago. I assume this changing will have an influence on weather stations nearby affected areas like cities and industrial areas or more or less on remote stations as well. It should be influence on average values as well.
Is there any compensation factor for this?
As I could read in this forum as well that temperatures by night are much more in charge of our global warming measured values than day temperatures. That would be logical if we take into account our increasingly sealed areas. Whenever it will not be the only reason for that.
I think we do not have comparable conditions in terms of our weather stations if we consider average values over decades or centuries.
A recent Canadian study found that 20% of daytime temperatures are UHI effect, and as much as 50% of night-time temps are UHI.
It does not mean a lot for a long list of reasons.
1. Earth is warming and has not reached its new equilibrium temperature. ECS is what you get in the long term.
2. Consensus science claims about 1/3 of GHG forcing is cancelled by anthropogenic aerosols (aka pollution). The actual warming would thus represent the impact of only 2/3s of GHG forcing.
3. There are other anthropogenic GHGs contributing significantly to this GHG forcing (methane, ozone, halogens, N2O..). They would be responsible for roughly 45% of it.
And that is just what the “consensus” side would want to add. If we go beyond that, there are even more issues..
4. Forcing and feedback parameters commonly used are just totally wrong. CO2 forcing is only about 2W/m2 once you include clouds (overlaps!) and actual surface emissivity. For the same reasons WV is tiny, and actually dominated by negative lapse rate feedback. ECS is accordingly small, about 0.5K only.
5. “global warming” is extremely one sided, located in the NH. From that alone it follows aerosols will not be the negative forcing the orthodoxy conveniently claims. You can’t have most warming where it should rather have been cooling.
6. A fare more significant anthropogenic forcing is notoriously ignored. This would be aviation induced cirrus. One can find a lot of hints in the literature that this is likely the dominant driver of “global warming”. Of course that would explain why it started in the 1970s and is mainly restricted to the NH..
Lots of issues..
https://greenhousedefect.com/