The latest global temperature measurements are available for both satellite data  and for the Hadley CRU temperature data , so I thought it would be interesting to compare these with the predictions made in 1990 by the first IPCC report. There is now sufficient data to test whether the GCM modeling of greenhouse gases used by the IPCC really matches up to reality. The result is shown below.
This comparison is based on the analysis described below.
Predictions from the IPCC Report 1990 
“Based on the IPCC Business as Usual scenarios, the energy-balance upwelling diffusion model with best judgement parameters yields estimates of global warming from pre-industrial times (taken to be 1765) to the year 2030 between 1.3°C and 2.8″C, with a best estimate of 2 0°C This corresponds to a predicted rise from 1990 of 0.7-1.5°C with a best estimate of 1.1C. “
Prediction: 1990 to 2030 –> 0.7 – 1.5 degrees C
T = T(1990) + 0.0275*deltaY
Assuming a linear extrapolating to May 2011:
T(2011) = T(1990) + 0.58 (maximum of 0.79 and minimum of 0.37)
The data I have used are both the Hadley/CRU data  which is an IPCC reference set based on global surface temperature measurements and UAH data  using a NOAA satellite-based microwave measurement of the lower atmosphere temperature. The data for both sets is available as global averages. For the UAH data I calculated yearly averages for each year to allow a direct comparison with HadCru. The 2011 values are the averages as of May 2011. Both datasets actually publish “Temperature Anomalies” rather than the absolute temperature. These are merely the offset from a long-term average temperature. They use different intervals for the anomaly so this causes an offset. The actual data compared to the IPCC predictions are shown below in Figure 2. The IPCC curves are based on a linear increase using the 1990 temperature value of HadCrut. The curves through both datasets are least square smoothing fits.
Both datasets agree rather well in shape, ignoring the normalisation offset and the long term trends are remarkably similar. The two datasets are independent of each other; HadCrut is based on worldwide meteorological data, and the satellite IR data is calibrated without using any surface temperature data. This gives us confidence that they represent an accurate record of global temperatures over the last 21 years. The comparison with the IPCC predictions is made by normalising the UAH trend data to that of HadCru and then normalising the IPCC predictions to the 1990 HadCru trend value. The result is shown above in Figure 1.
Following a gradual rise of about 0.2 degrees from 1990 to 2000, global temperatures have stopped increasing and have actually fallen slightly. The only IPCC prediction which remains consistent with the current data is the lower prediction of a 0.7 degree rise from 1990 to 2030. The “Best” IPCC estimate and the higher 1.5 degree rise are ruled out by the data.
CO2 levels in the atmosphere have continued to rise over the last 10 years (see overlay to temperature comparison below in Figure 3) but temperatures have not risen since 2000. This implies that CO2 is not the main driver of global temperatures on these time periods and that other natural mechanisms are at least as important. No evidence of any positive temperature feedback with increasing CO2 levels is found.
Pingback: Comparing IPCC 1990 predictions with 2011 data | Watts Up With That?
??? There are now 4 IPPC reports … of what significance is this post? Not much, as best I can tell. Seriously.
If the forecasts are unreliable, then you simply can’t make policy based upon them. That is the ENORMOUS significance of this post.
May I suggest that you end this information to any member of the HST(hockey stick team) who will subject the information to intensive and creative manipulation in order that we might better understand how you have come to the erroneous conclusion that CO2 is not guilty as charged/sarc
Thank you, sir, for this post. It is extremely important.
This is definitely something which will not be included in the next IPCC fantasy report.
“This implies that CO2 is not the main driver of global temperatures on these time periods and that other natural mechanisms are at least as important.”
And this is a real problem for the warmers. They are unable to identify any elements of natural variation that could have countered the effect of rising CO2 if climate sensitivity is really what they claim it is. This means one of two things, either their models are not including some very strong elements of natural variation, or their climate sensitivity numbers are much too high.
Thanks for your efforts, Clive. We always need people like yourself to publish the reality rather than the hype of the warmers.
Pingback: Ten Years And Counting: Wheres The Global Warming?
Pingback: Climate Conversation Group » Guess, meet fact
Can someone please explain how CO2 at about 0.06 % of the atmosphere by weight with a specific heat of less than 1 joule/gram can absorb and re-radiate enough energy to heat the rest of the atmosphere or earth continually ?
It seems that if this theory is correct we should have the solution to the world’s energy needs.
Lets build a closed system – unlike the atmosphere which is open – feed in all the emissions from a coal powered power station and raise the CO2 and water vapour levels to millions of ppm, add water and earth and sunlight and the runaway thermal greenhouse effect ought to really kick in and we will soon have a perpetual energy source and eventually we can shut down the power station and let this little beastie power us till eternity.
Am I wrong here or isn’t that what they predict ?
The answer is that you don’t understand the theory. It’s absorb infrared radiation emitted by the earth, which partly prevent radiative cooling and causes the heat from the sun to accumulate. That’s what we call the greenhouse effect. Gaseous elements like Nitrogen and oxygen on another hand do not have the capacity to absorb infrared radiation so they don’t participate directly to the greenhouse effect. Therefore, what you should be looking at shouldn’t be the percentage of the total atmosphere represented by CO2 but the percentage of the total Greenhouse effect caused by CO2. Also keep in mind that CO2 is not the only greenhouse gas emitted by men, CH4 is also an important contributor.
Kamizushi, you feign intellectualism while neglecting reality.
YOU keep in mind that water is 49 times more abundant in the atmosphere than CO2. YOU keep in mind that water absorbs seven times as much infrared radiation as water. YOU keep in mind that the Keeling Curve is a non-zero based graph solely to exaggerate the slope of the graph, which is more than 96% naturally produced CO2 anyway.
It takes humans 22 years to increase atmospheric CO2 by 1 part per MILLION, and so we are to plunge the world into permanent global depression for your outlandish fearmongering, lies?
You first, and Al Gore and Barack Obama with you.
Kamizushi, it is you who doesn’t know what is going on. The “total greenhouse effect caused by CO2” is trivial. The dominant greenhouse gas is water vapor. Not only does water vapor absorb seven times the infrared radiation that carbon dioxide does, given equal parts per million volume, but water vapor also constitutes 37 times the number of parts per million that CO2 does. Divide 15,000 for water by ~400 for CO2.
Moreover, humans account for less than 4% of CO2. It takes 22 years for humans to add 1 ppm to the atmosphere. And for this you want to return us to the stone age? Cast civilization into a permanent depression, worldwide? While Obama flies Air Force One to Florida for a round of golf?
Why is the 2011 data already provided? Was that data weighted, if so by how much?
PS – Why is no one calculating the latent heat of melting? Ice in a glass of water will stop a temperature increase of the water until it has mostly gone, especially if there is a lot of “stirring” from wild weather.
Be patient; I am confident that my property will be beachfront before I die.
A timely and vivid illustration of the existence of perpetual lie-telling in the climate wars. In Australia we have just been honoured with the release of the second Garnaut Report. Garnaut is an economist who has now proven himself to not be independent. In fact, you could say that he is a shill for the Labor-Greens government. He has just got through insisting that it is a lie that global temperatures have flat-lined since 1998. Here is the incontrovertible proof that he is lying. Garnaut actually insists that not only are global temperatures not static, but that they are rising at an ever increasing rate, a rate greater than previously indicated by the IPCC. And our Labor-Greens government is going to take advice from him!?
Incidentally, its Mauna Loa, not Maona Loa.
Why don’t you have a trend line through the UAH and HadCrut figures? you are not showing all data to allow for fair comparison.
ps: You are incorrect to say there has been no warming since 2000, both HadCrut and UAH show warming since then.
The following is from Andrew Bolt’s website. Kieth is first, I post as Brian S. Please note my question to you at the end:
More spin and idiocy from the data denying lefty trolls is expected.
The data deniers will be in full scream today.
Always good for a laugh.
I’ve linked to similar charts for some time on this and other blogs. All they can do is reflexively deny that Hansen’s and other’s models continue to diverge from reality.
So, data deniers, please provide some more merriment. More expressions of blind faith welcome.
Keith of Canberra (Reply)
Fri 10 Jun 11 (06:58am)
Brian S replied to Keith
Fri 10 Jun 11 (10:36am)
Happy to oblige Keith.
In the first place, if you click on the link a better overall picture is Fig 1 which clearly shows that temperatures are rising and the data is entirely conistent with the IPCC predictions.
From the link:
IPCC Report 1990 
“Based on the IPCC Business as Usual scenarios, the energy-balance upwelling diffusion model with best judgement parameters yields estimates of global warming from pre-industrial times (taken to be 1765) to the year 2030 between 1.3°C and 2.8?C, with a best estimate of 2 0°C This corresponds to a predicted rise from 1990 of 0.7-1.5°C with a best estimate of 1.1C. ”
What we have here is another case of cherry picking. Specifically, 3 cooler years 2006-2008 allegedly invalidate a linear tend line not scheduled to end until 2030.
(I actually have some other questions concerning whether the data as presented has taken zero point offsets between different data sets into account but am contacting Clive Best with those.)
Clive, there are some serious flaws in your graph:
1) If you’re comparing temperature prediction with observation from 1990, then blue and green lines should be moved up to meet red lines AT 1990. This also corrects the HadCrut and UAH offsets (they use different reference temperatures).
2) You should not “Assume a linear extrapolating to May 2011:” as no IPCC modeling predicts linear temperature increase. It is very clear from 1990 graphs that temperature was predicted to accelerate mid century, meaning slower temperature increases now and lower gradient red lines.
3) You should not use 2011 data as it will have a cooling bias because the Northern Hemisphere summer temps are not yet in.
Please publish new graphs after correcting errors, it may show modeling wasn’t too bad after all.
Sorry Clive, on my point 1) I had not seen you top graph, which does what I’ve asked.
But the blogosphere has picked up on Figure 2, which I think is misleading to those who have a cursory glance, and opens it up to criticism.
Nice work. I’m currently looking at James Hansen’s predictions. They seems equally as accurate.
What was IPCC’s prediction for 2010, if any? Since the temperature cannot be expected to rise linearly it is obviously absurd to take half the increase from 1990 to 2030!
Question. You stated that, “[t]he curves through both datasets are least square smoothing fits.” Did you use the daily, monthly or yearly data to calculate the least square smoothing fit? — John M Reynolds
Let me clarify what my aim was in this study. The first IPCC report from 1990 and John Houghton’s (excellent) book both make specific predictions over a timeframe short enough where we can actually compare with measurement. It is no good predicting the climate in 2100 since we will all likely be dead by then. Good science makes predictions which experimenters test (ie. forecasts – not hindcasts). A simple doubling of CO2 in the atmosphere (calculated for example with MODTRAN) forecasts a temperature rise of only about 0.8-1 degree. Current models assume positive feedbacks with H2O leading to enhanced warming, but these assumptions also need testing.
The data show a warming effect of around 0.2 degrees since the prediction was made back in 1990. Since 2000 however, both trends for HadCru and UAH show no further rise. Only the IPCC(1990) lower limit prediction of a 0.7 degree rise by 2030 remains compatible with the data. Furthermore an increasing rate of change to be expected by any positive feedback mechanism is also not evident.
I hope we won’t have to wait another 20 years until 2030 before this is settled.
I used the monthly values with an effective smooth width of 4. That is why the last point of the trend is 2009 and not 2011, although that last point is positioned by the other two years.
The value for 2011 is actually just sum(jan-may)/5 for both datasets. The trend curves could also be calculated using rolling averages.
Pingback: Global warming since 1995 'now significant'
Pingback: IPCC Predictions mega fail (again)
Regarding using a linear interpolation for the period from 1990 to 2030. The relevant graph taken from the 1990 report is shown below. I agree that over 250 years the graph is more exponential but a linear fit over the period in question : 1990 to 2030 is a fair approximation.
These graphs seem to be taken only from models run back to 1850. The prediction made in 1990 is based on the current temperature then – hence the starting point must be the current temperature in 1990.
Mat L. “3) You should not use 2011 data as it will have a cooling bias because the Northern Hemisphere summer temps are not yet in.”
This doesn’t matter, since the data represents monthly anomalies, not temperature.
Dr Phil Jones stated that there has been no statistically significant warming in th last ten years. Also another member of the Fiddlestick Team said they couldn’t account for the lost heat?
In response to Mat L. 3) I have repeated the same procedure now excluding the 2011 average up to now. The result is shown below.
The conclusion is the same.
Tilo Reber makes a good point however. If it really were to be true that the monthly temperature anomaly depends on the particular season in the Northern hemisphere, then it would appear that the procedure must produce a biased sine-wave in the data.
Steve Mosher has made some interesting comments concerning this graph over at WUWT.
The point he raises about the scenario you use appear to be valid. For example, the BAU scenario assumed rapidly rising levels of CO2, which has not been the case. He suggests creating the graph again using the CO2 total closest to what has actually occurred.
I see you have been willing to update the information based upon specific criticisms. I, for one, would be very interested to see the observed temperatures compared to the closest scenario.
For the record I am quit a bit skeptical of CAGW, but I do like to see accurate information.
Thank you for listening.
Mat L. “It is very clear from 1990 graphs that temperature was predicted to accelerate mid century, meaning slower temperature increases now and lower gradient red lines.”
Do you suppose that the IPCC forgot that the effect of CO2 is logarithmic. In other words, that whatever temperature increase results in adding 280 ppm to the pre-industrial level, we will have to add 560 ppm beyond that to get the same temperature effect again.
Thanks Clive for the updates, interesting stuff.
“I used the monthly values with an effective smooth width of 4. That is why the last point of the trend is 2009 and not 2011, although that last point is positioned by the other two years.”
So, you used the monthly data. As your width was 4 and the last trend point was 2009, that means that the 4 represents years. I guess that means that you took the monthly data, got a yearly average, then fitted those data to a curve. I recall some staticians, like W Briggs and his son WM Briggs among others, suggesting that you should not fit averaged data.
The IPCC and their supporters too often neglect to include error bars with their data, is there any way for you to add the level of uncertainty to your graphs?
John M Reynolds
Ross. The outgoing energy is simply slowed down. It just takes longer for the earth to cool down as compared to a scenario when greenhouse gases are not present. As well, you cannot have millions of ppm. As soon as you get a single million ppm, then the concentration has hit 100%. — John M Reynolds
Pingback: The Sound of Settled Science « SGTreport.com
Jack Greer. The fourth graph from your link proves Clive’s data is correct. Thanks. — John M Reynolds
Pingback: 1990 IPCC Predictions Confront the Data
Pingback: Gone Skiing | Be Responsible – Be Free!
I’ve been somewhat successful at putting this in it’s place on forums. I simply point out that the IPCC is using a legal standard of proof instead of using the Scientific Standards. The Scientific Standard is the Scientific Method. It is impossible to apply the scientific method to climate change theories because we can not repeat experiments since we have access to only one climate. The IPCC has choosen to use a consensus of elites to justify political action. Then by implying the consensus is a consensus of climate experts somehow that means their opinions have some sort of scientific basis.
Scientific Theories, Scientific Opinions, etc have a specific meaning, they are theories and opinions that can be tested using the scientific method.
When I use the IPCC’s own wording, even the most faithful fanatic will admit that man’s contribution to global warming can not be proved with Science. This exposes man made global warming as a political scheme disguised as a Scientific Theory.
Consensus was obsoleted by the Scientific Method.
Where are the error bars on the measured data? Why do we never see a them? Does anyone really believe we can really measure and track world temperature to 0.01 degC resolution or accuracy???
But the IPCC figures include positive feedbacks. Without feedbacks, just using the CO2 molecule emissivity, how does that jibe with the data?
Wouldn’t that be funny if it jibed well? Then the warming profiteers would really have to explain why global warming is scary and we we are in a “crisis.” The CO2 hypothesis “proven” yet also destroying the global warming industry, based on feedback fears.
Pingback: 1990 IPCC Predictions Confront the Data - US Message Board - Political Discussion Forum
The IPCC makes temperature predictions conditional on greenhouse gas levels. Business as usual did not happen (CFC and methane concentrations stopped rising) so the red prediction lines are not a correct representation of what the IPCC predicted in 1990.
Here is the relevant graph from the IPCC 1990 report showing the expected temperature rise from the “best case” sensitivity under different emission scenarios:
Here it is compared to UAH:
Pingback: Dinocrat » Blog Archive » More of the same
There are two elements to this. Firstly we have two completely independent measures of the average global temperature for each year. Therefore we can estimate the error on each single year by comparing the divergence between the two. The error estimated this way works out at approx +- 0.07 degrees per data point. The data also clearly show that there exist systematic year to year variations caused by natural variation – such as El Nino, volcanoes and the like.
Therefore to identify longer term trends we prefer to smooth out these short term variations in the data by using a rolling 2 or 4 point average or least squares fit. When we do this we see that there is rather good agreement in trends between the two datasets. The divergence of these independent averages leads to an estimated error of about +- 0.03 degrees in trend over the 21 years.
There remains also the possibility of longer (20-30 year) time scale systematic (physical) variations in the temperature data due to as yet unidentified natural causes.
I do agree that there is a need for a better statistical analysis so that confidence levels can be set on model predictions.
The sun has more to do with the rise and fall in global temperature change than anything man can do. Natural disasters are another factor (Krakatoa). Back in the ’70’s the “experts” were blowing hot and cold gloom and doom that the world temperatures were falling and we would all freeze. The only thing that is heating up are the political lies. They didn’t get the lie across in time before the sun cycled to lower radiation levels. Wait another eleven years and try again, or change your gloom report to “We are all gonna’ freeze”.
Pingback: TaJnB | TheAverageJoeNewsBlogg|| 1990 IPCC Predictions Confront the Data
I don’t even pretend to fully understand all of the math and science discussed here. However I would like to point out that there is a remakable difference in how Clive Best answers any criticism of his work. He responds with respect and facts. Trying to answer and explain all of the questions about his data.
Just for fun, try questioning the data and theories of the alarmists and see what kind of response you get from those narrow minded idealogues.
If you want to test the capability of the model used for the simulations to reproduce the globally averaged temperature change in Nature your approach is scientifically flawed. You use climate simulations, for which idealized emission scenarios and derived forcing are prescribed. But the real emission scenarios have deviated from the prescribed ones. Even or particularly for a highly skilled model, the simulated temperature change would deviate from the observed one in such a case. Thus, you can’t draw a valid conclusion about the capability of the model from a discrepancy between simulation and reality.
1. I haven’t found any statement in the IPCC report from 1990 where it is claimed those temperature “predictions” is what was going to happen in the future with high certainty. I found many ifs and buts instead, and long elaborations about what should be done to decrease the uncertainty and to better understand the climate system back then.
2. You say, the “only IPCC prediction which remains consistent”… But you only compare to the “business-as-usual” scenario, which is the most extreme emission scenario in the report. Why don’t you mention that there are three more scenarios, scenario B, C, and D? Why don’t you compare to a less extreme scenario? As I said, a test of the capability of the model to simulate is scientifically flawed, if already the prescribed forcings differ from the real world.
Yes, CO2 is not the main driver on a time scale of 10 years. Or 5 years. Or 2 years. Or 2 days. The global warming deniers here may be very excited about this, believing this refutes global warming. Like they perceive any temporary wobble against the long term trend in any climate variable as confirmation of their beliefs. But where do climate scientist actually claim that CO2 was the main driver on such short time scales, which would be refuted by this? And besides climate drivers, the shorter the time-scales, the more natural variability determines the temperature change compared to the global warming trend that plays out over many decades and centuries instead. And CO2 is not the only climate driver anyway, even on longer time scales. There are other greenhouse gases, aerosols, land use, and also solar variability to some degree. Add that there is also natural variability on longer time scales, the temperature won’t just linearly follow the CO2 increase, even on longer time-scales.
How would you see such evidence just by looking at the temperature curve over the 20 years? The observed temperature change is due to the combined effect of all climate drivers, positive and negative feedbacks, and natural variability.
You make some valid points. However, I would ask you to read my latest post regarding historic temperature data since 1850 for a longer time view of the feedback issue.
Logarithmic dependence of temperature on CO2 levels
This assumes that all long term effects are caused just the radiative forcing of CO2. It then compares the data to a very simple possible model. Perhaps the change in that model of the constant from 1.6 to 2.5 may be caused by ignoring other greenhuuse gas effects and a little feedback. Perhaps the model is wrong – and you you will be able to point out the errors to me ? I would be happy to concede if you can fault the argument.
Right now I am on a boat so have very little internet access – so please be patient for a proper reply to your other points.
However, I believe the evidence clearly shows that the IPCC may have exaggerated effects of CO2 emissions for non scientific reasons. This is my main point.
What are you saying? Last decade was the warmest decade since 1880, because the sun has cycled down, which had been, according to you, the major cause of the observed temperature change over the last decades? Somehow, I’m missing the logic in this.
This statement is an exaggeration, a distortion of the historical facts, which, I suspect, serves the purpose to discredit what climate science says today about global warming.
predictions are only just warnings.
virtual studies may or may not prove them ,
yet the significance of prediction
is not out of place.
While I appreciate the attempt, you do not peg them to the 1990 starting point used in the original graph. This makes it difficult to compare.
Are you kidding? Hadley CRU temperature data?!?!
THERE’S NO SUCH THING!!!!!
Go ask Phill where his raw data is!
Do you even read the news?!
Clive, I was surprised to see that the 1990 IPCC models were being tested today, two decades after they were developed, and four IPCC assessments ago. These were very early models for which little was claimed and for which many factors were omitted that have since been included. At the least it strikes me as unusual, scientifically, to make such a comparison.
But beyond that, your corrected graph does not support statement that “the conclusion is the same.” Both UAH and Hadley data sets fall within the limits of the (assumed linear) model curves except for 2-3 years at the end. As is well known, one cannot draw any conclusions on the basis of spans of less than 10-15 years. This is too bad for impatient us, but the data are too noisy for anything else. There has been a lot of hyperventilating over short term variations that don’t seem to follow the global warming model – the claimed cooling since the El Nino year of 1998 eg – and it works for those who like their daily dose of confirmation bias. But it’s lousy science. Well, it’s not even that…it’s not science.
In response to LA Coleman.
The reason to test the first IPCC report from 1990 rather than say the latest one is simple. The reason why it can be tested at all is precisely because it was made 21 years ago. Current forecasts are untestable because the time periods involved to test them are unreasonably long. In 1990 a strong prediction was made over a measurable time limit, and without the benefit of hindsight. CO2 emissions have continued to increase as assumed back then, but the temperature rise has apparently been smaller than the model predictions. IPCC’s latest report still forecasts rises of about 3, ranging up to 6 degrees C temperature rise by 2100. The simplest physics model possible for the enhanced greenhouse levels gives a logarithmic dependence on CO2 levels and only modest temperature increases even assuming CO2 levels continue increasing see: https://clivebest.com/blog/?p=2241 (just 0.7-1 C by 2100).
Yes it is true that El Nino effects, volcanic eruptions, and solar variability have short to medium term effects on climate. Good science is about developing theories and models which can then be tested by experiment. The IPCC models which assume positive feedbacks could be correct – but they still need testing over reasonable timescales. They cannot just be accepted as an act of faith.
It is rather like a preacher preaching that unless we repent now and mend our immoral ways, God is going to destroy the world in 100 years time. Should we ban Rap music now, close down the Internet and cable TV as an insurance policy in case he is right? The preacher knows full well that if the world doesn’t end in 100 years time then he won’t get the blame, because he will be long dead by then.
I was wrong ! In fact we can compare the IPCC 2007 predictions to the same data. I am sorry if I implied otherwise in the comment above. The report contains the following figure for short term predictions based on various scenarios for CO2 emissions.
The scenario restricting levels to 2000 values has not occured and the others give more or less the same predictions. Therefore I have just used a linear prediction through them to compare to the same data as before also plotting the year to year variations of HadCrut with the fits. The result is
Pingback: Clive Best » Blog Archive » IPCC Predictions (2007 report) compared to data
Where’s the “global warming”?
It’s over at places like REALCLIMATE.com, where they still are pursuing the fantasy that humans have usurped natural forces and now control weather, temperature and climate – and they’ve got the doctored data and the flawed mathematical models to prove it!
Meanwhile, scientists, the real kind, report that the sun is showing clearer signs of entering into a “Maunder Minimum” which suggests that we’re looking at significant global cooling, if not another mini ice age (or worse) over the next few decades.
I’m sure Hansen, Jones and the confused over at REALCLIMATE.com will still be whining about carbon dioxide as the glaciers begin to advance to cover Canada, Scotland and the rest of Northern Europe.
@ Jan Perlwitz:
The claims about the last decade, based on proven doctored data, and proven cherry picking of sites to generate that data, was that it was hotter than any since 1880.
Those of us who actually lived through it and paid attention know that was not true.
Yes, the “significance of the prediction” is indeed out of place. The sun is showing clearer signs that we can expect major COOLING in the coming decades, so even if the claim that anthropogenic CO2 is causing significant warming were true, and make no mistake it is not, the truth is that would be a good thing, not a bad thing, with the probability that we’re about to see another ice age growing.
Yes, very astute, for it is certain that the IPCC absolutely DID exaggerate the role of CO2, particularly of anthropogenic CO2, for other than scientific reasons exactly as you suggest.
I’m going in reverse order here, so once again,
@ Jan Perlwitz:
The problem we don’t seem to be communicating to you is that when models are consistently, significantly incorrect time and time again, when they diverge substantially from reality, when they cannot even get temperature and precipitation correct at the same time, it’s time you stopped suggesting that models are something we can have so much faith in until you figure out why they’re so consistently wrong. Your excuses are unacceptable. For they’re wrong because they start from, and are programmed to include, completely false assumptions. Chief among these is the idea that anthropogenic CO2 is having a measurable effect on weather, temperature and climate.
It simply is not.
Clive, would you do an updated version of this thread? I suspect newer evidence to make your point even more strongly! (No need to publish this comment)
Pingback: $billions on ball gazing | JunkScience Sidebar
Pingback: The IPCC's track record
Pingback: Why worry about global warming? - Page 37 - Christian Forums
Merci beaucoup pour l’article.
Pingback: Climate change
Pingback: Is There Really A Need To Panic Over a .29C Rise In World Temperatures? | Right Wing News
Pingback: Is There Really A Need To Panic Over a .29C Rise In World Temperatures? : Stop The ACLU
Pingback: Is There Really A Need To Panic Over a .29C Rise In World Temperatures? » Pirate's Cove
Pingback: 1930s photos show Greenland glaciers retreating faster than today - Page 47
Pingback: Even more Republican nutiness - Page 8
Pingback: Even more Republican nutiness - Page 8
Pingback: #ScienceSaidSo: Iowahawk, others offer historical perspective to @BarackObama’s settled science
Any model that attempts to predict the planet’s (climate) dynamics must necessarily fail. It is a fundamental property of the universe that certain type of phenomenon (what we call “fundamentally unstable phenomenon”) have “unpredictability” as part of their raison d’etre.
For example, the climate, financial markets and even apparently simple systems like lava lamps, and Hele-Shaw cells are fundamentally not predictable. It makes no difference how many math-geek or supercomputers you throw at the problem.
Notice, even if you could derive the exact correct model equations, you still would not be able to make any practical forecasts. These types of phenomenon generally have extreme sensitivity to initial conditions, and to parametrisation. Moreover, as they are non-linear, those errors are magnified dramatically/exponentially as they feed-back on themselves with each time step.
If anyone would like a short pedestrian explanation with lots of pictures, not too many equations. send me an email address and I’ll post a pdf.
HOWEVER, a most ASTONISHING aspect of climate modelling is that one need NOT know any maths whatsoever to prove conclusively that the IPCC-style models must (necessarily) be rubbish. You can do this with just one word: volcanoes.
In AR4 Fig 8.1 or the equivalent AR5 Fig 9.8 they show you their back-testing against a century or more of real data. Their models show excellent agreement to the real data.
Of particular interest should be that their models track the planet’s sudden cooling just after large volcanoes, with extreme precision.
But HOW is the POSSIBLE?
Since no one can predict volcanoes, the short answer is that they cheat.
Notice that during 1,900 – 2,000 the net warming was about 0.7C/100years, but the volcanic cooling was 1 – 2 C/100years … i.e. volcanoes are very important/big impact.
The only way to “intervene” in a proper forecast would be to use your time-machine to discover when, how big, and what type of volcanoes will occur, say, in the next 100 years.
Without that time-machine/cheating in the forecasts, one can reasonably expect forecasting errors on the order of 1 – 2C/100years … i.e. so large as to confirm the models’ rubbish status.
… in short, the models are complete nonsense, just on the volcano issue, never mind the many deep mathematical and physical issues that are each “terminal” to this modelling process.
In any case, if the predictions do not conform to observations, then it is the models that are wrong, not the data … unfortunately, the IPCC et al seem (consistently) to insist that the greater the discrepancy to real data, the more “right” they are … that’s just too weird.
On a related, but lesser note: Since much of their modelling revolves around RF, any errors introduced in the RF estimate will be serious, even if an RF approach is sensible.
For example, why is it RF = a Ln(C/Co), why not
RF = a Ln(C/Co) + b Ln(H2O/H2Oo) + c Ln(Volcanic Particulates/VPo)+ …. ??
Also, these sorts of RF curve fits, are curve fits of curve fits. The HiTRAN and related models are just that, models. They may have some tie to idealised laboratory data, but have little if anything to do with the actual atmosphere. Indeed, what could they possibly know about the atmosphere in 1,850? (ignoring also the many other huge forces: solar, planetary, geological, etc.)
Indeed, reading Myhre’s paper, I could not find a single reference to “water” or “H2O” or anything like that. If the IPCC begins its AR’s with “water vapour is the most important GHG”, then how can they leave that out of the RF?
There is something I have not been able to confirm, and perhaps someone may comment. I have not been able to find sufficient detail in the RF etc formulas relating to effects of altitude vs. density. For example, there is much attention to CO2 at higher alts, and in the stratosphere. By 10 km, density is about 30% of sea level, and by 20km its about 8% of sea level … i.e. the entire volume of the stratosphere might contain all of 3 molecules :-). Does it really matter if its 400 ppm, when there are virtually no “p’s”? In any case, if someone knows of some place where that is handled, I would be grateful.
CO2 is termed as being “well mixed” meaning that concentrations in general are constant way up into the stratosphere. However there are regional and seasonal variations.
Yes density and pressure fall sharply with height. At a given height for each wavelength the density becomes low enough for IR radiation to escape directly to space. The is the effective emission height for that wavelength. In fact the greenhouse effect depends only on the temperature at that height. As density increases so that height moves up a bit to a cooler level and so slightly less energy is radiated. This is essentially the origin of radiative forcing and everything is really determined high up in the atmosphere. The argument about back radiation warming the surface is a diversion and incorrect.
Many thanks for your prompt response.
Not trying to be funny or anything, but I may not have posed my thoughts with sufficient clarity, so I will try again:
1) The “volcano issue” (and related modelling matters) may be sufficient to provide a full and final end to the current IPCC climate forecasts. If so, then that puts an abrupt END to the climate POLICY questions, since without predictability, there can be no (sensible) policy.
Amazingly, then, one may need not know anything at all about climate science or mathematics to make a sensible decision on climate policy.
I have put a slightly easier (lots of pictures) and slightly more comprehensive version of this point in a PDF that can be dloaded here (http://www.thebajors.com/climategames.htm).
Perhaps you might comment on that as and when, and perhaps make that or some updated version available on your pages.
a) What I am looking for is not so much a general discussion on the plausibility of altitude/density effects. Rather I am looking for papers/proof, one way or the other, of what (if any) such effects are actually accounted for anywhere in the IPCC modelling.
If you, or anybody, has documentation concluding this one way or the other, I would be grateful.
For example, if it turns out the (effective) emission boundary is well into the troposphere, then there are significant ramifications for the credibility of various “models”.
Also, just because there is more “stuff” up there, that does not necessitate the type of increase in the effective emission altitude some may imagine. The relationship between “stuff” and altitude is a sufficiently complex to warrant a proper calculation.
Also, if the atmosphere is warming, and the emission boundary is expanding, what amount of the new warmth in the atmosphere goes into that emission layer (ie. does it actually get pushed into “colder” and by how much etc?) Again, would like to know the extent to which these mechanism are addressed in the actual calculations.
The “well mixed” matter is a little further down my list of priorities, but engineers had been solving the equivalent “Continuously Stirred Tank Reactor” (CSTR) problems some decades prior to these climate controversies, so I may return to this later. Its tedious, but well understood.
b) Not sure what prompted the “back radiation” comment, don’t recall brining that up.
c) I am interested, though, in how why the RF formula as used by IPCC et al includes only CO2. If there has been a change from 1850 to present (or whenever) why necessarily must all of that be associated with and only with CO2?
There are many other GHG things in the atmosphere, and plenty of other huge forces impacting/causing the changes, so ????
Why, as some say, is not at least RF= a Ln(C/Co) + b Ln (H2O/H2Oo) etc.
… aside: I am not sure if it was one of your bloggers, but at my end of “the business”, this would not be a “canonical form”.
… and perhaps for another day, I am not a fan of summary measure based balances as with RF. In my experience, it is much better to do the full and detailed system of couple PDEs for momentum, heat, and mass transfer etc, and while that is much more work, it is much more reliable.
… though on the point of “much more work”, as they have spent billions over decades to fund all the armies of math-geeks and mountains of supercomputers, one might think that budget is not an issue.
Actually, in connection with this, and on the “lighter side”, one of “visiting” seminars to my graduate school was the head of the software development for the space-shuttle. The thrust of his presentation was “how do you check code with 1 million lines” … but he had an excellent opening remark (Re software development) .. he said “We had an unlimited budget, and essentially we exceeded it” … hope that puts a smile on someone’s face 🙂
Every comment here is loaded with emotion and emotion is inimical to reason and science. So you all should take a chill pill and stop thinking with your mouths. If, indeed, Planet Earth’s climate really is warming, it’s because of all the hot air coming out of your pie holes.
Pingback: 550 SCIENTIFIQUES GIECOSCEPTIQUES – belgotopia
Pingback: Monte Rosa and the Comets | MalagaBay
With 2020 coming in over 0.4C warmer than 2010, this really didn’t age well, did it. Temperatures are now slightly above the “Best” line on that chart.