3D visualisation of Earth Science data

WebGL is a Javascript implementation of OpenGL which runs via your web browser locally on your graphics hardware. Nick Stokes has developed an interface to visualise 3D global  data such as temperatures on the earth. Here is an example for my March 2017 data. Click and drag on the sphere below to rotate it. Toggle buttons to add/remove triangular and point meshes.

To view the complete interface in a new window  click here

The great thing about such a 3D presentation is that you view distributions as they actually are, rather than through some distorted 2D map projection. Often such projections make polar regions appear far larger than they really are. Here are some more examples which I hope will show how useful 3D visualisations can be. To view Nick’s WebGL interface click on each title.

  1. Super El Nino January 2016

2. Cold winter of 2010

Europe experienced its coldest winter for 47 years. Cold Arctic air spread south over Europe and US under  a meandering Jet Stream.

3.Cold winter of 1963

The coldest winter in Europe for 100 years saw a similar pattern to 2010. There were twice as many weather stations active in 1963 than there are today. Compare the number of nodes in the grids to those visible in 2017, and you will see a huge reduction in stations.

VRML was originally proposed in 1995 as the 3D visualisation standard on the web but you always needed a plugin or SGI workstation. X3D grew out of XRML but has not been widely taken up. WebGL however has the advantage that it is supported by all modern HTML5 Browsers.

However, WebGL and Javascript are not for the faint hearted, as programming and debugging are rather difficult. Appreciations therefore go to Nick Stokes for doing most of the hard work to make a generic Earth Science interface, which is then relatively easy to adapt to new sets of data.

This entry was posted in AGW, Climate Change, Internet, web services and tagged , . Bookmark the permalink.

23 Responses to 3D visualisation of Earth Science data

  1. Lance Wallace says:

    The first link (Super el Nino) returns a 404 for me

  2. Hans Erren says:

    Shame Webgl won’t run on my ipad

    • Clive Best says:

      You really need a high end PC or a Macbook Pro with a decent graphics card. Strangely enough though it half works on an iPad Pro. You get the interface OK and can toggle the mesh on and off, but you can’t spin the earth as far as I can tell.

  3. Nick Stokes says:

    Congratulations, Clive. It looks great, and they are very interesting cases.

    I think you need more significant figures on the lat/lon data.

    You’re right that WebGL really needs a good graphics card.

  4. Steven C Crow says:

    Great images Clive.

    Your work suggests to me a geophysical origin of warming. It seems not “global” but rather concentrated in a ring around the North pole, in Northern Canada and Siberia. It reminds me of efforts to explain the Earth’s magnetic field in terms of a dynamo arising from subsurface magma flows. The polarity of the dynamo changes every 100 thousand years or so, which exposes the surface to intense cosmic radiation as the net field strength passes through zero. Do you think your beautiful thermal images could be the imprint of subsurface magma flow?

  5. Steven C Crow says:

    I was a little sloppy in my message above. The average rate of field reversal has been 4-5 per million years over the past 10 million years, so the duration of a polarity is 200-250 thousand years. The last reversal was 780,000 years ago, so we are overdue for a change.

    • Clive Best says:

      The reversal of the earth’s magnetic field is a fascinating subject. It gave the first real evidence for continental drift when magnetic stripes were found symmetrical either side of the Mid-Atlantic ridge. Several years ago I was asked to propose a feasible plot for a novel based on a worldwide failure of computer and electronic system. All I could think of was a sudden flip of the earth’s magnetic field because no-one really knows either the cause, or how fast the transition is.

      However, it can’t affect short term Arctic warming. There is no evidence for any increase in seismic activity say in Iceland. The main reason (I suspect) is that the earth spins around its polar axis. Heat flows from the equator to the poles asymmetrically with the seasons. Any small changes in energy balance are concentrated at the poles. Except Antarctica !

  6. Ron Graf says:

    Clive, Nick, very nice work! Is it possible to set up a tool for user control of monthly or yearly renderings? This would allow visual pattern searches.

    I notice that 4 out of 7 continents show significant warming anomaly. This looks like the expected AGW signal. But three out of four of those are in the NH, showing a preference, perhaps, for economic development and urban heat island effect. On the quest to eliminate the potential for AGE and UHIE confounding I would think one should expect to see a higher contrast between land and sea anomaly in La Nina years than in El Nino for AGW. A lower SST would create a higher radiative imbalance which should provide greater contrast between the lower land heat capacity and the sea. The AGW signal should also display a land/sea contrast in all the continents in a statistically even distribution. This is opposed to UHIE, which should not be affected by ENSO but instead should show a preference for continents/periods in economic development.

    • Clive Best says:


      Don’t forget these are individual months shown above. It certainly is true that most temperature changes occur over the Northern Hemisphere, especially at high latitudes. This changes from month to month and with the seasons, so part of what we observe is natural variation (weather and El Nino). There is an underlying warming trend of ~0.8C since pre-industrial times. The recent el Nino enhancement to 1C will likely reduce over the next 10 years. The land warming trend is larger (1.1C), but that can’t continue indefinitely. Some component of that land warming is definitely due to human UHI and land change over the last 150y. I suspect this is the real cause of the difference between ocean and land.

      The problem with climate change is that the whole subject has become toxic. Every possible pressure group is exploiting it, or the lack of it, for their own ulterior motives. I would also include some climate scientists in this group. Rational discussion has become almost impossible.

      Is climate change the most pressing problem for mankind? No it isn’t
      Can we decarbonise the world economy by 2030? No we can’t
      Can we decarbonise the world economy by 2100? Probably – if we don’t kill each other first!

    • Ron Graf says:

      Clive, I understood that the image represents just one month. My hope was that it further programming sweat could produce a multi-month clickable tool, bringing the ability for everyone who has a computer (like school kids) to make observations. The more easily the data can be processed, visualized and tested the faster discoveries can be found and accepted. I suspect Nick developed his programming with just this objective. By the way, realizing that Nick’s hypothesized ECS and political stance of what that implication demands is different than yours and mine, I find it awesome that you and Nick are working together. If only all investigators could follow your example.

      Nick, Clive, what is your opinion of being able to differentiate UHI signal from AGW using the continent’s anomalies versus the sea’s? My understanding is that theory predicts that positive radiative imbalance would warm the land first since it has little heat reservoir, where the sea’s heat capacity is 10X higher. The larger the land mass and the further from sea influence the higher the predicted anomaly for a given imbalance. However, with UHI the anomaly would not be affected by radiative imbalance or proximity to sea influence.

      • Clive Best says:

        There is a technical problem with combining monthly data to make annual or decadal data. The triangular grid is forever changing month to month, and year to year so averaging on the earth’s surface is not straightforward. It is easy to make a single global average, but to retain spatial resolution means making some assumptions.

        One possibility would be to define a standard hexagonal tessellation of the surface and project each month’s values onto that. Then we can make averages over any time period. I am not sure I will have time to look into that immediately.

        I wish we could separate the politics of climate from the science of climate. Then we could soon all agree on the later rather easily. Probably ECS is about 2.3C while TCR is about 1.5C Then we can talk about what if anything should be done about it. Random action to curb climate change can be disastrous causing more harm than good. e.g.

        Promotion of Diesel in UK . Ethanol production from Corn. Biodiesel production from Palm Oil causes deforestation in Borneo. Biofuel power stations using imported wood-chips.

        UHI has only a tiny effect on global temperatures. However, it does change the trend in land temperatures slightly, although not really after ~1960.

        Short term global warming will lead to small shifts in climate zones, but that has always happened. The problem is the just the rate of change. In the long run we are all ‘doomed’ anyway because another Ice Age is due within 10,000 years. If we learn to manage CO2 levels we might actually be able to avoid it.

        • Ron Graf says:

          >>>”UHI has only a tiny effect on global temperatures. However, it does change the trend in land temperatures slightly, although not really after ~1960.”

          Individual studies of UHI show significant. It only goes difficult to detect after homogenization smears it over the record. UHI and AGW should have different fingerprints, something that climate science generally lacks. If land warming does not follow an AGW fingerprint it must be UHI (instrumental error) or natural.

          Land is the only credible index. Sea surface temperature records are much much less reliable except the last 11 years, where they are likely better than land due to 2005 deployment of thousands of Argo sensor robots. If the land record is found to be biased by 10-20% one can be certain the sea indexes will be worse.

          >>>”In the long run we are all ‘doomed’ anyway because another Ice Age is due within 10,000 years. If we learn to manage CO2 levels we might actually be able to avoid it.”

          First, individually, of course we are all doomed much quicker. If we (humanity) survive another 80 years we will either have mastered fusion and a 10X improvement in batteries, fuel cells, etc. If not, our grandchildren will be fighting matches in Thunderdome in a Mad Max world after “the epoci-clipse.”

          Global temperature can be regulated easily by parking mirrors in space at Lagrange Point 1 (L1). Lawrence Livermore Labs already thought of this 16 years ago. https://en.wikipedia.org/wiki/Space_mirror_(geoengineering)

  7. Steven C Crow says:


    I agree with your comments about “climate science”, which I always enclose in quotation marks to indicate an oxymoron. To get my own bearing on the issue, I calculated the temperature increase since 1958 using Mauna Loa for CO2 input, a simple but rigorous radiation propagation model, and the HITRAN data base for the CO2 absorption spectrum. The model indicates that the Mauna Loa CO2 increase during that period (315-406 ppmv) would have caused a temperature increase of 0.25 deg C, and a doubling of today’s CO2 concentration would increase temperature by another 0.56 deg C. Those calculations involve no free parameters whatsoever.

    The calculated increase since 1958 is much less than the observed increase of about 0.55 deg C (Berkeley Earth), which suggests to me that the cause of the temperature increase is not primarily CO2, whether anthropogenic or not. That is what led to my speculation that subsurface magma flows could be the cause of the increase, though your objections to that hypothesis are cogent. Large-scale secular changes in oceanic flows and another and more likely cause.

    I am entertaining the thought that the CO2 rise measured on Mauna Loa is the result of geophysical temperature increases rather than their cause. The causal chain would be the same as for the annual oscillation in CO2 concentration: increase in T => increase in vegetation => vegetative decay => CO2.

    Steve Crow

    • Clive Best says:


      HITRAN calculates the ‘radiative forcing’ of increased CO2 . This follows a logarithmic dependency DS = 5.3 ln(C/C0) and a doubling of CO2 results in about 1.1C of warming.

      The argument used is that the earth actually warms more than this due to positive feedbacks. So a slight warming causes more evaporation which increases H2O greenhouse effect. Melting Ice reduces the albedo so more heat is absorbed from the sun. However there are also negative feedbacks. The lapse rate falls because the air is moister, and more clouds will form which increases albedo. Climate models make assumptions about such phenomena and predict higher but different rates of warming. The spread of climate models shows just how uncertain these feedbacks are. It is unlikely that matters will improve either.

      So all we can say is that temperatures will probably rise somewhere between 1.5C and 3.5C if CO2 levels reach 600ppm. Geothermal energy is assumed to be an insignificant factor in global warming. However plate tectonics are essential maintaining the carbon cycle and recycling the elements that are essential for life. For example Photosynthesis in the Oceans needs phosphorus. The consumption rate is about equal to the recycling rate through subduction zones and volcanoes !

      The increased CO2 fertilisation effect leads to faster uptake of CO2 by the world’s biota. One way of reducing CO2 would be to mimic the Carboniferous period. So instead of burning or leaving dead vegetation to rot, we should sink it under or lakes. After all that is all that Fossil fuels really are in the first place.

  8. Steven C Crow says:


    Sorry Clive, on some of these points we shall have to disagree.

    HITRAN is a data base for quantum mechanical line absorption parameters including absorption strength and various parameters associated with line broadening. By assuming an appropriate line shape, one can construct a highly resolved absorption spectrum over a selected bandwidth.

    For some molecules, the lines can be inferred from direct measurement, but not so for CO2. The vibrational lines of CO2 are too numerous and packed together for measurement at the needed resolution, so HITRAN generates the line profiles by quantum mechanical calculations. My work on CO2 absorption involves 12,933 of the 22,666 lines included in HITRAN for the isotopologue of CO2 dominant in the atmosphere.

    HITRAN was developed by the Air Force in the 1970’s to facilitate the design of infrared missile seekers and is currently maintained by Harvard University. Happily it is uncontaminated by “climate science”. It does not calculate “radiative forcing of increased CO2”. Any such applications are up to the user, but the user had better be prepared to sum absorption over 12,933 spectral lines with a resolution around 0.1 mu-1.

    The results for ground and air heating bear no resemblance to simplistic “climate science” formulas like DS = 5.3 in(C/Co). In fact the temperature increase saturates at about 0.94 deg C as CO2 concentration rises above 2000 ppmv.

    It’s all physics and a little mathematics, but I had sooner look for physics and mathematics in scientology rather than “climate science”. Cheers!


    • Clive Best says:


      Sorry I was thinking you were talking about MODTRAN !

      Yes you are right and I also calculated myself Line by Line radiative transfer from scratch using HITRAN. see : http://clivebest.com/blog/?p=4597

      The central lines in the 15 Micron band are saturated up into the stratosphere. As CO2 increases these lines actually ‘cool’ the earth because temperature rises with height.

      However the net results for ‘radiative forcing’ more or less agrees with that of Mehr et al. See : http://clivebest.com/blog/?p=4697

      I have a suspicion that models simply use the same log dependency for CO2 and a general formula also for continuum H2O forcing, rather than perform a radiative transfer calculations at each time step.



      • Steven C Crow says:


        Aha!! Now we are talking real science without the oxymoronic quotation marks.

        I am not familiar with MODTRAN but will take a Google shot this evening. I imagine it is the HITRAN quantum physics core with some radiation transport accessories. As you may imagine, I am inclined to do the radiation transport math myself, but we are on our way to a great dialogue.

        I want to send you some images but do not know how in the framework of this blog. Let me know if there is a way. Meanwhile, I shall send equivalents of your image via email.


        • Clive Best says:

          You can add an image to a comment but it must be online somewhere. To do this put the URL for the image in a line on its own and it will appear.

          some URL

          More comments. Image appears above.

  9. Steven C Crow says:


    I looked at MODTRAN on the internet and shall look some more. It seems interesting, but I have not yet discovered whether it has any connection with HITRAN or whether it uses quantum physics to compute underlying absorption line strengths.

    MODTRAN means MODerate resolution atmospheric TRANsmission, whereas HITRAN stands for HIgh resolution TRANsmission. HITRAN was developed by the Air Force in the 1960-70’s for implementation on mainframe computers. MODTRAN was developed in the late 1980’s under an Air Force SBIR grant to Spectral Sciences Inc, no doubt for implementation on PC’s. Nowadays, of course, HITRAN can be run on PC’s as I am doing.

    HITRAN uses quantum physics to compute absorption line strengths for vast numbers of modes of each molecule considered, 22,666 for the most common isotropologue of atmospheric CO2. Since the HITRAN bandwidth is 450 cm-1, the average separation between lines is 0.02 cm-1.

    MODTRAN6 uses some kind of statistical bin averaging over 0.1 cm-1 bands, then assumes that each bin acts as an absorber. Nowhere have I found what is averaged over, but it could be HITRAN line strengths.

    Whatever the source of the statistics, I believe that the method is unsound. The reason is that the Lambertian line profiles used by HITRAN decay much too slowly to be confined to a 0.1 cm-1 band. A corollary is that they fill in the gaps between line centers so that the gaps themselves become saturated.

    My approach is to evaluate the absorption spectrum at each wavenumber kappa by summing over all the lines offered by HITRAN, not just those within a 0.2 cm-1 band around the target wavenumber. I currently use 1500 wavenumbers and 12,933 lines to produce an Excel array with 19,399,500 entries. Those are summed over all the lines and then over all the wavenumbers to produce the final result for warming. Daunting as those numbers may seem, they produce the desired result after about 13 sec of computing time on a Windows Vista computer.

    Thus there is no reason to accept whatever compromises MODTRAN makes to condense the number of absorbers. My results, moreover, look very different than what you showed my from MODTRAN. I’ll look at that matter more closely but am reasonably sure that it has to do with the extend of the Lambertian line profiles.


  10. Steven C Crow says:


    A simple question arising from my effort to understand MODTRAN. What exactly is plotted in your figure titled “Compare spectra for different CO2 concentrations”. The top curve may be black body radiance, and the others diminish for increasing CO2 concentrations, except perhaps near the main spike around 668 cm-1. But what measurements do those curves represent?


    • Clive Best says:


      This is not from MODTRAN. I calculated the IR spectra as seen from space emitted by CO2 at the effective emission height. This is to be compared with the actual spectra observed by for example MODIS. So this was my calculation not MODTRAN. MODTRAN is a radiative transfer code where you can adjust CO2 levels and observe spectra and the surface temperature. There is are several web interfaces. eg.



      One exercise you can do is to double CO2 to say 800 ppm and see how the spectra changes. Then increase the surface temperature offset DT until you get the same IR output flux as before. This means that a temperature increase DT offsets the enhanced greenhouse effect and is equal to the expected warming for a doubling of CO2. The last time I looked at it I got an answer of ~1C

      However to really understand the CO2 greenhouse effect it is a good idea to try and calculate it using HITRAN. The method I used was to calculate the effective emission height for each quantum line. Assuming a fixed lapse rate gives the temperature of the CO2 molecules which determines the radiance to space.

      There is also a very good book by Pierrehumbert – Principals of Planetary Climate

Leave a Reply