Modelling Coronavirus

One should always be cautious of entering a field you know nothing about! However I wanted to understand better how epidemic models work since these are now the primary driver affecting government policy during the Coronavirus emergency.

The classic model of infectious diseases  dates back to the 1920s and is called the SIR  (Susceptible, Infectious & Recovered) model. This is well described in this article plus.maths.org on  which I base this post. Here is my (probably naive) interpretation of the SIR model:

During any  epidemic there are 3 groups of persons within a population.

S – Susceptible persons to infection

I – Those sick and infectious to others

R – Those who have recovered from infection or died

The rate of change of each group is as given by the following differential equations.

\frac{DS}{DT} = B - \beta \times S \times I - D \times S

\frac{DI}{DT} = \beta \times S \times I - g \times I - D \times I

\frac{DR}{DT} = g \times I - D \times R

where

B is the birth rate (births/day)

g is the recovery rate so that

\frac{1}{g}   is the infectious period in number of days

\beta is the contact rate (people/day)

D is the death rate (deaths/day)

One important number describes overall how serious any outbreak becomes – the basic reproduction rate – R_0 , which is equal to the average number of new infections passed on by each infected person at the beginning of an outbreak. This value naturally changes as the epidemic evolves.  It can also be changed by taking such measures as quarantining, and Social Distancing to reduce the contact rate, as currently imposed in the UK.

R_0 = \frac{\beta}{g}

If R_0 is > 1 then the disease will spread rapidly at first but will then peter out when  R_0 <1 .  Note that as more people get infected the initially large susceptible group reduces and as a consequence so does R_0 until it eventually falls below zero and the peak quickly decays.  That is why eventually epidemics will always end.

So let’s try to run this model for the UK under different scenarios. To make life easy we assume that the birth rate B and death rate D can be ignored as they are small compared to the overall population (60 million). We assume.

First infected person arrived in the UK say on February 1st 2020

  1. The infectious period is 5 day
  2. The contact rate is 0.5/day
  3. Therefore R_0 is 2.5

I coded up this model and started the simulation from day 1. I used Python even though I hate it. This is so that anyone else can run a simulation. Here are the initial results.

Fig 1: Coronavirus epidemic curve of infected cases and deaths per day for the UK population without mitigation efforts (days after first case) for the scenario described.

There is a rapid exponential rise in cases following an initial long quiet stage. The peak of the epidemic occurs about 62 days after the first cases but then rapidly collapses as the number of susceptible people quickly decline. The final number of deaths reaches a total of 271,500. At the end of the epidemic most of the population would have been infected but some lucky people remain unscathed by the virus. Out of a population of 60 million people about 5 million might escape being infected at all.

Fig 3. Change in susceptible and recovered  populations. Infections and deaths are per day. About 5 million people probably would escape coronavirus infection for a UK population of 60 million

Now suppose the government imposes a near lockdown in order to suppress the epidemic at day 30. This reduces the contact rate to 0.15/day shifting R_0 < 1. This has a dramatic effect on new cases but the long tail of infections is extended for several months. This means that any relaxation of lockdown measures will simply increase R_0 again within the next month thereby triggering a second epidemic and inducing a second lockdown etc.

Fig 4. Lockdown imposed at day 30 causes a rapid mitigation in cases – but for how long?

Can our economies survive such a series of stop go epidemic lockdowns until a vaccine becomes available in optimistically 10 months from now ?

Let’s hope that our governments actions  are based on far better scientific advice than mine, because otherwise their only alternative exit strategy is to simply let the epidemic run its course, but yet more painfully slowly by turning on and off lockdowns.

P.S. I hope I am totally wrong on this !

One possible motivation for lockdowns is to buy time in order to deploy more effective anti-viral drugs to greatly reduce the death rate.

Posted in Health | Tagged , | 20 Comments

The ACORN2 average December temperature for Australia is 25.9 C

I claimed in the last post that average temperatures in Australia for December  was 24C and not the 27C as stated by BOM, even though everyone else seems to agrees with them (CRU,GHCN,Berkeley) – See for example  this graphic. So apparently I must be wrong and should beg for forgiveness on twitter. However if  you actually use just the 112 quality controlled ACORN2 station data to calculate the average temperatures in exactly the same way as CRU (rather than triangulation) then you get 25.9C, provided you first exclude off-shore stations.  The result therefore depends on your definition of the average.  I calculated  the spatial distribution for ACORN2 stations binned and averaged in exactly the same way as CRU (5 degree bins).

Fig1. ACORN2 average December temperatures (1961-1990). Stars show station locations. Values shown are the average temperature within each 5×5 bin.

Figure 1  shows the 30y average temperatures for December using just ACORN2 stations. The station normals that I calculated for each month and for overlapping ACORN station agree exactly with those calculated by CRU. For instance the Merredin 12 monthly normals are:

CRU: 25.8 25.5 23.1 18.9 14.5 12.1 10.8 11.2 13.6 16.9 20.6 23.9

Clive: 25.8 25.5 23.2 19.0 14.6 12.2 10.8 11.2 13.6 16.9 20.7 24.0

So why do others find even warmer temperatures? Well one difference to the ACORN analysis is  that CRU, GHCN and Berkeley Earth are using about 400 Australian weather stations, three quarters of which do not directly appear in ACORN2 although some overlap. Perhaps including these extra stations then boosts the resultant spatial average to 27.1 C, or does it?

This paper describes how BOM selected the stations for ACORN. To quote:

Only some of the stations in a network are suitable for use in long term climate change analyses. Most have too little data (less than 30 years), and some have excessive missing data, poor site or observation quality, or are otherwise unsuitable.

So the Australian average temperature depends on whether you include these “unsuitable” stations in the calculation for Australia, including those rejected from the ACORN series for data quality reasons. Apparently BOM itself then quotes the CRU average temperature (27.1C for December)  resulting from using all stations and not those just those from the quality controlled ACORN set.

None of this makes any difference to the temperature anomaly results which are fine, but it demonstrates just how tricky it can be to convert these anomalies to absolute temperatures.

I then looked at the CRUTEM4 results themselves. I simply used the CRU calculated normals from all the station files within Australia. These have the Country codes  94 and 95. There are 360 of them with normals defined between 1961 and 1990. I likewise also restricted all these to be lie within the continental land mass of Australia  (one is actually located in Antarctica).  Here is the result.

The average December temperature now works out at 26.1C, which is a bit nearer to the quoted 27.1 but still not quite there. If you now exclude Tasmania then we reach a value of 26.4C. This is the highest value I can get using CRUs explicit 1961-1990 normalised values, but this is still 0.8C short of ACORN2’s quoted average December temperature.

All  data and results can be found here:

ACORN2

CRU-Normals

P.S. I am actually in Katoomba, Australia  right now (6 Feb). The temperature today reached just 14C but at least brought some much needed rain!

 

Posted in Uncategorized | 5 Comments

What is the mean temperature of Australia ?

ACORN2 is the latest BOM version of station data in Australia. You can plot the average temperature anomaly data for December using their trend tracker. Shown below the graph is the average temperature for the baseline period 1961-1990. The value is 27.2C. Ed Hawkins used the anomaly  data and simply added on this average temperature to produce this plot.

Ed’s alarmist plot

I have analysed ACORN1 data before so I downloaded the latest ACORN2 version and expected to be able to use the same software to check the result. However the format has changed dramatically. ACORN1 had daily Tmax and Tmin values in a single file, one for each station covering the full measurement period. These have now been split into two separate files Tmax and Tmin and the metadata is no longer directly available ( I found it in their Javascript!). Other changes which caused me difficulties were:

  1. Different timescales for Tmax and Tmin
  2. Missing values are now simply left blank whereas in ACORN1 they were set to -9999.9
  3. Sometimes ( eg. Merriden) there are up to 17 days of consecutive data simply missing .
  4. The start dates for the two files are quite often different which means for example we have only a tmin and a missing tmax .

Having resolved all these problems I then calculated the anomalies for December and the area averaged temperatures for Australia. I got good agreement for the temperature anomalies.

Compare ACORN2 with my calculation

However I got wildly different values for the Average December temperature. I got 23.8C as the area averaged temperature for Australia between for 1961-1990 instead of their 27.2C.

So obviously I must be wrong – or am I? I take the daily average  temperature to be (Tmax+Tmin)/2 and then average this value in each of the 12 months between 1961 and 1990 and for each of the 112 stations. My values then agree almost perfectly with various travel/tourist website averages for expected monthly temperatures. For example my monthly average values for Sydney are:

22.965, 23.093,  21.857,  19.548,  16.531,  14.016,  13.068,  14.176,  16.347,  18.671,  20.297,  22.175

Compared with https://www.holiday-weather.com/sydney/averages/

My area averaged value for Australia uses a triangulation method so I thought that maybe I had screwed that up, so to check I recalculated everything using the CRU 5×5 grid method, and I basically  got exactly the same result. So I think the difference is instead the following.

Everyone uses monthly average temperatures  – GHCN, Berkeley, CRU etc to calculate anomalies whereas I am using the daily values. So my 1961-1990 averages are based on the 30 year average daily temperature for each station during any particular month. These are then spatially integrated to give the normal  climatology. So why are they not the same as ACORN or GHCN?

My suspicion is that ACORN, GHCN, Berkeley etc. uses the monthly values over the 30 years period. So the minimum temperature is the lowest Average  temperature and NOT the lowest Minimum (night-time) temperature for a given month. This could explain why they get higher values.

 

My average of daily (Tmax+Tmin)/2
ACORN2 monthly averages

Finally here are my trends for December. The increased temperatures in December are mostly because the minimum (night-time) temperatures have been rising much faster. There is not much change in the average maximum temperatures up until 2018.

December temperatures from 1910 to 2018. Note how the minimum night-time temperatures have has risen fastest,

P.S. I have no time to check all this as I will soon be on a plane to Australia !

Posted in AGW, Australia, Climate Change | Tagged , | 6 Comments