If the alarm about Anthropogenic Global Warming (AGW) originated in the 1980s when Dr Hansen saw a powerful correlation between the rise in temperature and the rise in carbon dioxide accumulations in the atmosphere, and projected the rises forward almost indefinitely, then the later proliferation of fearful climate scenarios for the future have obtained their rationale through what are called Global Circulation Models, or Global Climate Models — GCMs. A ‘model’ is a small version of something much larger, and a GCM is mathematical model of the planet’s atmospheric system. Here is an example.
The square diagram shows you what is going on at one point in the sphere. The thick covering there is the atmosphere, with as many as 20 layers. The grids are 100km in extent, and you need to imagine that there is a kind of weather station at the extremities of each grid, sending information via a satellite to the control room. Since there are no such weather stations, especially on the sea, the model provides the data from the information that actually exists, and estimates the values where it does not exist. In fact there are some real data, while the known physics and chemistry of the atmosphere and the oceans provide the theoretical basis of the models. Time-points are also important. How frequently is the model to be run? One standard is 20 minutes. You can see that the computational requirements of a global model are simply enormous, and we would not even be talking about GCMs had not parallel computing become possible. Today’s computers are vastly more powerful than the first generation of parallel computers with which I was involved (in policy terms) in the late 1980s.
Nonetheless, there are other areas which are important but for which there is little useful information. One such is the domain of clouds, which we all recognise are most significant in affecting how hot it is wherever we are. Clouds form and reform, and have a short temporal life — fair-weather cumulus clouds, for example, can last from 5 to 45 minutes. For this reason they are not easily measured, and by and large they are given a standard value in GCMs. Whether or not there are cyclical patterns in climate, of short or long phase, remains an issue in climate science, but as far as I know such patterns are not included in the parameters of the models.
Temperature is the standard datum that goes into averages, and is based on the average of the highest and lowest temperatures for that day in the location. For each grid cell, therefore, the temperature data represent the average of averages for that grid cell. In Canberra, where I live, the average for today will be somewhere around 15 degrees C, with a range of 8 to 23 degrees C. Within a radius of 100 km there will be some rather different averages. Thredbo, for example, will be somewhere around 4 degrees C with a range of 1 to 9 degrees C. Coastal areas to the east of both Canberra and Thredbo will have higher averages. How much sense does it make, given the brevity and artificial construction of the observations, to average these averages? Oh, and the models don’t include el Nino and la Nina conditions, mostly because no one is yet able to predict them more than a few months ahead. These conditions have a great deal of consequence for the weather in the USA, Asia and Australia, as we have seen recently, and they are not caused by carbon dioxide accumulations — or perhaps more cautiously, no one has yet been able to show that they are, and there are excellent reasons for supposing that they are natural perturbations of the ocean/atmosphere link.
I’ve gone on at some length in the last paragraphs, because for most people what happens in the models is mysterious. One more theoretical comment: the IPCC, in its Third Assessment Report, said that climate was essentially chaotic, and that linear models could not grasp its complexity. That is another debate, and I am not competent to decide whether or not that is the case. In any case, the IPCC no longer makes the same point. If you want to read further, here is a most useful and readable summary, by an AI and neural networks expert. If you do read it, make sure you read the comments too, because there are some sustained objections to his presentation. I guess you could say that here too, the science is not settled. Above all, there is, and has to be, great uncertainty about the accuracy of model outcomes as representations of future reality.
Models need to be tested to determine how good they are. The standard processes are ‘validation’ and ‘verification’. If a computer model has been validated it provides a satisfactory range of accuracy consistent with its intended application. If it has been verified that tells us the model’s internal program, its theory of relationships, seems to be accurate, and it works as it should.
Model-building is done to try to understand the relationship of the variables to other variables in some kind of dynamic way. That knowledge might then help in producing a better aeroplane, a better car, or a good prediction of the behaviour of the Australian economy under, say, GFC-like conditions. To the best of my knowledge, the Australian Government does not rely on Treasury’s econometric models when it is formulating economic policy. One reason is that there is more than one of them.
Model-derived information has also been used to present ‘projections’ or ‘scenarios’ about the world’s climate in the future under certain conditions. I have much the same feeling about GCMs, of which there are dozens. All of them, except a Russian model, assume not only that carbon dioxide is the powerful ‘forcer’ of climate, a dynamic that pushes climate out of some kind of equilibrium, but assume also that there is something called ‘climate sensitivity’ that multiplies the effect of a doubling of CO2. In consequence, the most recent display of those models shows them to projecting temperatures that are much hotter than those found by observation. I’ve shown versions of this graph before , and it is to me the most powerful testimony that climate models have a long way to go before they can provide us with useful information on which to base policy. It is the Russian model whose track lies below the observations. There are several different examples of this comparison, and this one is the most recent I can find. It is by Dr Roy Spencer. Others use radiosonde balloons. They show the same poor fit between model predictions and observations.
It is fair to say that it will be a long time before these models are verified, because we simply don’t have a lot of experience with them, despite the billions of dollars that have been spent internationally on developing them — and in building newer, more powerful and faster computers. In its most recent report, the IPCC said that models had shown a ‘modest improvement’ in modelling clouds and aerosols, and accepted that none of them had been able to predict what was then a 15-year hiatus in global temperature. So you either accept them as they are, or wash your hands of them altogether. The IPCC remains highly confident in their worth, despite all the limitations which it acknowledges. In my humble opinion, neither the models nor the average of them (!) are valid, because they cannot model what has actually occurred. The orthodox argue that they do, they do, and that it is those pesky aerosols that somehow prevent the effects of CO2 being plain and obvious. The trouble is that observationally, those pesky aerosols can’t be found in the necessary quantity, and measuring them gets you into the same problems that come with trying to measure clouds, as the IPCC admits.
In sum, the models are an ambitious exercise that so far has not produced results that ought to be taken seriously, by governments or anyone else. Maybe they will improve, and we might hope so, because of the great expenditure on them. But at the moment, I don’t take any of their projections seriously. Even if one of them turns out to be realised in some sense, that will likely be more the result of luck than of predictive skill.
Next: Why do so many people believe in all this?
Later: A recent essay on this subject passed my notice when I was writing my own piece. You can read it here. If you do read it, please read the comments too, which have some objections. I’m not as confident about all this as is the writer, but he does have some useful data.
Later still: I found this long explanation most enlightening.