As a former historian, I see the emergence of the world from the last ice age as the starter’s gun for the advance of humanity (see here), and I’ve read a lot about the ice ages and the pattern that the recent ones have. From time to time you get people telling us that the ‘interglacial’ we are in is about to end, while others have suggested that this one will last for 50,000 years, which would mean that I wouldn’t be around to see the next glacial come in. Still others suggest that the more CO2 we pump out the better, because it will delay the arrival of the next ice age.
No one is quite sure why we have had, in the geologically recent period, the Pleistocene, a pattern of glacials and inter-glacials, with the glacials lasting much longer. As to the second point, it takes a lot of energy to melt ice, but it doesn’t take so long for ice to form if there is no heat. As to the pattern, the Milankovitch cycles and the theory contained in them seems to be the conventional wisdom as to why ice ages occur, but they don’t explain everything, in particular why they started when they did.
In the last half million years or so there have been a number of ice ages, which you can see on the following chart.
You can see that the interglacial we are in (at the right-hand end of the chart) seems to be slightly different to the others. The other four have had higher temperatures, but the last three peaked and declined rapidly; ours seems to have held up longer. I’ve wondered a bit how different these patterns are, and someone has done the work for me. It appears in the next chart, which relies on the EPICA ice core data.
Here the quick declines from the high temperatures of the earlier inter-glacials are more obvious. Ours has already lasted longer than two earlier ones, and is about four thousand years shorter than the last interglacial, which began 130,000 years ago. Why? What has caused the difference?
The guy that did this nice work you can encounter at <http://oz4caster.wordpress.com>. His name is Bryan, he is a Texan with forty years in meteorology, and he has normalised [the beginning of each interglacial] to the year where the estimated global temperature first reached the level of our modern “normal” climate. The approximate year where each interglacial episode first reached the modern “normal” temperature is shown in the legend. There is no doubt that our interglacial, the Holocene, is very different to the others.
A few months ago I wrote a piece on ‘the Ruddiman hypothesis’, the essence of which was that humans have been altering the climate for a long time, through burning, clearing and maintaining domesticated animals in large numbers — not just in developing heavy industry from the the late 18th century. Well, you can see why Ruddiman might have thought so given Bryan’s chart above. But as with so much else on climate science, it’s a hypothesis that can’t really be tested in any way.
Look at the declines again. In every case but ours the drop from the interglacial peak temperature is more or less symmetrical. The peak lasts a little while — a couple of thousand years — and then down the temperature goes. The Holocene, however, is chugging along some 12,000 years after temperature reached its peak. It is most like the interglacial 400,000 years ago, which lasted perhaps 28,000 years.
So, how much time do we have left before the ice returns? Bryan argues that we have no better data than those you see above, and that there is no reason to expect that the Holocene will last indefinitely. Equally, he sees no sign of its imminent decline. It’s worth noting that the difference between today’s ‘modern normal’ temperature and that of the glacial standard is about 5.5 degrees C. Much of the present northern hemisphere would be only sparsely populated at average temperatures so cold.
And a final thought. Bryan’s graph truncates the long ice ages, and you might get the impression that the interglacials and the glacials are of much the same length. It ain’t so. Very roughly, the glacial lasts for about a hundred thousand years, and the average of the interglacials so far is about 16,000 years. The ice ages build up a colossal amount of ice — it was two kilometres thick over the site of modern Chicago, for example — and the more ice the higher the albedo, meaning that more sunlight is reflected away, so the planet remains cold.
The colder it is the less life there is. Our modern civilisation is simply unworkable in an ice age, so the arrival of one — and the arrival can occur within a century or so — is a threat of great moment, much greater, in my opinion, than that of a warmer world, where we know that life flourishes. Much too little attention has been given, by those crying doom, to the possibility of the return of serious cold.
I recognise that the recent icy snap in North America is simply weather, as are early hot temperatures here, but it ought to be a reminder that human, animal and plant life all welcome summer, and go rather dormant in winter. A hundred-thousand-year winter really would be the end of civilisation as we know it.
Join the discussion 30 Comments
Hmm, the very woolly mammoth in the room, Don?
When I look at those graphs I think the Industrial Revolution came just in time. The Earth was at the depth of the little ice age, Thames river frozen, years without summer, glaciers growing – even destroying villages in the Alps. It appears the Earth could have been on its way to another ice age.
Increasing atmospheric CO2 concentration, together with increased solar activity (particularly from 1948) may have saved us – for now.
When I look at the graphs the steepest decent into an ice age is about 0.1C per century, hardly anything to worry about for many generations. I wouldn’t bet that increasing CO2 will warm us either, as it is possible natural feed-backs may negate any warming. The so called “The Little Ice Age” is probably a minor climatic aberration as possibly is the warming during the 20th Century.
“….as it is possible natural feed-backs may negate any warming.”
Agree. But lets define “natural feed-back”.
One possibility, is that humanity fails to cope with global warming and population declines and so does CO2 production. Eventually vegetation re-absorbs CO2 and temperature declines.
dlb those “natural feed-backs” as you quaintly describe them might not be so benign. 🙂
David, that sounds like a natural feedback operating over a very long period, relying on the premise that CO2 is the control knob.
I would define natural feed-backs as much more immediate working in the time span of decades to perhaps a century with cloud cover being the prime candidate. The ocean would also act as giant temperature buffer.
If I had to bet money on the sign of natural feed-backs to any temperature change I would say it would be the opposite.
Well there has not been any “natural feedback” over the last 150 years. Temperature has risen fairly constantly since the Industrial revolution.
And just a vague belief in “cycles” without at least nominating some physical process that can be tested is just a faith based argument.
Its not that different to arguing that God will take care of it.
Any correlation of global Temp with CO2 is only evident from 1910 to the 1940s and the mid 1970s to 1998. Also the slope of these two periods is almost identical despite CO2 increasing faster in the later period.
I base my belief in nature over anthropogenic influence on repeated evidence. Examples:
1. Loss of frog species (a fungus not pollution)
2. Degradation of the Great Barrier Reef (starfish not bleaching)
3. Stomach ulcers (a bacteria not stress)
4. Human personality (high genetic component rather than just environment)
Yes, I know there are theories saying that the starfish numbers are due to nutrient enrichment or the frog fungus has been transported by humans. But as far as I know they are only theories.
Would you at least concede that notwithstanding a few flat spots in temperature increase no feed-back mechanism has prevented global temperature increasing from 1900 to 2014.
Overall temperature has increased, hasn’t it ?
What ever feed back mechanisms might have been operating in the last 100 years they have not been very influential, have they?
well at least not strong enough to stop the warming
Just like the feed back mechanisms were not strong enough to stop the medieval warm period, the roman warm period and minoan warm periods ( to mention just 3) all of which were warmer (so far) than the current modern warm period.
A minor typo I think where you say “You can see that the glacial we are in” – don’t you mean “inter-glacial”?
Otherwise, an interesting and thought-provoking post as usual. Unfortunately the AGW crowd won’t take any notice of course, because they are not interested in the science, just in promoting their “noble” cause. If there were to be drastic cooling heralding an imminent glacial period, I wouldn’t be surprised if they applauded and claimed it as proof that it was caused by humans! I am almost 70 and have never seen such institutionalised economic (ZIRP) and environmental (CAGW) stupidity in my life. I dearly hope that the world shrugs off these twin madnesses, and soon, or I fear for the future of our children. And not because of any ice age!
Many thanks! Correction made. I agree with you, too.
Hi BfT – There is a science fiction novel, Fallen Angels, by Larry Niven (Jerry Pournelle and Michael Flynn) that envisions just such a future as you surmise in your second paragraph – a radical environmentalist government that brings on a new glacial epoch. The novel is mostly an homage to hard sci-fi fans and an attack on scientific illiteracy in media and politics. I found the characters cardboard cutouts, but parts are entertaining.
Yes there are glacial and inter-glacial periods, which may run counter to any warming trends. But you need to look at the rate at which temperature is changing due to glacial cycles. Looking at the graphs you have posted a 2 degree change might take 5,000 years, if it is rapid, or perhaps 10,000 years if slower. AGW is hypothesising 2 to 4 degree temperature increase in 100 to 200 years. That’s a 50-fold difference in the relevant time scale (e.g. 100 years to 5000 years).
What are you suggesting that there is no need to worry about AGW because we can all wait for 4,500 years for the next ice age to cool the planet?
These types of arguments are not convincing.
I note that you steadfastly remain convinced of the validity of the AGW arguments. So here is a genuine question for you: what credible information would you need to find, for you to change your mind about AGW?
The inclusion of a new variable (e.g. sun spot activity) into the best available climate model that
(i) improved the R squared of the model
(ii) rendered coefficient on CO2 statistically insignificant.
What is wrong with comparing model projections with observations and making a decision as to the utility of the models from that exercise?
Nothing. And we do test models the way you describe. Each year new temperature data become available, which either strengthen or weaken the AGW hypothesis.
But one of the purposes for creating a model is to generate predictions. If we make a temperature prediction for 2100, we don’t necessarily want to wait until the observations for 2100 become available, before we make policy decisions.
But you can see that analysing trends the way you describe is subjective. A change in trend (+/-) may or may not be dismissed as a statistical variation depending on ones preconceptions.
That last paragraph should begin “And you can see”…
Very interesting suggestion, David, that I think a number of sceptics have been making for some time. But what it would indicate to me would be that at last the modelling appears closer to reality – especially if it was successful in hindcasts. For me, it would be at last a positive reflection of climate modelling capability.
If those who consider that solar activity and the Earth’s position (distance, tilt etc) in relation to its Sun, are the major determinants of changing climates, the models would need to use significantly reduced climate sensitivity values. I suspect much of the problem with the climate models has been that to begin with, carbon dioxide was set up as the villain of the piece. The butler did it, and all we have to do is prove it! That’s not the way to play Cluedo.
Some few years back, I put that question on skepticalscience. Only one person replied: I recall that he said he would change his mind if it was shown there was no tropospheric warming. Well, no predicted hotspot was found, as you know. Interesting.
Here is a simple spreadsheet model that tracks southern ocean temperatures better than any IPCC model I have seen. The model and parameters are shown in the inset. It is based on a solar radiance model published by Nicola Scafetta. There is no need for any reference to CO2.
ah. graphic didn’t appear
maybe this time
Looks like curve fitting which, which is the point Nova cautions about
1. If you think it is such a simple model could you explain rationale behind the formula?
2. At any rate from 2000 the temperature of the
Southern Ocean (Blue line) is continuing to increase slowly but the model’s prediction (Red line) is decreasing sharply!
What’s up with that ?
There is a rationale but it is a confronting one for the modern mind – so much so that discussion of it is apparently banned on the WUWT climate sceptic site. The fitting isn’t totally arbitrary. It is based on the hypothesis that planetary motions influence, perhaps control, the internal circulation of plasma in the sun. The cycle periods used in the model are related to planetary orbits – Jupiter and Saturn mainly, as I remember. So you see – not something that is going to catch on any time soon but to me what matters is results.
The sun and planets have evolved together from a cloud of matter and the planets move the sun around their common gravitational centre by about half the sun’s radius so tidal forces are significant. It is a resonant system – child on a swing – small impulses maintaining it.
I’m not pushing the model particularly – too old for that and have other things to keep me occupied in my retirement – writing future fiction. I’m more interested in how people see science than I am in the climate. The few times I’ve visited this site it has impressed me with its calm rational discussion so I thought I’d risk it.
Your second point is well taken. That’s the controversial ‘pause’ since about 1997 or so. The main deviations from the model are around inflection points – 1910 and 1940 – so perhaps the latest is similar. The next few years will tell us if the rise resumes or if there is a decline. Observation and data win in the end.
(login issues again. Apologies in advance if this appears twice)
“Observation and data win in the end.” Yes very true!
I agree with you on your arithmetic, as far as it goes. But even a small decline in average temperature in the temperate zones will have an immediate impact on food production. Long before it is plain that the interglacial is ending there will be real troubles all over the world. There is no persuasive case for the 2 degree C limit – none that I have seen, anyway.
So what I perceive is undue anxiety about a warmer future and a great indifference to the possibility of cooling, even though there are serious arguments that we are in a cooling phase that might last until 2030.
And I repeat that what we ar seeing is weather, not climate.
Why default-display these comments by “best”? And also, what is the definition of “best” and who actually decides what is “best”? Makes it very hard to follow a logical conversation from A to Z – and anyway, I particularly liked the “woolly mammoth in the room” one. Yours, help/hopefully.
kvd — I’ve never noticed thew default, and in your honour I’ve changed it to ‘newest’!
Thank you Prof Aitken. In your honour I shall now read all books from last page to first; this makes just (almost) as much sense, no?
and also, sometimes, I shall even spell your name rite. Apologies.