Last week I was in Washington for the 30th MIT Global Change Forum. This slightly more than annual event is a core part of the MIT Joint Program on the Science and Policy of Global Change. Shell has been a sponsor of the program for over a decade now and the forum is always rich in content and a valuable opportunity to learn and gain a better understanding of the complexities of climate change and the potential society has for dealing with it. This was also true of the Washington event.
Given the recent rancour around climate science, it was refreshing to hear Professor Ron Prinn, co-director of the program and a leading atmospheric chemist, give a very down-to-earth assessment of the state of the climate as his opening address. This was not based on any new science or particular recent revelation, but a review of the solid body of work by thousands of scientists all over the world and their use of data from satellites, ocean buoys, ground observing stations and the like to attempt to build a reliable picture of just what is happening on our planet today.
There was some discussion regarding the fact that the global land-ocean surface temperature record hasn’t shown any particular trend over recent years, other than maintaining an elevated level compared to, say, a century ago. Professor Prinn made the point that this is very much the nature of surface temperature and noted that this has happened before and will almost certainly happen again. For example, between 1940 and 1960 global temperatures declined despite increasing levels of CO2 in the atmosphere before the underlying upward trend in temperature was seen again.
There is no clear reason as to why there was a decline over this period. Some scientists have linked it to the post-war industrial boom which brought with it increased use of coal for power generation, but without the sulphur removal that is common in modern OECD power stations today. Sulphur acts as a coolant in the atmosphere, which could also mean that the current period of temperature stability may be the result of the recent step change in coal use in developing countries, much of it without sulphur removal.
But Professor Prinn also reminded the audience that the ocean plays a huge role in governing climate, with a heat capacity some 1600 times that of the atmosphere. An upwelling of colder deep ocean water could easily shift global surface temperatures for a period of years. Nevertheless, the climate forcing due to increases in greenhouse gases and aerosols remains at 1.6 W per m2, or 816 TW globally, some 50 times current global energy consumption.
This last point is brought home in James Hansen’s latest book, Storms of My Grandchildren. He shows that the shift in forcing over the last million years due to slight changes in the orbit of the earth is only 1 W per m2, yet this has been sufficient to cause the advance and retreat of vast ice shelves over thousands of miles. As such, we shouldn’t somehow think that the current man made forcing is of no significance to long term climate.
On the subject of ice, Professor Prinn put up a chart showing the current trends in Arctic sea ice coverage since satellite imagery began. Like temperature, it is quite possible to shown that there is almost no change between certain individual years, e.g. 1996 and 2007, yet at the same time there is a very discernible decrease of ice extent of 2.5% per decade (March) and 8.9% per decade (September).
The issue of policy objectives was also discussed and this is where MIT bring great clarity to an otherwise opaque subject. At the end of 2009, the total level of long lived greenhouse gases in the atmosphere, in terms of CO2 equivalent, was 469 ppm. With 2091-2100 as a future reference point, the following probability analysis emerges:
Policy Objective | Probability of ΔT > 2⁰C above 1981-2000 | Probability of ΔT > 2⁰C above 1860 (pre-industrial levels) | Probability of ΔT > 4⁰C above 1981-2000 |
No policy, 1400 ppm CO2e outcome | 100% | 100% | 85% |
Stabilize at 900 ppm | 100% | 100% | 25% |
Stabilize at 790 ppm | 97% | 100% | 7% |
Stabilize at 660 ppm | 80% | 97% | 0.25% |
Stabilize at 550 ppm | 25% | 80% | < 0.25% |
This of course raises the issue of the credibility of a global policy objective to limit warming to 2⁰C. Professor Prinn went on to offer an alternative way to express the policy objective. The approach could be to consider the level of stabilization which would limit the probability of a very serious degradation of ice shelves, which is perhaps a more tangible objective and one that may be better understood and accepted by people. The table below (Webster et al, JPSPGC Report 180, 2009) shows the cumulative probability of Arctic surface air warming from 1981-2000 to 2091-2100 (note also that the poles warm much faster than the tropics). A very different picture arises.
Policy Objective | ΔT > 4⁰C above1981-2000 | ΔT > 6⁰C above1981-2000 | ΔT > 8⁰C above1981-2000 |
No policy, 1400 ppm CO2e outcome | 100% | 95% | 70% |
Stabilize at 900 ppm | 95% | 30% | 3% |
Stabilize at 790 ppm | 80% | 9% | 0.25% |
Stabilize at 660 ppm | 25% | 0.25% | <0.25% |
Stabilize at 550 ppm | 0.5% | <0.25% | < 0.25% |
If the Greenland ice-shelves were considered to be under threat as arctic temperature rose above 6⁰C, then stabilization at 660 ppm CO2-equivalent could be considered a positive outcome in this regard. I should say at this point that this was not a recommendation from MIT, rather an attempt to show how policy objectives could be formulated. It was certainly an interesting presentation. Professor Prinn also noted that the last time the polar regions were significantly warmer (~4⁰C) than present for an extended period (about 125,000 years ago), reductions in polar ice volume led to 4 to 6 metres of sea level rise.
The above is just a taster from two days of excellent presentations. The MIT reports are all available on their website.