Archive for the ‘IPCC’ Category

The last of the three IPCC 5th Assessment Reports has now been published, but with a final Synthesis Report to come towards the end of the year. The “Mitigation of Climate Change” details the various emission pathways that are open to us, the technologies required to move along them and most importantly, some feeling for the relative costs of doing so.

As had been the case with the Science and Impacts reports, a flurry of media reporting followed the release, but with little sustained discussion. Hyperbole and histrionics also filled the airwaves. For example, the Guardian newspaper reported:

The cheapest and least risky route to dealing with global warming is to abandon all dirty fossil fuels in coming decades, the report found. Gas – including that from the global fracking boom – could be important during the transition, but only if it replaced coal burning.

This is representative of the general tone of the reporting, with numerous outlets taking a similar line. The BBC stated under the heading “World must end ‘dirty’ fuel use – UN”:

A long-awaited UN report on how to curb climate change says the world must rapidly move away from carbon-intensive fuels. There must be a “massive shift” to renewable energy, says the study released in Berlin.

While it is a given that emissions must fall and for resolution of the climate issue at some level, anthropogenic emissions should be returned to the near net zero state that has prevailed for all of human history barring the last 300 or so years, nowhere in the Summary Report do words such as “abandon” and “dirty” actually appear. Rather, a carefully constructed economic and risk based argument is presented and it isn’t even until page 18 of 33 that the tradeoff between various technologies is actually explored. Up until that point there is quite a deep discussion on pathways, emission levels, scenarios and temperature ranges.

Then comes the economic crux of the report on page 18 in Table SPM.2. For scenarios ranging from 450ppm CO2eq up to 650 ppm CO2eq, consumption losses and mitigation costs are given through to 2100, with variations in the availability of technologies and the timing (i.e. delay) of mitigation actions. The centre section of this table is given below;

 IPCC WGIII Table SPM2

Particularly for the lower concentration scenario (430-480 ppm) the table highlights the importance of carbon capture and storage. For the “No CCS” mitigation pathway, i.e. a pathway in which CCS isn’t available as a mitigation option, the costs are significantly higher than the base case which has a full range of technologies available. This is still true for higher end concentrations, but not to the same extent. This underpins the argument that the energy system will take decades to see significant change and that therefore, in the interim at least, CCS becomes a key technology for delivering something that approaches the 2°C goal. For the higher concentration outcomes, immediate mitigation action is not so pressing and therefore the energy system has more time to evolve to much lower emissions without CCS – but of course with the consequence of elevated global temperatures. A similar story is seen in the Shell New Lens Scenarios.

Subtleties such as this were lost in the short media frenzy following the publication of the report and only appear later as people actually sit down and read the document. By then it is difficult for these stories to surface and the initial sound bites make their way into the long list of urban myths we must then deal with on the issue of climate change.

With much anticipation but little more than 24 hours of media coverage, the Intergovernmental Panel on Climate Change (IPCC) released the next part of the 5th Assessment Report, with Working Group II reporting on Impacts, Adaptation and Vulnerability. The report started with the very definitive statement;

Human interference with the climate system is occurring . . . 

But this was immediately followed by a statement that set the scene for the entire assessment;

. . . . and climate change poses risks for human and natural systems.

The key word here is “risk”. This report attempted to explain the risks associated with rising levels of CO2 in the atmosphere and demonstrate how the impact risk profile shifts depending on the eventual change in surface temperature and the response to this through adaptation measures. Unfortunately, the subtlety of this was largely lost in the media reporting.

For example, although the Guardian did use the “risk” word, it chose to open one of its many stories on the new report with the statement;

Australia is set to suffer a loss of native species, significant damage to coastal infrastructure and a profoundly altered Great Barrier Reef due to climate change . . . .

This was under the headline “Climate change will damage Australia’s coastal infrastructure“.

The IPCC report didn’t actually say this, rather it presented a risk assessment for coral reef change around the coast of Australia under different emission and temperature scenarios. This was summarised, along with a wide variety of other impact risks, in a useful chart form with the Australian coral extract shown below.

Coral reef risk

Of course it is the job of the media to translate a rather arcane and technical report into something that a much larger number of people can understand, but it is nevertheless important to retain the key elements of the original work. In this case, it is the risk aspect. With very few exceptions, there is no “will” in this subject, only “could”. Some of those “could” events have a very high level of probability (the IPCC use the term “virtually certain” for 99% probability), but this still doesn’t mean it is certain.

There has been an increasing tendency to talk about climate change in absolutes, such as “stronger hurricanes”, “more violent storms” and “a profoundly altered Great Barrier Reef”, when in fact this isn’t how the science describes the issue. Rather, it is how the media and others have chosen to describe it. This isn’t to say that these risks should be dismissed or ignored, they are real and very troubling, but the outcomes are not a given. Hopefully as others have time to digest the latest IPCC work, this aspect of the story becomes more prominent.

Taking this a step further though, it would appear that even the IPCC have chosen to present the risks with a slight skew. Although they are completely transparent about all the material they have used to build their case, the final presentation in the risk charts doesn’t tell the full story. They have chosen to present only two scenarios in the summary document, the 2 ºC case and the 4 ºC case. There is much to say between these and arguably, the space between 2 and 4 is where the real risk management story lies.

This was analysed in 2009 by the MIT Joint Program, in their report Analysis of Climate Policy Targets under Uncertainty. The authors demonstrated that even a modest attempt to mitigate emissions could profoundly affect the risk profile for equilibrium surface temperature. In the chart below five mitigation scenarios are shown, from a “do nothing” approach to a very stringent climate regime (Level 1, akin to the IPCC 2 ºC case). They note in the report that:

An important feature of the results is that the reduction in the tails of the temperature change distributions is greater than in the median. For example, the Level 4 stabilization scenario reduces the median temperature change by the last decade of this century by 1.7 ºC (from 5.1 to 3.4 ºC), but reduces the upper 95% bound by 3.2oC (from 8.2 to 5.0 ºC). In addition to being a larger magnitude reduction, there are reasons to believe that the relationship between temperature increase and damages is non-linear, creating increasing marginal damages with increasing temperature (e.g., Schneider et al., 2007). While many estimates of the benefits of greenhouse gas control focus on reductions in temperature for a reference case that is similar to our median, these results illustrate that even relatively loose constraints on emissions reduce greatly the chance of an extreme temperature increase, which is associated with the greatest damage.

 Temperature uncertainty

There is a certain orthodoxy in only looking at 2 and 4 ºC scenarios. It plays into the unhelpful discussion that seems to prevail in climate politics that “it must be 2 ºC or it’s a catastrophe.” I posted a story on this late last year. As it becomes increasingly clear that the extreme mitigation scenario required for 2 ºC simply isn’t going to happen, society will need to explore the area between these two outcomes with a view to establishing what can actually be achieved in terms of mitigation and to what extent that effort will shift the impact risk. Maybe this is something for the 6th Assessment Report.

What to make of 2013?

It’s difficult to sum up 2013 from a climate standpoint, other than to note that it was a year of contrast and just a little irony. Overall progress in actually dealing with the issue of global emissions made some minor gains, although there were a few setbacks of note along the way as well.

  • The IPCC released the climate science part of their 5th Assessment Report and that managed to keep the media interested for about a day, after which it was back to issues such as health care, economic growth, Euro-problems and assorted regional conflicts. Importantly, the report introduced into the mainstream the much more challenging model for global emissions, which recognizes that it is the long term accumulation that is important, rather than emissions in any particular year.
  • The global surface temperature trend remained stubbornly flat, despite every indication that the heat imbalance due to increasing amounts of CO2 in the atmosphere remains in place and therefore warming the atmosphere / ice / ocean system somewhere, although where exactly remained unclear. The lack of a clear short term trend became a key piece of evidence for those that argue there is no issue with changing the concentration of key components of the atmosphere, which further challenged the climate science community to provide some answers.
  • The UNFCCC continued to put a brave face on negotiations that are being seriously challenged for pace by most of the worlds declining glaciers while the world’s largest emitter, China, often thought of as blocking progress at the international level kicked off a number of carbon pricing trial systems in various parts of the country.
  • Australia elected a government that proudly announced on its first day in office that the carbon pricing system which was finally in place and operating after eight years of arguing would be dismantled, only to be confronted by the fact that the country sweltered under the hottest annual conditions ever recorded in that part of the world.
  • Several very unusual global weather extremes were reported, including what may be the most powerful ever storm to make landfall, yet there was a distinct lack of desire by scientists and commentators to attribute anything to the rising level of CO2 emissions in the atmosphere, except perhaps for the UNFCCC negotiator from the Philippines who went on a brief hunger strike in response to devastation that hit parts of his country.
  • The EU carbon price remained in the doldrums for the entire year, although did show a few signs of life as the Commission, Parliament and various Member States teased, tempted and taunted us with the prospect of action to correct the ETS and set it back on track. In the end, the “backloading” proposal was passed by the Parliament and will likely be adopted and implemented, but the test will be whether or not the Commission now has the backbone to propose and unconditionally support the necessary long term measures to see the ETS through to 2030 as the main driver of change.
  • For the first time that I had seen, a book was released that finally got to grips with the emissions issue, yet somewhat alarmingly failed to find any clear route out of the dilemma we collectively find ourselves in. “The Burning Question”, by Mike Berners-Lee and Duncan Clarke recognized how difficult the emissions challenge has become and questioned those who trivialize the issue by arguing that more renewable energy and better efficiency is all that is needed to solve the problem. Clearly a book for those who designed the hallway posters [Link] at COP19 in Warsaw to read. Closer to home, new Shell Scenarios released in March [Link] 2013 did chart a pathway out of the emissions corner that Mike and Duncan painted themselves into, but the much discussed 2°C wasn’t quite at the end of it.
  • The IEA put climate change back in the headlines of their World Energy Outlook, with a special supplement released in June outlining a number of critical steps that need to be taken to keep the 2°C door open. Unfortunately they hadn’t taken the time to read “The Burning Question” and consequently positioned enhanced energy efficiency as a key step to take over this decade.
  • In North America both the US and Canadian Federal governments continued to head towards a regulatory approach to managing emissions, while States and Provinces respectively continued to push for carbon pricing mechanisms. California and Quebec linked their cap and trade systems to create a first cross border link in the region.
  • The World Bank Partnership for Market Readiness continued its mission of preparing countries for carbon markets and carbon pricing, with numerous “works in progress” to show for the efforts put in to date. But the switch from early trials and learning by doing phases to robust carbon trading platforms underpinning vibrant markets remains elusive.

 These were all important steps, particularly those that tried to broaden or strengthen the role of carbon pricing. On that particular issue, 2013 saw both positive and negative developments, with progress best described as “baby steps” rather than anything substantial. With a change in the European Parliament, mid-term elections in the US and Australia in the process of unwinding, it is difficult to see where the big carbon pricing story in 2014 will come from. Perhaps the tinges of orange (see below) now beginning to appear in South America will flourish and green with COP20 being held in that region towards the end of the year.

 Slide4

Slide3

Slide2

 

Slide1

As expected and as had been widely leaked, the IPCC 5th Assessment WG1 Report released last week presented a range of evidence that further underpinned the case for anthropogenic induced warming of the climate system. By contrast with the 4th Assessment Report issued in 2007, the chance of a human link shifted from likely to extremely likely. Pages of supporting evidence were presented. 

But there was another important development since the 2007 report, the concept that cumulative total emissions of CO2 and global mean surface temperature response are approximately linearly related. There was only one reference to cumulative emissions in 2007 and that was simply a means of describing the mitigation challenge we face over this century. The 4th Assessment Report noted that;

Based on current understanding of climate-carbon cycle feedback, model studies suggest that to stabilise at 450 ppm carbon dioxide could require that cumulative emissions over the 21st century be reduced from an average of approximately 670 GtC to approximately 490 GtC.

The 5th Assessment Report takes this much further and devotes considerable attention to the subject. On page 20 of the Summary for Policy Makers, the report states;

  • Cumulative total emissions of CO2 and global mean surface temperature response are approximately linearly related (see Figure SPM.10). Any given level of warming is associated with a range of cumulative CO2 emissions, and therefore, e.g., higher emissions in earlier decades imply lower emissions later.
  • Limiting the warming caused by anthropogenic CO2 emissions alone with a probability of >33%, >50%, and >66% to less than 2°C since the period 1861–188022, will require cumulative CO2 emissions from all anthropogenic sources to stay between 0 and about 1560 GtC, 0 and about 1210 GtC, and 0 and about 1000 GtC since that period respectively. These upper amounts are reduced to about 880 GtC, 840 GtC, and 800 GtC respectively, when accounting for non-CO2 forcings as in RCP2.6. An amount of 531 [446 to 616] GtC, was already emitted by 2011.
  • A lower warming target, or a higher likelihood of remaining below a specific warming target, will require lower cumulative CO2 emissions. Accounting for warming effects of increases in non-CO2 greenhouse gases, reductions in aerosols, or the release of greenhouse gases from permafrost will also lower the cumulative CO2 emissions for a specific warming target.

The report also featured the chart below.

 IPCC Cumulative Carbon

 

This is important in that it clearly introduces into the mainstream the notion that the atmospheric CO2 issue is a stock problem, which brings with it a number of implications for both the energy system and the solution set.

For the energy system, the key issue this raises is that the amount of carbon already in the pipeline for consumption is considerably more than the remaining stock equating to a 2°C temperature anomaly goal. This has been picked up by a variety of organizations, both NGO and financial, and is at the core of the recent discussions on a “carbon bubble”.

But it also points to a critical aspect of finding a solution to the CO2 problem, the use of carbon capture and storage (CCS). I have written a great deal about this in previous postings. Sequestration (or removal of atmospheric carbon) is the only reliable mechanism for managing the stock, which means either increasing the permanent bio stock of carbon through forestry and land use or capturing and storing carbon dioxide geologically (CCS). Unfortunately this doesn’t get much of a mention from the carbon bubble proponents, which is a clear shortfall in their analysis. With the mitigation report coming out from the IPCC in the first half of next year, this WG1 finding may be an important placeholder for a more substantial discussion around sequestration.

One area that is left unaddressed, at least for me, is a better discussion on the role of short lived climate pollutants (SLCP) such as methane, particularly in the context of a stock framework for thinking about the climate issue. Although the IPCC say that the effective stock of CO2 must be reduced to account for the warming impact of SLCP, this isn’t the whole story. The difficulty is that while anthropogenic CO2 stays in the atmosphere for a very long time, gases such as methane do not – they break down to CO2. This means that methane isn’t a stock issue, but a flow issue, i.e. the impact of methane released today is to change the rate of current warming, but not really the peak warming that we will likely see at some point late this century or early next century. Methane emissions at that time will impact peak warming. It also means that the current efforts to reduce methane now could be undermined unless CO2 is also reduced.

So that is my take on this first release of the 5th Assessment Report. Of course there is a wealth of data to work through and understand, but this critical concept of cumulative carbon is one that needs to filter through policy circles. Once the penny drops on this story, we might actually see some real progress in policy making that will make a difference.

Bracing for impact

At the end of this week the Intergovernmental Panel on Climate Change (IPCC) is due to release the first part of its 5th Assessment Report, which will cover the science behind the climate change issue. Arguably this is the most important part of the whole 5th Assessment process in that it lays out the foundation for all that is to follow. It is also the foundation document for government policy approaches on climate, UNFCCC collective action and public perception and understanding of the climate issue.

Early versions of the report have been widely leaked and discussed at length on line and in recent weeks more and more media stories have been appearing arguing the pros and cons of the current scientific process. Claims of errors abound, such as this story in the UK Daily Express. Much attention is being paid to the recent “hiatus” in global temperatures, even as CO2 levels continue to rise. This is claimed by some as grounds for dismissing the entire warming hypothesis. Unfortunately such calls trivialize a complex issue and simply seek to shed doubt on a global issue that needs continued attention.

As the 5th Assessment Report arrives, we shouldn’t forget that the role that CO2 plays in regulating the temperature of the planet has been well understood for over a century, the physics and chemistry that governs the behaviour of our atmosphere has been developed, built on and refined over the decades and that the impact of a shift in the level of CO2 in the atmosphere has been highlighted for more than 50 years. It’s only in the last few years as the crunch point approaches, that on-line amateurs have decided they have a valid voice in a complex area of science and therefore ought to be listened to.

Just to review a very few of the key landmarks in 120 years of careful thought, observation and analysis;

Arrhenius in 1896 – he was an early pioneer in establishing the link between atmospheric CO2 and temperature;

 Arrhenius

  Keeling in the late 1950’s established that global CO2 levels were rising sharply;

 keeling_tellus_plot

The White House  – the Johnson Administration recognised the CO2 issue in 1965; White House 1965

James Hansen in 1988 alerting the US Congress to the issue;

 James_Hansen

 Allen et. al. in 2009 defining in very clear terms how CO2 is accumulating and what the limits are; Nature Cover - Trillionth Tonne

One of the keys to public acceptance and understanding of the IPCC report will be the way in which the media report on it. As noted above, in recent weeks there has been considerable press focus on the surface temperature “hiatus”, ranging from thoughtful discussion to questionable journalism – i.e. “actually we’re cooling, claim scientists”. Earlier this week, one of Britain’s leading news organizations chose to do a preview of the Friday release of the IPCC report on the main evening television news. They explained the issue quite well, but then offered two contrasting views of the science. One was an interview with a leading UK climate scientist who is also a contributing author to the IPCC report, the other was with a blogger who lacks credible credentials and objectivity on the issue.

My take on all this is that the IPCC have a difficult time in front of them and current media practices, even from mainstream outlets, won’t really help. The public could well be left more confused than ever, despite a very clear warning in the form of the 5th Assessment Report.

Mid-August is never a great time for activity on the climate front, particularly not in holiday bound Europe. Perhaps the “highlight” of the week was the leak of a late draft of the upcoming Summary for Policymakers of the IPCC WG1 5th Assessment Report. The formal release is near the end of September, but the leak has caused quite a stir. Most mainstream media have reported on at least one aspect of the draft document, in almost all cases highlighting that the temperature rise seen over the last 100 years is “extremely likely” caused (at least more than half of it) by the release of CO2 into the atmosphere from the use of fossil fuels by humans. This is a more certain statement than has been made in the past by IPCC.

A few media outlets that felt that their audience perhaps knew this, ventured further and tended to focus on the sea level rise projections that the IPCC have put forward (at least in this leaked draft). IPCC note that it is “virtually certain” that the rate of global mean sea level rise has accelerated during the last two centuries. For example, ABC (Australia) reported that ”sea levels will rise by between 29 and 82cm by the end of the century, with scientists fairly confident that will be the upper limit.

The draft Summary Report also picks up on an issue I posted on last week, that being  the sensitivity of sea level to global temperature and therefore the very substantial rise that could occur over the coming millennia as a result of the temperature rise now under way. IPCC note that:

There is very high confidence that the maximum global mean sea level was at least 5 m higher than present and high confidence that it did not exceed 10 m above present during the last interglacial period (129,000 to 116,000 years ago), when the global mean surface temperature was, with medium confidence, not more than 2°C warmer than pre-industrial. This sea level is higher than reported in AR4 owing to more widespread and comprehensive paleoclimate reconstructions. During the last interglacial period, the Greenland ice sheet very likely contributed between 1.4 and 4.3 m sea level equivalent, implying with medium confidence a contribution from the Antarctic ice sheet to the global mean sea level.

National Geographic also picked up on sea level in their September issue, with a cover feature and a comprehensive set of charts, graphs and videos (for the iPad version) as only National Geographic can do. 

natgeo_statue_liberty_sea_level

 

Unfortunately the cover has already come in for criticism from one blog site in particular, arguing that NatGeo is scaremongering as it will take thousands of years for sea level to rise to that level (about 60 metres) on the Statue of Liberty. In fairness, the magazine does point this out in the article where it has a graphic showing what the coast would look like if all the ice sheets melted (i.e. both Antarctica and Greenland over several thousand years). There is no assertion by NatGeo that this will happen, it’s just a graphic to give readers some idea of the full scope of the sea level issue.

All of this raises the possibility that sea level rise may become the eventual poster-child for climate change. While floods, droughts, storms, heat waves and forest fires all have the “wrath of God” feel to them, they are unpredictable, uncertain and have always been with us. No single event can be attributed to anything other than a chaotic system and a complex statistical analysis over a long period of time is required to make the case that a change is underway. This makes it very difficult to use these events to gain public appreciation of the climate risk and therefore build any consensus around a suitable course of action. As is now also apparent, global surface temperature itself is not that reliable either and the alternative of trying to get mainstream understanding of a global heat imbalance (oceans, air , ice etc.) is probably just too difficult. But sea level just keeps rising!

All that being said, an observer of the NASA web site on climate indicators will note that sea level took a remarkable dip and recovery over the last two years. Apparently this is due to very heavy and widespread rain in Australia over that period. The continent acts like a basin and “traps” the water to some extent, at least for a while until evaporation and drainage eventually do their job. Extraordinary!

Recent sea level

A year of weather extremes?

Through 2010 and 2011 in particular, weather extremes seemed to dominate the headlines. Extreme drought, rainfall, flood and wind all played a role in making the period one of the most expensive in terms of damage to infrastructure. In some locations there was also significant loss of life. It was also a time that saw the subject of extreme weather events rise up the climate change agenda, with numerous academic papers, blogs, seminars and campaigns focused on the issue.

Certainly as the atmosphere moves from one steady state to (presumably) another and one which is warmer and therefore has more energy, weather volatility should increase, at least during the period of transition. This is true in any control system where there is a change in set point (not exactly what is happening in the world, but analogous). The picture below is fairly typical, with large swings in response as the system adjusts to the change.

So we might well expect to see an increase in extreme weather events and many are now pointing to recent events as evidence. The problem here is that there have always been extreme events and there have also been previous periods of bunched extreme events. This may be driven by climate cycles, such as the El Nino Southern Oscillation (ENSO). A period that shows many similarities to the last two years is 1974-75 when there was a very strong La Nina event, such as the one we are currently experiencing.

 

In the timelines above the near back-to-back El Nino events of the 1970s and 2010s are shown in blue (also see them in the chart above the timeline in blue) and various extreme events are shown in red. Much similarity exits, although the severe droughts that have been experienced in the southern US states didn’t show up at all in the 1970s. In fact the Texas drought has been shown as exceptional by any standards.

With so much focus on extreme events and a further focus by many on an apparent plateau seen in global temperatures in recent years, are we perhaps missing some clearer signals buried in the data? One such signal, which got very little media coverage, was published by the WMO at the very end of 2011 and shows last year to be the hottest ever, for a La Nina year (which are typically cooler). In fact every La Nina year over the past 40 years has been warmer than the previous one.

Over six decades and taking just the La Nina years (chart above) there has been a temperature movement of 0.7 deg.C, or 0.12 deg.C per decade. This is somewhat less than the climate sensitivity indicated by the IPCC, but equally it may only be indicative of what is probably the bottom edge of the span of temperature change. It is nevertheless an important trend to understand and follow. Extreme weather events also deserve considerable attention, but there needs to be some increased diligence when it comes to immediately associating them all with climate change.

Later this month the IPCC (Intergovernmental Panel on Climate Change) will launch a very substantial report on Renewable Energy and Climate Change. In advance of that, a “Summary for Policy Makers” was released early this week following the 11th Session of Working Group III of the IPCC, held in Abu Dhabi on 5-8th May. In tandem with the Summary document was a press release, which starts out with the words;

Abu Dhabi, 9 May 2011Close to 80 percent of the world’s energy supply could be met by renewables by mid-century if backed by the right enabling public policies a new report shows.

Not surprisingly, this key phrase was repeated in headlines the world over, with much media enthusiasm for the report. But it isn’t what the Summary is actually about, nor does the Summary give any details into how this may come about.

Instead, the Summary for Policy Makers provides an extensive overview of the current state of key renewable technologies, including wind, solar, hydro, geothermal, ocean and biomass. There is no doubt, based on the information provided, that renewable energy technologies are maturing rapidly and impressive strides have been made in development and deployment.

The view on the ultimate deployment of renewables and their potential to capture much of the world’s energy market by 2050 comes from a review of some 164 scenarios, with an in-depth review of four. Here it should be noted that the four represent a span from a baseline scenario without a specific mitigation target to three scenarios representing different CO2 stabilization levels. Although we will need to wait for the full report to see the specific details of the scenarios, the fact that they have specific mitigation targets is a telling sign. This implies that these are not scenarios in one important sense, in that they have an artificial constraint which dictates the outcome. Such a constraint doesn’t exist in the real world, but must be developed over time as part of the societal response to energy and climate change issues.

In the current Shell energy scenarios, Blueprints and Scramble, there are no specific mitigation targets at a global level. Rather, the scenarios track different levels of response to the issue of carbon emissions. Blueprints, the more optimistic in terms of a response to climate change, sees the early development of carbon pricing and carbon markets throughout much of the world which in turn drives the rapid deployment of a range of technologies, including renewables, carbon capture and storage and vehicle electrification. It is a bottom-up, national policy driven scenario that pushes technology deployment rates beyond historical norms. By 2100, Blueprints sees stabilization of CO2 at some 540 ppm, with other GHGs adding a further 110 ppm CO2 equivalent. This is above the level that equates to a 2°C rise in global temperatures.

This isn’t to say that targeted scenarios are not instructive. By establishing a future goal and modeling a pathway towards it, we can better understand the role of various technologies and the rates at which they need to be deployed. Such a model also gives insight into the future cost development of certain technologies. The scenario should also test the physical feasibility of the necessary rapid change in the energy system. But none of this means that it is actually possible to achieve such a goal. Society must be suitably motivated to do so and be prepared to shoulder the economic costs, particularly where it involves very early retirement of existing infrastructure.

A key chart in the Summary illustrates the nature of the scenarios that have been sampled.

Category I, II and III scenarios represent CO2 stabilization levels below 485 ppm, a level at which many observers regrettably now see as unobtainable. The green Category I scenarios see stabilization at current levels or below, which implies the deployment of air capture technologies or very large scale use of bio-char sequestration or similar.

Although there isn’t sufficient information in the Summary to extract the underlying numbers, the chart above also implies that the 2050 world in many of the scenarios uses less energy (or at least not that much more) than the current world (492 EJ in 2008). This is due in part to the calculation method for primary energy differing between fossil energy and renewable energy, but it would also appear that tremendous improvements in energy efficiency are made. This was a key feature of the “100% renewable energy by 2050” scenario that WWF released recently. The real story in that scenario was not the renewables per se (where total deployment was not that different to the Shell Blueprints scenario), but the transformation in energy use that accompanied them. This almost certainly requires yet another step change in societal response – it is unlikely to be just about better technology.

As we head towards the IPCC 5th Assessment Report this contribution from Working Group III is likely to be an important milestone and much referred to in the coming months. But we should recognize it for what it appears to be and not be. It certainly appears to be a very thorough review of the current state of renewable energy technologies and an important guide as to how they may contribute to the energy system of the future. But this isn’t  a forecast of what will be, nor does it appear to be a clear guide as to what is actually doable in the limited time available to respond, particularly given the current economic circumstances the world is experiencing and the political stalemate over carbon emissions that we see in some major economies.