During a future energy workshop that I attended recently the audience heard from Professor Jorgen Randers, from the Center for Climate Strategy at the Norwegian Business School. Although Professor Randers has been involved in scenario planning for many years, he introduced his lecture by saying that it was time to just do a realistic forecast of what is “going to happen” and be done with it. He held out little hope for any sort of coordinated global action on emissions, which basically meant that the world would just have to come to terms with its higher atmospheric CO2 future. What was interesting though was the forecast that he then proceeded to give – it certainly wasn’t the runaway apocalypse that some will have us believe we are in for. I should say that Professor Randers earnestly thought we need to do better than this, he just couldn’t see how it might come about.
The forecast he presented is available on his website (www.2052.info). He uses a very small number of key metrics to establish his outlook, but takes a different view on how they might develop. The starting point is population, which he sees reaching a plateau of 8 billion in 2040, at the low end of UN forecasts (but not outside the forecast) . This is because of declining fertility rates as women increasingly move into the workforce and seek careers. As the existing population ages the global death rate increases as well. This population trend is, not surprisingly, a critical assumption for his forecast.
The next key assumption is that global GDP will begin to slow down, linked in part to the population assumption (the number of people in the 15-65 age bracket actually falls after 2035) and a second critical assumption that continuous improvement in labour productivity will eventually end as this metric plateaus (he noted that it has already started to). The result is global GDP also reaching a plateau in the 2050s and beyond. Linked to this is a plateau in consumption which is dampened by the need to spend a non-trivial amount of global GDP on adaptation and reconstruction (coastal cities etc.) as the climate changes and sea levels continue to rise. This latter point is an important self regulating part of the analysis.
Randers then turned his attention to energy use, shown in the chart below. With energy use per unit of GDP (efficiency) continuing its downward trend, global energy use peaks in 2040 and then declines.
The energy mix also changes over the period, with renewable energy coming on strongly, oil use reaching a long plateau around 2020 then declining quite quickly and both coal and natural gas use peaking in the 2030s. As a result CO2 emissions peak in 2030 and decline, dropping 10 billion tpa over the subsequent 20 years.
All of this then feeds through to an eventual plateau in atmospheric CO2 and therefore temperature. Randers clearly recognizes that we shoot through 2°C, but the end point in his forecast is about 3°C, not the much higher levels of 4, 5 or even 6°C that some are concerned about. In effect, this is now a self regulating system, albeit one that has to deal with significant changes in sea level and other impacts.
There was no attempt to endorse any of this, quite the opposite. Randers also noted that some of his assumptions are seen as politically or socially unacceptable, such as the declining birthrate and an eventual plateau in GDP. As such, the forecast itself becomes something of a political hot potato.
Whether he is right or wrong isn’t really the point, what is interesting about the analysis is that some very small changes in basic assumptions can have a profound effect on the outcome. Pretty much anybody that has constructed even the simplest spreadsheet with built in growth rates recognizes this, but I hadn’t seen it applied in this manner before. Even though they may be outside our normal expectation, all of his assumptions fall within the bounds of credibility, so the forecast is essentially a valid one. A 3°C world is far from where we are today, but it is useful to recognize that our global climate / economic system is now essentially a single entity and that there may be an outcome which is very different to the alternate vision of “meltdown”.
The additional tag line for this book, “Rising Sea Level and the Coming Coastal Crisis“, pretty much describes the story that unfolds even from the very early pages, although this book charts a somewhat different course, at least in the first half. With recent events such a Hurricane Sandy in mind, there is a tendency to focus on sea level rise as a short term or even “now” event. But our descendants may see vast changes in the shape of the coastline and the viability of the great cities for hundreds and perhaps thousands of years to come as a result of the change in atmospheric CO2 that today’s society will hand on to them. That is what this book is really about.
The author of this book, John Englander, takes the reader on a tour of the recent few million years, where sea level has shifted by tens of metres in response to relatively modest changes in global temperature and CO2 levels. These are changes that sit within the range that CO2 levels have moved in the last century or are similar to the temperature swing that is expected to play out over the coming century (even the hoped for 2°C limit). Englander builds a strong case that rising sea level is the single aspect of the current warming that will profoundly impact civilization. This is a message that very few people really get, given the myopic view we tend to have on events.
We hear much about the current 3+ mm rise in sea level per annum and the forecasts of 0.3 metres to 1 or 2 metres change over this century, but according to Englander this is just the beginning. The slow changes in ice sheet coverage in response to small shifts in global temperature have caused sea levels to rise and fall by up to 100 metres. The last ice age saw sea level some 125 metres lower than today (for a 5 deg.C lower global temperature) and the warming peak during the previous inter-glacial some 125,000 years ago had sea level topping out some 8 metres above current levels. Estimates vary, but there is good evidence to indicate that sea level and global temperature move together at about 10-20 metres per degree C. It’s just that sea level, because of the size of the ice sheets, moves on a millennial rather than annual time scale and takes a very long time to equilibrate.
This is interesting material and Englander does a great job of sharing the evidence and explaining the dynamics, but fifty pages or so into the story and it starts to run out of steam. It’s not a difficult case to make with the wealth of paleoclimate data that is available, but once made that is pretty much it. The book then turns its attention back to this century and the remainder of the story looks at individual cities, immediate government responses (or lack of them) and threats to infrastructure, population bases and low lying countries. It is solid stuff, but not up there with the first half.
There is now something of a religious fervor around the international goal of 2°C, to the extent that it is almost impossible to discuss other trajectories or outcomes. The only contrast that seems possible with 2°C is something that nobody wants, which is the “do nothing” scenario of 4°C or more.
Yet the 2°C pathway is hardly clear cut in itself. A recent series of discussions in a business group I attend has highlighted the range of myth, confusion and misinformation that surrounds the current goal. Given that this is an international goal that most nations subscribe to, exactly where are we headed? The number itself has been around for a while, but it was finally agreed at the Cancun COP16 after first appearing in the text emanating from Copenhagen. The Ad-hoc Working Group on Long Term Cooperative Action agreed the following in Cancun:
Further recognizes that deep cuts in global greenhouse gas emissions are required according to science, and as documented in the Fourth Assessment Report of the Inter- governmental Panel on Climate Change, with a view to reducing global greenhouse gas emissions so as to hold the increase in global average temperature below 2 °C above pre- industrial levels, and that Parties should take urgent action to meet this long-term goal, consistent with science and on the basis of equity; also recognises the need to consider, in the context of the first review, as referred to in paragraph 138 below, strengthening the long-term global goal on the basis of the best available scientific knowledge, including in relation to a global average temperature rise of 1.5 °C;
The text itself lays out an intention, but translating this into something tangible is easier said than done. It also turns out to be quite a divisive process and requires a deep dive into some reasonably complex statistics. This was perhaps best highlighted by the paper Greenhouse-gas emission targets for limiting global warming to 2°C, Malte Meinshausen, Nicolai Meinshausen, William Hare, Sarah C. B. Raper, Katja Frieler, Reto Knutti, David J. Frame & Myles R. Allen, Nature Vol 458 / 30 April 2009 (a copy is currently available here). Meinshausen et. al. showed that the uncertainty of the climate response combined with a variety of emission pathways delivers given probabilities for staying below 2°C, depending on the cumulative emissions over the period 2000-2049, with some indication of eventual outcome also given by emissions in 2020.
Excerpts from the table in the paper, giving probabilities of exceeding 2°C are shown below:
This is all very well, but the next step is the tough one. The call at Cancun was to “hold the increase below 2°C”, but this means different things to different people. At the meeting I attended recently, some interpreted this as meaning a “reasonable probability”, which was then interpreted as 75%. The table above shows that this means a limit on cumulative emissions between 2000-2049 of 1,000 Gt CO2. But with emissions from 2000-2013 already totalling about 470 billion tonnes, that leaves a remaining budget until 2050 of just 530 billion tonnes. That’s about 14 years of full on emissions, or for example, a trajectory that requires an immediate peak in emissions followed by year on year reductions of about 1.2 billion tonnes until emissions are near zero. Delaying the peak until 2020 pushes up the reduction rate to nearly 3 billion tonnes per annum.
By contrast, accepting a 50% probability gives a very different outcome. Emissions can peak in 2020 and a reduction pace of 1 billion tonnes per annum is then required. Alternatively, should emissions plateau in 2020 and start reducing in 2025, the annual effort rises to 1.5 billion tonnes. These are still very challenging numbers, but almost a world apart from the 75% probability case. The 75% case is most likely unachievable given where the world is today.
What was clear from the meeting I attended was that two people who may both talk about 2°C have very different perspectives on likelihood, usually without any thought as to the reduction implications behind their assumption. The EU is at least clearer on this in its main publication on the 2°C Target, where it notes in the key messages, “In order to have a 50% chance of keeping the global mean temperature rise below 2°C relative to pre-industrial levels . . . . .”.
Last week I attended the official launch in London of a book I reviewed recently, The Burning Question. Both authors were at the launch and they gave a great overview of the energy and climate predicament we have collectively managed to get ourselves into. Key to their message is that carbon emissions are growing exponentially and that no amount of energy efficiency or alternative energy investment is going to change that pathway anytime soon, rather both approaches may be exacerbating the problem. Of course they did make the point that all exponential systems eventually collapse or at best plateau, but in the meantime emissions continue to rise with no immediate sign of change. As I noted in my initial review, the authors paint themselves into something of a difficult corner and don’t give a great deal of insight as to how to get out, but carbon capture and storage looms large in their thinking. The book follows a line of thought that I have been developing in this blog over the last couple of years, best described here and here.
The morning after the book launch I found myself at a business association meeting where the subject of climate action was top of the agenda for the day. As if in follow-up to the previous evening, we quickly got on to the role of carbon capture and storage (CCS) for mitigation, vs. the apparently more attractive premise (to many people) that the focus must be on energy efficiency and renewables, with carbon capture and storage in more of a mop-up role at the end. The efficiency / renewables approach has been played out in numerous scenario exercises, most notably in that presented by WWF (with the support of Ecofys) in their 2011 report “100% Renewable Energy by 2050”. In all such cases and particularly that one, a natural progression of change within the energy system doesn’t feature, rather a “war time footing” scenario is advocated. This specific report was also presented to the meeting.
I contrast this with the recent Shell New Lens Scenarios which I discussed in a March posting. These do follow a natural progression forward, driven by social concerns, legislative change and energy economics. The conditions behind the Oceans scenario result in higher uptake of efficiency and much faster renewables deployment. However, these are not strong enough to offset all of the extra pressures for energy demand growth from developing markets in particular. As a result, fossil energy growth is similar to that of Mountains for the next several decades, and so without the strong stimulus for CCS in Mountains, the Oceans scenario results in higher cumulative CO2 emissions over the century and therefore additional warming. The reasons are somewhat similar to those articulated in The Burning Question.
This leads to thinking about climate action in terms of two paradigms. One recognizes the sobering reality of the global energy system as outlined in The Burning Question and seeks to address the issue through a combination of measures, prioritizing a robust carbon price in the energy system and placing a strong emphasis on carbon capture and storage. This tackles the issue from the fossil fuel end, which has the consequence of managing emissions directly (the CCS bit) and drawing in alternatives and reducing demand as pricing dictates (the carbon price bit). The other approach is to tackle the issue from the alternatives end, which results in forced efficiency measures and subsidized renewable energy coming into the mix. Following the logic of The Burning Question, this is like putting the energy system on steroids which pumps up global demand and potentially even forces emissions to rise.
Back then to the business association meeting which, at least in part, was also attended by a prominent official in the global climate process. The inevitable question as to the role of CCS arose and a debate around mitigation priorities got going. Many, including the official present in the room, took the view that efficiency and renewables were critical to the change process required and that this is where the emphasis must be.
Of course the real sweet spot is somewhere in the middle, where there is a strong attack on emissions through carbon pricing and CCS, but in combination with a more rapid displacement of fossil energy with alternatives such as solar and nuclear. This isn’t easy to achieve as the social conditions for one are somewhat counter to those needed for the other. This is one paradox that also comes out of the New Lens Scenarios. Nevertheless, if those in leadership positions are sitting at one end of this spectrum rather than squarely in the middle, will we ever get a solution that actually addresses the problem head on? Perhaps The Burning Question needs to be distributed more widely!
The world is not on track to meet the target agreed by governments to limit the long term rise in the average global temperature to 2 degrees Celsius (°C).
International Energy Agency, June 2013
The International Energy Agency (IEA) is well known for its annual World Energy Outlook, released towards the end of each year. In concert with the WEO come one or more special publications and this year is no exception. Just released is a new report which brings the IEA attention back squarely on the climate issue, Redrawing the Energy-Climate Map. The IEA have traditionally focused on the climate issue through their 450 ppm scenario. While they continue to do that this time, they are also going further with a more pragmatic model for thinking about emissions, that being the “trillion tonne” approach. I have discussed this at some length in previous posts.
The report looks deeply into the current state of climate affairs and as a result fires a warning shot across the bows of current national and UNFCCC efforts to chart a pathway in keeping with the global goal of limiting warming to 2 °C above pre-industrial levels. The IEA argue that we are on the edge of the 2 °C precipice and recommends a series of immediate steps to take to at least stop us falling in. With the catchy soundbite of ” 4 for 2° “, the IEA recommend four immediate steps in the period from now to 2020;
Rapid improvements in energy efficiency, particularly for appliances, lighting, manufacturing machinery, road transport and within the built environment.
Phasing out of older inefficient coal fired power stations and restricting less efficient new builds.
Reductions in fugitive methane emissions in the oil and gas industry.
Reductions in fossil fuel subsidies.
These will supposedly keep some hope of a 2°C outcome alive, although IEA makes it clear that much more has to be done in the 2020s and beyond. However, it didn’t go so far as to say that the 2° patient is dead, rather it is on life support.
I had some role in all this and you will find my name in the list of reviewers on page 4 of the report. I also attended a major workshop on the issue in March where I presented the findings of the Shell New Lens Scenarios and as a result advocated for the critical role that carbon capture and storage (CCS) must play in the solution set.
As a contributor, I have to say that I am a bit disappointed with the outcome of the report, although it is understandable how the IEA has arrived where it has. There just isn’t the political leadership available today to progress the things that really need to be done, so we fall back on things that sound about right and at least are broadly aligned with what is happening anyway. As a result, we end up with something of a lost opportunity and more worryingly support an existing political paradigm which doesn’t fully recognize the difficulty of the issue. By arguing that we can keep the door open to 2°C with no impact on GDP and by only doing things that are of immediate economic benefit, the report may even be setting up more problems for the future.
My concern starts with the focus on energy efficiency as the principal interim strategy for managing global emissions. Yes, improving energy efficiency is a good thing to do and cars and appliances should be built to minimize energy use, although always with a particular energy price trajectory in mind. But will this really reduce global emissions and more importantly will it make any difference by 2020?
My personal view on these questions is no. I don’t think actions to improve local energy efficiency can reduce global emissions, at least until global energy demand is saturated. Currently, there isn’t the faintest sign that we are even close to saturation point. There are still 1-2 billion people without any modern energy services and some 4 billion people looking to increase their energy use through the purchase of goods and services (e.g. mobility) to raise their standard of living. Maybe 1-1.5 billion people have reached demand saturation, but even they keep surprising us with new needs (e.g. Flickr now offers 1 TB of free storage for photographs). Improvements in efficiency in one location either results in a particular service becoming cheaper and typically more abundant or it just makes that same energy available to any of the 5 billion people mentioned above at a slightly lower price. Look at it the other way around, which oil wells, coal mines or gas production facilities are going to reduce output over the next seven years because the energy efficiency of air conditioners is further improved. The fossil fuel industry is very supply focused and with the exception of substantial short term blips (2008 financial crisis), just keeps producing. Over a longer timespan lower energy prices will change the investment portfolio and therefore eventual levels of production, but in the short term there is little chance of this happening. This is a central premise of the book I recently reviewed, The Burning Question.
Even exciting new technologies such as LED lighting may not actually reduce energy use, let alone emissions. Today, thanks to LEDs, it’s not just the inside of buildings where we see lights at night, but outside as well. Whole buildings now glow blue and red, lit with millions of LEDs that each use a fraction of the energy of their incandescent counterparts – or it would be a fraction if incandescent lights had even been used to illuminate cityscapes on the vast scale we see today. The sobering reality is that lighting efficiency has only ever resulted in more global use of lighting and more energy and more emissions, never less.
The result of increases in luminous efficacy has been an increase in demand for energy used for lighting that nearly exactly offsets the efficiency gains—essentially a 100% rebound in energy use.
I don’t think this is limited to just lighting. Similar effects have been observed in the transport sector. Even in the built environment, there is evidence that as efficiency measures improve home heating, average indoor temperatures rise rather than energy use simply falling.
The second recommendation focuses on older and less efficient coal fired power stations. In principle this is a good thing to do and at least starts to contribute to the emissions issue. This is actually happening in the USA and China today, but is it leading to lower emissions globally? In the USA national emissions are certainly falling as natural gas has helped push older coal fired power stations to close, but much of the coal that was being burnt is now being exported, to the extent that global emissions may not be falling. Similarly in China, older inefficient power stations are closing, but the same coal is going to newer plants where higher efficiency just means more electricity – not less emissions. I discussed the efficiency effect in power stations in an old posting, showing how under some scenarios increasing efficiency may lead to even higher emissions over the long term. For this recommendation to be truly effective, it needs to operate in tandem with a carbon price.
The third and fourth recommendations make good sense, although in both instances a number of efforts are already underway. In any case their contribution to the whole is much less than the first two. In the case of methane emissions, reductions now are really only of benefit if over the longer term CO2 emissions are also managed. If aggressive CO2 mitigation begins early, and is maintained until emissions are close to zero, comprehensive methane (and other Short Lived Climate Pollutants – SLCP) mitigation substantially reduces the long-term risk of exceeding 2˚C (even more for 1.5˚C). By contrast, if CO2 emissions continue to rise past 2050, the climate warming avoided by SLCP mitigation is quickly overshadowed by CO2-induced warming. Hence SLCP mitigation can complement aggressive CO2 mitigation, but it is neither equivalent to, nor a substitute for, near-term CO2 emission reductions (see Oxford Martin Policy Brief – The Science and Policy of Short Lived Climate Pollutants)
After many lengthy passages on the current bleak state of affairs with regards global emissions, the weak political response and the “4 for 2°C “ scenario, the report gets to a key finding for the post 2020 effort, that being the need for carbon capture and storage. Seventy seven pages into the document and it finally says;
In relative terms, the largest scale-up, post-2020, is needed for CCS, at seven times the level achieved in the 4-for-2 °C Scenario, or around 3 100 TWh in 2035, with installation in industrial facilities capturing close to 1.0 Gt CO2 in 2035.
Not surprisingly, I think this should have been much closer to page one (and I have heard from the London launch, which I wasn’t able to attend, that the IEA do a better job of promoting CCS in the presentation). As noted in the recently released Shell New lens Scenarios, CCS deployment is the key to resolving the climate issue over this century. We may use it on a very large scale as in Mountains or a more modest scale as in Oceans, but either way it has to come early and fast. For me this means that it needs to figure in the pre-2020 thinking, not with a view to massive deployment as it is just too late for that, but at least with a very focused drive on delivery of several large scale demonstration projects in the power sector. The IEA correctly note that there are none today (Page 77 – “there is no single commercial CCS application to date in the power sector or in energy-intensive industries”).
Of course large scale deployment of CCS from 2020 onwards will need a very robust policy framework (as noted in Box 2.4) and that will also take time to develop. Another key finding that didn’t make it to page one is instead at the bottom of page 79, where the IEA state that;
Framework development must begin as soon as possible to ensure that a lack of appropriate regulation does not slow deployment.
For those that just read the Executive Summary, the CCS story is rather lost. It does get a mention, but is vaguely linked to increased costs and protection of the corporate bottom line, particularly for coal companies. The real insight of its pivotal role in securing an outcome as close as possible to 2°C doesn’t appear.
So my own “ 2 for 2°C before 2020“ would be as follows;
Demonstration of large-scale CCS in the power sector in key locations such as the EU, USA, China, Australia, South Africa and the Gulf States. Not all of these will be operational by 2020, but all should be well underway. At least one “very large scale” demonstration of CCS should also be underway (possibly at the large coal to liquids plants in South Africa).
Development and adoption of a CCS deployment policy framework, with clear links coming from the international deal to be agreed in 2015 for implementation from 2020.
In recent months there has been a renewed look at the idea of a financial carbon bubble, or unburnable carbon reserves. Most recently, a report from The Carbon Tracker with a forward by Lord Stern of the Grantham Research Institute on Climate Change (London School of Economics), argued that serious risks are accumulating for investors in high carbon assets, such as coal mining companies and the oil and gas industry.
The idea of the “carbon bubble” is based on a concept that I have discussed many times in this blog: that there is a finite limit to the “atmospheric space” for CO2 while still ensuring that warming does not rise above 2 °C. That limit is about one trillion tonnes of carbon.
The issue of the bubble arises because the combined proven oil, gas and coal reserves currently on the books of fossil fuel companies (and governments in the case of NOCs) will produce far more than this amount of CO2 when consumed. This implies that in a world where the 2 °C limit is imposed and achieved, most of the future value generation of the companies involved will never be realized and therefore investors in them today are looking at a financial bubble that may well burst in front them. According to my analysis and the global reserves data in the BP Statistical Review of World Energy, we get to about 1.6 trillion tonnes of carbon as shown below. This equates to the use of total current fossil energy reserves of about 900 billion tonnes of carbon equivalent (the balance comes from the use of cement and land use change).
The report clearly sets out the global carbon budget, the reserves outlook, the current capital flow being consumed to expand those reserves and comes to the additional conclusion that this part of the global energy system will also waste trillions in capex over the coming decade as it develops more reserves that could also become unburnable. The report authors argue that even the massive application of carbon capture and storage will do little to help the situation.
There is really nothing to argue about in terms of the CO2 math itself. It is certainly the case that current proven reserves will take us well past 2 °C if completely consumed and the CO2 emitted. But now comes the reality check!
What is missing in the report is any discussion about the dynamics of the global energy system, the need to meet energy demand and of course the rapid growth we are seeing in that demand. To bring all this math into the equation it is probably best to turn to the new Shell Energy Scenarios, released about two months ago. I discussed these at some length a few weeks back.
In the context of this discussion, the initial focus should probably be on the Oceans scenario in that it sees the very rapid introduction of solar energy, with eventual large scale displacement of fossil fuels in the second half of the century. Global energy demand rises from 535 EJ in 2010 to 777 EJ in 2030 and 1056 EJ in 2060. Although solar (mainly PV) is the largest single energy source by that time, total carbon consumed through fossil fuel use amounts to 800 billion tonnes carbon by the end of the century, just a bit less than current proven reserves (900 billion tonnes as indicated above). The large consumption of fossil fuel is required simply to meet energy needs as renewable energy attempts to catch up with overall demand (which it won’t do until sometime in the 22nd century). This change is purely through the market and social dynamics present in the Oceans scenario, which sees strong growth, improved energy efficiency driven by higher prices and solar eventually dominating. CCS comes in later in the century, removing about 100 billion tonnes of carbon.
By contrast, Mountains is a fossil fuel scenario, but with heavy reliance on CCS from about 2030. Total fossil fuel use is over a trillion tonnes of carbon equivalent, which exceeds current proven reserves. However, CCS removes some 300 billion tonnes of carbon, giving an overall accumulation of 1.25 trillion tonnes by 2100 (current accumulation plus fossil use to 2100 plus land use change and cement). This is still above the trillion tonne limit, but is the overall lower emissions outlook.
The key lesson from the scenarios in this regard is that both a rapid growth in renewable energy and the early use of CCS are required to manage emissions throughout this century. The paradox is that these exist in different scenarios with entirely different underlying economic and social drivers. It’s quite hard to have both – a world that likes fossil fuel readily gives permission to CCS going forward, but doesn’t really see huge segments of the energy market taken by renewable energy. Nuclear is strong though. Conversely, the distributed energy solar world of Oceans doesn’t want to hear about CCS and therefore leaves it until physical climate pressures (e.g. extreme weather events) force action.
The reality check for the “carbon bubble” proponents is that global energy demands still need to be met and that there are limits to the growth rate of fossil energy substitutes, even as climate goals come under pressure.