Archive for the ‘Energy technology’ Category

As we head towards COP21 in Paris at the end of 2015, various initiatives are coming to fore to support the process. So far these are non-governmental in nature, for example the “We Mean Business”  initiative backed by organisations such as WBCSD, CLG and The Climate Group. In my last post I also made mention of the World Bank statement on Carbon Pricing.

2 C Puzzle - 3 pieces

This week has seen the launch of the Pathways to Deep Decarbonization report, the interim output of an analysis led by Jeffrey Sachs, director of the Earth Institute at Columbia University and of the UN Sustainable Development Network. The analysis, living up to its name, takes a deeper look at the technologies needed to deliver a 2°C pathway and rather than come up with the increasingly overused “renewables and energy efficiency” slogan, actually identifies key areas of technology that need a huge push. They are:

  • Carbon capture and storage
  • Energy storage and grid management
  • Advanced nuclear power, including alternative nuclear fuels such as thorium
  • Vehicles and advanced biofuels
  • Industrial processes
  • Negative emissions technologies

These make a lot of sense and much has been written about them in other publications, except perhaps the second last one. Some time back I made the point that the solar PV enthusiasts tend to forget about the industrial heartland; that big, somewhat ugly part of the landscape that makes the base products that go into everything we use. Processes such as sulphuric acid, chlorine, caustic soda and ammonia manufacture, let alone ferrous and non-ferrous metal processes often require vast inputs of heat, typically with very large CO2 emissions. In principle, many of these heat processes could be electrified, or the heat could be produced with hydrogen. Electrical energy can, in theory, provide this through the appropriate use of directed-heating technologies (e.g. electric arc, magnetic induction, microwave, ultraviolet, radio frequency). But given the diversity of these processes and the varying contexts in which they are used (scale and organization of the industrial processes), it is highly uncertain whether industrial processes can be decarbonized using available technologies. As such, the report recommends much greater efforts of RD&D in this area to ensure a viable deep emission reduction pathway.

Two key elements of the report have also been adopted by the USA and China under their U.S.-China Strategic and Economic Dialogue. In an announcement on July 9th, they noted the progress made through the U.S.-China Climate Change Working Group, in particular the launching of eight demonstration projects – four on carbon capture, utilization, and storage, and four on smart grids.

Reading through the full Pathways report I was a bit disappointed that a leading economist should return to the Kaya Identity as a means to describe the driver of CO2 emissions (Section 3.1 of the full report). As I noted in a recent post it certainly describes the way in which our economy emits CO2 on an annualised basis, but it doesn’t given much insight to the underlying reality of cumulative CO2 emissions, which is linked directly to the value we obtain from fossil fuels and the size of the resource bases that exist.

Finally, Sachs isn’t one to shy away from controversy and in the first chapter the authors argue that governments need to get serious about reducing emissions;

The truth is that governments have not yet tried hard enough—or, to be frank, simply tried in an organized and thoughtful way—to understand and do what is necessary to keep global warming below the 2°C limit.

I think he’s right. There is still a long way to go until COP21 in Paris and even further afterwards to actually see a real reduction in emissions, rather than reduction by smoke and mirrors which is arguably where the world is today (CO2 per GDP, reductions against non-existent baselines, efficiency improvements, renewable energy goals and the like). These may all help governments get the discussion going at a national or regional, which is good, but then there needs to be a rapid transition to absolute CO2 numbers and away from various other metrics.

Two sides to every coin

As we near the middle of the year and therefore have, at least in the Northern Hemisphere (i.e. Germany), long days with lots of sunshine, renewable energy statistics start to appear in the media and the renewables distortion field enveloping much of Europe expands just that little bit more. The first of these I have come across was posted by a number of on-line media platforms and highlighted the fact that on Sunday May 11th Germany generated nearly three quarters of its electricity from renewable sources. Given the extraordinary level of solar and wind deployment in recent years, it shouldn’t be a surprise that this can happen. But it’s rather a one sided view of the story.

The flip side is of course December and January when the solar picture looks very different. The Fraunhofer Institute for Solar Energy Systems ISE use data from the EEX Platform to produce an excellent set of charts showing the variability of renewable energy, particularly solar and wind. The monthly data for solar shows what one might expect in the northern latitudes, with very high solar in summer and a significant tailing off in winter. The ratio between January and July is a factor of 15 on a monthly average basis.

Annual solar production in Germany 2013j

But wind comes to the rescue to some extent, firstly with less overall monthly variability and secondly with higher levels of generation in the winter which offsets quite a bit of the loss from solar.

Annual wind production in Germany 2013

The combination of the two provides a more stable renewable electricity supply on a monthly basis, with the overall high to low production ratio falling to about 2. One could argue from this that in order to get some gauge of the real cost of renewable energy in Germany, monthly production of 6 TWh of electricity requires about 70 GW of solar and wind (average installed capacity in 2013, roughly 50% each). By comparison, 70 GW of natural gas CCGT online for a whole month at its rated capacity would deliver 51 TWh of electricity, nearly a factor of 9 more than for the same amount of installed solar plus wind. But to be fair, some of that 70 GW of natural gas will have downtime for maintenance etc., but even with a 20% capacity loss to 40 TWh, the delivery factor is still about 7. For solar on its own it will be closer to 10 in Germany.

Annual solar + wind production in Germany 2013

But this isn’t the end of the story. Weekly and daily data shows much greater intermittency. On a weekly basis the high to low production ratio rises to about 4, but on a daily basis it shoots up to 26.

Annual solar + wind production in Germany 2013 by week

 

Annual solar + wind production in Germany 2013 by day

Fortunately, Germany has an already existing and fully functioning fossil fuel + nuclear baseload generation system installed, which can easily take up the slack as intermittency brings renewable generation to a standstill. But the cost of this is almost never included in an assessment of the cost of renewable power generation. In Germany’s case this is a legacy system and therefore it is taken for granted, but for countries now building new capacity and extending the grid to regions that previously had nothing, this is a real cost that must be considered.

This is perhaps an anti-leapfrog argument (being that regions with no grid or existing capacity can leapfrog to renewables).  The German experience shows that you can shift to renewables more easily when you already have a fully depreciated fossil & nuclear stock, and your demand is flat.  Otherwise, this is looking like a potentially costly story that relies on storage technologies we still don’t have in mainstream commercial use.

____________________________

As a complete aside, but certainly the “flip side” of another issue, I came across this chart which highlights the flip side of rising CO2 levels in the ocean and atmosphere due to the combustion of fossil fuels – falling levels of oxygen. This is a very small effect (given the amount of oxygen in the atmosphere) and certainly not an issue, but it’s entirely measurable which is the interesting bit. The chart is produced by Ralph Keeling, son of the originator of the CO2 Keeling Curve.

Falling oxygen levels

 

In my previous post I responded to an article by environmentalist Paul Gilding where he argued that the rate of solar PV deployment meant it was now time to call “Game over” for the coal, oil and gas industries. There is no doubt that solar PV uptake is faster than most commentators imagined (but not Shell in our Oceans scenario) and it is clear that this is starting to change the landscape for the utility sector, but talk of “death spirals” may, in the words of Mark Twain, be an exaggeration.

In that same article, Gilding also talks about local battery storage via electric cars and the drive to distributed systems rather than centralized ones. He clearly envisages a world of micro-grids, rooftop solar PV, domestic electricity storage and the disappearance of the current utility business model. But there is much more to the energy world than what we see in central London or Paris today, or for that matter in rural Tasmania where Paul Gilding lives. It all starts with unappealing, somewhat messy but nevertheless essential processes such as sulphuric acid, ammonia, caustic soda and chlorine manufacture (to name but a few). Added together, about half a billion tonnes of these four products are produced annually. These are energy intensive production processes operating on an industrial scale, but largely hidden away from daily life. They are in or play a role in the manufacture of almost everything we use, buy, wear, eat and do. These core base chemicals also rely on various feedstocks. Sulphuric acid, for example, is made from the sulphur found in oil and gas and removed during the various refining and treatment processes. Although there are other viable sources of sulphur they have long been abandoned for economic reasons.

dow-chemical-plant-promo

The ubiquitous mobile phone (which everything now seems to get compared to when we talk about deployment) and the much talked about solar PV cell are just the tip of a vast energy consuming industrial system, built on base chemicals such as chlorine, but also making products with steel, aluminium, nickel, chromium, glass and plastics (to name but a few). The production of these materials alone exceeds 2 billion tonnes annually. All of this is of course made in facilities with concrete foundations, using some of the 3.4 billion tonnes of cement produced annually. The global industry for plastics is rooted in the oil and gas industry as well, with the big six plastics (see below) all starting their lives in refineries that do things like converting naphtha from crude oil to ethylene.

The big six plastics:

  • polyethylene – including low density (PE-LD), linear low density (PE-LLD) and high density (PE-HD)
  • polypropylene (PP)
  • polyvinyl chloride (PVC)
  • polystyrene solid (PS), expandable (PS-E)
  • polyethylene terephthalate (PET)
  • polyurethane (PUR)

All of these processes are also energy intensive, requiring utility scale generation, high temperature furnaces, large quantities of high pressure steam and so on. The raw materials for much of this comes from remote mines, another facet of modern life we no longer see. These in turn are powered by utility scale facilities, huge draglines for digging and vast trains for moving the extracted ores. An iron ore train in Australia might be made up of 336 cars, moving 44,500 tonnes of iron ore, is over 3 km long and utilizes six to eight locomotives including intermediate remote units. These locomotives often run on diesel fuel, although many in the world run on electric systems at high voltage, e.g. the 25 kV AC iron ore train from Russia to Finland.

The above is just the beginning of the industrial world we live in, built on a utility scale and powered by utilities burning gas and coal. These bring economies of scale to everything we do and use, whether we like it or not. Not even mentioned above is the agricultural world which feeds 7 billion people. The industrial heartland will doubtless change over the coming century, although the trend since the beginning of the industrial revolution has been for bigger more concentrated pockets of production, with little sign of a more distributed model. The advent of technologies such as 3D Printing may change the end use production step, but even the material that gets poured into the tanks feeding that 3D machine probably relied on sulphuric acid somewhere in its production chain.

Redrawing the Energy-Climate Map

  • Comments Off

The world is not on track to meet the target agreed by governments to limit the long term rise in the average global temperature to 2 degrees Celsius (°C).

International Energy Agency, June 2013

The International Energy Agency (IEA) is well known for its annual World Energy Outlook, released towards the end of each year. In concert with the WEO come one or more special publications and this year is no exception. Just released is a new report which brings the IEA attention back squarely on the climate issue, Redrawing the Energy-Climate Map. The IEA have traditionally focused on the climate issue through their 450 ppm scenario. While they continue to do that this time, they are also going further with a more pragmatic model for thinking about emissions, that being the “trillion tonne” approach. I have discussed this at some length in previous posts.

The report looks deeply into the current state of climate affairs and as a result fires a warning shot across the bows of current national and UNFCCC efforts to chart a pathway in keeping with the global goal of limiting warming to 2 °C above pre-industrial levels. The IEA argue that we are on the edge of the 2 °C precipice and recommends a series of immediate steps to take to at least stop us falling in. With the catchy soundbite of ” 4 for 2° “, the IEA recommend four immediate steps in the period from now to 2020;

  1. Rapid improvements in energy efficiency, particularly for appliances, lighting, manufacturing machinery, road transport and within the built environment.
  2. Phasing out of older inefficient coal fired power stations and restricting less efficient new builds.
  3. Reductions in fugitive methane emissions in the oil and gas industry.
  4. Reductions in fossil fuel subsidies.

These will supposedly keep some hope of a 2°C outcome alive, although IEA makes it clear that much more has to be done in the 2020s and beyond. However, it didn’t go so far as to say that the 2° patient is dead, rather it is on life support.

I had some role in all this and you will find my name in the list of reviewers on page 4 of the report. I also attended a major workshop on the issue in March where I presented the findings of the Shell New Lens Scenarios and as a result advocated for the critical role that carbon capture and storage (CCS) must play in the solution set.

As a contributor, I have to say that I am a bit disappointed with the outcome of the report, although it is understandable how the IEA has arrived where it has. There just isn’t the political leadership available today to progress the things that really need to be done, so we fall back on things that sound about right and at least are broadly aligned with what is happening anyway. As a result, we end up with something of a lost opportunity and more worryingly support an existing political paradigm which doesn’t fully recognize the difficulty of the issue. By arguing that we can keep the door open to 2°C with no impact on GDP and by only doing things that are of immediate economic benefit, the report may even be setting up more problems for the future.

My concern starts with the focus on energy efficiency as the principal interim strategy for managing global emissions. Yes, improving energy efficiency is a good thing to do and cars and appliances should be built to minimize energy use, although always with a particular energy price trajectory in mind. But will this really reduce global emissions and more importantly will it make any difference by 2020?

My personal view on these questions is no. I don’t think actions to improve local energy efficiency can reduce global emissions, at least until global energy demand is saturated. Currently, there isn’t the faintest sign that we are even close to saturation point. There are still 1-2 billion people without any modern energy services and some 4 billion people looking to increase their energy use through the purchase of goods and services (e.g. mobility) to raise their standard of living. Maybe 1-1.5 billion people have reached demand saturation, but even they keep surprising us with new needs (e.g. Flickr now offers 1 TB of free storage for photographs). Improvements in efficiency in one location either results in a particular service becoming cheaper and typically more abundant or it just makes that same energy available to any of the 5 billion people mentioned above at a slightly lower price. Look at it the other way around, which oil wells, coal mines or gas production facilities are going to reduce output over the next seven years because the energy efficiency of air conditioners is further improved. The fossil fuel industry is very supply focused and with the exception of substantial short term blips (2008 financial crisis), just keeps producing. Over a longer timespan lower energy prices will change the investment portfolio and therefore eventual levels of production, but in the short term there is little chance of this happening. This is a central premise of the book I recently reviewedThe Burning Question.

Even exciting new technologies such as LED lighting may not actually reduce energy use, let alone emissions. Today, thanks to LEDs, it’s not just the inside of buildings where we see lights at night, but outside as well. Whole buildings now glow blue and red, lit with millions of LEDs that each use a fraction of the energy of their incandescent counterparts – or it would be a fraction if incandescent lights had even been used to illuminate cityscapes on the vast scale we see today. The sobering reality is that lighting efficiency has only ever resulted in more global use of lighting and more energy and more emissions, never less.

doha_skyline_560px

An analysis from Sandia National Laboratories in the USA looks at this phenomena and concludes;

The result of increases in luminous efficacy has been an increase in demand for energy used for lighting that nearly exactly offsets the efficiency gains—essentially a 100% rebound in energy use.

 I don’t think this is limited to just lighting. Similar effects have been observed in the transport sector. Even in the built environment, there is evidence that as efficiency measures improve home heating, average indoor temperatures rise rather than energy use simply falling.

The second recommendation focuses on older and less efficient coal fired power stations. In principle this is a good thing to do and at least starts to contribute to the emissions issue. This is actually happening in the USA and China today, but is it leading to lower emissions globally? In the USA national emissions are certainly falling as natural gas has helped push older coal fired power stations to close, but much of the coal that was being burnt is now being exported, to the extent that global emissions may not be falling. Similarly in China, older inefficient power stations are closing, but the same coal is going to newer plants where higher efficiency just means more electricity – not less emissions. I discussed the efficiency effect in power stations in an old posting, showing how under some scenarios increasing efficiency may lead to even higher emissions over the long term. For this recommendation to be truly effective, it needs to operate in tandem with a carbon price.

The third and fourth recommendations make good sense, although in both instances a number of efforts are already underway. In any case their contribution to the whole is much less than the first two. In the case of methane emissions, reductions now are really only of benefit if over the longer term CO2 emissions are also managed. If aggressive CO2 mitigation begins early, and is maintained until emissions are close to zero, comprehensive methane (and other Short Lived Climate Pollutants – SLCP) mitigation substantially reduces the long-term risk of exceeding 2˚C (even more for 1.5˚C). By contrast, if CO2 emissions continue to rise past 2050, the climate warming avoided by SLCP mitigation is quickly overshadowed by CO2-induced warming. Hence SLCP mitigation can complement aggressive CO2 mitigation, but it is neither equivalent to, nor a substitute for, near-term CO2 emission reductions (see Oxford Martin Policy Brief – The Science and Policy of Short Lived Climate Pollutants)

After many lengthy passages on the current bleak state of affairs with regards global emissions, the weak political response and the “4 for 2°C “ scenario, the report gets to a key finding for the post 2020 effort, that being the need for carbon capture and storage. Seventy seven pages into the document and it finally says;

In relative terms, the largest scale-up, post-2020, is needed for CCS, at seven times the level achieved in the 4-for-2 °C Scenario, or around 3 100 TWh in 2035, with installation in industrial facilities capturing close to 1.0 Gt CO2 in 2035.

Not surprisingly, I think this should have been much closer to page one (and I have heard from the London launch, which I wasn’t able to attend, that the IEA do a better job of promoting CCS in the presentation). As noted in the recently released Shell New lens Scenarios, CCS deployment is the key to resolving the climate issue over this century. We may use it on a very large scale as in Mountains or a more modest scale as in Oceans, but either way it has to come early and fast. For me this means that it needs to figure in the pre-2020 thinking, not with a view to massive deployment as it is just too late for that, but at least with a very focused drive on delivery of several large scale demonstration projects in the power sector. The IEA correctly note that there are none today (Page 77 – “there is no single commercial CCS application to date in the power sector or in energy-intensive industries”).

Of course large scale deployment of CCS from 2020 onwards will need a very robust policy framework (as noted in Box 2.4) and that will also take time to develop. Another key finding that didn’t make it to page one is instead at the bottom of page 79, where the IEA state that;

Framework development must begin as soon as possible to ensure that a lack of appropriate regulation does not slow deployment.

For those that just read the Executive Summary, the CCS story is rather lost. It does get a mention, but is vaguely linked to increased costs and protection of the corporate bottom line, particularly for coal companies. The real insight of its pivotal role in securing an outcome as close as possible to 2°C doesn’t appear.

So my own “ 2 for 2°C before 2020“ would be as follows;

  1. Demonstration of large-scale CCS in the power sector in key locations such as the EU, USA, China, Australia, South Africa and the Gulf States. Not all of these will be operational by 2020, but all should be well underway. At least one “very large scale” demonstration of CCS should also be underway (possibly at the large coal to liquids plants in South Africa).
  2. Development and adoption of a CCS deployment policy framework, with clear links coming from the international deal to be agreed in 2015 for implementation from 2020.

But that might take some political courage!

There are many books and thousands of reports on climate change, carbon economics, energy transformation and the like, but few encapsulate the issue as well as a recently released book by Mike Berners-Lee and Duncan Clark, The Burning Question. Judging by the recommendation on the cover, even Al Gore liked it.

bqcover

Rather than speculate on the potential severity of climate events or try to convince readers that simple changes in consumer behaviour and green, job creating investment will solve everything, the book takes a thought provoking but dispassionate look at the global energy system. The authors discuss the role of fossil fuels and the carbon emission limits that we know we should meet and set out to explain the rock and the hard place that we find ourselves between. The rock in this case is the trillion tonne of carbon limit for cumulative emissions over time and the hard place is the abundance of fossil fuels, the rate at which we use them and the relative ease with which more becomes available as demand rises.

Berners-Lee and Clark present a compelling set of stories which show how fossil fuels dominate the global energy market, why it is proving almost impossible to displace them (on a global basis) and why strategies such as improving energy efficiency and deploying renewables  are not effective approaches to try and limit global emissions. In fact they make the point that in some instances the reverse happens – emissions just rise faster.

The tag line on the cover includes the teaser  “So how do we quit?” (using fossil fuels). Do they really know? As the book unfolds and the problem they describe mounts in both complexity and difficulty, there is almost the feeling of a thrilling ending around the corner. SPOILER ALERT. Sadly this is not quite the case, but they do give some useful advice for policy makers trying to get to grips with the issue and the book itself gives the reader a very different perspective on the energy-climate conundrum (although hopefully one that the readers of this blog have picked up over time, but here it is all in one book).

I assume that for similar reasons to my own line of thinking (but after beating around the bush about it for 181 pages) they do finally land on a key thought:

In the course of writing this book we have come to think that the most undervalued technology in terms of unlocking international progress on climate change is carbon capture – both traditional CCS for point sources such as power plants and more futuristic ambient air capture technologies for taking carbon directly out of the atmosphere.

It would appear that The Inconvenient Truth and CCS are indeed inextricably linked. Clark and Berners-Lee don’t go so far as to argue that CCS is the convenient answer, but the message on CCS is a strong one. Nevertheless, geoengineering makes a surprise entrance at the end!!

Overall, this is an excellent discussion which is both easy to ready and hugely informative. It is well worth putting it on the summer reading list.

I came across an article from the Breakthrough Institute which argues for the benefits of government support for new energy technologies. The story is a few months old, but still highly relevant – in any case a related story is back on their front page this week. The technology in question is hydraulic fracturing (fracking) to extract natural gas from shale formations (shale gas). Breakthrough have come to the conclusion that the boom in shale gas is largely the result of considerable early investment in the technology by the US DOE. The article argues that this technology has transformed the USA energy scene, also resulting in a drop in US CO2  emissions. But the crunch point is the comparison with the EU, where the focus on emissions reduction has been through the development of carbon pricing. Breakthrough argues that the US is shifting rapidly to a lower carbon economy on the back of successful technology push policies, whereas the EU has a failed carbon market which is now even seeing a resurgence in coal use, some of it imported from the USA.

The differing experiences in Europe and the United States illustrate the relative efficacy of direct technology push versus carbon pricing in emissions reduction and advanced technological deployment. As we wrote in a February 2012 article in Yale e360, “the existence of a better and cheaper substitute has made the transition away from coal much more viable economically, and it has put wind at the back of political efforts to oppose new coal plants, close existing ones, and put in place stronger EPA air pollution regulations.”

. . . . .

America’s investments in technological innovation contrast strongly with the European Union’s preference for pricing signals. As Europe follows through on plans to build new coal plants that will burn for decades and America leads recent global decarbonization trends, we continue to find little evidence of success from the ETS or any other major carbon pricing schemes around the world.

There is no doubt that from an emissions perspective, the US is benefitting from the current gas boom. Back in June the IEA reported;

US emissions have now fallen by 430 Mt (7.7%) since 2006, the largest reduction of all countries or regions. This development has arisen from lower oil use in the transport sector (linked to efficiency improvements, higher oil prices and the economic downturn which has cut vehicle miles travelled) and a substantial shift from coal to gas in the power sector.

However, the story that Breakthrough is telling is more about linking events after the fact, rather than analyzing the real policy drivers. According to both Breakthrough and an analysis by Associated Press, DOE funding of fracking goes back decades, as does DOE funding for a range of energy technologies. However, this funding wasn’t linked to emissions reduction, but more to the general need for energy supply diversity, energy security and therefore the cost of energy. I have always argued for technology funding, it is an essential part of the policy landscape, particularly for technologies such as CCS. Canada has been active in this regard, with significant funding for CCS demonstration, such as for the Shell Quest project.

But it wasn’t the technology funding on its own that has delivered the change in the US. Price signals have played a key role, it is just that they are less transparent than the carbon price in the EU. Although there isn’t a carbon price mechanism operating in the USA today (across the whole economy), existing coal fired power stations and almost certainly any new ones being considered are still exposed to carbon pricing. This comes from the expectation of carbon pricing in the future, through regulation under the CAA or a later Congress implementing direct pricing. Shell uses such a price premise in its own projects, including those in the USA. We are on record at $40 per tonne of CO2. There are also more price signals for coal, such as from the new mercury rules.

What has worked in the USA is the combination of funding for new energy technologies and a price signal in the market which then drives deployment. It also happens that the coal fleet is old and even the longevity optimists amongst the power producers are starting to count down the number of years before replacement is due. Eventually, the combination of age, cost of natural gas, expected cost of emissions and likely investment required to keep the coal running delivers the knockout blow.

Turning to Europe, the modest resurgence in coal use comes from a similar set of sums, it’s just that the answer is different. The natural gas prices currently seen in the USA aren’t available, coal is getting cheaper thanks in part to US exports and the carbon price signal can even be locked in at relatively low and known levels by using the market. The result is less than desirable from the atmosphere’s perspective, but it is the reality of the current pricing signals. Back in June, Bloomberg reported;

Europe is burning coal at the fastest pace since 2006, as surging imports from U.S. producers such as Arch Coal Inc. (ACI) helped cut prices 26 percent in a year and benefited European power companies including EON AG. Demand for coal, the dirtiest fuel for making electricity, grew 3.3 percent last year in Europe while sales of less- polluting natural gas fell 2.1 percent, the steepest drop since 2009 . . .

None of this means that the EU approach to managing CO2 emissions is wrong or that price signals don’t do anything. Quite the reverse. It’s just that the answers coming out are currently giving some unexpected outcomes.

The Energy Mix

The World Business Council for Sustainable Development (WBCSD) held its annual company delegate conference in Switzerland this week. For the WBCSD Energy and Climate team the event marked the launch of the latest WBCSD publication “The Energy Mix”. This is a document that started life back in the middle of last year, originally as a response to the reaction from a number of governments to the events in Fukushima. The initial aim was to inform policy makers on the implication of sudden changes in energy policy, such as the decision by the German government to rapidly phase out the use of nuclear power. But as the work got going, the document took on a number of additional dimensions. Many have been covered in previous postings on this blog, but the document does a nice job of bringing a lot of information together in a crisp fold-out brochure format (at the moment the PDF is in regular page format, so the fold-out aspect is rather lost through this medium).

Sitting behind this effort is the WBCSD Vision 2050 work which charts the necessary pathway to a world in 2050 which sees “Nine billion people living well within the means of one planet”. A number of key themes are explored in “The Energy Mix” brochure:

  1. The risk of carbon lock-in, in other words current and “on the drawing board” infrastructure and related emissions being sufficient to consume the remaining global carbon budget (related to a 2°C temperature goal) within the normal remaining lifespan of those assets.
  2. The need for clear energy policy framework to guide the necessary changes over the coming decades.
  3. The importance of carbon pricing within that framework.

The document uses some fifteen vignettes to illustrate a variety of points. For example, to illustrate a) that policy can make a difference and b) it takes a long time, but c) its still very hard to reduce emissions by a big amount, take the case of France. Back in the 1970s the government intervened in the energy system and have progressively forced the construction of substantial nuclear capacity and a national high speed rail network, operating in combination with (like the rest of the EU) high transport fuel taxes. While these measures were not originally intended to reduce CO2 emissions, they are nevertheless compatible with such a goal and could just as easily be the route forward for a country. France now gets about 80% of its electricity from nuclear and has one of the best rail systems in the world, yet emissions have only fallen by 28% in 40 years. Economic growth and population growth continue to eat into the gains made, which might argue for yet further measures in the longer term. However, French emissions on a CO2/GDP basis are about 60% less than in the USA. With a very low CO2 per kWh for power generation, France would be in an excellent position to further decarbonize if electric cars entered the vehicle population in significant numbers. Interestingly, the car company with perhaps the worlds most progressive electric vehicle production programme also happens to be French. 

 The key message on the required policy framework is a pretty simple one – cover the key sectors and focus on the elements of the technology development pathway (Discover, Develop, Demonstrate, Deploy). The resulting grid looks like this:

 Filling in the boxes results in something that looks like this:

The framework shouldn’t be a big surprise, many of the elements are alive in the EU (but not so well in all cases- such as the carbon price).

The new WBCSD Energy Mix document can be downloaded here.

One of the blogs I read from time to time is that of Paul Gilding, an independent writer on sustainability and former head of Greenpeace International. He spoke at TED last week with a talk called “The Earth is Full”. His blog post this week references the talk and argues why we shouldn’t rely on the “techno-optimist” point of view that all will be okay on the night.

 Driven by their optimism bias, people use the clearly huge opportunity of technology to reassure themselves we won’t face a crisis. They believe any serious limits in the system will be avoided because technology will intervene and we’ll adapt.

I discussed this a while back in an earlier post. Two colleagues in the Shell Scenario team published an article in Nature that showed clear historic trends for the deployment of new energy technologies.

 

 They derived two “laws” from this work, which are:

 Law 1

When technologies are new, they go through a few decades of exponential growth, which in the twentieth century was characterized by scale-up at a rate of one order of magnitude a decade (corresponding to 26% annual growth). Exponential growth proceeds until the energy source becomes ‘material’ — typically around 1% of world energy.

Law 2

After ‘materiality’, growth changes to linear as the technology settles at a market share. These deployment curves are remarkably similar across different technologies.

The “laws” show that it can take up to a generation (i.e. 25-30 years) for an energy technology to become material. Gilding also makes the point that we shouldn’t necessarily draw lessons from the spectacular deployment of technologies such as mobile phones and then assume that the energy industry can do likewise.

But can’t technology drive rapid change? Everyone at TED holds up their smart phones as a wonderful example of such fast, transformational change. This is a good and correct example, but it needs to be put in perspective. This is what I call a “toy technology” – something that makes our lives more convenient and more fun. These technologies are adding real value to our lives and driving change, but they are not transforming the foundations of our current economy.

Unfortunately the deployment of “toy technology” also follows the “laws”, although the time scales are shortened somewhat. Although the first hand-held mobile phone call was made around 1975 and Finland had a 20,000 person subscriber trial up and running by 1980 (i.e. first adopter), it wasn’t until 1995 that the technology became “material”, reaching 1-2% of the global population. Today the global market is approaching saturation (6 billion subscriptions) although now the transition from mobile phone to mobile smart device is underway. So even in the world of fast paced technological change, materiality still takes 15 or so years and full scale deployment another 15-20 years.

So should we be techno-optimists?

For the reasons I argued in my November post, “Can global emissions really be reduced”, it will only be a major technology shift that sees emissions fall dramatically. Ideally this should be introduced through a carbon price because that will pull it into the energy economy faster than would otherwise be the case. Carbon pricing was a principal feature of the Shell Blueprints scenario, which saw electric mobility, solar, wind and CCS all playing major roles in the period to 2050. Emissions do fall in that scenario and the level of CO2 in the atmosphere reaches a plateau, albeit above 450 ppm. 

We need to be optimistic about the role of technology, but also realistic about just how fast the transition can take place. Blueprints exceeded the “laws” in some instances yet still didn’t fully deliver on a 2°C ambition. However, natural gas was not as prevalent in that scenario as it now appears to be which should be a positive development, but on the other hand the Blueprints transition to a global carbon market was already well underway.