Archive for April, 2012

The Energy Mix

The World Business Council for Sustainable Development (WBCSD) held its annual company delegate conference in Switzerland this week. For the WBCSD Energy and Climate team the event marked the launch of the latest WBCSD publication “The Energy Mix”. This is a document that started life back in the middle of last year, originally as a response to the reaction from a number of governments to the events in Fukushima. The initial aim was to inform policy makers on the implication of sudden changes in energy policy, such as the decision by the German government to rapidly phase out the use of nuclear power. But as the work got going, the document took on a number of additional dimensions. Many have been covered in previous postings on this blog, but the document does a nice job of bringing a lot of information together in a crisp fold-out brochure format (at the moment the PDF is in regular page format, so the fold-out aspect is rather lost through this medium).

Sitting behind this effort is the WBCSD Vision 2050 work which charts the necessary pathway to a world in 2050 which sees “Nine billion people living well within the means of one planet”. A number of key themes are explored in “The Energy Mix” brochure:

  1. The risk of carbon lock-in, in other words current and “on the drawing board” infrastructure and related emissions being sufficient to consume the remaining global carbon budget (related to a 2°C temperature goal) within the normal remaining lifespan of those assets.
  2. The need for clear energy policy framework to guide the necessary changes over the coming decades.
  3. The importance of carbon pricing within that framework.

The document uses some fifteen vignettes to illustrate a variety of points. For example, to illustrate a) that policy can make a difference and b) it takes a long time, but c) its still very hard to reduce emissions by a big amount, take the case of France. Back in the 1970s the government intervened in the energy system and have progressively forced the construction of substantial nuclear capacity and a national high speed rail network, operating in combination with (like the rest of the EU) high transport fuel taxes. While these measures were not originally intended to reduce CO2 emissions, they are nevertheless compatible with such a goal and could just as easily be the route forward for a country. France now gets about 80% of its electricity from nuclear and has one of the best rail systems in the world, yet emissions have only fallen by 28% in 40 years. Economic growth and population growth continue to eat into the gains made, which might argue for yet further measures in the longer term. However, French emissions on a CO2/GDP basis are about 60% less than in the USA. With a very low CO2 per kWh for power generation, France would be in an excellent position to further decarbonize if electric cars entered the vehicle population in significant numbers. Interestingly, the car company with perhaps the worlds most progressive electric vehicle production programme also happens to be French. 

 The key message on the required policy framework is a pretty simple one – cover the key sectors and focus on the elements of the technology development pathway (Discover, Develop, Demonstrate, Deploy). The resulting grid looks like this:

 Filling in the boxes results in something that looks like this:

The framework shouldn’t be a big surprise, many of the elements are alive in the EU (but not so well in all cases- such as the carbon price).

The new WBCSD Energy Mix document can be downloaded here.

Is the first offer the best?

Energy policy development over the last decade has shown one thing for certain, governments the world over are persistent in their desire to alter the energy mix and/or at least begin to manage emissions. Whether this is purely for environmental reasons or for concerns about energy security or perhaps for long term fiscal security almost doesn’t seem to matter, energy policy development and emissions management continues to be a high priority. This then opens up the question as to how business should best respond to this trend and what role it should play?

Recent developments in Australia present a useful case study. When the CPRS (Carbon Pollution Reduction Scheme – a national cap-and-trade system) was proposed in 2008, an unintended coalition of certain business interests, the Federal Opposition and Green Party opponents eventually managed to see the bill fail. Many businesses actually supported the bill at the time, but seemingly the planets were not suitably aligned for passage. Had things been different, Australia would now have been in the late implementation phase of a relatively benign approach to managing emissions with a carbon price very likely around AU$10 per tonne, trading on the back of the global price for a Certified Emission Reduction (the UNFCCC offset mechanism) and its link to the EU ETS. Instead, events have resulted in a very different outcome. A fixed carbon price of $23 per tonne will be implemented from July, albeit transitioning to a market related price in a few years time. Recent media reports tell of a heated national debate now underway, with many arguing that the price is out of line with the “prevailing global price” and therefore leaving Australia competitively exposed. Not surprisingly, those that first opposed the CPRS and those concerned about the current price are in many cases, one in the same. The first offer in the form of the CPRS was arguably the better deal, yet it was turned down.

At least two offers have been made in the USA. In 2001 the Bush Administration offered a science and technology based approach which has delivered some results, but given a general lack of enthusiasm for implementation by the NGO community in particular with some business groups as unintended allies, the initiative failed in key areas such as the development of carbon capture and storage. Had real progress been made, rollout of the technology might have been underway today. Eight years later the second offer came from the Obama Administration in the form of a national cap-and-trade approach in combination with technology incentives, but this was also declined. Both of these were also relatively benign, the first because it represented an early start and would had been largely government funded and the second because the overall structure of the deal offered significant competitive protection for key industries and included both a long lead time for implementation and a soft start. The Clean Air Act offer now on the table appears to be the least palatable of all these and could well prove to be less effective in terms of actually reducing emissions. Given that it will require specific actions of large emitters, the implied carbon price for some facilities may be very high. In addition, the approach will address individual sources but may not result in a real reduction of national emissions because no overall cap will be in place.

Canada has also followed a fairly tortuous path in recent years. No substantive national programme to manage emissions has emerged, yet various forms of market based policy have been tested and rejected. Although carbon pricing mechanisms now exist in some provinces, a national standards based regulatory approach may well emerge, keeping pace with the Clean Air Act developments now underway in the USA. This is bound to be more complex and almost certainly more costly for business than the cap-and-trade approach that was first proposed back in about 2003. In 2005 a North American cap-and-trade approach was even studied by a combined EPA / Environment Canada Task Force.

Canada United States ccap and trade.jpg

 The increasing number of standards based or fixed price approaches that are now “on offer”, bring into question the wisdom of defeating “cap-and-trade”. The latter offers compliance flexibility through offset mechanisms, banking and limited borrowing, competition protection through free allocation in the early phases of implementation and even technology incentives through constructions such as the NER300 in the EU-ETS. By contrast, a standard has limited flexibility, no price transparency and potentially onerous penalties. This would appear to represent something of an “own goal”.

The EU faces a related issue today. Despite some initial grumbling, businesses in Europe actually accepted the first offer of the EU ETS (cap and trade). But its effectiveness has slowly eroded over time. This is partly due to the recession but there is also a policy design cause arising from the superimposition of multiple layers of policy, such as specific renewable energy targets, nuclear build rates, efficiency mandates and more. These policies are well meaning but often misaligned. As the ETS has weakened, this process has accelerated therefore compounding the problem. The business community is split over what to do about this with various proposals involving the set aside of allowances favoured by some, but others arguing that the system is naturally responding to events and should be left to find its own way. The problem with the latter position is that it could result in an ETS that becomes politically and economically irrelevant, leaving a standards based approach as the way forward in Europe as well. Another “own goal” in the making!

With a new report from the IPCC on managing the risks associated with extreme weather and continued weather phenomena attracting media attention, it is important to attempt to get to grips with the science and statistics behind this rapidly emerging field of research. Back in January I posted a story on the current trend to label any and sometimes all extreme weather events as symptomatic of climate change. I argued that a much more rigorous approach is required to understand the links between extreme weather and rising global temperatures.

Work along such lines is starting to develop. Some early work was done by Professor Myles Allen of Oxford University following the extraordinary European heat-wave of 2003. His analysis showed that the event lay so far out of the normal 2-standard deviation band around the historical average, that it could be argued that the event would never have occurred without a certain level of background warming. The figure below illustrates this phenomenon.

A recent paper by NASA climatologist James Hansen explores the phenomena in considerable depth and shows with some conviction that extreme heat events should be a cause for concern. As illustrated in the figure above, Hansen has shown that the distribution of seasonal temperature has indeed shifted, leading to an increase in anomalous events. An important change is the emergence of a category of summertime extremely hot outliers, more than three standard deviations (σ) warmer than the 1951-1980 baseline.  This hot extreme, which covered much less than 1% of Earth’s surface in the base period, now typically covers about 10% of the land area.  He concludes that extreme heat waves, such as that in Texas and Oklahoma in 2011 and Moscow in 2010, were “caused” by global warming, because their likelihood was negligible prior to the recent rapid global warming.

The variability in global temperatures (weather) can be approximated as a normal (Gaussian) distribution, the so-called ‘bell curve’.  A normal distribution has 68 percent of the anomalies falling within one standard deviation of the mean value.  The tails of the normal distribution decrease quite rapidly so there is only a 2.3% chance of the temperature exceeding +2σ, where σ is the standard deviation, and a 2.3% chance of being colder than -2σ.  The chance of exceeding +3σ is only 0.13% for a normal distribution, with the same chance of a negative anomaly exceeding -3σ.

Hansen’s analysis of temperature data over the period 1951-2011 (see figure below) showed that the expected shift in the distribution is actually occurring, with the consequent emergence of a new category of “extremely hot” summers, more than 3σ warmer than the base period. +3σ anomalies practically did not exist in the base period, but in the past several years these extreme anomalies have covered of the order of 10% of the land area. The increase, by more than a factor 10, of area covered by extreme hot anomalies (> +3σ ) in summer reflects the shift of the anomaly distribution in the past 30 years of global warming. One implication of this shift is that the extreme summer climate anomalies in Texas in 2011, in Moscow in 2010, and in France in 2003 almost certainly would not have occurred in the absence of global warming with its resulting shift of the anomaly distribution.  In other words, we can say with a high degree of confidence that these extreme anomalies were a consequence of global warming.

Hansen has concluded that the extreme hot tail of the distribution of temperature anomalies shifted to the right by more than +1σ in response to the global warming of about 0.5°C over the past three decades.  He goes on to say that additional global warming in the next 50 years, if business-as-usual emissions continue, is expected to be at least 1°C and that in that case, the further shifting of the anomaly distribution will make +3σ anomalies the norm and +5σ anomalies will be common.

The chance of summer falling in the “hot” category of 1951-1980 is now about 80%.  This change is sufficiently large that the perceptive person (old enough to remember the climate of 1951-1980) should recognize the existence of climate change.

While the perceptive person may be starting to recognize that things are not what they were (even the NBC News anchorman recently commented after reading a weather story, “It sure wasn’t like that when I was a kid”), it remains unclear how long it will take the general public to recognize that change is underway. An increased incidence of +5σ events may well trigger such a reaction, although such a change may not be apparent until the 2020s or 2030s (assuming a shift of one standard deviation every 15+ years).