As I have posted in recent weeks, my complete book on climate change is now out, under the title ‘Putting the Genie Back: Solving the Climate and Energy Dilemma‘. The publisher, Emerald Publishing Limited, has just made the first chapter available on-line so that you can try before you buy.
Their advertising for the chapter included a quote from President Barack Obama, which gives a hint as to where the title came from, but isn’t the whole story. In May 2014 the President had invited several meteorologists to the White House for exclusive interviews on the climate change report that the administration released earlier in the week. The president summed up the importance of the situation with one simple phrase: “it’s hard to put a genie back in the bottle.” The quote was part of a longer piece which was referring to the climate itself and the possibility of reaching a tipping point. He had said;
“I want to make sure that my children and my grandchildren are able to enjoy a beautiful day like today; that they’re not living in a more dangerous world. […] This is one of those things that is very difficule to recover from. Once we hit a certain tipping point, we don’t know exactly how certain weather patterns could end up resulting in catastrophe. It’s hard to put a genie back in the bottle.”
By that time my first e-book, which the current book also incorporates, was in the late stages of writing and I had already pencilled in the genie metaphor, although with other suggestions coming from friends and family, I wasn’t entirely sure. My use was never in relation to the climate itself, but to the use of fossil fuels. That particular genie has ushered in astounding progress for society, but it is also a genie that is hard to challenge. Nevertheless, we can, with carbon capture and storage representing a method of returning the carbon to the subsurface, effectively putting the genie back.
Even though my intended meaning wasn’t the same as the President, when he also used the phrase in relation to climate change it seemed like the decision was made.
Enjoy the pre-read and I hope you buy the book. My proceeds are going to the Centre for Climate and Energy Solutions (C2ES) in Washington D.C.
With the advent of the Paris Agreement, there is a new focus on net zero emissions. This is largely driven by a better understanding of climate science (the importance of cumulative emissions), but also by a line in the Agreement itself which calls for a ‘balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century’. This potentially brings into play a set of technologies known as negative emissions technologies or NETs. A NET is a technology which draws down on atmospheric carbon dioxide; perhaps the simplest implementation of this is planting a tree.
NETs are required for two reasons over the long term;
Be it local or global, a requirement for net zero emissions will inevitably mean a balance between remaining sources of emissions and the removal of carbon dioxide from the atmosphere as an offset, rather than a world of no emissions at all. Remaining sources of emissions could include some continuing use of fossil fuels but without dedicated carbon capture and storage (e.g. aviation) or very difficult to manage emissions such as from the agriculture sector. This requirement may only need NET deployment on a modest scale, simply to match the remaining emission sources. However, if those sources remain significant, then NET deployment would have to be scaled to match.
At a global level, cumulative emissions may have exceeded a desired level for a certain temperature goal, in which case there is a need for an overall drawdown on atmospheric carbon dioxide, beyond that which natural sinks might deliver (e.g. continued ocean uptake). This is likely to require very significant deployment of NETs, certainly on the many gigatonnes per annum scale.
Even before the Paris Agreement, an in-depth look at the IPCC 5th Assessment report would have shown that many of the scenarios consistent with the 2°C goal included a period in the second half of the century when global emissions were negative to achieve a net drawdown on atmospheric carbon dioxide. The reason for needing such a period is that under these scenarios it doesn’t prove possible to limit emissions sufficiently, given the time it takes to re-engineer the energy system in the face of rising demand and legacy infrastructure.
The Paris Agreement has only strengthened the need for negative emissions technologies. With a goal of somewhere between 1.5 and 1.8C (‘well below’, as the Agreement states, could be interpreted as at least 10% below 2°C), the cumulative emissions of carbon should be some 175 billion tonnes of carbon lower than for a 2°C scenario, or 640 billion tonnes CO2. At current levels, that is the equivalent of 15 years emissions. As I illustrated in a pre-Paris post, decades of NET deployment and use may be required to meet this stringent carbon budget.
DAC: Direct air capture of carbon dioxide from ambient air by engineered chemical reactions. This would then become DACS (or DACCS) if geological storage were involved.
EW: Enhanced weathering of minerals, where natural weathering to remove carbon dioxide from the atmosphere is accelerated and the products stored in soils, or buried deep in land or deep-ocean.
AR: Afforestation and reforestation to fix atmospheric carbon ion biomass and soils.
OU: manipulation of carbon uptake by the ocean, either biologically or chemically.
AS: Altered agricultural practices, such as increased carbon storage in soils.
BC: Converting biomass to recalcitrant biochar, for use as a soil amendment.
The article focusses on BECCS, DAC, EW and AR and gives a detailed breakdown of the global impacts of these technology areas in terms of water, energy needs, land use and so on. It is clear that there is no silver bullet to rely on. While BECCS and DAC can potentially be deployed at scale and make a material difference to atmospheric carbon dioxide (>3 GT Carbon per annum by 2100, or 10+ GT CO2), BECCS requires significant land and water use (but is a net energy producer), whereas DAC is a big energy user. The latter is also deemed to be very expensive to implement. EW, on the other hand, just doesn’t make the grade in terms of scale. That leaves AR, which is certainly scalable but only very large scale deployment occupying huge swathes of land will make a significant difference in atmospheric carbon dioxide.
The paper ends with the rather sobering recognition that a failure of NETs to deliver expected mitigation in the future due to any combination of the biophysical and economic limits examined, leaves the world with no ‘Plan B’. Clearly there is much more to be done to commercialise and deliver a sustainable pathway for this family of technologies.
Two recent and separate articles in Foreign Affairs highlight different routes forward for tacking the climate issue. One, by Michael Bloomberg, argues that the mitigation solution increasingly lies with cities (this isn’t just about city resilience) and the other puts the challenge squarely in front of the business community.
These are just two in a salvo of pre-Paris articles that seek to direct the negotiations towards a solution space, including some by me and other colleagues arguing the case for carbon pricing systems. The articles reminded me of a similar article in 2009, the Hartwell Paper, in which a group of UK economists cast the climate issue as a ‘wicked problem’, but still went on to propose a very specific solution (a big technology push funded by carbon taxes). That paper also built its argument on the back of the Kaya Identity, which I have argued simplifies the emissions problem such that it can lead to tangential solutions that may not deliver the necessary stabilization in atmospheric carbon dioxide. Nevertheless, there is still merit in focusing on a specific way forward – at least something useful might then get done.
But the description of the climate problem as ‘wicked’, is one that deserves further thought. The use of the word wicked in this context is different to its generally accepted meaning, but instead pertains to the immense difficulty of the problem itself. Wikipedia gives a good description;
A problem that is difficult or impossible to solve because of incomplete, contradictory, and changing requirements that are often difficult to recognize. The use of the term “wicked” here has come to denote resistance to resolution, rather than evil. Moreover, because of complex inter-dependencies, the effort to solve one aspect of a wicked problem may reveal or create other problems.
It is also important to think about which problem we are actually trying to solve. For example, it may turn out that the issue of climate change is immensely more difficult to solve than the issue of carbon dioxide emissions. There is now good evidence that emissions can be brought down to near zero levels, but this doesn’t necessarily resolve the problem of a changing climate. Although warming of the climate system is being driven by increasing levels of carbon dioxide in the atmosphere, the scale on which anthropogenic activities are now conducted can also impact the climate through different routes. Moving away from fossil fuels to very large scale production of energy through other means is a good illustration of this. In a 2010 report, MIT illustrated how very large scale wind farms could result in some surface warming because the turbulent transfer of heat from the surface to the higher layers is reduced as a result of reduced surface kinetic energy (the wind). This is because that energy is converted to electricity. This is not to argue that we shouldn’t build wind turbines, but rather to highlight that with a population of 7-10 billion people all needing energy for a prosperous lifestyle, society may inadvertently engage in some degree of geoengineering (large-scale manipulation of an environmental process that affects the earth’s climate) simply to supply it.
Even narrowing the broader climate issue to emissions, the problem remains pretty wicked. Inter-dependencies abound, such as when significant volumes of liquid fuels may be supplied by very large scale use of biomass or when efficiency drives an increase in energy use (as it has done for over 100 years), rather than the desired reduction in emissions.
An approach to managing wicked problems (Tim Curtis, University of Northampton) first and foremost involves defining the problem very succinctly. This involves locking down the problem definition or developing a description of a related problem that you can solve, and declaring that to be the problem. Objective metrics by which to measure the solution’s success are also very important. In the field of climate change and the attempts by the Parties to the UNFCCC to resolve it, this is far from the course currently being taken. There is immense pressure to engage in sustainable development, end poverty, improve access to energy, promote renewable technologies, save forests, solve global equity issues and use energy more efficiently. Although these are all important goals, they are not sufficiently succinct and defined to enable a clear pathway to resolution, nor does solving them necessarily lead to restoration of a stable climate. The INDC based approach allows for almost any problem to be solved, so long as it can be loosely linked to the broad categories of mitigation and adaptation. The current global approach may well be adding to the wickedness rather than simplifying or even avoiding it.
The short article referenced above concludes with a very sobering observation;
While it may seem appealing in the short run, attempting to tame a wicked problem will always fail in the long run. The problem will simply reassert itself, perhaps in a different guise, as if nothing had been done; or worse, the tame solution will exacerbate the problem.
In climate change terms, this translates to emissions not falling as a result of current efforts, or even if they do fall a bit this has no measurable impact on the continuing rise in atmospheric carbon dioxide levels.
But that is not to say we should give up, as the counter to this observation is that having defined a clear and related objective to the wicked problem that is being confronted, declare that there are just a few possible solutions and focus on selecting from among them. For me, that comes down to implementing a cost for emitting carbon dioxide through systems such as cap-and-trade or carbon taxation. As such, I am about to release a second book in my Putting the Genie Back series, this one titled Why Carbon Pricing Matters. It will be available from mid-September but can be pre-ordered now.
There are many books and thousands of reports on climate change, carbon economics, energy transformation and the like, but few encapsulate the issue as well as a recently released book by Mike Berners-Lee and Duncan Clark, The Burning Question. Judging by the recommendation on the cover, even Al Gore liked it.
Rather than speculate on the potential severity of climate events or try to convince readers that simple changes in consumer behaviour and green, job creating investment will solve everything, the book takes a thought provoking but dispassionate look at the global energy system. The authors discuss the role of fossil fuels and the carbon emission limits that we know we should meet and set out to explain the rock and the hard place that we find ourselves between. The rock in this case is the trillion tonne of carbon limit for cumulative emissions over time and the hard place is the abundance of fossil fuels, the rate at which we use them and the relative ease with which more becomes available as demand rises.
Berners-Lee and Clark present a compelling set of stories which show how fossil fuels dominate the global energy market, why it is proving almost impossible to displace them (on a global basis) and why strategies such as improving energy efficiency and deploying renewables are not effective approaches to try and limit global emissions. In fact they make the point that in some instances the reverse happens – emissions just rise faster.
The tag line on the cover includes the teaser “So how do we quit?” (using fossil fuels). Do they really know? As the book unfolds and the problem they describe mounts in both complexity and difficulty, there is almost the feeling of a thrilling ending around the corner. SPOILER ALERT. Sadly this is not quite the case, but they do give some useful advice for policy makers trying to get to grips with the issue and the book itself gives the reader a very different perspective on the energy-climate conundrum (although hopefully one that the readers of this blog have picked up over time, but here it is all in one book).
I assume that for similar reasons to my own line of thinking (but after beating around the bush about it for 181 pages) they do finally land on a key thought:
In the course of writing this book we have come to think that the most undervalued technology in terms of unlocking international progress on climate change is carbon capture – both traditional CCS for point sources such as power plants and more futuristic ambient air capture technologies for taking carbon directly out of the atmosphere.
It would appear that The Inconvenient Truth and CCS are indeed inextricably linked. Clark and Berners-Lee don’t go so far as to argue that CCS is the convenient answer, but the message on CCS is a strong one. Nevertheless, geoengineering makes a surprise entrance at the end!!
Overall, this is an excellent discussion which is both easy to ready and hugely informative. It is well worth putting it on the summer reading list.
We tend to think of climate change as a relatively modern issue, perhaps marked by the testimony before Congress of James Hansen in summer 1988. The terms “climate change” and “global warming” hardly appear in literature before 1975 and didn’t really take off until the mid-1980’s.
There is of course Svante Arrhenius who published on the role of carbon dioxide in 1903 and even some others before that. There was certainly research on the issue throughout the 20th Century, including the work of Keeling and Callendar. But this week I was prompted to read a bit about the Revelle Factor (ocean uptake of CO2, more basic research in the 20th Century) and came across the following publication, endorsed and signed in 1965 by then President Lyndon Johnson and produced by the President’s Science Advisory Committee. It is a review on the then current state of the environment with a focus on pollutants. To my surprise, contained within it is a lengthy chapter on the rising levels of CO2 in the atmosphere from the use of fossil fuels and its impact on global temperature. Was this the earliest political prompt on the issue from the science community?
In the days before computer models, climate lobbyists, sceptics, warmists and the pseudo-scientists who claim to have deep and insightful knowledge of atmospheric physics and chemistry (a.k.a. a variety of journalists, hobbyists, lawyers, political figures and others) which the atmospheric physicists themselves “apparently don’t have”, here is a first (??) thoughtful introduction and analysis by the science community, published by the United States Government on an issue that has become paramount today. It makes for interesting reading.
The paper looks at the atmospheric build up of CO2, the likely further build up by 2000 as fossil fuels continue to be consumed, expected temperature rises, the possible impact on global sea levels as ice caps melt and concludes;
The climatic changes that may be produced by the increased CO2 content could be deleterious from the point of view of human beings.
Perhaps the most surprising aspect of the chapter on Atmospheric Carbon Dioxide is the discussion on a geoengineering solution. The above conclusion goes on to say;
The possibilities of deliberately bringing about countervailing climatic changes therefore need to be thoroughly explored. A change in the radiation balance in the opposite direction to that which might result from the increase of atmospheric CO2 could be produced by raising the albedo, or reflectivity, of the earth . . . . . . .
Nearly fifty years later, not a great deal has been done in response to all this, although climate science has certainly advanced. But in his opening remarks, President Johnson calls for “highest priority of all to increasing the numbers and quality of the scientists and engineers working on problems related to the control and management of pollution“. The fact that some in our society have chosen to demonize these people and even mock their work is a sad state of affairs. Tackling climate change means we need more scientists, with society fully behind young people focusing on science, technology, engineering and mathematics. Governments too can play a bigger role, not only like Johnson did in recognizing the problem but by enacting enabling policy measures and delivering public funding to support progress in research and development.
I was fortunate to be invited to attend CIGI 10 just outside Toronto, Canada. The annual “deep dive” policy discussion is held by the Centre for International Governance Innovation, a policy think-tank founded by Jim Balsillie, co-CEO of Research in Motion (a.k.a. Blackberry) and this year the focus was the global governance around the climate. While there was much discussion on bilateral vs. multilateral, UNFCCC or G20 and so on, one particular discussion focused on the role of sulphur in the atmosphere.
The discussion started with the current reality of sulphur being artificially pumped into the troposphere through the worldwide use of High Sulphur Fuel Oil (HSFO) in ships (and of course from other sources such as coal fired power stations not fitted with scrubbers). The combustion of this fuel powers much of the worlds ocean going fleet and the sulphur leaves the ship through the funnel. HSFO contains some 3.5% sulphur, so a modern container ship travelling from Shanghai to Southampton via the Suez Canal will eject about 30 tonnes of sulphur into the atmosphere, along with some 3,000 tonnes of CO2. The CO2 of course adds to the growing accumulation of this gas in the atmosphere, but the sulphur remains in the atmosphere for just a few weeks in aerosol form before dropping out. Nevertheless, as a result of all the marine activity and other sources of sulphur, there is a net suspension of sulphur in the atmosphere above us. The result of this is that it cools the atmosphere by scattering incoming radiation, offsetting some of the warming impact of CO2 and other greenhouse gases.
But sulphur also has a negative effect in terms of local and regional air quality so the International Maritime Organisation (IMO) has moved to limit sulphur in marine fuel. A recent analysis by Winebrake et al (2009) discusses the climate impact of the marine fuel sulphur specification reducing to 0.5% globally – a potential end goal of the current IMO limits. Whereas the global annual average cooling effect of shipping is currently some -0.6 W/m2 (compared to the current additional radiative forcing from post-industrial CO2 now approaching 2 W/m2), this is shown to reduce to -0.3 W/m2 in the case of a global 0.5% sulphur specification – in other words, another 0.3 W/m2 of warming.
But this was just the start of the discussion. The real issue was the potential role of sulphur in deliberately managing the global temperature – a practice more commonly referred to as geoengineering. Trying to do this at sea level and injecting sulphur into the troposphere has far less impact than doing the same in the stratosphere. For the same amount of surface cooling, approximately one twentieth the amount of sulphur is required at 25,000 metres because the half-life of the aerosol suspension is some 18 months at that height, rather than just the few weeks seen in the low atmosphere.
An indicative calculation has shown that a fleet of 150 aircraft injecting sulphur into the stratosphere on a continuous basis could potentially offset the warming associated with a doubling of CO2 in the atmosphere. The cost of this is estimated to be no more than $10 billion per annum and perhaps quite a bit less.
So began the real debate – the implications of being able to manage atmospheric warming for an amount so small that even some individuals could undertake the experiment, or perhaps a group such as the small island states in defense of their territory. For major emitters this would be a paltry sum, far less than some of the direct mitigation options. But if such a practice were undertaken, what then for the global endeavors to reduce emissions? Would we just give up trying? And while some amount of cooling might be achieved, phenomena such as ocean acidification would continue. Who should decide on such weighty issues and what if one nation or group of nations decided to conduct the practice unilaterally? One participant asked if the practice might even be in breach of Article 2 of the Framework Convention.
In the short time we had there was of course no resolution to the issues raised, but it was suggested that a global aerosol management framework was as important to the climate discussions as the greenhouse gas framework slowly being formulated or the CFC framework that exists under the Montreal Protocol. But no such framework is seriously under discussion. I won’t be so bold as to suggest answers to the questions raised, or even to attempt to list the dozens of other ethical and moral questions raised by this topic. But it certainly did provide a lively start to the Sunday morning portion of the conference!