Tag Archives: IEA

Shale Gas (Part III): A Brave New World?

In this post, we will switch from a look at the shale gas outlook in the US to that globally. Again, the starting point is a forecast of total energy consumption out into the future, and then a discussion of what amount of gas would be needed to produce a true energy transformation. The latest set of forecasts we have are those from BP’s Energy Outlook 2012, just released this January. The report can be found here.

Interestingly, there is not that much difference between the aggregate energy numbers produced by the major organisations that predict energy supply and demand into the future (IEA, EIA, OPEC, BP and Exxon Mobile). I think that this is because they generally start with a GDP growth (and energy intensity) assumption and then work backwards to produce supply and demand forecasts. (The question of whether growth drives energy or energy drives growth is a topic for another post.)

Continue reading

The Chief Investment Officer of JP Morgan Comes Out of the Peak Oil Closet

At the heart of the cornucopian view of energy abundance lies the belief that technology will overcome any natural resource constraint. The poster child for this view of the world is Moore’s Law, whereby computing power follows the allegory of a grain of wheat on a chess board (so promising untold riches for us all).  Interestingly, Michael Cembalest the CIO of JP Morgan—surely the antithesis of The Archdruid—displayed a large dose of scepticism over this technological nirvana in a recent report that commenced by highlighting a few famous predictions of our energy future (in so doing, Cembalest makes the point that Moore’s Law is the technological exception—not the rule):  Continue reading

A Big Number Gets Tweaked

If I had to nominate candidates for the title of two most important numbers in the world, they would have to be 1) the atmospheric concentration of CO2 in the atmosphere (which you can find here) and 2) the climate sensitivity of the global mean temperature to a doubling of CO2.

As esoteric as this discussion may appear, both numbers rank above such economic heavy weights as inflation, GDP growth and government debt-to-GDP ratios for the life outcomes for my kids (in my humble opinion). Basically, bad things happen as CO2 jumps and temperature rises (see here, here and here).

Now there is a lot more I would like to say about atmospheric CO2 concentration, but that will have to wait for future posts. Today, I want to focus on climate sensitivity because an academic paper in the journal Science has just been released (here) that claims the numbers we have been using up to now for climate sensitivity have been too high.

But before I quote the abstract of the new paper, it is useful to restate the existing consensus from the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007. It can easily be found on page 12 of the Summary for Policy Makers here. The key paragraph is as follows:

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

Now we turn to the new academic paper by Schmittner et al. and—after noting that Kelvin (K) is the equivalent to Celsius (C)—we read this:

Assessing impacts of future anthropogenic carbon emissions is currently impeded by uncertainties in our knowledge of equilibrium climate sensitivity to atmospheric carbon dioxide doubling. Previous studies suggest 3 K as best estimate, 2 to 4.5 K as the 66% probability range, and nonzero probabilities for much higher values, the latter implying a small but significant chance of high-impact climate changes that would be difficult to avoid. Here, combining extensive sea and land surface temperature reconstructions from the Last Glacial Maximum with climate model simulations, we estimate a lower median (2.3 K) and reduced uncertainty (1.7 to 2.6 K 66% probability). Assuming paleoclimatic constraints apply to the future as predicted by our model, these results imply lower probability of imminent extreme climatic change than previously thought.

Very simplistically, the paper reconstructs the temperature record of the last glacial maximum (LGM, the height of the last ice age) 20,000 years ago. Their findings suggest that the LGM was between 2 to 3 degrees Celsius cooler than the present, against current consensus estimates of around 5 degrees. The authors then matched this temperature against the green house gas concentrations of that time. In sum, for the given difference in CO2 with the present, they got less bang for the buck in terms of CO2 impact on temperature compared with what climate models currently suggest for the future.

If we believe the new findings, then the best estimate of climate sensitivity should be reduced from 3 degrees Celsius for a doubling of CO2 to 2.3 degrees—and the range has also to be narrowed. Just to put things in context, the pre-industrial concentration of CO2 was 280 parts per million and we are now at around 390 ppm, or up 40%. Now the IPCC’s AR4 also has this to say:

450 ppm CO2-eq corresponds to best estimate of 2.1°C temperature rise above pre-industrial global average, and “very likely above” 1°C rise, and “likely in the range” of 1.4–3.1°C rise.

Now I’ve highlighted it before in another post, but I will highlight it again in this post, CO2 and CO2 equivalent are different concepts. However, at the current time, non-C02 atmospheric forcing effects currently cancel out (for a more detailed discussion of this, see here), so we are in the happy position of being able to capture what is happening by looking at the CO2 number alone—for the time being.

Moving on, we should note that the international community has decided that 2 degrees Celsius of warming marks the point as where we will experience ‘dangerous’ climate change. This is in the opening paragraph of the Copenhagen Accord:

We underline that climate change is one of the greatest challenges of our time. We emphasise our strong political will to urgently combat climate change in accordance with the principle of common but differentiated responsibilities and respective capabilities. To achieve the ultimate objective of the Convention to stabilize greenhouse gas concentration in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system, we shall, recognizing the scientific view that the increase in global temperature should be below 2 degrees Celsius, on the basis of equity and in the context of sustainable development, enhance our long-term cooperative action to combat climate change.

To recap, we have a best estimate of climate sensitivity of 3 degrees. And based on this number,  atmospheric CO2-equivalent should be capped at 450 ppm to hold temperature rise to around 2 degrees. This, in turn, is because 2 degrees of warming is deemed the level at which ‘dangerous’ climate change develops.

Now what happens if the 3 degree number is incorrect and should be 2.3 degrees? Well, the first reaction is to think that the 450 ppm ‘line in the sand’ for dangerous climate change goes out the window. Further, if this CO2 concentration number goes out the window, so do all the numbers for ‘extremely dangerous’ climate change, and for that matter ‘catastrophic’ climate change. If so, the carbon emissions paths associated with different levels of warming as talked about in my post here also have to be radically revised (click for larger image below, see here for the original article).

And, addition, the deadline for the cessation of fossil fuel based energy production plant installation calculated by the International Energy Agency (IEA) and as talked about in my last post here would also have to be reworked.

However, some caution is in order. First, this is only one paper amongst many that have tackled the question of climate sensitivity from a variety of angles; it should be judged within the context of the total body of work. Further, as with all good science, its assumptions will come under intense scrutiny to check if the methodology is correct. Unlike the climate skeptic blog comnentary, the authors of the report fully admit the tentative nature of their findings:

“There are many hypotheses for what’s going on here.  There could be something wrong with the land data, or the ocean data.  There could be something wrong with the climate model’s simulation of land temperatures, or ocean temperatures.  The magnitudes of the temperatures could be biased in some way.  Or, more subtly, they could be unbiased, on average, but the model and observations could disagree on the cold and warm spots are, as I alluded to earlier.  Or something even more complicated could be going on.

Until the above questions are resolved, it’s premature to conclude that we have disproven high climate sensitivities, just because our statistical analysis assigns them low probabilities.”

The excellent site Skeptical Science has a great post on the Schmittner et al. paper here.  After going through the technical challenges in considerable depth, they also note a critical, and inconvenient truth, if the article’s findings are correct:

In short, if Schmittner et al. are correct and such a small temperature change can cause such a drastic climate change, then we may be in for a rude awakening in the very near future, because their smaller glacial-interglacial difference would imply a quicker climate response a global temperature change, as illustrated in Figure 4.

As Figure 4 illustrates, although the Schmittner et al. best estimate for climate sensitivity results in approximately 20% less warming than the IPCC best estimate, we also achieve their estimated temperature change between glacial and interglacial periods (the dashed lines) much sooner.  The dashed lines represent the temperature changes between glacial and interglacial periods in the Schmittner (blue) and IPCC (red) analyses.  If Schmittner et al. are correct, we are on pace to cause a temperature change of the magnitude of an glacial-interglacial transition – and thus likely similarly dramatic climate changes – within approximately the next century.*

In the run-up to the publication of the IPCC’s AR5 report in 2013, it will be critical to see if a new consensus number emerges that is different from that of the last AR4 report in 1997—a consensus that takes all the new findings made over the last few years into consideration. As this number changes, so will the world.

The IEA’s Closing Window

One small mercy in the global warming debate is that the International Energy Agency (IEA) has not become a closet supporter of the carbon lobby under the mantra “drill, baby drill”. The IEA, basically a rich country club that focuses on energy security and research, has come under criticism over the years for downplaying the potential for renewables and for underplaying the threat from peak oil, bit can’t be accused of neglecting the issue of climate change (or at least not recently).

Every year, the IEA issues its flagship publication, the World Energy Outlook (WEO), around November. Jump back a decade to see the WEO’s view of the world in 2001  here. This publication mentions both renewables and carbon capture and sequestration (CCS) technology—but there is no explicit mention of climate change. For renewables, an economic case exists for their introduction irrespective of global arming (if you can get down to the right price point). But the WEO was being rather coy with CCS; it is a pretty pointless technology if you don’t already buy into the threat posed by climate change. So at the time, climate change could be characterised as the energy issue ‘that dares not speak its name’.

As the years rolled by, the WEO become rather less reserved about the issue, culminating with a sea change in its approach in the 2006 report, in which the Executive Summary kicked off with this sentence:

The world is facing twin energy-related threats: that of not having adequate and secure supplies of energy at affordable prices and that of environmental harm caused by consuming too much of it.

And, at last, we are out of the closet:

Safeguarding energy supplies is once again at the top of the international policy agenda. Yet the current pattern of energy supply carries the threat of severe and irreversible environmental damage – including changes in global climate.

The report then went on to note that its mandate had been revised to “advise on alternative energy scenarios and strategies aimed at a clean, clever and competitive energy future” and that “greenhouse-gas emissions would follow their current unsustainable paths through to 2030 in the absence of new government action”.

The following year, the 2007 report (here) explained the scale of the  challenge in more detail:

According to the best estimates of the Intergovernmental Panel on Climate Change, this concentration would correspond to an increase in average temperature of around 3°C above pre-industrial levels. In order to limit the average increase in global temperatures to a maximum of 2.4°C, the smallest increase in any of the IPCC scenarios, the concentration of greenhouse gases in the atmosphere would need to be stabilised at around 450 ppm.

To achieve this, CO2 emissions would need to peak by 2015 at the latest and to fall between 50% and 85% below 2000 levels by 2050.

We estimate that this would require energy-related CO2 emissions to be cut to around 23 Gt in 2030 – 19 Gt less than in the Reference Scenario and 11 Gt less than in the Alternative Policy Scenario. In a “450 Stabilisation Case”, which describes a notional pathway to achieving this outcome, global emissions peak in 2012 at around 30 Gt.

In the subsequent WEO in 2008, an entire section of the report (the full report can be found here) was devoted to setting out the energy production paths required to hit both 450 ppm and 550 ppm CO2-equivalent targets.  Through providing such depth of analysis, the IEA had thus positioned itself as a vital source of information for anyone trying to understand the challenge of climate change.

Fast forward to November 2011, and we have another 4 years of data compared to what the IEA had in hand back in 2007. So how well are we doing in terms of achieving their “450 Stabilisation Case” (stabilisation at 450 parts per million of  CO2 equivalent)?

As I stressed in my ‘Odds of Cooking the Kids’ posts, it is possible for any person on the planet to answer this kind of question by periodically checking into the fossil fuel carbon emission data releases found here. The most up-to-date data we have is the advance estimate for fossil fuel carbon emissions in 2010, which is 9.1 giga tonnes. Translate this into CO2 (remembering to multiply by 3.67) and we saw 33.5  giga tonnes of CO2 emitted in 2010. So the 2007’s global emissions peak of 30 giga tonnes in 2012 looks a bit of a stretch goal!

But if we go back to my post here, we should note that such peaks are not cast in stone: we have a trade-off between early CO2 emission peaks and slow reductions, and late peaks and aggressive reductions.

This brings us to the current World Energy Outlook for 2011, in which some new knowledge is brought to the table in the form of energy production inertia and the price of production. (Note that for past issues of the WEO it is possible to access the full reports for free, but for the most recent issue only the Executive Summary  (here) is available without payment.)

Let’s just accept for the moment that 450 ppm CO2-equivalent is our atmospheric goal, and this will likely restrict the rise on global mean temperature rise to below 2 degrees Celsius above pre-industrial revolution levels—the level deemed dangerous. Then ignoring cost, we may be ambivalent between the three pathways in the chart above. But, of course, cost does matter.

Now it could be argued that the cost of renewables will come down as technology advances, so suggesting we hang back in their deployment. However, the IEA pokes holes in this line of reasoning through emphasising the large upfront sunk costs of energy production infrastructure and the long life of such plant once it is built. In short, once you have installed a coal-fired power station, it makes little sense to decommission it before the end of its useful life.

Imagine that you buy a brand-new SUV with the intention of going green at some stage in the future by buying an all electric Nissan Leaf. Assuming there is no second hand market (which there isn’t for power stations), then the time to switch cars from an economic perspective is when the useful life of your SUV has come to an end. Now you could make the trade three years after buying your SUV, but—remembering there is no second market—that would mean trashing a vehicle with many years left of useful life and considerable economic value. This is the logic the IEA follows. In their words:

Four-fifths of the total energy-related CO2 emissions permissible by 2035 in the 450 Scenario are already “locked-in” by our existing capital stock (power plants, buildings, factories, etc.). If stringent new action is not forthcoming by 2017, the energy-related infrastructure then in place will generate all the CO2 emissions allowed in the 450 Scenario up to 2035, leaving no room for additional power plants, factories and other infrastructure unless they are zero-carbon, which would be extremely costly.

In sum, to hit the 450 CO2-equivalent target, we would have to stop building new fossil fuel based energy capacity completely after 2017. And if we maintain the current rate of fossil fuel based energy production installation, we will head for far higher degrees of warming as can be seen in the chart below.

The major culprit is, of course, coal, which has been taking the lion’s share of new capacity installation. Indeed, over the last decade new coal plant has been almost equivalent to all other types of energy generation capacity put together.

The gloomy conclusion is that the 450-eq, and thus 2 degree of warming target, is already almost out of reach. The key question then is “how far could we overshoot looking at the cost dynamics of energy installation alone?” This is a topic I will return to in future posts.

Odds of Cooking the Kids: Part 3

This is the final post of this particular series. In my previous post, we set out a framework that allowed us to get a sense of whether our kids lives would be warmed up to the degree that they would be very different from ours. And when I say ‘different’, I don’t mean ‘different’ as in access to different technology, but ‘different’ as in living in a different economic and political world.

In many decisions in life, we use simplistic rules of thumb as starting points for decision-making. In the same manner, I have suggested that the Intergovernmental Panel on Climate Change (IPCC)’s benchmark 2 degrees Celsius of warming yardstick (from pre-industrial levels) provides one such rule of thumb: it tells us what degree of warming is likely to have a material impact on food production and economic growth. This has been termed ‘dangerous’ climate change.

In a similar vein, ‘extremely dangerous’ climate change has been associated with 3 or 4 degrees Celsius of warming—a degree of warming that would suggest socio-economic instability; that is, the failure of sovereign states, and possibly a reworking of the post-war international political order.

Rules of thumb are by definition approximations. We don’t exactly know how resilient world food production will be in a 2-degree warmer world; we don’t know whether economies can grow enough to easily adapt to a 2 degree world—or, indeed, whether it is actually possible to inoculate ourselves against climate change through economic growth as economists such as William Nordhaus suggest (I think not, others would beg to differ, we shall see).

What we do know, however, is that a 2-degree world poses an appreciable risk to our welfare. We could blindly go about our everyday lives (as most of us are currently doing) and get lucky: the world economy grows, technology advances, we prove able to cope with climate change, and 2 degrees of warming proves the peak. But then again, may be not.

But will we be able to get a better handle on how bad things could get as data comes in over the next decade? The answer to this is ‘yes’ —up to a point, but we will never escape uncertainty entirely. Yet, the human condition is one of decision making under uncertainty. This is why the climate skeptic refrain that we don’t have enough certainty to make any decisions over fossil fuel emissions is so absurd. How much certainty do we have of future outcomes when choosing a college, selecting a spouse or applying for a job? The answer is that we take a stab at future outcomes based on the information currently available. As more information comes in, we can then revise our view of what the future will bring—but we will never achieve certainty.

So in the spirit of empowerment, I suggested in my last post that you periodically check to see whether our annual fossil fuel carbon emissions are consistent with a world that will stay within 2 degrees of warming or suggest a world where we overshoot this target. The Carbon Dioxide Information Analysis Center’s advance estimate for 2010 fossil fuel carbon emission was 9.1 gita tonnes of carbon, a 5.9% rise from 2009. Quite simplistically, if this number does not start declining by 2020, then it is extremely unlikely that warming will be kept below 2 degrees (or much worse). Further, if this number continues to grow as the current decade progresses, then we will move into the territory of ‘extremely dangerous’ climate change of 3 or 4 degrees Celsius—or more.

Let’s dig a little and see if we can get a better idea of how the CO2 emission number will likely evolve over the coming decade. In my last post I introduced the idea of a carbon budget following the work of Meinhaussen and other scientists and suggested that we had already used up 28% of that budget since 2000. Note though that the precise number should not be given too much weight; rather it is a best estimate of where we are in terms of emissions. Moreover, if we heroically manage to hold emissions at a constant level from now onward, we would run out of our budget by 2030 if a) we wanted a 75% chance of success keeping within 2 degrees of warming and b) reduced emissions to zero the following year after 2030.

Two assumptions here are patently unrealistic: first, that near term emission growth will miraculously flat line; and second, that emissions will instantaneously revert to zero at some appointed year.

Why? Firstly, because economic growth is highly energy dependent over the near term (and probably mid term as well). As such, we have now entered the realm of economic cause and effect as opposed to physical science cause and effect (for example, the link between atmospheric C02 and global mean temperatures).

The correlation between economic growth and energy has long been known, but is not necessarily truly appreciated. Indeed, many scholars now believe that the industrial revolution was as much a fossil fuel energy revolution (for example see here) as opposed to a pure technology (hard technology such as James Watt’s steam engine and soft technology as in such capitalist corner stones as the joint stock company). If there had been no available coal (and later oil), there would have been no industrial revolution—and no miraculous jump in GDP growth and associated living standards.

The degree of correlation between growth and energy, however, can shift through time. Moreover, the correlation between energy and CO2 emissions can also alter with the years. Thus, we have a three-step process: moving from GDP growth, to energy production growth, to CO2 emissions growth.

More specifically, an advanced serviced orientated economy may be able to grow with little increase in energy inputs (although frequently we find that this is because they have, in effect, outsourced their energy needs to less-developed countries who make solid ‘stuff’ that complements the services the advanced countries provide). In addition, this advanced service economy may be able to fulfill what little additional energy needs it has through developing sustainables such as solar and wind that produce minimum (at least explicitly) CO2 emissions.

If we put the developed countries and less-developed countries together, we get a sense of the dependency of global GDP to carbon emissions. PWC put out a report in 2006 which looked at the relationship between growth and carbon and then mapped out possible future scenarios (here).

As you can see from the chart above (click for a larger image), in all the scenarios carbons emissions grow at lower rates than GDP growth, but the growth rates differ widely from the ‘Scorched Earth’ scenario (we increasingly rely on more dirty sources of energy like coal and tar sands to meet our energy needs) to a green/technology nirvana scenario (Green Growth plus Carbon Capture and Storage) where we have managed to almost completely decarbonise economic growth. A more detailed definition of the various scenarios is given below (click for larger image):

Note that since the PWC report was published in 2006. the link between carbon emissions and GDP has followed a path a little bit worse than the ‘Baseline’ scenario.

So what happens if we continue on a path of relatively strong global growth (even if there is meagre growth in OECD countries) and relatively little success in decarbonisation (following on from the failure of the Copenhagen talks to come up with hard global goals on carbon emission mitigation)?

The United Nations Environment Programme (UNEP)’s “The Emissions Gap Report” provides some answers. It builds on the carbon budget concepts discussed in my last post and highlights the choice between achieving an early peak in carbon emissions and a slow decline, and a late peak in carbon emissions and a rapid decline.

The report then goes on to look at what emissions need to look like in the year 2020 if we wish to hold global mean temperature rise from pre-industrial levels to 2 degrees Celsius of warming (what they term a ‘likely’ chance of success). They use 2005 as their base year, when a total of 45 giga tonnes of CO2 equivalent was emitted.

Note the use of CO2 equivalent. The term means CO2 emissions plus emissions of other non-greenhouse gases translated into equivalent CO2 units. This is a different unit of measure than that found at the Carbon Dioxide Information Analysis Center (CDIAC) web site that I referred to in my last post (they focus on giga tonnes of carbon) and is different than the units the International Energy Agency (IEA) uses in its reports (they look at giga tonnes of CO2). We will return to this issue a little later, so let us stay with CO2 equivalent, or CO2-eq.

The first thing to note in the chart above is that the best estimate of having a 66% chance of keeping within the 2 degree limit requires CO2-eq emissions to be actually less than 2005 levels by 2020. How realistic is that? Looking at the alternative scenarios, the answer must be: “it isn’t realistic”. Business as usual sees us a whopping 11 giga tonnes of CO2-eq above the required target (25% above), while even the strict observance of binding pledges sees an overshoot.

So if we are likely to wander into the realm of ‘dangerous’ climate change what is the likelihood of us getting into the even worse situation of ‘extremely dangerous’ climate change where our kids’ lives could be transformed? Building on the UNEP’s analysis, an academic paper by Joeri Rogelj and co-authors in the journal Nature throws some light on this question (see here). They model a series of emission pathways consistent with a 66% probability of staying within a particular temperature increase based on the spread of sensitivity of temperature to CO2.

Remember, if we get lucky, temperature could turn out to be less sensitive to a doubling of CO2 than our best estimate of 3 degrees; if we are unlucky, it could turn out to be more sensitive. In the former case, we can get away with emitting more CO2; in the latter case, less. Unfortunately, we won’t get a better handle on which sensitivity is correct for a decade or two, by which time any CO2 emitted into the atmosphere will stay there for hundreds of years—and warm the earth for hundreds of years.

The Rogelj paper, therefore, provides a road map of where various emission paths of CO2-eq will take us temperature rise, with sign posts for the year 2020 and 2050. For a larger image, you can click on this link to the chart from the original article.

Now 2050 is a long way away, but 2020 is less than a decade. Moreover, given the inertia in energy production systems (you can’t replace coal powered power stations with renewables overnight)  you can get a sense of where we will be in 2020 by just seeing whether current emissions growth is decelerating, accelerating or staying the same. So let’s look at this signpost in more detail (click on the table for a larger image).

In column three of the table above, the central number is the median estimate of C02 emissions in the year 2020 that are consistent with a peak in global mean temperature given in the left hand column. So for us to stay within 2 degrees of warming with a 66% probability, we would need to limit CO2-eq emissions to 44 giga tonnes in 2020, one giga tonne less than we emitted in 2005. What about for ‘extremely dangerous’ climate change of 3 degrees of warming? The median emission given for this limit is 52 giga tonnes, 7 giga tonnes above 2005.

Unfortunately—and this goes back to my frustration with the dog’s dinner of units that the scientific community uses for climate change publications—there isn’t a readily accessibly and timely updated source of CO2-eq data. However, given that we are interested in changes, this doesn’t really matter. So let us use 2005 as our base year and see what fossil fuel carbon emissions were in that year according to the CDIAC (see here). You can find that in 2005 a total of 8.1 giga tonnes of carbon was emitted. In 2010, the CDIAC’s advance estimate has the world emitting 9.1 giga tonnes of CO2. So in the past five years, carbon emissions have increased by 1 giga tonnes.

Extrapolating that growth rate, we could then make a quick and dirty estimate that carbon emissions will rise by another 2 giga tonnes by 202o, making a grand total of 3 extra tonnes of carbon emissions over our 2005 base year. Now we have to translate carbon emissions into CO2 emissions by weight, remembering from the last post that the CO2 molecule is 3.667 times heavier than a single carbon atom. That means that CO2 emissions will be 11 giga tonnes higher in 2020 than 2005.

Now let’s be conservative and assume that non-CO2 greenhouse gases flatline over this period (in reality, atmospheric concentrations of these gases will likely rise a bit), so the 11 giga tonnes increase in CO2 from 2005 to 2020 will be the same thing as an 11 tonnes rise in CO2-eq for the period. Put that figure on top of our starting amount of 44 giga tonnes of CO2-eq in 2005 gives us 55 giga tonnes of CO2-eq emissions in 2020. And 55 giga tonnes of CO2 emissions in 2020 is consistent with a CO2 emissions pathway that leads us to 5 degrees of warming. This, in turn, is in line with UNEP’s base line scenario.

True, we could get lucky even if we are emitting 55 giga tonnes of CO2-eq in 2020 (although it is unlikely that this luck will allow us to defend the 2 degree target, or even higher). Temperature could prove less sensitive to CO2 than our current best estimate. Some miraculous technology could appear within a decade or two that allows us to decarbonise energy production at a breakneck pace. But this is a blog about risk—and unfortunately there is an appreciable risk that our current emissions path will cook our kids. And to come to that conclusion, you don’t need to rely on the word of others. Just look at the data yourself: the key numbers are all here.