Tag Archives: climate sensitivity

Back to that Big Number

In my post “A Big Number Gets Tweaked” I focused on ‘climate sensitivity’, aka the global mean surface temperature response to a doubling of CO2. It is an important number, and a basic understanding of what it means is a basic part of what I would call ‘climate change literacy’.

Going back to the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007, a definition of climate sensitivity can be found on page 12 of the Summary for Policy Makers here.

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

The chart below gives a sense of the different sensitivity estimates that provided the background to the IPCC’s final number:

This definition of climate sensitivity dates back to a landmark paper by Jule Charney et al in 1979 (here). In fact, to avoid confusion, we could call it Charney sensitivity. Now what Charney sensitivity isn’t (surprisingly) is the real world sensitivity of surface temperatures to a doubling of CO2. This is because Charney sensitivity was a blending of the results of two climate models that held a number of the variables constant. Of course, the Charney sensitivity in its modern version is now backed up by  a multitude of models of far greater sophistication, but interestingly the sensitivity number that came out of the 30-year old Charney report has held up pretty well. Nonetheless, the Charney sensitivity has a somewhat narrow definition. The excellent climate scientist run blog RealClimate (www.realclimate.org) explains this in more detail here:

The standard definition of climate sensitivity comes from the Charney Report in 1979, where the response was defined as that of an atmospheric model with fixed boundary conditions (ice sheets, vegetation, atmospheric composition) but variable ocean temperatures, to 2xCO2. This has become a standard model metric (because it is relatively easy to calculate. It is not however the same thing as what would really happen to the climate with 2xCO2, because of course, those ‘fixed’ factors would not stay fixed.

A wider definition is usually termed the Earth System sensitivity that allows all the fixed boundary conditions in the Charney definition to vary. As such, ice sheets, vegetation changes and atmospheric composition can provide feedbacks to temperature and thus cause a greater temperature response over the longer term. The Earth System sensitivity is in theory closer to the real world as it tells us at what temperature the system will ultimately get back to equilibrium.

The most influential calculation of Earth System sensitivity has been that made by NASA’s Jim Hansen, since it forms the scientific foundation for the 350.org climate change campaigning organisation. As the name suggests, 350.org urges humanity to strive toward a target of 350 parts per million (ppm) of CO2. The rationale for the target can be found here and rests heavily on a paper by Jim Hansen and his coauthors entitled “Target atmospheric CO2: Where should humanity aim?“.

In the abstract of the Hansen article, we immediately see a differentiation between a sensitivity that includes only fast feedback processes (a Charney sensitivity) and an equilibrium sensitivity that includes slower feedbacks (an Earth System sensitivity):

Paleoclimate data show that climate sensitivity is ~3°C for doubled CO2, including only fast feedback processes. Equilibrium sensitivity, including slower surface albedo feedbacks, is ~6°C for doubled CO2 for the range of climate states between glacial conditions and ice-free Antarctica. Decreasing CO2 was the main cause of a cooling trend that began 50 million years ago, the planet being nearly ice-free until CO2 fell to 450 ± 100 ppm; barring prompt policy changes, that critical level will be passed, in the opposite direction, within decades.

The paper then goes on to make a pretty forceful policy recommendation:

If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm, but likely less than that.

Note that the article does contain a number of caveats over climate variability, climate models and other uncertainties. Further, as is the usual process in science, it has received various critiques, many suggesting that a figure of 6 degree Celsius is too high for long term sensitivity. What is not in dispute, however, is that, an Earth System sensitivity with long-term feedbacks will have a higher sensitivity number than a Charney sensitivity with only short-term feedbacks (almost by definition).

Despite this fact, we see numerous media reports getting tangled up between the two types of sensitivities following the publishing of the new Schmittner et al paper I talked about in a previous post. This from the Houston Chronicle:

To me, the real effect of this paper will be to really impair the credibility of the more extreme environmentalists who have been saying the planet faces certain doom from climate change.

I am thinking about such efforts as Bill McKibben’s 350 campain, in which he asserts that 350 ppm is the most important number in the world. Such environmentalists assert that the planet will warm as much as 6 Celsius degrees with a doubling of atmospheric carbon dioxide levels.

That’s a big number and doubtless would have catastrophic consequences for the planet. This is not in dispute. But scientists are now telling us this is not going to happen.

Well ‘no’ actually. Since we are comparing apples and pears, scientists are not now telling us that catastrophic outcomes are not going to happen.

Getting back to the topic of risk, we can now see how a better understanding of the different sensitivity concepts allows ordinary people to get a better idea of the climate risk they and their families face.

To reiterate, we are going from CO2, to temperature (via sensitivity) to impacts. To get a good idea of overall risk we need a sense of of how carbon emissions are trending; then we need a feeling for how sensitive temperature is to CO2; and lastly an understanding of how much the earth changes (and the impact on us of those changes) once the world warms.

The Charney sensitivity is very useful since it gives a floor to the kind of temperature changes we will experience. If the best estimate of this sensitivity number if found in the future to be smaller than the current consensus of 3 degrees, then that—other things being equal—is a positive thing. However, we are not in a position, yet, to reduce the consensus based on the Schmittner paper.

The Hansen 6 degree Celsius number is probably a little too high, but if we get anywhere close to this number, we are still in the bad lands of catastrophic climate change. Nonetheless, the time horizon for the full warming stretches generations into the future; thus, it is probably not the risk metric you would use if your concern only goes our as far out as grandchildren. But I think Jim Hansen receives a lot of underserved ridicule in certain parts of the blogosphere and American press for his championing of a number that implies the yet unborn have rights too.

Putting this question of human ethics to one side, those alive today are really interested in a Charney sensitivity plus alpha from a climate risk perspective. The components that make up that ‘plus alpha’ are a topic for another post.

A Big Number Gets Tweaked

If I had to nominate candidates for the title of two most important numbers in the world, they would have to be 1) the atmospheric concentration of CO2 in the atmosphere (which you can find here) and 2) the climate sensitivity of the global mean temperature to a doubling of CO2.

As esoteric as this discussion may appear, both numbers rank above such economic heavy weights as inflation, GDP growth and government debt-to-GDP ratios for the life outcomes for my kids (in my humble opinion). Basically, bad things happen as CO2 jumps and temperature rises (see here, here and here).

Now there is a lot more I would like to say about atmospheric CO2 concentration, but that will have to wait for future posts. Today, I want to focus on climate sensitivity because an academic paper in the journal Science has just been released (here) that claims the numbers we have been using up to now for climate sensitivity have been too high.

But before I quote the abstract of the new paper, it is useful to restate the existing consensus from the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007. It can easily be found on page 12 of the Summary for Policy Makers here. The key paragraph is as follows:

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

Now we turn to the new academic paper by Schmittner et al. and—after noting that Kelvin (K) is the equivalent to Celsius (C)—we read this:

Assessing impacts of future anthropogenic carbon emissions is currently impeded by uncertainties in our knowledge of equilibrium climate sensitivity to atmospheric carbon dioxide doubling. Previous studies suggest 3 K as best estimate, 2 to 4.5 K as the 66% probability range, and nonzero probabilities for much higher values, the latter implying a small but significant chance of high-impact climate changes that would be difficult to avoid. Here, combining extensive sea and land surface temperature reconstructions from the Last Glacial Maximum with climate model simulations, we estimate a lower median (2.3 K) and reduced uncertainty (1.7 to 2.6 K 66% probability). Assuming paleoclimatic constraints apply to the future as predicted by our model, these results imply lower probability of imminent extreme climatic change than previously thought.

Very simplistically, the paper reconstructs the temperature record of the last glacial maximum (LGM, the height of the last ice age) 20,000 years ago. Their findings suggest that the LGM was between 2 to 3 degrees Celsius cooler than the present, against current consensus estimates of around 5 degrees. The authors then matched this temperature against the green house gas concentrations of that time. In sum, for the given difference in CO2 with the present, they got less bang for the buck in terms of CO2 impact on temperature compared with what climate models currently suggest for the future.

If we believe the new findings, then the best estimate of climate sensitivity should be reduced from 3 degrees Celsius for a doubling of CO2 to 2.3 degrees—and the range has also to be narrowed. Just to put things in context, the pre-industrial concentration of CO2 was 280 parts per million and we are now at around 390 ppm, or up 40%. Now the IPCC’s AR4 also has this to say:

450 ppm CO2-eq corresponds to best estimate of 2.1°C temperature rise above pre-industrial global average, and “very likely above” 1°C rise, and “likely in the range” of 1.4–3.1°C rise.

Now I’ve highlighted it before in another post, but I will highlight it again in this post, CO2 and CO2 equivalent are different concepts. However, at the current time, non-C02 atmospheric forcing effects currently cancel out (for a more detailed discussion of this, see here), so we are in the happy position of being able to capture what is happening by looking at the CO2 number alone—for the time being.

Moving on, we should note that the international community has decided that 2 degrees Celsius of warming marks the point as where we will experience ‘dangerous’ climate change. This is in the opening paragraph of the Copenhagen Accord:

We underline that climate change is one of the greatest challenges of our time. We emphasise our strong political will to urgently combat climate change in accordance with the principle of common but differentiated responsibilities and respective capabilities. To achieve the ultimate objective of the Convention to stabilize greenhouse gas concentration in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system, we shall, recognizing the scientific view that the increase in global temperature should be below 2 degrees Celsius, on the basis of equity and in the context of sustainable development, enhance our long-term cooperative action to combat climate change.

To recap, we have a best estimate of climate sensitivity of 3 degrees. And based on this number,  atmospheric CO2-equivalent should be capped at 450 ppm to hold temperature rise to around 2 degrees. This, in turn, is because 2 degrees of warming is deemed the level at which ‘dangerous’ climate change develops.

Now what happens if the 3 degree number is incorrect and should be 2.3 degrees? Well, the first reaction is to think that the 450 ppm ‘line in the sand’ for dangerous climate change goes out the window. Further, if this CO2 concentration number goes out the window, so do all the numbers for ‘extremely dangerous’ climate change, and for that matter ‘catastrophic’ climate change. If so, the carbon emissions paths associated with different levels of warming as talked about in my post here also have to be radically revised (click for larger image below, see here for the original article).

And, addition, the deadline for the cessation of fossil fuel based energy production plant installation calculated by the International Energy Agency (IEA) and as talked about in my last post here would also have to be reworked.

However, some caution is in order. First, this is only one paper amongst many that have tackled the question of climate sensitivity from a variety of angles; it should be judged within the context of the total body of work. Further, as with all good science, its assumptions will come under intense scrutiny to check if the methodology is correct. Unlike the climate skeptic blog comnentary, the authors of the report fully admit the tentative nature of their findings:

“There are many hypotheses for what’s going on here.  There could be something wrong with the land data, or the ocean data.  There could be something wrong with the climate model’s simulation of land temperatures, or ocean temperatures.  The magnitudes of the temperatures could be biased in some way.  Or, more subtly, they could be unbiased, on average, but the model and observations could disagree on the cold and warm spots are, as I alluded to earlier.  Or something even more complicated could be going on.

Until the above questions are resolved, it’s premature to conclude that we have disproven high climate sensitivities, just because our statistical analysis assigns them low probabilities.”

The excellent site Skeptical Science has a great post on the Schmittner et al. paper here.  After going through the technical challenges in considerable depth, they also note a critical, and inconvenient truth, if the article’s findings are correct:

In short, if Schmittner et al. are correct and such a small temperature change can cause such a drastic climate change, then we may be in for a rude awakening in the very near future, because their smaller glacial-interglacial difference would imply a quicker climate response a global temperature change, as illustrated in Figure 4.

As Figure 4 illustrates, although the Schmittner et al. best estimate for climate sensitivity results in approximately 20% less warming than the IPCC best estimate, we also achieve their estimated temperature change between glacial and interglacial periods (the dashed lines) much sooner.  The dashed lines represent the temperature changes between glacial and interglacial periods in the Schmittner (blue) and IPCC (red) analyses.  If Schmittner et al. are correct, we are on pace to cause a temperature change of the magnitude of an glacial-interglacial transition – and thus likely similarly dramatic climate changes – within approximately the next century.*

In the run-up to the publication of the IPCC’s AR5 report in 2013, it will be critical to see if a new consensus number emerges that is different from that of the last AR4 report in 1997—a consensus that takes all the new findings made over the last few years into consideration. As this number changes, so will the world.