Open Mind

Two Boxes

October 3, 2007 · 10 Comments

I recently commented for RealClimate on the inapplicability of the simplified model used in Schwartz’s estimate of climate sensitivity. It seems to me that the greatest failure of this model is that it allows only one time scale, or “characteristic time,” for the climate system. But surely the earth has many. The atmosphere reacts quickly to changes in energy input and output; the land heats up a bit more slowly, the ocean more slowly still. Ice and snow grow and diminish. All these systems have their own characteristic time scales, and they all interact to form our climate system.

The model used by Schwartz amounts to this:

C_V ~{dT \over dt} = F - \alpha C_V T,

where T is the temperature, C_V is the heat capacity of the climate system, and F is the climate forcing. The quantity \tau = 1/\alpha is the time scale, and for this model \tau / C_V = 1/(\alpha C_V) gives the climate sensitivity. We can call this a “1-box” model, because it represents earth’s climate as a single entity with a single temperature, and allows only one time scale. We can make the model more realistic by including more “boxes.” Just to get an idea how multiple “boxes” can affect climate change, I’ll use two: one for the atmosphere/land, another for the oceans.

In this case we’ll have two temperatures: one for each box. We’ll have a separate heat capacity and characteristic time scale for each box, and we can even allow separate climate forcings. We must also allow the two boxes to exchange heat; otherwise we don’t have a 2-box model, we have two 1-box models. We’ll end up with something like this:

C_1 ~{dT_1 \over dt} = F_1 - \alpha_1 C_1 T_1 + \beta (T_2 - T_1),

C_2 ~{dT_2 \over dt} = F_2 - \alpha_2 C_2 T_2 + \beta (T_1 - T_2),

Note that all the quantities have subscripts “1″ or “2″ indicating whether they apply to box 1 or 2, except the quantity \beta (the heat exchange coefficient), which is the same for both. It must be, in order not to violate the law of conservation of energy.

These equations can be solved exactly for any forcings F_1,~F_2. What emerges from the solution is that the system has two time scales, \tau_1 and \tau_2. The time scales depend on the constants \alpha_1,~\alpha_2, the heat capacities C_1,~C_2, and the heat exchange coefficient \beta.

What does it mean to say the system has a “time scale?” Let’s go back, for just a moment, to the 1-box model. Suppose climate forcing has been zero for a long time, and global temperature T is zero. Then, let’s raise the forcing by instantaneously doubling atmospheric CO2 and keeping it at that level. Under this circumstance, temperature will evolve according to

T = {F \over \alpha C_V} ~[1 - e^{-t/\tau}].

From this we see that temperature will rise in response to doubled CO2, and will approach a new equilibrium value (F/\alpha C_V) with exponential decay. If the new equilibrium value is, say 3 deg.C, and the time scale is \tau=5 yr, the response will look like this:


If, on the other hand, the time scale is \tau=30 yr, it will look like this:


In the first case, with a short time scale, climate adjusts to the new equilibrium value quickly. After 20 years, which is 4 “time scales,” it’s almost reached its new equilibrium value and has only a miniscule amount of warming “still to go.” It’s on this basis Schwartz concludes that if his estimates are correct, almost all of the warming from the greenhouse gases we’ve already emitted has already been felt; there’s almost no warming left “in the pipeline” (warming that still remains even if we stop emitting greenhouse gases entirely). But in the second case 20 years is only 2/3 of a single time scale, and we still have almost half of the warming still in the pipeline. Schartz argues for a short time scale, based on the fact that in the past century climate has responded quickly to forcing changes; with such a short time scale we don’t have to worry about warming still in the pipeline, just about future greenhouse gas emissions.

Now let’s look at the behavior of our 2-box model, and suppose its two time scales are \tau_1=5 yr and \tau_2=30 yr. Under not-unrealistic conditions, a sudden doubling of CO2 (with no other changes) might lead to temperature behavior like this:


We can see that the ocean warms slowly, as though it behaved with the longer time scale \tau_2=30. But the atmosphere behaves differently; at first it warms quickly, then more slowly. This illustrates how it’s possible to observe what we’ve observed, despite the fact that there’s warming in the pipeline even if we don’t emit any more greenhouse gases. The short time scale for heating the atmosphere makes climate at first respond quickly to changes in forcing, while the long time scale for heating the oceans makes climate institute the full effect of climate forcing much more slowly. The fact that atmospheric temperature responds quickly to climate forcings doesn’t mean that the time scale is short, it only means that one of the time scales is short. But because of the other, longer time scale we can still have substantial warming in the pipeline.

It is the belief of the majority of climate scientists that this is actually the case (not that climate follows a 2-box model, but that it has 2 or more characteristic time scales). The atmosphere responds quickly to climate forcing, but the ocean sluggishly; it takes a long time for heat to penetrate deep into the ocean, so it’s responsible for a second, longer time scale. Because of this, we haven’t yet got even close to the equilibrium response to the climate forcing from the greenhouse gases we’ve already emitted, and there’s quite a bit of warming still in the pipeline — we’ll feel it even if greenhouse gas emissions completely halt today. According to at least one estimate, we still have about 0.6 deg.C warming in the pipeline. That’s if we stop emitting all greenhouse gases instantly.

Some people get the idea that this means we can’t really do anything about global warming. After all, the planet will continue to warm even if we halt all greenhouse gas emissions instantly and completely. This is a false idea; although halting emissions won’t halt global warming, every bit of greenhouse gas emissions leads to more warming. Halting emissions won’t stop the warming that’s already in the pipeline — but that’s even more reason not to put yet more warming in the pipeline.

Categories: Global Warming · climate change

10 responses so far ↓

  • NU // October 3, 2007 at 3:06 pm

    I think the multiple time scale issue is more devastating to Schwartz’s analysis than the statistical points made in this comment. The importance of ocean heat uptake in diagnosing the climate response goes back at least to the 1985 Science paper of Hansen et al. Wigley et al. highlight a very similar error by Douglass and Knox.

  • Hank Roberts // October 3, 2007 at 3:51 pm

    Something coming mentioned here:


    ____Excerpt below_________

    “Here I will give a summary and personal perspective. For reasons of journal embargo’s, ‘collective’ results of the workshop and expert elicitation cannot be included. The reader should bear in mind that as with any group of scientific experts there was considerable difference of opinion. A different expert would no doubt give a different slant on the importance of various mechanisms. I will try to reflect this in the use of error ranges and qualitative statements about uncertainties.

    Definition and historical examples

    For components of the Earth system that are at least sub-continental in scale (~1000km), they are a tipping element if: The parameters controlling the system can be transparently combined into a single control, and there exists a critical value of this control from which a small perturbation leads to a qualitative change in a crucial feature of the system, after some observation time (a full formalization of this is given in Lenton et al., submitted). This definition is deliberately broad and inclusive. …”

  • Mark Hadfield // October 3, 2007 at 9:28 pm

    Basically I agree with you, Tamino, but (as I may already have said in a comment on your blog) a model of ocean heat absorption with just one time scale is also a gross simplification. A minimal a priori model would have two: one for the top few hundred metres (the maximum mixed layer depth) and a much longer one for the ocean as a whole.

    [Response: Quite right. However, this post is just to illustrate a point, so I've taken the simplest more-than-one-box model. It's also extremely valuable to understand the behavior of simple models, which is one reason that although I disagree with the result of Schwartz's research, his efforts are a valuable exploratory step.

    The general "n-box" model is exactly soluble for arbitrary forcing; I'm currently writing a program to compute the evolution of temperature given a specified forcing function. Perhaps it will shed some light on real-world processes; perhaps not!]

  • ChrisC // October 4, 2007 at 12:14 pm

    I should also state (although you probably already know this Tamino) that the atmosphere itself is highly stratified, and as such, should also be treated as a series of separate “boxes”. For radiative transfer calculations, for example those used in the ECMWF weather model, 91 vertical atmospheric levels are used.

    This is relevant for discussion of CO2 forcing, as different atmospheric levels approach “saturation” at different rates, due to the fact that atmospheric density decreases with increasing altitude, and the prescence of water vapour diminishes.

    At lower atmospheric levels, higher temperatures, more water vapour and pressure broadening of spectral lines lead to increasing levels of saturation. This changes at higher altitude, where CO2 becomes (realatively) more important due to the lack of water vapour, and spectral lines become less broad. In short, the atmosphere does not behave as a homogenous slab, and the upper levels will have longer time scales (ie. larger tau) that lower levels. RealClimate has a good, non-technical explaination of this effect:

    Once again I mutter a curse at nature for being bloody complicated. But I guess that’s what makes it fun.

  • NU // October 4, 2007 at 1:21 pm

    If you want a more realistic ocean model, you should consider a 1D upwelling/diffusion model such as the one in Raper et al.; U/D ocean models have been used in the past to diagnose climate sensitivity.

  • Hank Roberts // October 4, 2007 at 7:44 pm

    > Raper et al.

    Nice pointer. Interesting line in the abstract, that fits with Hansen’s recent public comments about how climate sensitivity over the current short term horizon is lower than climate sensitivity over the long course of events as things sort out:

    “… the HadCM2 effective climate sensitivity is found to increase from about 2.0 °C at the beginning of the integration to 3.85 °C after 900 years ….”

  • aphriza // October 5, 2007 at 12:18 am

    I appreciate the stripped-down models in this post that illustrate your point, tamino. I had always wondered how people supported the argument that “most of the warming has already happened” - never realizing it was by a specific oversimplification of the climate system. Thanks.

    [Response: Just so everybody knows: this oversimplified model represents the *principle*, but the idea that deep-ocean heating has a very long time scale, and that as a result we still have warming in the pipeline, is part of even complex models.]

  • steven mosher // October 5, 2007 at 3:04 pm

    Hi Tamino!

    I posted this on CA. thought you might like it.

    TAMINO has a nice post on two box models:

    Even though in the past I have given him grieef I found his explaination of a two box model very lucid and enlightening.

    When I looked at his hypothetical temperature chart above, noting the difference in rates of warming, the first thing that came to mind was this: Could one use the differential between atmosphere warming and ocean warming to estimate the ratio of time scales?

    Anyway, I decided to play a bit. So I got land Anomalies from:

    And SST anomalies from:

    And compared the two.

    Difference the two: the result is fascinating . A third order polynomial fits the result with an r2
    of .66. You can see the land leading the ocean, and then the ocean catches up, and then the land leads the ocean again.

    I cannot post the graphic here. But it’s posted on CA.

    Essentailly you see the the ocean lagging the land in rate of change, then catching up, overshooting ( momentum) and then the land temp starts to diverge again.

  • george // October 5, 2007 at 4:55 pm

    I guess the main question I have is this:

    Do the “heat capacity” (C) and “time constant” (tau) that Schwartz uses in his equation to determine sensitivity apply to the same thing?

    It appears that his “C” applies to the deep ocean , but it is not clear exactly what his “tau” applies to.

    If they do not apply to the same thing (ie, if they do not apply to the same “box”, suing Tamino’s analogy), it would amount to “mixing and matching C’s and Taus” and (regardless of whether each of the values C and tau is “correct” for its corresponding “box”), the “sensitivity” value thus obtained would be questionable, to say the least.

    Schwartz assumes the ocean heat capacity is the one that is most important, which is perfectly reasonable, since the heat capacity of the oceans far exceeds that of the atmosphere (though his estimated C value for the ocean could still be wrong, of course).

    But he uses the time development of the global mean surface temperature anomaly to estimate his time constant.

    Is the time constant thus obtained that for the ocean? (It does not appear to be that for the deep ocean, at any rate)

    Or is it the time constant for the air at the earth’s surface?

    Schwartz assumes that the short time response of surface air temperatures to influences like volcanoes can be used to infer the “system time constant”, which he uses in his equation to estimate sensitivity.

    I’d have to say that assumption is probably the one that deserves to be scrutinized to the greatest degree.


    As Schwartz says in his paper:
    “The view of a short time constant for climate change gains support also from records of widespread change in surface temperature following major volcanic eruptions. Such eruptions abruptly enhance planetary reflectance as a consequence of injection of light-scattering aerosol particles into the stratosphere. “

    “…from the perspective of inferring the time constant of the system, recovery ensued in just a few years. From an analysis of the rate of recovery of global mean temperature to baseline conditions between a series of closely spaced volcanic eruptions between 1880 and 1920 Lindzen and Giannitsis [1998] argued that the time constant characterizing this recovery must be short; the range of time constants consistent with the observations was 2 to 7 years, with values at the lower end of the range being more compatible with the observations. A time constant of about 2.6 years is inferred from the transient climate sensitivity and system heat capacity determined by Boer et al. [2007] in coupled climate model simulations of GMST following the Mount Pinatubo eruption. Comparable estimates of the time constant have been inferred in similar analyses by others [e.g., Santer et al., 2001; Wigley et al., 2005].

    But Schwartz himself recognizes the possible problem with his inference that the time response of the surface air after a volcano (recovery time) is indicative of an overall time constant of the system (ie, the one Schwartz has used to estimate sensitivity)

    “A concern noted by several investigators with inferences of system time constant from GMST following volcanic eruptions is that as the duration of the forcing is short, the response time of the system may not be reflective of that which would characterize a
    sustained forcing such as that from increased greenhouse gases because of lack of penetration of the thermal signal into the deep ocean.”

    Indeed, that is precisely the point raised by some of the very researchers he quoted (eg, Santer, Wigley). Here’s an abstract to a paper by Santer, Wigley et al:

    Krakatoa lives: The effect of volcanic eruptions on ocean heat content and thermal expansion

    “A suite of climate model experiments indicates that 20th Century increases in ocean heat content and sea-level (via thermal expansion) were substantially reduced by the 1883 eruption of Krakatoa. The volcanically-induced cooling of the ocean surface is subducted into deeper ocean layers, where it persists for decades. Temporary reductions in ocean heat content associated with the comparable eruptions of El Chichón (1982) and Pinatubo (1991) were much shorter lived because they occurred relative to a non-stationary background of large, anthropogenically-forced ocean warming.

    The research is also described here:

    Volcanoes helped slow ocean warming trend, researchers find

    “The experiments studied by Gleckler’s team also included the more recent 1991 Mt. Pinatubo eruption in the Philippines, which was comparable to Krakatoa in terms of its size and intensity. While similar ocean surface cooling resulted from both eruptions, the heat-content recovery occurred much more quickly in the case of Pinatubo.

    “The heat content effects of Pinatubo and other eruptions in the late 20th century are offset by the observed warming of the upper ocean, which is primarily due to human influences,” Gleckler said.

    [Response: It seems you've summarized the situation well.

    I will point out that Schwartz explores the consequences of assuming that the entire system acts as a single box; under that assumption, it's appropriate to use observed data to estimate the parameters. I'll also emphasize that it's valuable to understand the behavior of simple models, even when they don't reflect the complexities of the real world very well. So in my opinion, Schwartz's research is a valuable first step in exploring the implications of simplified models, and their relationship to observed parameters. I agree that Schwartz's model is *too* simple, in that it allows only one time constant when in fact we have good evidence for many -- but his treatment is a catalyst for others to explore not-quite-so-simple models (like 2-box and n-box models). It's also important not to forget that Schwartz is not a denialist and his paper is not the "denialist" work it's touted to be by denialist websites (and sen. Inhofe), that's the *real* slander against him.]

  • Rank Exploits // December 20, 2007 at 9:19 pm

    Time Constant for Climate: Greater than Schwartz Suggests!

    A recent empirical analysis Schwartz (2007) suggests that the time constant for the earth’s climate may be as low as 5 years. If so, the climate sensitivity of the earth may be much lower than suggested by Climate models. That estimate did not a…

Leave a Comment