Open Mind

Cycling Carbon

July 2, 2008 · 104 Comments

Last week James Hansen (NASA’s chief climate scientists) spoke to members of Congress about global warming. It wasn’t his first communication with politicians about the subject, rather it was the 20th anniversary of his now-famous 1988 Congressional testimony in which he told the U.S. government, in no uncertain terms, that global warming was trouble.


This time around, Hansen tried to emphasize the extreme nature of the problem, and the critical consequences of our immediate actions, saying that things have gotten so bad only drastic action can save the day. He said that the world has long passed the “dangerous level” for greenhouse gases in the atmosphere and needs to get back to 1988 levels. He said Earth’s atmosphere can only stay this loaded with man-made carbon dioxide for a couple more decades without changes such as mass extinction, ecosystem collapse and dramatic sea level rises. He also emphasized the urgency of the situation: We’re toast if we don’t get on a very different path,” Hansen told The Associated Press. This is the last chance.”

Those are sobering words from one of the world’s foremost climate experts. He’s been warning the world, and in particular the U.S. government, of the danger of global warming for more than 20 years. Over that time period, his warnings have become more stern, more severe, and more certain, while his criticism for those who have attempted to cloud the truth of the matter has become more shrill and more angry. Some think the angry tone of some of his most critical comments works against his own cause; Hansen apparently believes that only the most severe words will arouse public and political action. I don’t pretend to know whether his chosen approach is wisdom or folly.

But this much is clear: James Hansen believes that we’ve already passed the dangerous level of greenhouse gas concentrations. He says the safe level for atmospheric CO2 is a mere 350 ppmv (its 1988 value), the present level is 384 ppmv. Worse, levels are still rising and show no sign of slowing down; Hansen’s “safe limit” is not just a thing of the past, it’s rapidly getting further away. And the consequences will be dire.

To get serious about emissions, Hansen is proposing a ban on all new coal fired power plants that don’t capture carbon, a phase-out of old coal-fired generators, and “a tax on carbon high enough to make sure that we leave tar sands and oil shale in the ground.” He identifies coal as the biggest threat because it’s by far the largest store of readily combustible fossil carbon on the planet. It’s just about inevitable that we’re going to burn all the oil on the planet and dump its carbon into the air (as CO2), but if we can keep our hands off the coal, at least we can keep a lid on the problem.

Hansen says we should start now. From the Huffington Post:


Today I testified to Congress about global warming, 20 years after my June 23, 1988 testimony, which alerted the public that global warming was underway. There are striking similarities between then and now, but one big difference.

Again a wide gap has developed between what is understood about global warming by the relevant scientific community and what is known by policymakers and the public. Now, as then, frank assessment of scientific data yields conclusions that are shocking to the body politic. Now, as then, I can assert that these conclusions have a certainty exceeding 99 percent.

The difference is that now we have used up all slack in the schedule for actions needed to defuse the global warming time bomb. The next president and Congress must define a course next year in which the United States exerts leadership commensurate with our responsibility for the present dangerous situation.

Otherwise it will become impractical to constrain atmospheric carbon dioxide, the greenhouse gas produced in burning fossil fuels, to a level that prevents the climate system from passing tipping points that lead to disastrous climate changes that spiral dynamically out of humanity’s control.

Most of us can appreciate how big a difference it makes when a problem is confronted sooner rather than later; when it comes to climate, few understand it better than James Hansen. Of course, an even better time to act would have been when Hansen originally testified in Congress:


I would like to draw three main conclusions. Number one, the earth is warmer in 1988 than at any time in the history of instrumental measurements. Number two, the global warming is now large enough that we can ascribe with a high degree of confidence a cause and effect relationship to the greenhouse effect. And number three, our computer climate simulations indicate that the greenhouse effect is already large enough to begin to effect the probability of extreme events such as summer heat waves

However, the climate model simulations indicate that certain gross characteristics of the greenhouse warming should begin to appear soon, for example, somewhat greater waming at high latitudes than at low latitudes, greater warming over continents than over oceans, and cooling in the stratosphere while the troposphere warms.

Conclusion: Global warming has reached a level such that we can ascribe with a high degree of confidence a cause and effect relationship between the greenhouse effect and the observed warming. Certainly further study of this issue must be made. The detection of a global greenhouse signal represents only a first step in analysis of the phenomenon.

It turns out that 1988 was indeed (at the time) globally the hottest year in the history of instrumental measurements. What has happened in the intervening years? The warming already noted in 1988 didn’t stop there; 14 of 19 subsequent years have been even hotter:

At the time of Hansen’s original testimony, global CO2 emissions totalled about 6 GtC (gigatonne carbon), but that too has risen and is now over 8 GtC per year:

As a result the atmospheric concentration of CO2 has steadily increased, reaching 384 ppmv in 2007:

Fossil fuel is the primary energy source for the modern world. The energy is stored in the chemical arrangement of carbon atoms, so extracting the energy requires reorganizing the carbon atoms into a new chemical ordering. The straightforward way is to burn it, i.e., combine it with oxygen; the inevitable byproduct is CO2. So the relationship between CO2 emissions and fossil fuel use is straightforward: extracting energy from fossil carbon produces CO2.

According to the EPA, burning one gallon of gasoline releases about 8.8 kg CO2 into the atmosphere, which contains about 2.4 kg of carbon. This is the anthropogenic part of the carbon cycle; fossilized carbon (in this case petroleum refined to gasoline) is burned to produce atmospheric carbon (CO2) and energy (it’s the energy we’re after). If you burn one gallon of gasoline per day for a year, you’re converting 875 kg of fossil-fuel carbon into 3208 kg CO2; seven eighths of a (metric) tonne of carbon is cycled from fossils to atmosphere, and the atmospheric CO2 load increases by 3.2 tonnes. If you burn 417 gallons of gas in one year, you’re releasing one tonne of carbon (as 3.67 tonne of CO2). Back in 1988, humankind emitted about just about 1 tonne (metric) of carbon to the atmosphere for every person on the planet.

This is only one part of the carbon cycle; carbon is constantly being cycled between the earth’s systems. The atmosphere, the oceans, the soil, the biosphere, the cryosphere, even the lithosphere (the rock in earth’s crust) all take in and give off carbon in many forms, and each such exchange is part of the carbon cycle. On time scales which are geologically short but long in human terms (particularly for the last 10,000 years) the carbon cycle has been in near enough equilibrium that the carbon load in the atmosphere (mostly CO2) stayed about the same, varying between about 260 and 285 ppmv (parts per million by volume). The “pre-industrial” atmosphere had about 280 ppmv CO2, which amounts to about 596 GtC (gigatonne carbon), contained in 2190 Gt CO2 (gigatonne carbon dioxide), and these numbers had not changed by more than 10% for ten thousand years.

And as a result of the increased atmospheric CO2 concentration, the chemical equilibrium between atmosphere and other carbon reservoirs has changed. This means that much of the carbon emitted by human activity has passed from atmosphere to other locations, including the oceans, soil, and biosphere. Not all the carbon humans emit to the atmosphere remains there; so not only are we increasing the concentration of carbon in the atmosphere, we’re increasing it in the oceans as well. Carbon dioxide dissolves in water as carbonic acid, and as its name implies it’s somewhat acidic. Hence increased levels of dissolved CO2 in the oceans will acidify the oceans, which may have drastic consequences for many forms of sea life.

It was in the 18th and 19th centuries that humankind introduced the anthropogenic factor into the carbon cycle. We took fossil carbon, which had been sequestered for a very long time (tens of millions of years or longer), and burned it to produce energy. By doing so, we’ve upset the equilibrium of the carbon cycle. The fossil-to-atmosphere carbon flux is now about 8 GtC/yr, and the atmosphere’s CO2 concentration has increased to about 384 ppmv, its carbon load to about 818 GtC in the form of 3000 Gt CO2. This is a 37% increase over pre-industrial levels — we’ve already had a profound impact on the atmosphere’s CO2 load! If we wish to limit the increase of atmospheric CO2, we have to limit our own emissions. In fact, just to keep emissions constant we have to limit ourselves to about 1 tonne C per person per year. How much energy will a tonne of fossil-fuel carbon (417 gallons of gas) get me?

Suppose, for instance, you drive a vehicle which gets 20 miles per gallon of gas. Then the allotment of 417 gallons will take you 8340 miles over the course of the year. That’s 22.8 miles per day. If you commute to work daily, and your office is 11.4 miles away from your home, then the daily trip to work uses up all of your allotment.

And you only get those 22.8 miles per day if you don’t use fossil-fuel energy for anything else! If you want electricity, that’s another matter entirely. Burning a gallon of gas releases 130.88 MJ (megaJoules) of energy, also known as 36.4 kW-hr. Hence a gallon of gas releases 54.53 MJ/kgC = 15.15 kW-hr/kgC. If we could convert all of the energy in our 417 gallons of gas to useful electricity, we’d get 15,162.12 kW-hr of electricity. Of course this would have to last us the whole year, so we’d get 41.5 kW-hr/day, for an average energy consumption of 1.73 kW-hr/hr (also known as 1.73 kW).

But then there’s the undeniable fact that electricity generation is far from 100% efficient. In a typical pwer plant, about one-third of the energy contained in the fuel is converted into electricity, while the remainder is emitted as waste heat. This means we won’t get 1.73 kW steady power from our 417 gallons of gas, we’ll only get about 1/3rd of that, or about 0.58 kW, or 580 watts. However, the average 2-story home uses electricity at a rate of about 2 kW (about 50 kW-hr/day or 18,250 kW-hr/yr), so we won’t have enough to power the average American 2-story home. And that’s for a home which uses some other energy source for heating! And of course, if we use all our gas for electricity we won’t have any left for the car.

If the electric power comes from coal rather than gas, the 1-tonne emissions allotment limits us even further. Coal doesn’t produce nearly as much energy as gas for each kg carbon emitted, only about 6.7 kW-hr/kgC; gasoline produces roughly 2.25 times as much. Hence our 1-tonne C allotment will really only allow us to consume energy at an average rate of 0.26 kW (260 watts). And again, that leaves no carbon emissions left over for home heating, or powering a motor vehicle.

It’s abundantly clear that using fossil fuel to power our lives involves much greater carbon emissions that the paltry 1 tonne/person/year limit which would bring us back to 1988 levels. Such a limit would require tremendous sacrifices, investment, and innovation. And it probably won’t be enough. But the consequences of not limiting our emissions will be far worse.

Categories: Global Warming
Tagged:

104 responses so far ↓

  • Paul T // July 2, 2008 at 4:59 am

    I don’t think 1 Gt would be so much paltry…as it is excessive, but 1 tonne might be a bit of a challenge, a one tonne challenge as it were.

  • John Ramming // July 2, 2008 at 5:30 am

    First a nit, last paragraph, 1Gt/p/y should be 1t. [Response: Thanks. Fixed.]
    =
    Hansen’s 350 ppm target is just the first goal, he’s stated the range from 300 ppm to 350 ppm will contain the final goal for a healthy and stable climate. The choice of 350 as the first goal is logical, with our current 384 ppm and accelerating, pick the higher goal first, then the methods we develop to turn around our emissions can then be directed at a lower goal later when more evidence has arrived firming up the eventual goal.

  • John Ramming // July 2, 2008 at 6:24 am

    With our initial goal of 350 ppm, three necessary tactics:

    First, stop making the problem worse. Force coal out of our energy mix ASAP. The FF companies have been heavily advertising “Clean Coal”, perfect, WHEN CCS is a commercial reality new coal plants can be permitted. Let’s leave coal and oil sands in the ground for our future generations that will have the benefit of CCS.

    Second, replace the lost FF energy with non-carbon sources. Wind and solar are so close to mass installation, if we’re to subsidize anything let it be the non-FF technologies. As we force the reduction of coal use, and shift the subsidies from FF to alternative energy sources, enterprising individuals and companies will rush to fill the void where future profits are to be found.

    Third, ramp up Biomass power plants with CCS to draw down the atmospheric CO2 concentrations. Our old rule of thumb was that Biomass from agricultural residue could not be economically transported farther than 50 miles to the power plant. With the current rise in the cost of liquid fuel for transport, I expect the economic distance for ag residue biomass transport is closer to 10 miles. This dictates the need for smaller distributed power plants that have the possibility of waste heat utilization further increasing the efficiency of our energy use.

    Shifting from FF to non-carbon energy sources will increase our cost of electricity and transport fuels. Until now we’ve only paid a portion of the real cost for our FF use, our children are being handed the remainder of the bill. In California, the delta advisory group has sent our governor three recommendations: 1) all state agencies should use a sea level rise of 16 inches by 2050 as a planning requirement, 2) by 2100 a SLR of 54 inches should be used, and 3) these levels need to be frequently reviewed to incorporate the best new scientific recommendations. If we think the new cost of fuel and electricity will be too much to bear, what will be the cost of the dike under the Golden Gate Bridge to protect the inner bay and the Central Valley. If this dike needs to be built in the future, it will have to last much longer than the pyramids, possibly be built up gradually to a few hundred feet tall within our planning horizon (500 yrs) and withstand tens of thousands of years of earthquakes and tsunamis. This possible cost may also be on the bill we are handing our future generations. What will be our legacy? What will our children think of us?

    AFTER we in the US have stepped up and begin to address this very real threat and moral challenge, then we can shift our gaze to the other large players in the world (China and India). We should not be surprised to find them neck-in-neck racing to the same end goal, they too have children.

  • George Darroch // July 2, 2008 at 8:09 am

    “Hansen apparently believes that only the most severe words will arouse public and political action. I don’t pretend to know whether his chosen approach is wisdom or folly”

    When I worked for a particular organisation, the comment used to be that the “extremists” in another organisation made them look reasonable, and got them to the negotiating table. If the others did not exist, they would be the outliers. The caveat of this is that there must be an intermediary position to engage with, otherwise they’ll go to the other side.

    I think the more strong and credible voices the better. They’ll make the less strident opinions of the IPCC and others seem eminently reasonable (which of course they are).

    I enjoyed your post and would commend you to the 2000 Watt Society, if you’re not already familiar with them. They take a very similar tack to the one you’ve just outlined.

  • Ed D // July 2, 2008 at 10:09 am

    Excellent post but one quibble: your one tonne allotment of carbon emissions needs to include not just your direct personal use for heating, cooking, transport, entertainment, etc, but also the indirect use through the manufacture and transport of any goods (including food) which you use.

    Also 6 GtC is already too much so the allotment needs to be less than a tonne. Half a tonne?

  • inel // July 2, 2008 at 10:58 am

    Dear tamino,

    Thanks for this clear summary of Cycling Carbon. Very few people in positions of power appreciate the seriousness and urgency with which emissions reductions need to happen, and what those reductions would mean in practice. I suggest we make a list of people who need to know this now, and send them your post!

    Not only is there is a disturbing disconnect between those who inform, and those who act upon that information; just as troubling is the gulf between scientific and economic advisers’ recommendations for safe versus appropriate ‘parts per million targets’.

    Even amongst world-class experts in different fields of discipline—such as Dr. James Hansen on the science of climate change and Sir Nicholas Stern on the economics of climate change—the gulf between their statements about target atmospheric concentrations of carbon dioxide is striking.

    Only last week, the end of June 2008, both advisers gave relevant figures that should, one way or another, feed in to climate policy discussions. Speaking about recommended concentrations of carbon dioxide in the atmosphere, Hansen summed up the science and Stern the economic cost. Look at the difference:

    Top NASA scientist James Hansen said on Monday June 23, 2008 in Global Warming Twenty Years Later: Tipping Points Near:

    “the safe level of atmospheric carbon dioxide is no more than 350 ppm (parts per million) and it may be less. Carbon dioxide amount is already 385 ppm and rising about 2 ppm per year.”

    By contrast, that very same week, in The Guardian on Thursday June 26, 2008 leading economist Nicholas Stern warned that the cost of tackling climate change has doubled:

    “To get below 500 ppm … would cost around 2% of GDP.”

    So, which is it? Below 350 ppm or below 500 ppm?

    And nobody is screaming about that disconnect!

    Personally, I would listen to Hansen for reliable safe ppm and ask Stern to price the investment needed to get us below 350 ppm. But I fear that policymakers will hear Stern, run with 500 ppm, and miss!

  • J // July 2, 2008 at 12:35 pm

    What a great post, Tamino. This just lays it all out nicely and simply.

    I do have one comment, about your conclusion:

    It’s abundantly clear that using fossil fuel to power our lives involves much greater carbon emissions [than the] limit which would bring us back to 1988 levels. Such a limit would require tremendous sacrifices, investment, and innovation. And it probably won’t be enough. But the consequences of not limiting our emissions will be far worse.

    Sometimes people look at the difficulty of limiting total world CO2 emissions to some “target” (say, the levels of 1988 or 2000 or today), and say “That would require immense efforts, and it still might not be enough to prevent the climate from changing” … and then use that difficulty as a justification (consciously or unconsciously) for doing nothing.

    But this really is a case where something is better than nothing. We have to stop somewhere! If not at 350 ppmv, then 380, or 400, or 425. But if we shrug our shoulders and do nothing, we’ll blow past 560 ppmv and god help our children and grandchildren (and the rest of the planet).

  • andy // July 2, 2008 at 1:02 pm

    Do you have a high level of confidence that what Hansen can do himself, eg make the GISS temperature calculations and adjustments as accurate and transparent as possible, and not eg “rewriting the history”?

  • John L. McCormick // July 2, 2008 at 1:26 pm

    That was the most concise, efficient and effective way to describe our future energy dilemma as I have ever read. Five Stars.

    And, it begs a far more serious consideration of advanced nuclear power and particularly pebble-bed and other passive safety-designed high temperature reactors fueled with thorium

    Ralph Nader is on his last legs politcally and we all must look beyond his ludite dicatates.

    Our children must have energy options we are not ready to consider or fund but that must also change.

    Running a household on one ton of carbon will be the greatest challenge the next generation - our children - will have to meet and overcome. If nuclear energy is not a part of their mix of supply options for electric power there will be no hope they can achieve that goal.

    John McCormick

  • BBP // July 2, 2008 at 1:32 pm

    If Global Warming is one 800 pound gorilla in the room, Peak Oil may very well be another. We may face dramatic decreases in fossil fuel use (and I think this is fairly likely) rather we want to or not. To me the question seems to be how we react when we realize that we are running out of cheap oil - do we try to shift to renewables and nuclear, or do we try to hang on to our ‘carbon lifestyle’ by any means possible. If we choose the latter I suspect we may be able to keep things going just long enough to ensure dramatic climate changes. Then we’ll be faced with the perfect storm of signifigant climate change, dramatically declining fossil fuels, and an infrastructure and society that depends on fossil fuels.

  • Patrick Hadley // July 2, 2008 at 2:15 pm

    If you put a moving average on your temperature graph, whether of 10, 20 or 30 years you see that when Hansen made his address to Congress in 1988 there had been a significant positive smoothed trend for about 10 years. There was no long term trend in temperature in the period between 1948 and 1978.

    Yet Hansen was confident enough after just 10 year of warming trend to make his predictions of climate in the future, and to make a great deal of the statistical significance of a ten year long trend in temperatures. Why did someone not tell him that climate change is defined as happening over a 30 to 50 year period, and that 10 years is far too short to make any long term conclusions?

  • Zeke Hausfather // July 2, 2008 at 2:54 pm

    Nice way to put things in perspective.

    The challenge lies in deciding the severity of mitigation demanded by the magnitude of the problem. Its obvious that there is some point where the cost of mitigation exceeds the societal cost of climate change. For example, paying 1 to 3 percent of world GDP to limit atmospheric concentrations to 450 ppm CO2 (slightly over 500 ppm CO2e) probably makes sense. Spending 10 percent of the world’s GDP (assuming diminishing returns to mitigation expendiatures) to reduce concentrations to 350 ppm CO2 (400 ppm CO2e) is less clearcut. Its really a question of weighing contemporary costs against future risk, which in general is not one of our strengths as a society.

    On an unrelated note, I presume Hansen is referring to 350 ppm CO2 and not CO2e. We really need to start referring to everything in terms of forcings, because the differing baselines and units seem to breed an enormous amount of confusion.

    Oh, and Tamino, welcome back. Hopefully this is the start of another prolific posting season.

  • Joseph // July 2, 2008 at 3:22 pm

    Worse, levels are still rising and show no sign of slowing down

    A third-order polynomial forecast suggests they could slow down a little, but not much. Of course, polynomial forecasts are not terribly accurate.

    It makes sense, though, that if population growth starts to slow down, the price of oil keeps rising, and we have hit or are about to hit peak oil, the trend will slow down.

    It’s still bad, though. I’ve seen the data. The relationship between CO2 concentration and temperature (8 years later) is linear for the last 50 years. It doesn’t look like it could be anything other than linear. My impression is that it’s not that hard to forecast what the situation will be like in a couple decades.

  • Ray Ladbury // July 2, 2008 at 3:48 pm

    Patrick Hadley asks why, based on only 10 years of consistent warming, Hansen was confident enough to make predictions.

    Answer: Because his prediction was not based on the temperature record at all, but rather upon the well understood physics of greenhouse gasses. The 10 years of warming merely provided assurance that something was not horribly amiss with the models. If there is one single robust result in climate science it is this: Add CO2 and it gets warmer than it would have been otherwise. Pretty fundamental physics there.

  • J // July 2, 2008 at 3:54 pm

    Patrick Hadley writes:

    Yet Hansen was confident enough after just 10 year of warming trend to make his predictions of climate in the future, and to make a great deal of the statistical significance of a ten year long trend in temperatures. Why did someone not tell him that climate change is defined as happening over a 30 to 50 year period, and that 10 years is far too short to make any long term conclusions?

    During the 30 years leading up to Hansen’s 1988 testimony, the temperature trend was +0.7C/century. During the past 30 years, it’s +1.7C/century.

    I can’t speak for Hansen, but my guess is that if the only reason for concern in 1988 had been observation of slightly rising temperatures, he never would have gone before Congress. It was the combination of insights from basic physics, results from climate models, AND observational data that raised his concern.

    The evidence is overwhelming. We should have acted on this a long time ago.

  • Ken // July 2, 2008 at 4:09 pm

    Patrick - My own analysis of the data finds your claim in error. The 30-year GISS temperatures (1958 - 1987) shows a clear 0.10C/decade rising trend. The 40-year (1948 - 1987) shows a 0.07C/decade rising trend. And a 10-year running average (up to 1982) shows a clear (and sharp) turn up at 1972. Statistically, this isn’t anything at all similar to the past 10 years.

    Hansen was clearly seeing the climate trend rise above the noise by 1988.

  • Hank Roberts // July 2, 2008 at 4:57 pm

    …. ocean pH ….

  • David B. Benson // July 2, 2008 at 7:25 pm

    I’m reasonably that Hansen means CO2e with regard to ‘no more than 350 ppm’. I’ll fairly strongly suggest that in the long term that needs to be no more than 260–280 ppm or else the Greenland Ice Sheet will continue to melt away.

  • steven mosher // July 2, 2008 at 8:31 pm

    99 percent certain. does the IPCC agree with that or is that a non consensus view? Honest question.

    cite frog, hank roberts. hop to it!

  • Hank Roberts // July 2, 2008 at 8:45 pm

    > cite frog, hank roberts. hop to it!
    You’re mistaken. Clean up after yourself.

  • Dave Andrews // July 2, 2008 at 9:13 pm

    Do any of you guys, including Hansen, live in the REAL WORLD?

    China, India and other rapidly developing countries are not going to stop their economic growth because some climate scientist from the US says they should.

    Sure they would like some help from the West. If the UK , for example stopped all CO2 production today that would give China about 2 years breathing space before it more than made up the difference.

    Hansen’s polemic is aimed squarely at the US political process, it is political advocacy and NOTHING to do with rigorous climate science

  • Zeke Hausfather // July 2, 2008 at 9:28 pm

    Dave,

    You seem to be arguing that, because roughly half of current emissions (and increasing rapidly) come from developing countries, working to reduce emissions in developed countries to reduce emissions is useless.

    It should be obvious that this is a rather silly arguement. Think of it this way: would countries like China or India be more or less likely to adopt binding targets or timetables if the U.S. and the rest of the developed world took the lead?

  • chriscolose // July 2, 2008 at 10:31 pm

    Dave,

    I don’t understand the point about blaming China for anything. What causes climate change is not today’s emissions, but atmsopheric concentrations. Most of the CO2 in the atmosphere today is from the UK and United States..and that surpasses China on a per capita basis by an order of magnitude. We actually need to do something, and so do other developed nations.

  • Hank Roberts // July 2, 2008 at 11:52 pm

    http://www.wri.org/stories/2008/02/chinese-cement-companies-account-co2-emissions

  • cce // July 3, 2008 at 12:38 am

    If scientist A is convinced 99.9%, and Nation B is “only” convinced 90.1%, the consensus is “greater than 90%”

  • Duane Johnson // July 3, 2008 at 3:42 am

    Chriscolose said:

    “I don’t understand the point about blaming China for anything. What causes climate change is not today’s emissions, but atmsopheric concentrations. Most of the CO2 in the atmosphere today is from the UK and United States…”

    Most of the CO2 in the atmosphere today is from natural sources. To the extent that it matters, only future emissions need be a concern, unless you want to play a blame game and enjoy self loathing.

  • Hank Roberts // July 3, 2008 at 4:11 am

    Duane, you’re right that Chris left off the words ‘from fossil sources’ — which I expect you’re aware of. You’re posting a PR talking point, whether you realize it or not, and one that’s outdated because wilful ignorance really has been dropped from the list of opposition positions except by a very few people. The new PR position is to take responsibility for past behavior. Why? Market forces require it.

    http://www.insnet.org/ins_press.rxml?id=3982&photo=

  • EliRabett // July 3, 2008 at 4:24 am

    There are places and people actually building stuff to reach a 2000 W society. That is about where we have to be per capita to decarbonize, and it need not lead to a loss of comfort in the Western world.

    The truth is that China and India and the rest of the developing world are well below this limit. The truth is that those who complain about China and India’s emissions have no sense of shame for their own behavior.

  • Ray Ladbury // July 3, 2008 at 10:02 am

    cce, You’ve got it wrong. If the relevant scientific community is 99.9% convinced, and they manage to convince their 99% of their peers in related disciplines they are correct, AND nation A is only 10% convinced, the consensus is >99%, because ignorant food tubes should not get a vote on scientific matters.

  • Barton Paul Levenson // July 3, 2008 at 12:09 pm

    Dave Andrews writes:

    China, India and other rapidly developing countries are not going to stop their economic growth because some climate scientist from the US says they should.

    Nobody said they should stop their economic growth. Where did you get that idea? All anybody has said is that we need to switch from fossil fuels to other power sources.

  • dhogaza // July 3, 2008 at 2:25 pm

    Hansen’s polemic is aimed squarely at the US political process, it is political advocacy and NOTHING to do with rigorous climate science.

    He wasn’t asked to do “rigorous climate science”. This was, after all, a committee meeting in Congress, who asked him for his advice regarding policy. Your comment’s so lame it doesn’t even rise to the level of strawman.

  • John Finn // July 3, 2008 at 3:17 pm

    Wouldn’t now be an appropriate time to assess how well the Hansen model is performing. I recall JH (writing in 1998) saying that the model was ok but it would be another decade before a proper evauation could be done.

    My own view was that the similarity of the trends (observed v model) was an artifact of the timing of the volcanic eruptions (1991 v 1995).

    Anyway I’d like to see a proper analysis and find out how model predictions could or have been
    improved.

    This is slightly tongue-in-cheek but there is a serious point.

  • Hank Roberts // July 3, 2008 at 5:08 pm

    John Finn, try these links for help with that very question:
    http://www.google.com/search?num=100&hl=en&newwindow=1&safe=off&q=%22john+finn%22+Hansen+climate+model&btnG=Search

  • luminous beauty // July 3, 2008 at 6:29 pm

    The question of corporate liability for malfeasance vis~a~vis knowingly promoting disinformation regarding climate change is not just a matter of idle speculation. It is becoming a real legal issue:

    Today, we’re going to look at the rapidly growing field of global warming litigation. I’m joined here in Aspen, Colorado by the attorney Stephen Susman. He’s the founding partner of the law firm Susman Godfrey. Earlier this year, he helped file a groundbreaking lawsuit on behalf of 400 villagers in the Alaskan town of Kivalina. They’re being forced to relocate because of flooding caused by global warming.

    The suit accuses twenty oil, gas and electric companies of being responsible for emitting millions of tons of greenhouse gases, causing the Arctic ice to melt. Companies named in the suit include ExxonMobil, Chevron, BP, ConocoPhillips and Peabody. The suit also accuses eight of the corporations of being involved in a conspiracy to mislead the public about the causes of global warming.

  • David B. Benson // July 3, 2008 at 6:34 pm

    Duane Johnson // July 3, 2008 at 3:42 am — Actually the current excess C02, due to human activities so far, do matter.

    (1) Oceans continue to acidify.
    (2) Greenland I ce Sheet, etc., continue to melt.

    That’s why Hansen has suggested an upper limit of 350 ppm.

  • Duane Johnson // July 3, 2008 at 7:17 pm

    David B. Benson//July 3, 2008 at 6:32 pm

    My point was that it’s only future CO2 emissions that anything can be done about, if it is necessary. Feeling shame, as Ely suggests, doesn’t remove CO2 from the air. But I forgot, according to Hank Roberts, it’s a matter of PR. I find it sad that PR trumps Science.

  • David B. Benson // July 3, 2008 at 9:22 pm

    Duane Johnson // July 3, 2008 at 7:17 pm — There are understood processes for removing carbon from the active carbon cycle. I estimate this can be done for about $135 per tonne of carbon.

    So far, I have been unable to interest anybody in funding such; about $670 billion per year would lower the CO2 to around 315 ppm before the century is out.

  • Dave Andrews // July 3, 2008 at 9:51 pm

    I wasn’t blaming anything on China, just pointing out that the world is not actually how Hansen would like it to be.

    His recent upper limit of 350ppm is frankly ridiculous as we have already passed that and there is no way, realistically, emissions are going to come down in the near, or even medium term.

    So what is driving him? Well there is a local matter of a US election. Obama’s “Climate Tsar” or Gore’s “Deputy” anyone?

  • David B. Benson // July 3, 2008 at 10:02 pm

    Dave Andrews // July 3, 2008 at 9:51 pm — Hansen is concerned about Antarctica melting sooner than expected:

    http://en.wikipedia.org/wiki/West_Antarctic_Ice_Sheet

    is of the (much) greater concern.

  • Hank Roberts // July 3, 2008 at 10:18 pm

    … ocean pH …

  • Hank Roberts // July 3, 2008 at 10:31 pm

    > no way

    You may very well be correct:
    http://dotearth.blogs.nytimes.com/2008/05/29/nobel-winner-co2-going-to-1000-parts-per-million/

    “… I asked Dr. Rowland two quick questions. The first: Given the nature of the climate and energy challenges, what is his best guess for the peak concentration of carbon dioxide?

    His answer? ‘1,000 parts per million,’ he said.

    My second question was, what will that look like?

    ‘I have no idea,’ Dr. Rowland said. He was not smiling. ”
    ———-

    Peter Ward has looked at the past for that:

    “… to the end of the Triassic, as a guide to what atmospheric carbon of 1,000 ppm (a concentration we will hit within the century if we don’t change our ways) …”
    http://www.worldchanging.com/archives/006386.html

    and

    http://www.astrobio.net/news/article2553.html

    astrophysicist Neil deGrasse Tyson interviewed University of Washington paleontologist Peter Ward. They discussed Ward’s latest book, “Under a Green Sky,” which explores extinctions in Earth’s past and predicts extinctions to come in the future.

    Under a Green Sky, a new book by Peter Ward.
    ‘If you look at the fossil record, it is just littered with dead bodies (from past catastrophes),’ Ward says in the interview. He says that only one extinction in Earth’s past was caused by an asteroid impact – the event 65 million years ago that ended the age of the dinosaurs. All the rest, he claims, were caused by global warming.

    http://www.astrobio.net/podcast/TysonFreeFM971.mp3

    Those of you who believe only in the future, not the past, and only in the free market, not ecology, don’t bother to look or listen. You know what you’re going to think, already.

    Those with ears to hear …. ocean pH is changing faster than global climate, is going far outside what has been livable within this century, and is well established as a risk of the flip into conditions Dr. Ward has described.

    Each past great extinction but one shows up as an anoxic black shale in ocean sediments — an episode worth understanding, if we are to avoid adding another such layer to Earth’s history by our own actions.

    Short answer: don’t be fucking stupid, read the strata and understand why life on Earth has almost died several times from too much warming too fast.

  • David B. Benson // July 3, 2008 at 11:00 pm

    Hank Roberts // July 3, 2008 at 10:18 pm — Yes, but I haven’t noticed Hansen mentioning ocean pH for his ‘no more than 350′ statement.

  • Hank Roberts // July 3, 2008 at 11:11 pm

    And yes, if you’re a ‘Gaia is Mother Earth’ nitwit you are really upset after listening to Peter Ward.

    Look it up. Mother Earth is not your friend.
    http://www.news.cornell.edu/Chronicle/98/8.6.98/Broecker.html

  • Hank Roberts // July 3, 2008 at 11:39 pm

    > I haven’t noticed

    I’m not surprised.

    Avert your eyes, or change your story:

    James Hansen:
    —–excerpt follows——-

    Tomorrow I will testify to Congress about global warming, 20 years after my 23 June 1988 testimony, which alerted the public that global warming was underway. ….

    … Now, as then, frank assessment of scientific data yields conclusions that are shocking to the body politic. Now, as then, I can assert that these conclusions have a certainty exceeding 99%.

    The difference is that now we have used up all slack …. commensurate with our responsibility for the present dangerous situation…..

    Coral reefs, the rainforest of the ocean, are home for one-third of the species in the sea. Coral reefs are under stress for several reasons, including warming of the ocean, but especially because of ocean acidification, a direct effect of added carbon dioxide. Ocean life, dependent on carbonate shells and skeletons, is threatened by dissolution as the ocean becomes more acid.

    … A level of no more than 350ppm is still feasible, with the help of reforestation and improved agricultural practices, but just barely …..

    —–end excerpt—–

  • Hank Roberts // July 4, 2008 at 12:11 am

    http://www.geotimes.org/oct06/feature_Geocatastrophes.html

    Carbon cycling — rate of change — catastrophes:

    ———excerpt follows——-

    … Geologic temperature proxies suggest that this rapid warming at the Paleocene-Eocene boundary occurred over a period of 10,000 to 20,000 years in association with a large change in the global carbon cycle. Surface temperatures increased by as much as 5 degrees Celsius in the tropics, and 10 degrees Celsius in high latitudes, then gradually returned to warm background levels over the next 100,000 years. At no other time during the last 65 million years do we have evidence for such a rapid change in temperature.

    Concern over presently rising levels of greenhouse gases and their effects on future climate has driven researchers to … search of clues to the trigger of the Paleocene-Eocene warming event. They hope that an understanding of the causes and consequences of this ancient warming will provide insight into our future.

    … Coincident with this “excursion” of carbon-12 in the carbon isotope record and the inferred increase in deep-sea temperatures at the PETM, the fossil record suggests there was a large decrease in the diversity and abundance of deep-dwelling creatures. …

    Subsequent sediment cores collected from around the world during the past 15 years have supported the initial data … (see Geotimes, August 2006). The presence of the subtropical dinoflagellate, Apectodinium, across the PETM in the Arctic core, suggests that this region warmed considerably, from a background temperature of 18 degrees Celsius in the Late Paleocene to more than 23 degrees Celsius during the PETM.

    … So far, the Bighorn Basin has yielded the only plant fossils from the PETM, and the fossils suggest that some types of tropical plants could have extended their ranges as far as 1,500 kilometers northward in response to the warming event.

    Methane belches
    So what is the culprit in this ancient global warming event? …

    The light carbon isotope ratio found both in the ocean and on land during the PETM is the key to understanding the causes of the rapid changes. Whatever triggered the warming involved the release of large quantities of light carbon into the ocean-atmosphere system. …

  • Ray Ladbury // July 4, 2008 at 12:59 am

    Dave Andrews, Physics does not depend on politics or even with what is convenient or easy. Hansen is concerned that at levels above 350 ppmv, we could be talking about a very different and unpredictable climate–and one where positive feedbacks such as ghg emissions from natural sources, melting ice caps, etc. ensure we have zero control over the path the climate takes. And ultimately, there is the threat of turning the oceans into a stinking soup of bacteria emitting H2S–which caused a mass extinction the last time it happened. These are all credible threats. They’ve happened before–just not when we had a world already straining under the burden of 9-12 billion humans.

  • steven mosher // July 4, 2008 at 1:32 pm

    Sorry Hank, I’m not finding anything in the IPCC about the 350ppm limit. And I’ve seen the error spread in the 20 or so models, so I’m not seeing how 350 gets justified as a 99% certain fact.
    The fact that the value of 350 happens to coinicide with the level in 1988, when the alarm was sounded, makes for an interesting “i told you so”
    but I’m not seeing a consensus forming around that number. So, honest question. cites from other studies? come on froggy jump to it

  • steven mosher // July 4, 2008 at 2:47 pm

    Lilypads for the frog

    http://www.dailymail.co.uk/sciencetech/article-1031438/Pictured-The-floating-cities-day-house-climate-change-refugees.html

  • dhogaza // July 4, 2008 at 3:21 pm

    come on froggy jump to it

    Do you really expect that being insulting will motivate Hank to help you overcome your own ignorance and bias?

  • Ray Ladbury // July 4, 2008 at 3:38 pm

    Steven Mosher, I would liken the 350 ppmv to an engineering judgment–like a maximum weight for a bridge. If you look at the uncertainties on CO2 forcing and then figure out the confidence level you’d like to have on the survival of civilization, it would be risky to go much above this level for an extended period. Keep in mind that Hansen feels there is significant additional warming “in the pipeline” and that significant positive feedbacks, not all in the models may kick in soon. I don’t think that it is possible to do a rigorous determination of a “safe” CO2 level at this point, although a Bayesian analysis of expert opinion would be interesting. Just a guess, but I think Hansen would likely be on the conservative side, though not wildly so. I think the median would be in the 400-425 ppmv range, the low at preindustrial levels (e.g. Lovelock) and the high end 475. I don’t think you find many experts (and I mean people who really understand climate) above 500.

  • Paul Middents // July 4, 2008 at 3:57 pm

    Re mosher:

    I don’t see Hansen claiming 99% certainty for the 350 ppm limit. What he does claim 99% certainty for is “what is understood about global warming by the relevant scientific community “.

    The key word is “relevant”. To some the opinions of Lindzen, the Pielke’s, Soon, Baliunas, Singer, Courtney, Monckton, McIntyre, McKittritck, et al are relevant. To others, they aren’t.

  • Ray Ladbury // July 4, 2008 at 5:03 pm

    Steven Mosher, I suppost that the degree to which one appreciates Hank’s posts depends on how well integrated one is to the reality-based community, since facts are the constructs by which we illustrate reality. Since you are not so tightly teathered, you have little need or respect for them. I on the other hand would argue that Hank does an excellent job of anchoring the discussion in some semblance of reality–a good thing in my opinion, if not in yours.

  • David B. Benson // July 4, 2008 at 7:58 pm

    steven mosher // July 4, 2008 at 1:32 pm — Hansen, starting in late 2007 or early 2008 started stating ‘350 ppm’. He recently stated ‘300 to 350 ppm’ and ‘not more than 350 ppm’.

    Even 300 ppm is unsustainably large, IMO. Even if attained tomorrow, in the long term galciers and GIS would continue to melt (although much more slowly than now). Possibly even 280 ppm is too large. For sure 260 ppm is safe.

    And these are all CO2e levels.

  • Dave Andrews // July 4, 2008 at 10:37 pm

    Guys,

    You’re being very US centric here. CO2 levels are over 380ppm already, they are going to rise a lot further whatever you may wish. (Quite apart from the arguments about whether it matters much at all
    :-) )

    Hansen may even understand this which is why he is manouevreing for a political position in a Democrat administration

  • David B. Benson // July 5, 2008 at 12:07 am

    Dave Andrews // July 4, 2008 at 10:37 pm — 387 ppm. Dangerously high:

    http://en.wikipedia.org/wiki/West_Antarctic_Ice_Sheet

  • cce // July 5, 2008 at 2:08 am

    Ray,

    That would be great, but unfortunately, ignorant food tubes approve the wording (along with the lead authors) of the Summary for Policymakers, so the official language of the IPCC has to be made to reflect the opinions of all parties, even if the relevent scientists are more convinced.

  • Ray Ladbury // July 5, 2008 at 2:14 am

    Dave Andrews, the fact that you see everything through a political lens says more about you than it does about what you are seeing. Do you think David Lovelock is angling for a position in the Liberal government? Did it ever occur to you that you are just being told what the evidence says?

  • dhogaza // July 5, 2008 at 7:17 am

    Hansen may even understand this which is why he is manouevreing for a political position in a Democrat administration

    This, from a man who claims climate science is, in essence, pure speculation.

  • John Finn // July 5, 2008 at 10:34 am

    Re: Hank Roberts // July 3, 2008 at 5:08 pm

    “John Finn, try these links for help with that very question:”

    Hank

    Many of the posts you link to are several years old. Hansen himself said (in 1998) it would take another decade before his model could be properly evaluated.

    This is not just a dig at the model’s over-estimation of the temps of even scenario C.

    There are other issues such as time lag with respect to scenarios B and C.

  • George Tobin // July 5, 2008 at 1:42 pm

    What is the source data for the temp graph?

    [Response: GISS (Goddard Institute for Space Studies). For a comparison to other estimates of global mean surface temperature, look here.]

  • Hank Roberts // July 5, 2008 at 3:42 pm

    John, those point to other times you’ve posted the same question; the answers are there.

  • Tenney Naumer // July 5, 2008 at 4:11 pm

    Patrick Hadley, gavin over at realclimate answered your question about short-term trends:

    “Hansen did not make his points because of a short-term trend in temperatures but because the long term trends were a match to the expectation he had from the physics. And he was right. - gavin”

  • Tenney Naumer // July 5, 2008 at 4:15 pm

    Dave Andrews,

    You can use BIG CAPS all you want, but that does not change the fact that we need to get below 280 ppm, meaning that we have to become carbon neutral and develop the technology to remove CO2 from the atmosphere.

  • John Lederer // July 5, 2008 at 4:59 pm

    I find it almost impossible to square this graph in this article:
    http://tamino.files.wordpress.com/2008/07/giss1988.jpg

    with this graph:

    http://wattsupwiththat.files.wordpress.com/2008/07/uah_june_082.png

    I realize the data sources are different– one is satellite, and one is (I think) GISS post adjustment land-sea record. They use different bases to calculate the anomaly from.

    But still the difference is so great that it begs for an explanation. Anyone want to try one?

    .

  • Chris Colose // July 5, 2008 at 5:55 pm

    One of the large problems in a conversation like this is the ill-defined and generally subjective use of words like “tipping point” or “too much warming” and when to say enough is enough. Is 1 C rise acceptable? How about 2 C or is that pushing it? Is the loss of summer arctic sea ice a “tipping point?”

    I think most people are agreed that a rise of 2 C would represent a substantial interference with the climate and ecological systems. Assuming a climate sensitivity of 0.75 C/W/m^2 (and working with just CO2 ) then it takes about 360 ppmv relative to a 280 ppmv baseline to give a 1 C rise after equilibrium, and about 461 ppmv to get up to 2 C at equilibrium. Those numbers will be lower when you factor in all the other anthropogenic factors which are net positive, and declining aerosols give an even bigger reason for concern.

    I don’t necessarily see the reason to get back to pre-industrial conditions, but I don’t want to see 2 C of warming either.

  • nanny_govt_sucks // July 5, 2008 at 6:44 pm

    we need to get below 280 ppm,

    Good luck with that one. What is your plan, have everyone move back to caves, use whale oil lamps, and then release velociraptors in China to decimate their population?

    Come back to Earth, Tenney. Look at the greening all around you. Move away from the coast, if you think rising seas will be a problem. Try something more manageable like getting fresh water to most of Africa.

  • Hank Roberts // July 5, 2008 at 7:18 pm

    John Lederer, just a guess, I think you may be comparing the same records discussed here:
    http://lwf.ncdc.noaa.gov/oa/climate/research/2008/may/may08.html

    U.S. Has 36th Coolest Spring on Record …
    Global Temperature Ranked 8th Warmest … for May, 7th Warmest on Record for Spring

  • tamino // July 5, 2008 at 7:42 pm

    John Lederer:

    The graph in this article is of annual average temperature, the graph in your link is monthly average. The graph in this article covers a span of nearly 130 years, the graph in your link covers a bit less than 30 years time span.

    If you want a direct comparison of GISS annual average surface temperature to satellite *annual average* surface temperature, for the period for which we have data for both, it’s here. For a direct comparison of GISS and satellite *monthly* average temperature, for the period for which we have data for both, it’s here.

    And by the way: Anthony Watts and his contributors are grotesquely incompetent at data analysis. Just in case any of his “fans” are reading this, I’ll repeat: as a data analyst, Anthony Watts and his collaborators are the most grossly incompetent data analysts I’ve ever seen anyone take seriously. It’s truly pathetic. If you want to know how trustworthy his analyses are, read this and this and this and this and this and this.

  • Dave Andrews // July 5, 2008 at 7:44 pm

    Ray,

    It is not I who sees everything through a political lens but James Hansen in his recent pronouncements. And it is those pronouncements that I am responding to.

    Tenney,

    280ppm, “you cannot be serious!”

  • Dave Andrews // July 5, 2008 at 8:25 pm

    Ok, lets take a few steps back.

    Hansen et al publish a preprint which says 350ppm is the dire limit.

    Has this paper been accepted for publication? You, with your better connections may know this but I can’t ascertain that it has. So we have a non peer reviewed paper, with no supporting backup in the scientific literature, and people are taking it as gospel. Why is that? Faith in the leader?

  • David B. Benson // July 5, 2008 at 8:26 pm

    Chris Colose // July 5, 2008 at 5:55 pm — The data point I usually cite is for 1958 CE, with CO2 at 315 ppm. At this time the Swiss glaciers were retreating at 4 m/y.

    AFAIK, nobody knows just what GIS was doing then, but if losing mass, then this also increases sea level. After enough sea level rise and with a most modest increase in sea temperature

    http://en.wikipedia.org/wiki/West_Antarctic_Ice_Sheet

    will partially collapse, raising sea levels a few meters in a short time. That’s a catastrophe.

  • mndean // July 5, 2008 at 8:42 pm

    Dave Andrews,
    This coming from a man who uses the term “Democrat administration”. Those who are unaware may not know what you mean, but I certainly do. If you want to be taken seriously as politically impartial, you just blew it big time.

  • Hank Roberts // July 5, 2008 at 9:48 pm

    Nanny, of course, does understand the problem for the oceans. But since the ocean is a communist, it deserves to die. Right?
    http://www.sida.se/sida/jsp/sida.jsp?d=824&a=25763&language=en_us

  • george // July 5, 2008 at 11:09 pm

    John Finn says

    Hansen himself said (in 1998) it would take another decade before his model could be properly evaluated.

    This is not just a dig at the model’s over-estimation of the temps of even scenario C. ”

    Well,
    Hansen himself re-evaluated the model performance in a paper published in 2006 and Gavin Schmidt also provided an updated assessment of Hansen’s 1988 Projections just a little over a year ago (May 2007)

    Here’s Schmidt’s “bottom line”:

    The bottom line? Scenario B is pretty close and certainly well within the error estimates of the real world changes. And if you factor in the 5 to 10% overestimate of the forcings in a simple way, Scenario B would be right in the middle of the observed trends. It is certainly close enough to provide confidence that the model is capable of matching the global mean temperature rise!

    But can we say that this proves the model is correct? Not quite. Look at the difference between Scenario B and C. Despite the large difference in forcings in the later years, the long term trend over that same period is similar. The implication is that over a short period, the weather noise can mask significant differences in the forced component. This version of the model had a climate sensitivity was around 4 deg C for a doubling of CO2. This is a little higher than what would be our best guess (~3 deg C) based on observations, but is within the standard range (2 to 4.5 deg C). Is this 20 year trend sufficient to determine whether the model sensitivity was too high? No. Given the noise level, a trend 75% as large, would still be within the error bars of the observation (i.e. 0.18+/-0.05), assuming the transient trend would scale linearly. Maybe with another 10 years of data, this distinction will be possible. However, a model with a very low sensitivity, say 1 deg C, would have fallen well below the observed trends.

    As Schmidt points out, the best way to assess the model performance is to compare the projected trends to the observational (measured) trend. As he also indicates, one needs to be very careful about simply comparing specific years:
    Schmidt:

    As mentioned above, with a single realisation, there is going to be an amount of weather noise that has nothing to do with the forcings. In these simulations, this noise component has a standard deviation of around 0.1 deg C in the annual mean. That is, if the models had been run using a slightly different initial condition so that the weather was different, the difference in the two runs’ mean temperature in any one year would have a standard deviation of about 0.14 deg C., but the long term trends would be similar. Thus, comparing specific years is very prone to differences due to the noise, while looking at the trends is more robust.

    From 1984 to 2006, the trends in the two observational datasets are 0.24+/- 0.07 and 0.21 +/- 0.06 deg C/decade, where the error bars (2\sigma ) are the derived from the linear fit. The ‘true’ error bars should be slightly larger given the uncertainty in the annual estimates themselves. For the model simulations, the trends are for Scenario A: 0.39+/-0.05 deg C/decade, Scenario B: 0.24+/- 0.06 deg C/decade and Scenario C: 0.24 +/- 0.05 deg C/decade.

  • nanny_govt_sucks // July 6, 2008 at 12:17 am

    But since the ocean is a communist, it deserves to die. Right?

    Well, it is the UN which prevents private ownership of most of the ocean, so THEY are the “communists” and yes, the UN should die. Peacefully, of course.

  • cce // July 6, 2008 at 3:15 am

    I have a history of Hansen’s ‘88 scenarios and subsequent controversy here:

    http://cce.890m.com/?page_id=23

  • Petro // July 6, 2008 at 5:54 am

    Nanny, your libertarian fantasyland may work with communities of 50 - 200 puritans. In real global world with 6,5 billion people and counting there is evident need for organistional governing structures like national governments, EU and UN.

  • dhogaza // July 6, 2008 at 8:20 am

    Well, it is the UN which prevents private ownership of most of the ocean…

    That is so laughably ignorant that it deserves to stand by itself.

    As though the ocean were open to private ownership or even national ownership before the UN was created … pffft.

  • dhogaza // July 6, 2008 at 8:22 am

    Having one nation declare sovereignty over the ocean would yield a lot for future military historians to write about, though!

  • John Finn // July 6, 2008 at 1:11 pm

    Re: Hank Roberts // July 5, 2008 at 3:42 pm

    ” John, those point to other times you’ve posted the same question; the answers are there.”

    No they don’t. I cannot find any which request an evaluation of the Hansen projections. And there’s certainly nothing which relates to the time constant lag in scenario C (no ghg growth after 2000) which appears to be ~5 years, i.e. exactly the figure Stephen Schwartz obtained in his much criticised study.

    I can understand that you (and others) may be reluctant to discuss the Hansen projections and I respect that but I wish you’d be more honest and admit that they have failed to represent reality.

    Remember it was Hansen - not me or anyone else who asked that we give the models 20 years to show their worth.

  • dhogaza // July 6, 2008 at 1:54 pm

    John Finn will undoubtably reach a sudden understanding of, and belief in, weather variability when the next El Niño kicks in …

  • george // July 6, 2008 at 2:27 pm

    cce:

    That’s a nice summary of the history.

    I would only make one caveat. I think it is important to be careful about making statements like

    although scenario B matches warming over land, it exaggerated warming over the land and ocean by about 21%.

    While you correctly pointed out that some of the apparent difference (based on a comparison of the central trend values alone) could be attributed to the fact that Hansen used a climate sensitivity of 4C rather than 3C, I think one has to be careful about the “overestimate” claim itself.

    The reason for this is that all of the trends (for the scenarios AND the observational data) have uncertainties attached to them.

    Gavin Schmidt gives the trends with the associated (2-sigma) uncertainties (also quoted above)

    From 1984 to 2006, the trends in the two observational datasets are 0.24+/- 0.07 and 0.21 +/- 0.06 deg C/decade, where the error bars (2\sigma ) are the derived from the linear fit. The ‘true’ error bars should be slightly larger given the uncertainty in the annual estimates themselves. For the model simulations, the trends are for Scenario A: 0.39+/-0.05 deg C/decade, Scenario B: 0.24+/- 0.06 deg C/decade and Scenario C: 0.24 +/- 0.05 deg C/decade.

    The overlap is considerable. Even the central values for both scenarios B and C (0.24C/decade) are well within the 2-sigma observational range for both land-ocean (0.15C/decade to 0.27C/decade) and just land (0.17C/decade to 0.31C/decade). In fact, the central values for B and C trends are actually within the 1-sigma ranges: (land-ocean) 0.18C/decade to 0.24C/decade and (land) 0.21C/decade to 0.28C/decade

  • luminous beauty // July 6, 2008 at 3:48 pm

    It is amusing to see the reactionary conservative version of Obama’s message of change and hope.
    Change the goalposts and hope no one notices.

    Since, apparently, in the John Finn universe, the fact John Finn “cannot find any[?] which request an evaluation of the Hansen projections.” obviates the possibility that any of the above actual evaluations cited exist, then I can only conclude that this graph which illustrates that scenario B and C forcings begin to diverge in 1984 and not at the arbitrarily referenced inflection point in 2000, at which time the divergence is already ~0.2W/m^2, inferring a much longer relaxation time than 5 years, must be invisible in the John Finn universe.

  • cce // July 6, 2008 at 7:28 pm

    Thanks george. I will add the uncertainty of the trends. I’ll note that there is uncertainty in the measurements also, but I think I have them bounded by the four semi-independent temperature analyses.

    tamino, is equation number 3 on this page the proper way to do this?
    http://science.widener.edu/svb/stats/regress.html

    [Response: Yes, if the noise is white noise (no autocorrelation).

    If there is autocorrelation, try using an "effective number" of data points N_{eff} in place of the actual number N. First estimate the "lag-1 autocorrelation" \alpha. You can do this by doing a linear regression of each y value on the *preceding* y value (not on the x values), i.e., a linear regression of the form y_j = \alpha y_{(j-1)} + \beta. The slope of that regression line is an estimate of the lag-1 autocorrelation \alpha. Then compute the "effective number" as N_{eff} \approx N (1-\alpha) / (1+\alpha).]

  • Hank Roberts // July 6, 2008 at 7:33 pm

    But we digress. Let’s not.

    Are y’all following Dr. LeQuere’s work?

    http://www.realclimate.org/index.php/archives/2005/06/how-much-of-the-recent-cosub2sub-increase-is-due-to-human-activities

    http://scholar.google.com/scholar?hl=en&lr=&scoring=r&q=lequere+climate&as_ylo=2008&btnG=Search

    (You may get different results for “le quere” and “lequere” from both Google and Scholar)

  • Dave Andrews // July 6, 2008 at 9:48 pm

    mndean,

    I’m a UK citizen so am totally impartial when it comes to US politics. Here in the UK, I’ve been on the left all my voting life.

  • David B. Benson // July 6, 2008 at 11:32 pm

    “The basic dynamics of the atmosphere and ocean have been modeled by Richard P. McGehee, a mathematician at the University of Minnesota. His models describe the basic interactions among (1) the atmosphere, (2) the shallow ocean, and (3) the deep ocean, and are accessible to undergraduates. There are only three differential equations, but these vastly simplified models [using a climate sensitivity of 3.3] make predictions within the range of predictions described in the IPCC report [and also agrees well with the twentieth century].”

    from “Climate Change: A Research Opportunity for Mathematicians?” by P.C. Kenschaft in June/July 2008 issue of the Notices of the American Mathematical Society, p. 695 ff.

    Somebody care to check if there is a web presence?

  • EliRabett // July 7, 2008 at 12:19 am

    David, Eli rather thinks that depends if you want a zero dimensional model.

  • Molnar // July 7, 2008 at 3:34 am

    While cycling carbon may be marginally better than cycling steel or aluminum, I think cycling anything is preferable to most other forms of transportation.

  • Timothy Chase // July 7, 2008 at 6:25 am

    cce wrote:

    I have a history of Hansen’s ‘88 scenarios and subsequent controversy here:

    http://cce.890m.com/?page_id=23

    I have had a chance to look at some of your website — and it appears to have been done by someone quite literate in the science, familiar with the history — and who is also an excellent writer.

    I’m impressed!

  • David B. Benson // July 7, 2008 at 6:11 pm

    EliRabett // July 7, 2008 at 12:19 am — Yes, a good one would be nice to have around.

  • David B. Benson // July 7, 2008 at 10:20 pm

    Here ia a (short) workshop report on climate sensitivity:

    http://books.nap.edu/openbook.php?record_id=10787&page=1

  • cce // July 8, 2008 at 2:45 am

    Thanks Timothy. I’m always looking for improvements, so if you see anything that is wrong or bad, send comments my way. It is eventually going to be a video presentation available on the site and on Google Video. Before I record the narration, I want to make sure as many problems as possible are fixed, because fixing audio is a lot harder than fixing text.

  • John Finn // July 9, 2008 at 11:33 am

    From luminous beauty // July 6, 2008 at 3:48 pm
    “ ….which illustrates that scenario B and C forcings begin to diverge in 1984 and not at the arbitrarily referenced inflection point in 2000, at which time the divergence is already ~0.2W/m^2, inferring a much longer relaxation time than 5 years, must be invisible in the John Finn universe.”
    What’s 1984 got to do with anything?

    Scenarios B and C are essentially one and the same. They both assume moderate growth of ghg emissions – until 2000. It is at that point that Scenario C has no further growth, i.e. ghg emissions and, by implication, atmospheric ghg concentrations remain constant. Scenario B emissions continue to grow at the previous rate. Around 2005, Scenario C temperature anomalies peak at just over 0.6 deg C and then slowly fall way. Scenario B anomalies continue to rise towards 0.8 and beyond. The divergence occurs in 2005. The change in emission rates happened in 2000.

    Just to re-emphasise: Scenario C ghg growth stopped in 2000 – Scenario C temp peaked in 2005.

    Oh and one more thing - actual observed temps have been below scenario C temps for the past 3 years.

  • Raphael // July 12, 2008 at 5:16 pm

    Limit the per capita carbon emissions to 1988 levels?

    From 1989 to 2003, the per capita carbon emissions were at or lower than the 1988 levels.

    15 years of testing shows this analysis to be a poor solution to the defined problem.

  • Hank Roberts // July 12, 2008 at 6:08 pm

    Raphael, cite please?

    Countries where population growth has outrun development have lower per capita carbon emissions. Countries with major longterm wars, similarly. What are you talking about?
    http://en.wikipedia.org/wiki/List_of_countries_by_carbon_dioxide_emissions_per_capita

  • Raphael // July 12, 2008 at 6:20 pm

    I’m talking about global emissions per capita.

    As per the last column Here

  • Hank Roberts // July 12, 2008 at 7:14 pm

    Raphael, do you understand what “cherry picking” refers to when looking at a long time series?

    What’s the bottom line on the chart you link to?

    You’re talking about the dip in the increasing rate of CO2 from fossil fuel during the years after the USSR collapsed.

    This is not news. If you noticed it yourself from looking at CDIAC, you fooled yourself. If you’re telling us because you came across this factlet on a blog somewhere, you’ve been fooled.

    Tell us how you came to this particular bit of information?

  • Hank Roberts // July 12, 2008 at 7:20 pm

    Raphael, put these together — notice what happens around 1970:

    http://www.sustainablescale.org/images/uploaded/Population/World%20Population%20Growth%20to%202050.JPG
    (includes projection into future)

    http://cdiac.esd.ornl.gov/trends/emis/glo.htm

  • Raphael // July 12, 2008 at 8:08 pm

    Hank,

    Egags man. What did all of that have to do with the price of tea in china?

    What is Tamino Proposing in this post?

    Reduction of Global emissions per capita to 1988 global emission per capita level?

    Because that’s the way I read it. Especially in his conclusion when he says, “It’s abundantly clear that using fossil fuel to power our lives involves much greater carbon emissions that the paltry 1 tonne/person/year limit which would bring us back to 1988 levels.

    If that’s the case, the global emissions as viewed on that chart (regardless of the collapse of the Soviet Union) shows that maintaining a per capita emission at that level is rediculous when it comes to resolving the defined problem.

  • tamino // July 12, 2008 at 8:43 pm

    I completely agree that reducing per capita emissions to 1988 levels is woefully inadequate to bring rising atmospheric CO2 levels under control I wasn’t advocating such a policy, I was trying to illustrate that relying on fossil fuels for energy supply is not a viable option.

  • Raphael // July 13, 2008 at 1:44 pm

    Tamino,

    Could I get you to clarify what you meant when you said, “ In fact, just to keep emissions constant we have to limit ourselves to about 1 tonne C per person per year.”

  • Barton Paul Levenson // July 15, 2008 at 12:01 pm

    The article describes James Hansen as “NASA’s chief climate scientists.” I assume you meant “scientist?”

  • Hank Roberts // July 26, 2008 at 4:08 pm

    http://www.nature.com/ngeo/journal/v1/n5/index.html

    Carbon cycle: Checking the thermostat
    - pp289 - 290
    David Archer
    doi:10.1038/ngeo194

    Atmospheric carbon dioxide levels greatly influence the Earth’s climate. Evidence from ice cores and marine sediments suggests that over timescales beyond the glacial cycles, carbon fluxes are finely balanced and act to stabilize temperatures.

    ———————-

    Close mass balance of long-term carbon fluxes from ice-core CO2 and ocean chemistry records - pp312 - 315

    Richard E. Zeebe & Ken Caldeira

    doi:10.1038/ngeo185

    On geological timescales, carbon dioxide enters the atmosphere through volcanism and organic matter oxidation and is removed through mineral weathering and carbonate burial. An analysis of ice-core CO2 records and marine carbonate chemistry indicates a tight coupling between these processes during the past 610,000 years, which suggests that a weathering feedback driven by atmospheric CO2 leads to a mass balance between CO2 sources and sinks on long timescales.

Leave a Comment