Blog Archive

Showing posts with label Tipping points. Show all posts
Showing posts with label Tipping points. Show all posts

Sunday, August 24, 2014

Climate Code Red: Dangerous climate change: Myths and reality. Part 2

by David Spratt, Climate Code Red, August 23, 2014


Download report (16 pages)
Myth 3: Big tipping points are unlikely before 2°C

Tipping points, often an expression of non-linear events, are difficult to project. But if it is sometimes hard to see tipping points coming, it is also too late to be wise after the fact. Estimated tipping points around or below ~1.5 ºC include: 

  • West Antarctic Ice Sheet: Current conditions affecting the West Antarctic Ice Sheet are sufficient to drive between 1.2 and 4 metres of sea rise, and these glaciers are now in "unstoppable" meltdown at global average warming of just 0.8 ºC (NASA, 2014A; Rignot, Mouginot et al., 2014; Joughin, Smith et al., 2014). 

  • Loss of summer Arctic sea-ice: Because climate models generally have been poor at dealing with Arctic sea-ice retreat (see summary of literature at Spratt, 2013), expert elicitations play a key role in considering whether the Arctic has passed a very significant and “dangerous” tipping point, including Steffen (quoted by Cubby, 2012), Livinia and Lenton (2013), UWA (2012), Serreze (quoted by Romm, 2012), Wadhams (2012; quoted by Vidal, 2012), Maslowski, Kinney et al. (2012) and Laxton (quoted by McKie, 2012). Duarte, Lenton et al. (2012) find that: “Warming of the Arctic region is proceeding at three times the global average, and a new ‘Arctic rapid change’ climate pattern has been observed in the past decade.” Reductions in the sea-ice cover are believed to be the largest contributor toward Arctic amplification. Maslowski, Kinney et al. (2012) note that: “a warming Arctic climate appears to affect the rate of melt of the Greenland ice sheet, Northern Hemisphere permafrost sea-level rise, and global climate change.” It is worth noting that one month of sea-ice-free summer conditions in the Arctic each year would add approx. 0.2 °C to global warming (Hudson, 2011), an event that though credible in the next few decades is not taken into account in any carbon budget modelling. 
  • Greenland Ice Sheet (GIS): Current-generation climate models are not yet all that helpful on GIS. They have a poor understanding of the processes involved, and acceleration, retreat and thinning of outlet glaciers are not represented (Maslowski, Kinney et al., 2012). Estimated tipping point for GIS is +1.6 ºC with an uncertainty range of +0.8 to +3.2 ºC (Robinson, Calov et al., 2012). A recent study finds that deep canyons will contribute to more rapid GIS deglaciation (NASA, 2014B; Morlighem, Rignot et al., 2014). Contrary to previous studies, that estimated it would take centuries to millennia for new climates to increase the temperature deep within ice sheets such as GIS, the influence of melt water means warming can occur within decades and produce rapid accelerations (Phillips, Rajaram et al., 2013; University of Colorado Boulder, 2013). As well, "rapid iceberg discharge is possible in regions where highly crevassed glaciers are grounded deep beneath sea level, indicating portions of Greenland and Antarctica that may be vulnerable to rapid ice loss through catastrophic disintegration” (Basis and Jacobs 2013). Informally, many leading cryosphere scientists say the GIS has passed its tipping point, "is already lost" and similar sentiments (pers. com.). With Arctic amplification of around three times average global warming, it is hard to conceive that GIS deglaciation will other than continue to accelerate as reflectivity declines, and late-summer ocean conditions become ice-free. In 2012, then NASA climate science chief James Hansen told Bloomberg that: “Our greatest concern is that loss of Arctic sea ice creates a grave threat of passing two other tipping points – the potential instability of the Greenland ice sheet and methane hydrates… These latter two tipping points would have consequences that are practically irreversible on time scales of relevance to humanity” (Morales, 2012).  
  • Coral reefs: “Preserving more than 10% of coral reefs worldwide would require limiting warming to below +1.5 °C (atmosphere–ocean general circulation models (AOGCMs) range: 1.3–1.8 °C) relative to pre-industrial levels” (Frieler, Meinshausen et al., 2013). At 10%, the reefs would be remnant, and the ecosystems as we know them today would be a historical footnote. Data suggests the area of reef systems has already been reduced by half around the world.  
  • Permafrost:  In February 2013, scientists using radiometric dating techniques on Russian cave formations to measure melting rates warned that a 1.5 ºC global rise in temperature compared to pre-industrial was enough to start a general permafrost melt. Vaks, Gutareva et al. (2013) found that “global climates only slightly warmer than today are sufficient to thaw extensive regions of permafrost.” Vaks says that “1.5 ºC appears to be something of a tipping point.” In May 2013, Brigham-Grette, Melles et al., (2013) published evidence from Lake El’gygytgyn, in northeast Arctic Russia, showing that 3.6–3.4 million years ago, summer mid-Pliocene temperatures locally were ~8 °C warmer than today, when CO2 was ~400 ppm (a similar level to today). This is highly significant because researchers say the tipping point for large-scale permafrost carbon loss is around +8–10ºC regional temperature increase (Bitz, Ridley et al, 2009). As well, research from Ballantyne, Axford et al. (2013) finds that during the Pliocene epoch, when CO2 levels were ~400 ppm, Arctic surface temperatures were 15–20 °C warmer than today’s surface temperatures. Soon to be published work by Shakhova and Semiletov, as a follow-up to their 2013 paper on shallow-water, sea-floor sediment cores on the East Siberian Arctic Shelf, finds the ocean floor permafrost layer at "thaw point" temperature and "slushy" (pers. com.), suggesting vulnerability of the underlying methane hydrate stability zone, in the area where vast new methane plumes in the ocean are being observed in the 2014 northern summer (Papadopoulou, 2014). 
Figure 2. 2 °C of warming is not a safe target. The temperature reconstruction of Shakun, Clark et al. (2012) and Marcott, Shakun et al. (2013) is combined with the instrumental period data from HadCRUT4 and model average of IPCC projections for the A1B scenario up to 2100.
In summary, there is a very high risk that further significant tipping points will be passed before warming reaches 2 °C. Some of these are irreversible on time scales of centuries to a millenia.

Myth 4: We should mitigate for 2 °C but plan to adapt to 4 °C

The failure of international climate negotiations and insufficient national efforts have led many negotiators and commentators to conclude that warming will not be held to 2 ºC and much higher warming is likely. This has resulted in a policy approach of still trying to reduce emissions (mitigate) for 2 ºC, whilst also planning to adapt to 4 ºC of warming.

World Bank (2012) and Price Waterhouse Coopers (2012) reports complement a range of research that suggests the world is presently heading for 4 ºC or more of warming this century. Global average warming of 4 ºC means around 6 °C of warming over land, and perhaps 7–8 °C at the extremes. IEA Chief Economist Fatih Birol says that emission trends are “perfectly in line with a temperature increase of 6 °C, which would have devastating consequences for the planet” (Rose, 2012).



The notion that we can reasonably adapt to 4 °C is ill-founded because:

  • Climate researcher Rachel Warren says, “In… a 4 °C world, the limits for human adaptation are likely to be exceeded in many parts of the world, while the limits for adaptation for natural systems would largely be exceeded throughout the world. Hence, the ecosystem services upon which human livelihoods depend would not be preserved. Even though some studies have suggested that adaptation in some areas might still be feasible for human systems, such assessments have generally not taken into account lost ecosystem services” (Warren, 2010).
  • Professor Neil Adger says, "Thinking through the implications of 4 °C of warming shows that the impacts are so significant that the only real adaptation strategy is to avoid that at all cost because of the pain and suffering that is going to cost... There is no science on how we are going to adapt to 4 °C warming. It is actually pretty alarming" (Randerson, 2008).
  • At 4 °C hotter, the world would be warmer than during any part of the period in which modern humans evolved, and the rate of climate change would be faster than any previously experienced by humans. The world's sixth mass extinction would be in full swing. In the oceans, acidification would have rendered many calcium-shelled organisms such as coral and many at the base of the ocean food chain artifacts of history. Ocean ecosystems and food chains would collapse (literature surveyed by Spratt, 2011). 
  • Warming of 4 ºC is sufficient to melt the polar ice sheets and produce 70 metres of sea-level rise over a longer period of time (Hansen, Sato et al., 2013).
  • Prof. Kevin Anderson (2011) says there is a widespread view amongst scientists that “a 4 °C future is incompatible with an organised global community, is likely to be beyond ‘adaptation,’ is devastating to the majority of ecosystems and has a high probability of not being stable.”
One question remains: if the world has practically speaking given up on holding to 2 °C and it is not possible for human civilization to survive in a 4 °C warmer world, what’s the plan? Some have suggested that in fact we have a substantial “carbon budget” available for the 2 °C target…

Myth 5: We have a substantial carbon budget left for 2 °C

The carbon budget has come to public prominence in recent years, including in the IPCC’s Fifth Assessment Report in 2013, as being the difference between the total allowable greenhouse gas emissions for 2 °C of warming, and the amount already emitted or spent. 
        But this is not as simple as it seems, because 2 °C means different things to different people:

  • The 2 °C cap: A cap is an upper boundary, not to be exceeded. This is implicit in international agreements such as the Copenhagen Accord and Cancun Agreements which aim to “hold the increase in global average temperature below 2 °C, and to take action to meet this objective consistent with science and on the basis of equity” and the position of the European Commission in 2007, to “ensure that global average temperatures do not exceed preindustrial levels by more than 2 °C” and to “adopt the necessary domestic measures… to ensure” this is the case (emphasis added). This language implies a very low probability of exceeding the target. This is consistent with the approach taken in catastrophic risk management, where the risk of failure must be very small (Dunlop, 2011). Climate change with its non-linear events, tipping points and irreversible events – such as mass extinctions, destruction of ecosystems, the loss of large ice sheets and the triggering of large-scale releases of greenhouse gases from carbon stores such as permafrost and methane clathrates – contains many possibilities for catastrophic failure.
  • The 2 °C target: A target can be overshot; in common parlance, we may “miss the target.” This is the language employed for the carbon budget, where misses are part of the target calculations. The IPCC gives carbon budgets only for 33%, 50% and 66% chances of keeping to 2 °C (IPCC, 2013).  Higher probabilities of achieving the target were not reported. The most stringent — at 66% — has a one-in-three chance of exceeding the target, and a range of outcomes from 1 °C to 3.1 °C (with 95% confidence).
Figure 3. The carbon budget and probability of success. The budget (vertical axis) is related to risk of failure (overshooting 
the 2 °C) (horizontal axis) along the blue curve.  Emissions to date are indicated by grey box, leaving the available budget as 
the distance between the blue curve and grey box. As chance of not exceeding the target increases from 33% (green) to 50% (orange) to 66% (red), the budget decreases. At 90% chance of not exceeding the target (black), no carbon budget remains.
 With this distinction between “cap” and “target” in mind:
  • For the 2 °C cap, and a risk-averse (low probability of less than 10%) approach of not exceeding the target, there is no carbon budget left for the 2 °C target: "…the combination of a 2 °C warming target with high probability of success is now unreachable" using the current suite of policy measures, because the budget has expired (Raupach, Harman et al., 2011; Raupach, 2013). See Figure 3. "[T]o provide a 93% mid-value probability of not exceeding 2 °C, the concentration would need to be stabilized at, or below, 350 ppmv CO2e, i.e., below current levels" (Anderson and Bows, 2008). If some reasonably optimistic assumptions are made about deforestation and food-related emissions (halving per unit of production) for the rest of the century, then most emission reduction scenarios are incompatible with holding warming to 2 °C, even with a high 50% probability of exceeding the target, and there is no budget left for fossil fuel emissions  (Anderson and Bows, 2008).
  • If we make some optimistic assumptions about how soon emissions peak and decline in the developing world (non-Annex 1 nations), there is no carbon budget available for developed nations (Annex 1 countries) (Anderson and Bows, 2011).
  • Accounting for the possible release of methane from melting permafrost and ocean sediment implies a substantially lower budget, but this was not done (IPCC, 2013).
The idea of a carbon budget and “allowable” emissions is dangerous, according to climate scientist Ken Caldeira:
There are no such things as an 'allowable carbon dioxide (CO2) emissions.' There are only 'damaging CO2 emissions' or 'dangerous CO2 emissions.' Every CO2 emission causes additional damage and creates additional risk. Causing additional damage and creating additional risk with our CO2 emissions should not be allowed. If you look at how our politicians operate, if you tell them you have a budget of XYZ, they will spend XYZ. Politicians will reason: 'If we’re not over budget, what’s to stop us to spending? Let the guys down the road deal with it when the budget has been exceeded.' The CO2 emissions budget framing is a recipe for delaying concrete action now (Caldeira, quoted by Romm, 2013B).
Finally, we need to remember that the current level of greenhouse gases is already enough for more than 2 °C of warming, though some gases such as methane are relatively short-lived in the atmosphere. Ramanthan and Feng (2008) calculated that the observed increase in the concentration of greenhouse gases (GHGs) since the pre-industrial era has most likely committed the world to a warming of 2.4 °C (within a range of +1.4 °C to +4.3 °C) above the pre-industrial surface temperatures.
Note: References available at PDF download

Saturday, June 14, 2014

Climate Roulette: Elmar Kriegler et al.

Climate roulette

by James Dacey, physicsworld.com, March 16, 2009

Mankind is playing a Russian roulette with the climate, according to a study published today in the Proceedings of the National Academy of Sciences.

Elmar Kriegler, of the Potsdam Institute for Climate Impact Research, and his colleagues sought to find out what leading scientists really think will happen to the climate.

So Kriegler surveyed 43 scientists to gauge the impact of rising temperatures on five major components of the global climate system.

They calculate a 1-in-6 chance that a “tipping event” will occur if the temperature increases by 2-4 C in the next 200 years.

The 5 systems concerned are:

  • Major changes in the North Atlantic Ocean circulation
  • The Greenland and West Antarctic ice sheets
  • The Amazon rainforest and El Nino

They define a tipping point as “the event of initiating the transition, or making its future initiation inevitable.” 

Essentially they are saying that beyond these points the climate will reach a kind of elastic limit -- beyond which, we will feel the wrath of the climate, and there’ll be nothing we can do about it.

Realizing that previous surveys have been met with a fair degree of apathy, they used “imprecise probabilities” -- a part of Bayesian statistics.

This new mathematics has been controversial, but advocates say it can weigh up a given hypothesis in a more rounded way than classical statistics.

Developed in the 1980s and 1990s, Bayesian statistics seem to have gained most traction in the field of operations research and economic decision making.

“The currently discussed long-term targets of 50% reduction globally by 2050 (and 80% reduction for the industrial countries), with a continuing reduction after 2050, is an important step in this direction, but does not guarantee the reaching of the 2 degree target,” Kriegler told physicsworld.com.

This may sound like a very gloomy forecast, but Kriegler was a bit more pragmatic about taking coordinated international action:

“Nevertheless, these [targeted reductions] are a useful benchmark to focus the minds of politicians and society. Reaching this goal requires at least the following -- in the order of importance:

  1. A massive decarbonization of the energy system, starting in the electricity sector
  2. A strong increase in energy efficiency
  3. A stop to tropical deforestation, and an increase of the forest area in the tropics in the long run
  4. A massive reduction of CH4 and N2O emissions from the agricultural sector

http://www.iop.org/mt4/mt-tb.cgi/2860

Monday, May 19, 2014

MUST READ: Fugitive Methane Emissions from Fracking Oil and Gas Production Can Cause a ‘Global Catastrophe’ and Point of No Return

by Bobby Magill, Climate Central, May 15, 2014


A Cornell University scientist's claims that oil and gas development is so harmful to the climate that methane emissions and oil and gas production in general need to be cut back immediately to avoid a "global catastrophe" are adding more fuel to the scientific debate over the climate implications of shale oil and gas production. 
Fossil fuels production is the largest methane pollution source in the U.S., and ignoring those emissions will lead to a climate change “tipping point” from which there is no return, Cornell environmental biology professor Robert Howarth said in a statement Wednesday. He was unavailable for an interview.
Excess methane is often burned off from oil and gas production and distribution systems. Credit: Center for Enabling New Technologies Through Catalysis
Though scientists say there are avenues to preventing catastrophe other than curbing methane emissions, Howarth’s previous research with Cornell environmental engineering professor Anthony Ingraffea and others concluded that the climate impact of natural gas produced from shale — most of which involves hydraulic fracturing, or fracking — may be worse than that of coal and crude oil. That's because methane leaks from natural gas production have a greater effect on the climate than carbon dioxide emissions, Howarth said. 
Over a 100-year timeframe, methane is about 34 times as potent as a climate change-driving greenhouse gas than carbon dioxide, and over 20 years, it's 86 times more potent. Of all the greenhouse gases released by humans globally, methane contributes more than 40 percent of all radiative forcing, a measure of trapped heat in the atmosphere and a measuring stick of a changing climate, Howarth said.
“We have to control methane immediately, and natural gas is the largest methane pollution source in the United States,” Howarth said. “If we hit a climate-system tipping point because of methane, our carbon dioxide problem is immaterial. We have to get a handle on methane, or increasingly risk global catastrophe.”
Howarth's research is controversial, with the energy industry trying to discredit his work and other scientists questioning his methods. Those questions come amid a steady stream of studies released over the past year that strongly suggest either that methane emissions emanating from oil and gas fields are greater than U.S. Environmental Protection Agency estimates or that the impact those emissions will have on climate change is extremely complex and difficult to determine. And even many scientists who agree with Howarth's research say there are other ways to curb methane emissions without shutting down natural gas production.
In other words, Howarth's critics say, methane's effect on the climate is too complicated to demand that emissions be cut dramatically and immediately. 
Howarth's new paper, to be published May 20 in the journal Energy Science and Engineering, reviews much of the oil and gas-related methane emissions research conducted nationwide over the 4 years since his initial methane research was published in 2011, and in the context of the Intergovernmental Panel on Climate Change’s Fifth Assessment Report released last year.
Howarth’s conclusion: Producing natural gas of any kind has a worse greenhouse gas footprint than burning coal and crude oil over a 20-year timeframe. In other words, the idea that natural gas is a “bridge fuel” between carbon-producing coal and clean renewable energy sources simply isn’t true, especially if natural gas is used for home heating, the study says.
At best, Howarth said natural gas might lead to a “very modest” reduction in greenhouse gas emissions if it is used in place of coal to generate electricity and only with “unprecedented” investment in natural gas infrastructure and regulatory oversight.
The paper is the latest in a long line of recent studies suggesting that methane emissions from shale oil and natural gas production and distribution equipment are much greater than previously thought.
A study by researchers from Purdue and Cornell universities published in April showed that natural gas drilling could emit up to 1,000 times the methane previously thought.
Just last week, the Cooperative Institute for Research in Environmental Sciences at the University of Colorado-Boulder released a study by National Oceanic and Atmospheric Administration atmospheric scientist Gabrielle Petron showing that an airplane flying over a large northeast Colorado shale oil and gas field measured atmospheric methane concentrations three times greater than U.S. Environmental Protection Agency estimates for the area.
EPA estimates are based on oil industry-reported data. In the EPA’s summary of its latest greenhouse gas emissions inventory, the agency cited one of Petron’s earlier methane emissions studies as evidence that the EPA’s industry-based methane estimates differ from the results of research that involves actual emissions measurements. The summary says the EPA “has engaged with researchers” on how measurements could improve understanding of inventory estimates.
“These discrepancies are substantial,” Petron said in a May 7 statement. “Emission estimates or ‘inventories’ are the primary tool that policymakers and regulators use to evaluate air quality and climate impacts of various sources, including oil and gas sources. If they’re off, it’s important to know.”
But different methods of measuring methane emissions get different results, and it's critical those differences be reconciled, said Robert Jackson, a professor of global environmental change at Duke University whose research has shown methane leaks are a hazard in natural gas distribution systems in the U.S. 
By using an airplane to fly over an oil and gas field to directly measure methane concentrations in the air, Petron's study used a "top-down" approach to estimating oil and gas field emissions. Other scientists have used a "bottom up" approach by measuring emissions from oil and gas facilities on the ground, a method used in a University of Texas study published last year suggesting fracked natural gas wells leak less methane than the EPA previously estimated. 
The simplest explanation for the discrepancy is that a few oil and gas wells emit a lot of methane, while others measured in "bottom up" studies release much less methane, Jackson said. Hundreds or thousands of wells would have to be sampled on the ground for the "bottom up" studies to accurately measure emissions, he said. 
"The key point is the data that have come in in the last couple of years, it's not a huge dataset," Jackson said. "The data that have come in seem to suggest the EPA estimates are too low. Will they turn out that they're high enough that Bob Howarth is right? We don't know that yet, and it may not be the case." 
The overall implications of natural gas production on a changing climate are extremely complicated, a Duke University study published in April by researchers Richard Newall and Daniel Raimi concluded.
Natural gas use can increase overall energy use and alter economy-wide greenhouse gas emissions, but it's unclear whether that means an increase or decrease in those emissions, and without specific emission targets, trends in atmospheric greenhouse gas emissions aren't likely to change even with widespread use of natural gas, Newall and Raimi conclude.
Howarth disagrees, saying there’s enough evidence that the climate implications of methane emissions from oil and gas development could be catastrophic and that it’s important to act now.
Crude oil tanks in northeast Colorado's suburban Wattenberg oil field, where measurements showed atmospheric methane concentrations were three times the levels reported in EPA inventories. Scientists say most of that methane came from the oil and gas operations in the area. Credit: Bobby Magill

If shale oil and gas methane emissions aren’t reined in quickly, the earth could warm a critical 2 °C within 15 to 35 years, he said. In order for the earth to avoid the most serious consequences of global warming, the planet’s average temperature cannot warm more than 2 °C above where it was in the 1800s. Global average temperatures have already warmed 1 °C.
Lawrence Cathles, a Cornell earth and atmospheric sciences professor whose criticism of Howarth's previous research made national headlines along with Howarth's rebuttal, said the science does not suport Howarth's claim that immediate curbs on methane emissions are necessary to avoid the 2 °C warming thresshold. 
"For methane to be a significant climate driver between now and 2035, its rate of increase in the atmosphere would need to accelerate dramatically, and so far we don't see this happening," Cathles said. "Curbs on methane emissions are desirable, but they will make a small player in climate change even smaller, and reducing emission rates below present levels is not a matter of necessity in controlling global warming." [Yeah, right.]
[snip]
Drew Shindell, a NASA Goddard Institute for Space Studies scientist on whose research Howarth draws but was not involved in Howarth's study, said that Howarth’s research is sound, but slashing methane emissions from natural gas isn’t the only way to keep global warming under 2 °C.
Keeping the earth from warming will involve more than cutting carbon dioxide emissions alone or methane alone. Cutting a combination of some CO2, some methane, some black carbon and anything else that contributes to radiative forcing could keep warming down, too, Shindell said.
Regarding Howarth’s views denying that natural gas is a bridge fuel, Shindell said Howarth is pointing out that unless the methane leak rate from natural gas production and distribution is extraordinarily low, the reduction in greenhouse gas emissions compared to coal doesn’t exist.
If natural gas could be produced with the very lowest possible methane leak rate, natural gas might come out ahead of coal for greenhouse gases, Shindell said.
“Whether that’s feasible, I don’t know,” he said.
In his paper, Howarth is adamant that replacing climate-changing coal with climate-changing natural gas does nothing to slow global warming.
“Society needs to wean itself from the addiction to fossil fuels as quickly as possible,” Howarth said in a statement. “But to replace some fossil fuels — coal, oil — with another, like natural gas, will not suffice as an approach to take on global warming. Rather, we should embrace the technologies of the 21st century and convert our energy systems to ones that rely on wind, solar and water power.”
Jackson said he wouldn't quite go so far as to call for running away from natural gas. 
"We need to do everything we can to cut methane emissions right now," Jackson said. "Using less natural gas might be one approach, but given that we are going to continue to use natural gas, my research (focuses on) how can we detect leaks quickly and fix them cheaply to reduce that leakage term?"  [Would need the states' departments of natural resources to write and enforce regulations for detection, measurement, and remediation, and hire and train thousands of inspectors -- not gonna happen.  For example, at the moment, the Illinois DNR has so few inspectors that it would take over 300 years to inspect wells.  And the Illinois DNR has no will to enforce anything on oil and gas companies -- currently, they are bending over backwards to help the oil and gas companies run roughshod over anything in their way.]
Shifting over completely to renewables would be great for human health and the environment, "but that's not the world we live in," Jackson said. "I want to know if you turn the spigot off for natural gas, do we get wind or do we get a new coal plant? And Bob (Howarth) might say that even if we got a coal plant, that's a good thing. It's not so black and white for me."

Tuesday, December 10, 2013

Wieslaw Maslowski of the US Naval Postgraduate School predicts lower-bound summer ice free Arctic by 2016

Is conventional modelling out of pace with speed and abruptness of global warming?

Arctic Sunrise among broken floes of Arctic sea ice
Greenpeace icebreaking ship, Arctic Sunrise, among broken floes of Arctic sea ice, photographed from the air. This image was taken in the Fram Strait, in the month that the sea ice coverage receded to the second lowest extent since records began. Photograph: Nick Cobbing
by Nafeez Ahmed, The Guardian, December 9, 2013
An ongoing US Department of Energy-backed research project led by a US Navy scientist predicts that the Arctic could lose its summer sea ice cover as early as 2016  84 years ahead of conventional model projections.
The project, based out of the US Naval Postgraduate School's Department of Oceanography, uses complex modelling techniques that make its projections more accurate than others.
paper by principal investigator Professor Wieslaw Maslowski in the Annual Review of Earth and Planetary Sciences sets out some of the findings so far of the research project:
"Given the estimated trend and the volume estimate for October–November of 2007 at less than 9,000 km3, one can project that at this rate it would take only 9 more years or until 2016 ± 3 years to reach a nearly ice-free Arctic Ocean in summer. Regardless of high uncertainty associated with such an estimate, it does provide a lower bound of the time range for projections of seasonal sea ice cover."
The paper is highly critical of global climate models (GCM) and even the majority of regional models, noting that "many Arctic climatic processes that are omitted from, or poorly represented in, most current-generation GCMs" which "do not account for important feedbacks among various system components." There is therefore "a great need for improved understanding and model representation of physical processes and interactions specific to polar regions that currently might not be fully accounted for or are missing in GCMs."
According to the US Department of Energy describing the project's development of the Regional Arctic System Model (RASM):
"Given that the Arctic is warming faster than the rest of the globe, understanding the processes and feedbacks of this polar amplification is a top priority. In addition, Arctic glaciers and the Greenland Ice Sheet are expected to change significantly and contribute to sea level rise in the coming decades."
Such Arctic changes "could have significant ramifications for global sea level, the ocean thermohaline circulation and heat budget, ecosystems, native communities, natural resource exploration, and commercial transportation."
The regional focus of RASM permits "significantly higher spatial resolution" to represent and evaluate the interaction of "important fine-scale Arctic processes and feedbacks," such as:
"... sea ice deformation, ocean eddies, and associated iceocean boundary layer mixing, multiphase clouds as well as landatmosphereiceocean interactions."
The role of the Department of Energy in backing the research is not surprising considering that President Obama's national Arctic strategy launched in May is focused on protecting commercial and corporate opportunities related to control of the region's vast untapped oil, gas and mineral resources.
The model coheres with the predictions of several other Arctic specialists  namely Prof Peter Wadhams, head of polar ocean physics at Cambridge University and Prof Carlos Duarte, director of the Ocean Institute at the University of Western Australia  who see the disappearance of the Arctic sea ice in the summer of 2015 as likely.
Prof Wadhams is co-author of the controversial Nature paper which calculated the potential economic costs of climate change based on a scenario of 50 Gigatonnes (Gt) of methane being released this century from melting permafrost at the East Siberia Arctic Shelf (ESAS), a vast region of shallow-water covered continental crust. The scenario was first postulated by Natalia Shakhova and Igor Semiletov of the International Arctic Research Centre at the University of Alaska, Fairbanks.
In 2010, Shakhova's team published results showing that 7 teragrammes of methane was bubbling to the surface annually in the ESAS. Last month, she released a new paper in Nature Geoscience updating these findings on the basis of more rigorous measurements using an unmanned underwater vehicle with advanced sonar capability. She found that annual bottom water temperatures have increased over the last 14 years, correlating with a release of about 17 teragrammes of methane a year, accentuated by storms. This conservative estimate is more than double the earlier assessment.
However, the source of these methane emissions remains a matter of dispute, as other scientists investigating the phenomenon point out that while large deposits of methane hydrates could be breaking up, the other possibility is a slow leak of methane that has already gone on for hundreds of years. Christian Berndt, of the GEOMAR/Helmholz Centre for Ocean Research, has speculated that both phenomena could be going on at once, but he admits, "We have no proof."
Despite their latest study uncovering higher levels of methane than previously recognised, Shakhova has also distanced herself from the 'methane bomb' scenario she had once previously posited, noting a lack of direct evidence for the scenario.
Commenting on the study, the US National Snow & Ice Data Centre (NSIDC) observes:
"Ship-based observations show that methane concentrations in the air above the East Siberian Sea Shelf are nearly twice as high as the global average... Layers of sediment below the permafrost slowly emit methane gas, and this gas has been trapped for millennia beneath the permafrost. As sea levels rose at the end of the ice age, the shelf was once again covered by relatively warm ocean water, thawing the permafrost and releasing the trapped methane... In the short-term... methane has a global warming potential 86 times that of carbon dioxide."
Most scientists agree that more research is needed to determine the source and nature of these methane emissions.
But scientists also largely agree that an ice free Arctic in the summer could have serious consequences for the global climate. Some research has pointed out a link between the warming Arctic and changes in the jet stream, contributing to unprecedented weather extremes over the last few years. These extreme events in turn have dramatically impacted crop production in key food basket regions.

A landmark new study in Nature Climate Change finds the melting of the sea ice over the last 30 years at a rate of 8% per decade is directly linked to extreme summer weather in the US and elsewhere in the form of droughts and heatwaves. Lead study author Quihang Tang at the Institute of Geographic Sciences and Natural Resources Research in Beijing said:
"As the high latitudes warm faster than the mid-latitudes because of amplifying effects of melting ice, the west-to-east jet-stream wind is weakened. Consequently, the atmospheric circulation change tends to favour more persistent weather systems and a higher likelihood of summer weather extremes."
The new study supplements earlier research published in Geophysical Research Letters demonstrating a link between Arctic sea ice loss and extreme weather particularly in both the summer and winter, including prolongation of "drought, flooding, cold spells, and heat waves."
Last year Prof Duarte was lead author of a paper in the Royal Swedish Academy of Science's journal AMBIO warning that the Arctic was at risk of passing critical "tipping points" that could lead to a cascading "domino effect once the summer sea ice is lost." Prof Duarte said at the time:
"If set in motion, they can generate profound climate change which places the Arctic not at the periphery but at the core of the Earth system. There is evidence that these forces are starting to be set in motion. This has major consequences for the future of human kind as climate change progresses."
Dr Nafeez Ahmed is executive director of the Institute for Policy Research & Development and author of A User's Guide to the Crisis of Civilisation: And How to Save It among other books. Follow him on Twitter @nafeezahmed

Saturday, November 16, 2013

Jeff Masters: Haiyan's true intensity and death toll still unknown

by Jeff Masters, wunderblog, November 15, 2013

A full week after one of the strongest tropical cyclones in world history devastated the Philippines, the full extent of the death and destruction wrought by Super Typhoon Haiyan is still not fully known, nor do we have actual ground measurements of the storm's peak winds and lowest pressure. The Philippines ‪National Disaster Risk Reduction and Management Council‬ estimates 3,432 people were killed, and the U.N. puts this number at 4,460. This makes Haiyan the 2nd deadliest Philippines tropical cyclone in history, behind Tropical Storm Thelma of 1991, which killed 5,081-8,165 people. Damage is estimated at $12-$15 billion, or about 5% of the Philippines' GDP.


Figure 1. Infrared VIIRS image of the eye of Haiyan taken at 16:19 UTC November 7, 2013. At the time, Haiyan was at peak strength with 195 mph sustained winds. Image credit: NOAA/CIRA.

What was Haiyan's lowest pressure?
The Japan Meteorological Agency (JMA) estimated that Haiyan's central pressure was 895 mb at landfall, which would make it the 12th strongest tropical cyclone in world history (by pressure.) We now have pressure measurements from Haiyan's second landfall in Tacloban, where a group of storm chasers deployed two high-quality, Kestrel pressure instruments in the Hotel Alejandro, in the heart of the Downtown district (11.2414 N, 125.0036 E). This location was about 18 miles north of the center of the eye, and did not receive the typhoon's strongest winds, which probably occurred about two miles to the south, judging by the zoomed-in radar image from landfall (Figure 3). 


Josh Morgerman of iCyclone.com was kind enough to send me plots of the data recorded from their instruments. Device 1 measured a minimum pressure of 960.8 mb at 7:12 a.m., and their Device 2 measured 960.3 mb at 7:20 a.m. Josh talked to a source at the Tacloban Airport, located about 1 mile farther to the south, who said that the airport measured 955.6 mb at 7:15 am, before power was lost. These readings suggest that Haiyan had a pressure gradient of about 4 mb per mile. If we assume the airport was 17 miles north of the center of the eye, and there was a 4 mb/mile pressure gradient, Haiyan could have had an 888 mb central pressure. 

An email I received from NHC hurricane specialist Dr. Jack Beven documented several cases of Category 5 tropical cyclones with extreme pressure gradients:

Hurricane Andrew, 1992 (South Florida): 60 mb in 14 miles (4.3 mb/mile)
Hurricane Wilma, 2005 (in Caribbean): 94 mb in 14 miles (6.7 mb/mile)
Super Typhoon Megi, 2010 (east of Philippines): 60 mb in 14 miles (4.6 mb/mile)
September 1933 hurricane (ship measurement): 45 mb in 6 miles (7.5 mb/mile)
Hurricane Felix, 2007 (in Caribbean): 63 mb in 14 miles (4.5 mb/mile)

So, it is certainly possible that Haiyan had a pressure below 900 mb, but we will probably never know for certain.


Figure 2. Pressure observed in downtown Tacloban during the passage of Super Typhoon Haiyan on November 8, 2013, by Josh Morgerman of iCyclone.com. This sensor bottomed out at 960.3 mb at 7:20 a.m., at a location a few miles north of the northern edge of the eye. If we look at the Tacloban airport pressure readings in the last 3 hours they sent data on November 8, the readings were 1001.1 mb, 1000.9 mb, and 997.3 mb, at 12 a.m., 1 a.m., and 2 a.m., respectively. The iCyclone instrument recorded 1002 mb, 1000 mb, and 998 mb at those times, so the two instruments agreed to within 1 mb.


Figure 3. Radar image of Super Typhoon Haiyan over Tacloban, on November 8, 2013. Tacloban was in the north (strongest) portion of Haiyan's eyewall, at a time when the typhoon's top sustained winds over water were estimated at 185 mph. Image credit: http://climatex.ph.

How strong were Haiyan's winds at initial landfall in Guiuan?
Haiyan's strongest winds occurred on the south shore of Samar Island and the city of Guiuan (population 47,000), where the super typhoon initially made landfall with 1-minute average winds estimated at 195 mph. This estimate came from the Joint Typhoon Warning Center (JTWC), and was based on satellite measurements. We have no ground level or hurricane hunter measurements to verify this estimate. The Japan Meteorological Agency (JMA), which uses their own techniques to estimate typhoon strength via satellite imagery, put Haiyan's peak strength at 125 knots (145 mph), using a 10-minute averaging time for wind speeds. The averaging time used by JTWC and NHC is 1-minute, resulting in a higher wind estimate than the 10-minute average winds used by JMA and PAGASA in their advisories. To convert from 10-minute averaged winds to 1-minute average, one conversion factor that is commonly used is to multiply by 1.14 -- though lower conversion factors are sometimes used. JMA satellite strength estimates are consistently much lower than those from JTWC for high-end Category 5 strength typhoons; JTWC estimates are the ones most commonly used by the hurricane research community. A searchable database going back to 1976 of the JMA typhoon information available at Digital Typhoon reveals that Haiyan is tied for second place as the strongest typhoon that JMA has rated, and was the strongest landfalling typhoon, when measured by wind speed. The only typhoon they rated as stronger was Super Typhoon Tip of 1979, but that storm weakened to Category 1 strength before making landfall in Japan. 


Figure 4. The 400-year-old Church of the Immaculate Conception (left) collapsed in Guiuan, Philippines, during Super Typhoon Haiyan. Image credit: J.B. Baylon and http://chuvaness.com/.

Typhoon and hurricane maximum wind speed estimates are only valid for over water exposure, and winds over land are typically reduced by about 15%, due to friction. This would put Haiyan's winds at 165 mph over land areas on the south shore of Samar Island. This is equivalent to a high end EF-3 tornado. 


Forty minutes before landfall, the airport in Guiuan reported sustained 10-minute average winds of 96 mph, with a pressure of 977 mb, before contact was lost. Damage photos of Guiuan show at least EF-2 scale damage (111-135 mph winds): Roofs torn off well-constructed houses; foundations of frame homes shifted; mobile homes completely destroyed; large trees snapped or uprooted; light-object missiles generated; cars lifted off ground. The mayor of the city had his car lifted off the ground and slammed into a building, which is consistent with at least EF-2 damage. 

There is possible EF-3 damage (136-165 mph winds) in the Guiuan damage photos, with the 400-year-old stone Church of the Immaculate Conception collapsed, and a bus toppled. EF-3 damage is defined as: Entire stories of well-constructed houses destroyed; severe damage to large buildings such as shopping malls; trains overturned; trees debarked; heavy cars lifted off the ground and thrown; structures with weak foundations blown away some distance. A detailed damage survey would be need to determine if EF-3 winds really did occur in Guiuan. 

Haiyan Links
Wunderblogger Lee Grenci discusses mesovorticies in the eye of Haiyan in his latest post.
Wunderground's weather historian Christopher C. Burt reviews the Philippine's typhoon history.
The University of Wisconsin CIMSS Satellite Blog has a great collection of satellite images of Haiyan.
NOAA's Michael Folmer has a post showing the unusual burst of lightning that occurred at landfall in Haiyan.
Hurricanes and Climate Change: Huge Dangers, Huge Unknowns, my August 2013 blog post.
Storm Chaser James Reynolds on Twitter, from Tacloban, Leyte.
Storm Chaser Jim Edds on Twitter, from Tacloban, Leyte.
Storm Chaser Josh Morgerman (iCyclone) on Facebook

The Philippine Red Cross is appealing for donations.

Portlight disaster relief charity is reaching out to disability organizations in the Philippines to provide durable medical equipment. and welcomes donations.


Figure 5. "Tipping Points" host Bernice Notenboom goes on a sled adventure in Greenland.

New "Tipping Points" episode, "Greenland Ice-sheet Melt," airs Saturday at 9 p.m. EDT, 8 p.m. CDT 


“Tipping Points,” the landmark 6-part climate change TV series that began airing in October on The Weather Channel, airs for the fifth time on Saturday night, November 16, at 9 p.m. EDT. The new episode, "The Greenland Ice-sheet Melt," goes on an expedition to the remote Inuit village of Qaanaaq to explore the rate Greenland Ice sheet melt and its effects on global ocean circulation. I make a short appearance 8 minutes into the episode to report on how much ice Greenland has lost in the past decade. The series is hosted by polar explorer and climate journalist Bernice Notenboom, the first woman to perform the remarkable triple feat of climbing Mt. Everest and walking to the North and South Poles. In each episode, Notenboom heads off to a far corner of the world to find scientists in the field undertaking vital climate research to try to understand how the climate system is changing and how long we have to make significant changes before we reach a tipping point--a point of no return when our climate system will be changed irreversibly.


http://www.wunderground.com/blog/JeffMasters/haiyans-true-intensity-and-death-toll-still-unknown