Category Archives: Climate

Missing the point about efficiency rebounds … again

Breakthrough’s Nordhaus and Shellenberger (N&S) spot a bit of open-loop thinking about LED lighting:

ON Tuesday, the Royal Swedish Academy of Sciences awarded the 2014 Nobel Prize in Physics to three researchers whose work contributed to the development of a radically more efficient form of lighting known as light-emitting diodes, or LEDs.

In announcing the award, the academy said, “Replacing light bulbs and fluorescent tubes with LEDs will lead to a drastic reduction of electricity requirements for lighting.” The president of the Institute of Physics noted: “With 20 percent of the world’s electricity used for lighting, it’s been calculated that optimal use of LED lighting could reduce this to 4 percent.”

The problem of course is that lighting energy use would fall 20% to 4% only if there’s no feedback, so that LEDs replace incandescents 1 for 1 (and of course the multiplier can’t be that big, because CFLs and other efficient technologies already supply a lot of light).

N&S go on to argue:

But it would be a mistake to assume that LEDs will significantly reduce overall energy consumption.

Why? Because rebound effects will eat up the efficiency gains:

“The growing evidence that low-cost efficiency often leads to faster energy growth was recently considered by both the Intergovernmental Panel on Climate Change and the International Energy Agency.”

“The I.E.A. and I.P.C.C. estimate that the rebound could be over 50 percent globally.”

Notice the sleight-of-hand: the first statement implies a rebound effect greater than 100%, while the evidence they’re citing describes a rebound of 50%, i.e. 50% of the efficiency gain is preserved, which seems pretty significant.

Presumably the real evidence they have in mind is http://iopscience.iop.org/0022-3727/43/35/354001 – authors Tsao & Saunders are Breakthrough associates. Saunders describes a 100% rebound for lighting here http://thebreakthrough.org/index.php/programs/energy-and-climate/understanding-energy-efficiency-rebound-interview-with-harry-saunders

Now the big non sequitur:

But LED and other ultraefficient lighting technologies are unlikely to reduce global energy consumption or reduce carbon emissions. If we are to make a serious dent in carbon emissions, there is no escaping the need to shift to cleaner sources of energy.

Let’s assume the premise is true – that the lighting rebound effect is 100% or more. That implies that lighting use is highly price elastic, which in turn means that an emissions price like a carbon tax will have a strong influence on lighting energy. Therefore pricing can play a major role in reducing emissions. It’s probably still true that a shift to clean energy is unavoidable, but it’s not an exclusive remedy, and a stronger rebound effect actually weakens the argument for clean sources.

Their own colleagues point this out:

In fact, our paper shows that, for the two 2030 scenarios (with and without solid-state lighting), a mere 12% increase in real electricity prices would result in a net decline in electricity-for-lighting consumption.

What should the real takeaway be?

  • Subsidizing lighting efficiency is ineffective, and possibly even counterproductive.
  • Subsidizing clean energy lowers the cost of delivering lighting and other services, and therefore will also be offset by rebound effects.
  • Emissions pricing is a win-win, because it encourages efficiency, counteracts rebound effects and promotes substitution of clean sources.

Reflections on Virgin Earth

Colleagues just pointed out the Virgin Earth Challenge, “a US$25 million prize for an environmentally sustainable and economically viable way to remove greenhouse gases from the atmosphere.”

John Sterman writes:

I think it inevitable that we will see more and more interest in CO2 removal. And IF it can be done without undermining mitigation I’d be all for it. I do like biochar as a possibility; though I am very skeptical of direct air capture and CCS. But the IF in the prior sentence is clearly not true: if there were effective removal technology it would create moral hazard leading to less mitigation and more emissions.

Even more interesting, direct air capture is not thermodynamically favored; needs lots of energy. All the finalists claim that they will use renewable energy or “waste” heat from other processes to power their removal technology, but how about using those renewable sources and waste heat to directly offset fossil fuels and reduce emissions instead of using them to power less efficient removal processes? Clearly, any wind/solar/geothermal that is used to power a removal technology could have been used directly to reduce fossil emissions, and will be cheaper and offset more net emissions. Same for waste heat unless the waste heat is too low temp to be used to offset fossil fuels. Result: these capture schemes may increase net CO2 flux into the atmosphere.

Every business knows it’s always better to prevent the creation of a defect than to correct it after the fact. No responsible firm would say “our products are killing the customers; we know how to prevent that, but we think our money is best spent on settling lawsuits with their heirs.” (Oh: GM did exactly that, and look how it is damaging them). So why is it ok for people to say “fossil fuel use is killing us; we know how to prevent that, but we’ve decided to spend even more money to try to clean up the mess after the pollution is already in the air”?

To me, many of these schemes reflect a serious lack of systems thinking, and the desire for a technical solution that allows us to keep living the way we are living without any change in our behavior. Can’t work.

I agree with John, and I think there are some additional gaps in systemic thinking about these technologies. Here are some quick reflections, in pictures.

EmittingCapturingA basic point for any system is that you can lower the level of a stock (all else equal) by reducing the inflow or increasing the outflow. So the idea of capturing CO2 is not totally bonkers. In fact, it lets you do at least one thing that you can’t do by reducing emissions. When emissions fall to 0, there’s no leverage to reduce CO2 in the atmosphere further. But capture could actively draw down the CO2 stock. However, we are very far from 0 emissions, and this is harder than it seems:

AirCapturePushbackNatural sinks have been graciously absorbing roughly half of our CO2 emissions for a long time. If we reduce emissions dramatically, and begin capturing, nature will be happy to give us back that CO2, ton for ton. So, the capture problem is actually twice as you’d think from looking at the excess CO2 in the atmosphere.

Currently, there’s also a problem of scale. Emissions are something like two orders of magnitude larger than potential markets for CO2, so there’s a looong way to go. And capture doesn’t scale like like a service running on Amazon Elastic Cloud servers; it’s bricks and mortar.

EmitCaptureScaleAnd where does that little cloud go, anyway? Several proposals gloss over this, as in:

The process involves a chemical solution (that naturally absorbs CO2) being brought into contact with the air. This solution, now containing the captured CO2, is sent to through a regeneration cycle which simultaneously extracts the CO2 as a high-pressure pipeline-quality product (ready to be put to numerous commercial uses) …

The biggest commercial uses I know of are beverage carbonation and enhanced oil recovery (EOR). Consider the beverage system:

BeverageCO2CO2 sequestered in beverages doesn’t stay there very long! You’d have to start stockpiling vast quantities of Coke in salt mines to accumulate a significant quantity. This reminds me of Nike’s carbon-sucking golf ball. EOR is just as bad, because you put CO2 down a hole (hopefully it stays there), and oil and gas come back up, which are then burned … emitting more CO2.

Fortunately the biochar solutions do not suffer so much from this problem.

Next up, delays and moral hazard:

CO2moralHazardThis is a cartoonish view of the control system driving mitigation and capture effort. The good news is that air capture gives us another negative loop (blue, top) by which we can reduce CO2 in the atmosphere. That’s good, especially if we mismanage the green loop. The moral hazard side effect is that the mere act of going through the motions of capture R&D reduces the perceived scale of the climate problem (red link), and therefore reduces mitigation, which actually makes the problem harder to solve.

Capture also competes with mitigation for resources, as in John’s process heat example:

ProcessHeat

It’s even worse than that, because a lot of mitigation efforts have fairly rapid effects on emissions. There are certainly long-lived aspects of energy and infrastructure that must be considered, but behavior can change a lot of emissions quickly and with off-the-shelf technology. The delay between air capture R&D and actual capturing, on the other hand, is bound to be fairly long, because it’s in its infancy, and has to make it through multiple discover/develop/deploy hurdles.

One of those hurdles is cost. Why would anyone bother to pay for air capture, especially in cases where it’s a sure loser in terms of thermodynamics and capital costs? Altruism is not a likely candidate, so it’ll take a policy driver. There are essentially two choices: standards and emissions pricing.

A standard might mandate (as the EPA and California have) that new power plants above a certain emissions intensity must employ some kind of offsetting capture. If coal wants to stay in business, it has to ante up. The silly thing about this, apart from inevitable complexity, is that any technology that meets the standard without capture, like combined cycle gas electricity currently, pays 0 for its emissions, even though they too are harmful.

Similarly, you could place a subsidy or bounty on tons of CO2 captured. That would be perverse, because taxpayers would then have to fund capture – not likely a popular measure. The obvious alternative would be to price emissions in general – positive for emissions, negative for capture. Then all sources and sinks would be on a level playing field. That’s the way to go, but of course we ought to do it now, so that mitigation starts working, and air capture joins in later if and when it’s a viable competitor.

I think it’s fine if people work on carbon capture and sequestration, as long as they don’t pretend that it’s anywhere near a plausible scale, or even remotely possible without comprehensive changes in incentives. I won’t spend my own time on a speculative, low-leverage policy when there are more effective, immediate and cheaper mitigation alternatives. And I’ll certainly never advise anyone to pursue a geoengineered world, any more than I’d advise them to keep smoking but invest in cancer research.

 

 

Climate Interactive – #12 climate think tank

Climate Interactive is #12 (out of 210) in the International Center for Climate Governance’s Standardized Ranking of climate think tanks (by per capita productivity):

  1. Woods Hole Research Center (WHRC)
  2. Basque Centre for Climate Change (BC3)
  3. Centre for European Policy Studies (CEPS)*
  4. Centre for European Economic Research (ZEW)*
  5. International Institute for Applied Systems Analysis (IIASA)
  6. Worldwatch Institute
  7. Fondazione Eni Enrico Mattei (FEEM)
  8. Resources for the Future (RFF)
  9. Mercator Research Institute on Global Commons and Climate Change (MCC)
  10. Centre International de Recherche sur l’Environnement et le De?veloppement (CIRED)
  11. Institut Pierre Simon Laplace (IPSL)
  12. Climate Interactive
  13. The Climate Institute
  14. Buildings Performance Institute Europe (BPIE)
  15. International Institute for Environment and Development (IIED)
  16. Center for Climate and Energy Solutions (C2ES)
  17. Global Climate Forum (GCF)
  18. Potsdam Institute for Climate Impact Research (PIK)
  19. Sandbag Climate Campaign
  20. Civic Exchange

That’s some pretty illustrious company! Congratulations to all at CI.

How many things can you get wrong on one chart?

Let’s count:

  1. stupidGraphTruncate records that start ca. 1850 at an arbitrary starting point.
  2. Calculate trends around a breakpoint cherry-picked to most favor your argument.
  3. Abuse polynomial fits generally. (See this series.)
  4. Report misleading linear trends by simply dropping the quadratic term.
  5. Fail to notice the obvious: that temperature in the second period is, on average, higher than in the first.
  6. Choose a loaded color scheme that emphasizes #5.
  7. Fail to understand that temperature integrates CO2.
  8. Fallacy of the single cause (only CO2 affects temperature – in good company with Burt Rutan).

Those crazy Marxists are at it again

“Normally, conservatives extol the magic of markets and the adaptability of the private sector, which is supposedly able to transcend with ease any constraints posed by, say, limited supplies of natural resources. But as soon as anyone proposes adding a few limits to reflect environmental issues — such as a cap on carbon emissions — those all-capable corporations supposedly lose any ability to cope with change.” Krugman – NYT

Geoengineering justice & governance

From Clive Hamilton via Technology Review,

If humans were sufficiently omniscient and omnipotent, would we, like God, use climate engineering methods benevolently? Earth system science cannot answer this question, but it hardly needs to, for we know the answer already. Given that humans are proposing to engineer the climate because of a cascade of institutional failings and self-interested behaviours, any suggestions that deployment of a solar shield would be done in a way that fulfilled the strongest principles of justice and compassion would lack credibility, to say the least.

Geoengineering seems sure to make a mess, even if the tech works.

How I learned to stop worrying and love methane

RealClimate has a nice summary of recent atmospheric methane findings. Here’s the structure:

methane2The bad news (red) has been that methane release from permafrost and clathrates on the continental shelf appears to be significant. At the same time, methane release from natural gas seems to be larger than previously thought, and (partly for the same reason – fracking) gas resources appear to be larger. Both put upward pressure on atmospheric methane.

However, there are some constraints as well. The methane budget must be consistent with observations of atmospheric concentrations and gradients (green). Therefore, if one source is thought to be bigger, it must be the case historically that other natural or anthropogenic sources are smaller (or perhaps uptake is faster) by an offsetting amount (blue).

This bad-news-good-news story does not rule out positive feedbacks from temperature or atmospheric chemistry, but at least we’re not cooked yet.

Golf is the answer

Lots of golf.

I couldn’t resist a ClimateDesk article mocking carbon-sucking golf balls, so I took a look at the patent.

I immediately started wondering about the golf ball’s mass balance. There are rules about these things. But the clever Nike engineers thought of everything,

Generally, a salt may be formed as a result of the reaction between the carbon dioxide absorbent and the atmospheric carbon dioxide. The presence of this salt may cause the golf ball to increase in weight. This increase in weight may be largely negligible, or the increase in weight may be sufficient to be measurable and affect the play characteristics of the golf ball. The United States Golf Association (USGA) official Rules of Golf require that a regulation golf ball weigh no more than 45.93 grams. Therefore, a golf ball in accordance with this disclosure may be manufactured to weigh some amount less than 45.93, so that the golf ball may increase in weight as atmospheric carbon dioxide is absorbed. For example, a finished golf ball manufactured in accordance with this disclosure may weigh 45.5 grams before absorbing any significant amount of atmospheric carbon dioxide.

Let’s pretend that 0.43 grams of CO2 is “significant” and do the math here. World energy CO2 emissions were about 32.6 MMT in 2011. That’s 32.6 gigatons or petagrams, so you’d need about 76 petaballs per year to absorb it. That’s 76,000,000,000,000,000 balls per year.

It doesn’t sound so bad if you think of it as 11 million balls per capita per year. Think of the fun you could have with 11 million golf balls! Plus, you’d have 22 million next year, except for the ones you whacked into a water trap.

Because the conversion efficiency is so low (less than half a gram CO2 uptake per 45 gram ball, i.e about 1%), you need 100 grams of ball per gram of carbon. This means that the mass flow of golf balls would have to exceed the total mass flow of food, fuels, minerals and construction materials on the planet, by a factor of 50.

76 petaballs take up about 4850 cubic kilometers, so we’d soon have to decide where to put them. I think Scotland would be appropriate. We’d only have to add a 60-meter layer of balls to the country each year.

A train bringing 10,000 tons of coal to a power plant (three days of fuel for 500MW) would have to make a lot more trips to carry away the 1,000,000 tons of balls needed to offset its emissions. That’s a lot of rail traffic, so it might make sense to equip plants with an array of 820 rotary cannon retrofitted to fire balls into the surrounding countryside. That’s only 90,000 balls per second, after all. Perhaps that’s what analysts mean when they say that there are no silver bullets, only silver buckshot. In any case, the meaning of “climate impacts” would suddenly be very palpable.

Dealing with this enormous mass flow would be tough, but there would be some silver linings. For starters, the earth’s entire fossil fuel output would be diverted to making plastics, so emissions would plummet, and the whole scale of the problem would shrink to manageable proportions. Golf balls are pretty tough, so those avoided emissions could be sequestered for decades. In addition, golf balls float, and they’re white, so we could release them in the arctic to replace melting sea ice.

Who knows what other creative uses of petaballs the free market will invent?

Update, courtesy of Jonathan Altman:

animal house marbles

Summary for Suckers

The NIPCC critique is, ironically, a compelling argument in favor of the IPCC assessment. Why? Well, science is about evaluation of competing hypotheses. The NIPCC report collects a bunch of alternatives to mainstream climate science in one place, where it’s easy to see how pathetic they are. If this is the best climate skeptics can muster, their science must be exceedingly weak.

The NIPCC (Nongovernmental International Panel on Climate Change, a.k.a. Not IPCC) is the Heartland Institute’s rebuttal of the IPCC assessments. Apparently the latest NIPCC report has been mailed to zillions of teachers. As a homeschooling dad, I’m disappointed that I didn’t get mine. Well, not really.

It would probably take more pages to debunk the NIPCC report than it occupies, but others are chipping away at it. Some aspects, like temperature cherry-picking, are like shooting fish in a barrel.

The SPM, and presumably the entire report that it summarizes, seems to labor under the misapprehension that the IPCC is itself a body that conducts science. In fact, the IPCC assessments are basically a giant literature review. So, when the Heartland panel writes,

In contradiction of the scientific method, the IPCC assumes its implicit hypothesis is correct and that its only duty is to collect evidence and make plausible arguments in the hypothesis’s favor.

we must remember that “the IPCC” is shorthand for a vast conspiracy of scientists, coordinated by an invisible hand.

The report organizes the IPPC argument into 3 categories: “Global Climate Model (GCM) projections,” “postulates,” and “circumstantial evidence.” This is a fairly ridiculous caricature of the actual body of work. Most of what is dismissed as postulates could better be described as, “things we’re too lazy to explore properly,” for example. But my eye strays straight to the report’s misconceptions about modeling.

First, the NIPCC seems to have missed the fact that GCMs are not the only models in use. There are EMICS (models of intermediate complexity) and low-order energy balance models as well.

The NIPCC has taken George Box’s “all models are wrong, some are useful” and run with it:

… Global climate models produce meaningful results only if we assume we already know perfectly how the global climate works, and most climate scientists say we do not (Bray and von Storch, 2010).

How are we to read this … all models are useless, unless they’re perfect? Of course, no models are perfect, therefore all models are useless. Now that’s science!

NIPCC trots out a von Neumann quote that’s almost as tired as Box:

with four parameters I can fit an elephant, and with five I can make him wiggle his trunk

In models with lots of reality checks available (i.e. laws of physics), it just isn’t that easy. And the earth is a very big elephant, which means that there’s a rather vast array of data to be fit.

The NIPCC seems to be aware of only a few temperature series, but the AR5 report devotes 200 pages (Chapter 9) to model evaluation, with results against a wide variety of spatial and temporal distributions of physical quantities. Models are obviously far from perfect, but a lot of the results look good, in ways that exceed the wildest dreams of social system modelers.

NIPCC doesn’t seem to understand how this whole “fit” thing works.

Model calibration is faulty as it assumes all temperature rise since the start of the industrial revolution has resulted from human CO2 emissions.

This is blatantly false, not only because it contradicts the actual practice of attribution, but because there is no such parameter as “fraction of temp rise due to anthro CO2.” One can’t assume the answer to the attribution question without passing through a lot of intermediate checks, like conforming to physics and data other than global temperature. In complex models, where the contribution of any individual parameter to the outcome is likely to be unknown to the modeler, and the model is too big to calibrate by brute force, the vast majority of parameters must be established bottom up, from physics or submodels, which makes it extremely difficult for the modeler to impose preconceptions on the complete model.

Similarly,

IPCC models stress the importance of positive feedback from increasing water vapor and thereby project warming of ~3-6°C, whereas empirical data indicate an order of magnitude less warming of ~0.3-1.0°C.

Data by itself doesn’t “indicate” anything. Data only speaks insofar as it refutes (or fails to refute) a model. So where is the NIPCC model that fits available data and yields very low climate sensitivity?

The bottom line is that, if it were really true that models have little predictive power and admit many alternative calibrations (a la the elephant), it should be easy for skeptics to show model runs that fit the data as well as mainstream results, with assumptions that are consistent with low climate sensitivity. They wouldn’t necessarily need a GCM and a supercomputer; modest EBMs or EMICs should suffice. This they have utterly failed to demonstrate.

 

Pindyck on Integrated Assessment Models

Economist Robert Pindyck takes a dim view of the state of integrated assessment modeling:

Climate Change Policy: What Do the Models Tell Us?

Robert S. Pindyck

NBER Working Paper No. 19244

Issued in July 2013

Very little. A plethora of integrated assessment models (IAMs) have been constructed and used to estimate the social cost of carbon (SCC) and evaluate alternative abatement policies. These models have crucial flaws that make them close to useless as tools for policy analysis: certain inputs (e.g. the discount rate) are arbitrary, but have huge effects on the SCC estimates the models produce; the models’ descriptions of the impact of climate change are completely ad hoc, with no theoretical or empirical foundation; and the models can tell us nothing about the most important driver of the SCC, the possibility of a catastrophic climate outcome. IAM-based analyses of climate policy create a perception of knowledge and precision, but that perception is illusory and misleading.

Freepers seem to think that this means the whole SCC enterprise is GIGO. But this is not a case where uncertainty is your friend. Bear in mind that the deficiencies Pindyck discusses, discounting welfare and ignoring extreme outcomes, create a one-sided bias toward a SCC that is too low. Zero (the de facto internalized SCC in most places) is one number that’s virtually certain to be wrong.