Read Romm on Gates & R&D vs deployment

This post from Joe Romm on where Bill Gates got it wrong in his TED speech is worth a read. Quick summary – Bill Gates took the ‘we need a technology miracle’ line & said we should spend the next 20 years researching & developing new technologies, and the 20 years after that implementing them. Romm argues that while R&D is important, deployment is far more important for preventing runaway climate change, as the majority of the emissions reduction ‘wedges’ will come from mass commercial deployment of existing technology, not new silver bullets fired from carbon-neutral guns. Like the US, Australia has taken important steps like the R&D tax credit & ‘Commercialisation Australia’, but we’re a long way away from anything really significant on the deployment front:

WHY DEPLOYMENT, FAR MORE THAN R&D, IS THE KEY TO BOTH INNOVATION AND STABILIZING AT OR BELOW 2°C.

I was acting assistant secretary (and principal deputy assistant secretary) of energy for energy efficiency and renewable energy from 1995 to 1998, helping to run the billion-dollar federal office in charge of research, development, demonstration, and deployment of most low-carbon technologies, including three of Gates’ would be miracles.  For much of that time I was in charge of technology and market analysis for the office.  Since then, I have written a number of books on low carbon technology development and deployment.

So I have thought a lot about whether Gates is right that we need multiple “energy miracles” developed through a $10 billion-a-year government R&D effort to stabilize at 350 to 450 ppm.

Put more quantitatively, the question is — What are the chances that multiple (4 to 8+) carbon-free technologies that do not exist today can each deliver the equivalent of 350 Gigawatts baseload power (~2.8 billion Megawatt-hours a year) and/or 160 billion gallons of gasoline cost-effectively by 2050? [Note — that is about half of a stabilization wedge.] For the record, the U.S. consumed about 3.7 billion MW-hrs in 2005 and about 140 billion gallons of motor gasoline.

Put that way, the answer to the question is painfully obvious: “two chances — slim and none.” Indeed, I have repeatedly challenged readers and listeners over the years to name even a single technology breakthrough with such an impact in the past three decades, after the huge surge in energy funding that followed the energy shocks of the 1970s. Nobody has ever named a single one that has even come close.

Yet somehow the government is not just going to invent one TILT (Terrific Imaginary Low-carbon Technology) in the next few years, we are going to invent several TILTs comparable to the microprocessor. Seriously. Hot fusion? No. Cold fusion? As if. Space solar power? Come on, how could that ever compete with solar baseload (aka CSP)? Hydrogen? It ain’t even an energy source, and after billions of dollars of public and private research in the past 15 years — including several years running of being the single biggest focus of the DOE office on climate solutions I once ran — it still has actually no chance whatsoever of delivering a major cost-effective climate solution by midcentury if ever (see “California Hydrogen Highway R.I.P.).

I don’t know why the energy miracle crowd can’t see the obvious — so I will elaborate here. I will also discuss a major study that explains why deployment programs are so much more important than R&D at this point. Let’s keep this simple:

  • To stabilize below 450 ppm, we need to deploy by 2050 some 12 to 14 stabilization wedges (each delivering 1 billion tons of avoided carbon) covering both efficient energy use and carbon-free supply (see here).  The technologies we have today, plus a few that are in the verge of being commercialized, can provide the needed low-carbon energy [see “How the world can stabilize at 350 to 450 ppm: The full global warming solution (updated)“].
  • Myriad energy-efficient solutions are already cost-effective today.  Breaking down the barriers to their deployment now is much, much more important than developing new “breakthrough” efficient TILTs, since those would simply fail in the marketplace because of the same barriers.  Cogeneration is perhaps the clearest example of this.
  • On the supply side, deployment programs (coupled with a price for carbon) will always be much, much more important than R&D programs because new technologies take an incredibly long time to achieve mass-market commercial success. New supply TILTs would not simply emerge at a low cost. They need volume, volume, volume — steady and large increases in demand over time to bring the cost down, as I discuss at length below.
  • No existing or breakthrough technology is going to beat the price of power from a coal plant that has already been built — the only way to deal with those plants is a high price for carbon or a mandate to shut them down. Indeed, that’s why we must act immediately not to build those plants in the first place.
  • If a new supply technology can’t deliver half a wedge, it won’t be a big player in achieving 350-450 ppm.

For better or worse, we are stuck through 2050 with the technologies that are commercial today (like solar thermal electric) or that are very nearly commercial (like plug-in hybrids).

I have discussed most of this at length in previous posts (listed below), so I won’t repeat all the arguments here. Let me just focus on a few key points. A critical historical fact was explained by Royal Dutch/Shell, in their 2001 scenarios for how energy use is likely to evolve over the next five decades (even with a carbon constraint):

“Typically it has taken 25 years after commercial introduction for a primary energy form to obtain a 1 percent share of the global market.”

Note that this tiny toe-hold comes 25 years after commercial introduction. The first transition from scientific breakthrough to commercial introduction may itself take decades. We still haven’t seen commercial introduction of a hydrogen fuel cell car and have barely seen any commercial fuel cells — over 160 years after they were first invented.

This tells you two important things. First, new breakthrough energy technologies simply don’t enter the market fast enough to have a big impact in the time frame we care about. We are trying to get 5% to 10% shares — or more — of the global market for energy, which means massive deployment by 2050 (if not sooner).

Second, if you are in the kind of hurry we are all in, then you are going to have to take unusual measures to deploy technologies far more aggressively than has ever occurred historically. That is, speeding up the deployment side is much more important than generating new technologies. Why? Virtually every supply technology in history has a steadily declining cost curve, whereby greater volume leads to lower cost in a predictable fashion because of economies of scale and the manufacturing learning curve.

WHY DEPLOYMENT NOW COMPLETELY TRUMPS RESEARCH

How do we achieve rapid innovation in existing technologies, as Gates suggests he wants?

A major 2000 report by the International Energy Agency, Experience Curves for Energy Technology Policy has a whole bunch of experience curves for various energy technologies. Let me quote some key passages:

Wind power is an example of a technology which relies on technical components that have reached maturity in other technological fields…. Experience curves for the total process of producing electricity from wind are considerably steeper than for wind turbines. Such experience curves reflect the learning in choosing sites for wind power, tailoring the turbines to the site, maintenance, power management, etc, which all are new activities.

Or consider PV:

Existing data show that experience curves provide a rational and systematic methodology to describe the historical development and performance of technologies….

The experience curve shows the investment necessary to make a technology, such as PV, competitive, but it does not forecast when the technology will break-even. The time of break-even depends on deployment rates, which the decision-maker can influence through policy. With historical annual growth rates of 15%, photovoltaic modules will reach break-even point around the year 2025. Doubling the rate of growth will move the break-even point 10 years ahead to 2015.

Investments will be needed for the ride down the experience curve, that is for the learning efforts which will bring prices to the break-even point. An indicator for the resources required for learning is the difference between actual price and break-even price, i.e., the additional costs for the technology compared with the cost of the same service from technologies which the market presently considers cost-efficient. We will refer to these additional costs as learning investments, which means that they are investments in learning to make the technology cost-efficient, after which they will be recovered as the technology continues to improve.

Here is a key conclusion:

for major technologies such as photovoltaics, wind power, biomass, or heat pumps, resources provided through the market dominate the learning investments. Government deployment programmes may still be needed to stimulate these investments. The government expenditures for these programmes will be included in the learning investments.

Obviously government R&D, and especially first-of-a-kind demonstration programs, are critical before the technology can be introduced to the marketplace on a large scale — and I’m glad Obama had doubled spending in this area. But, we “expect learning investments to become the dominant resource for later stages in technology development, where the objectives are to overcome cost barriers and make the technology commercial.”

We are really in a race to get technologies into the learning curve phase: “The experience effect leads to a competition between technologies to take advantage of opportunities for learning provided by the market. To exploit the opportunity, the emerging and still too expensive technology also has to compete for learning investments.”

In short, you need to get from first demonstration to commercial introduction as quickly as possible to be able to then take advantage of the learning curve before your competition does. Again, that’s why if you want mass deployment of the technology by 2050, we are mostly stuck with what we have today or very soon will have. Some breakthrough TILT in the year 2025 will find it exceedingly difficult to compete with technologies like CSP or wind that have had decades of such learning.

And that is why the analogy of a massive government Apollo program or Manhattan project is so flawed. Those programs were to create unique non-commercial products for a specialized customer with an unlimited budget. Throwing money at the problem was an obvious approach. To save a livable climate we need to create mass-market commercial products for lots of different customers who have limited budgets. That requires a completely different strategy.

The vast majority — if not all — of the wedge-sized solutions for 2050 will come from technologies that are now commercial or very soon will be. And federal policy must be designed with that understanding in mind. The IEA report concluded:

A general message to policy makers comes from the basic philosophy of the experience curve. Learning requires continuous action, and future opportunities are therefore strongly coupled to present activities. If we want cost-efficient, CO2-mitigation technologies available during the first decades of the new century, these technologies must be given the opportunity to learn in the current marketplace. Deferring decisions on deployment will risk lock-out of these technologies, i.e., lack of opportunities to learn will foreclose these options making them unavailable to the energy system.

… the low-cost path to CO2-stabilisation requires large investments in technology learning over the next decades. The learning investments are provided through market deployment of technologies not yet commercial, in order to reduce the cost of these technologies and make them competitive with conventional fossil-fuel technologies. Governments can use several policy instruments to ensure that market actors make the large-scale learning investments in environment-friendly technologies. Measures to encourage niche markets for new technologies are one of the most efficient ways for governments to provide learning opportunities. The learning investments are recovered as the new technologies mature, illustrating the long-range financing component of cost-efficient policies to reduce CO2 emissions. The time horizon for learning stretches over several decades, which require long-term, stable policies for energy technology.

Deployment, deployment, deployment, R&D, deployment, deployment, deployment.

In light of the problems with the roof insulation grant program, it’s clear that what Romm describes as the need for ‘unusual measures to deploy technologies far more aggressively than has ever occurred historically’ demands not only policy of unparalleled braveness, but policy implementation of unparalleled skill. The next few decades will be an interesting time to be a public servant.

Advertisements