Learning Curves and Enhanced Geothermal (Part 1 of 2)

Why policies to support wind and solar power will save us money.

Blue Mountain Geothermal Plant, Nevada Geothermal Power (NGP), Humbolt County, Nevada, May 17, 2017

Credit: Dennis Schroeder / NREL

I recently listened to two podcasts by Dave Roberts both of which individually are well worth checking out. The first is on learning curves and their implication for renewable energy policy and the second is on enhanced geothermal power. In both, Roberts interviews authors of recent studies on the respective topics. After listening to them, I wondered how learning curves apply to enhanced geothermal power. To learn more, I dug into the studies covered in the podcasts and then turned to my colleague Yeh-Tang Huang to get the latest on geothermal. This, then, is the first in a two-part series on learning curves, enhanced geothermal power and speculating about what learning curves can tell us about the future of the cost of enhanced geothermal power.

Part 1: Learning curves: why policies to support wind and solar power will save us money

Even though wind and solar power are already the lowest-cost source of new electricity generation in the U.S., continued policy support for these technologies will help lower energy costs for households, businesses, and industry. This may seem counter-intuitive. If wind and solar are already the cheapest options, why should we continue to support them? In the past, I’ve argued that we should support them because we’re in a race against climate change and while that is still true, continued support has other benefits. According to a recent study, the faster we deploy these technologies and move away from fossil fuels the more money we will save—with the study finding that a global transition away from fossil fuels by 2050 would save $12 trillion globally (net present value) over the next three decades compared to a “no transition” scenario.

Why does accelerating the transition towards clean energy save more money on energy costs? The answer comes down to learning curves. Technically, a learning curve describes the relationship between the amount of a given technology that has been deployed and the cost of that technology. The goal, however, is to capture how processes are improved over time—where experience with deploying a technology can improve the efficiency and proficiency of building and installing the technology—helping lower the cost of production and installation for each new unit. In this way, as the cumulative amount deployed of a technology grows, the cost of deploying that technology in the future generally drops as we become more efficient and develop better, cheaper processes to develop and install that technology.

However, not all technologies show the same level of learning, or rate of improvement; some technologies will show a steeper reduction or “high learning” rates—where costs fall quickly as deployment increases—while others will show much slower learning (see the figure below). According to the authors, the study applies more extensive and recent data and more robust computational methods to calculate learning curves for a range of technologies and applies those learning curves to the energy system as a whole, globally.

Notably, the study claims oil and gas show no learning. While the price of oil and gas fluctuates wildly, the authors conclude there is no obvious correlation between the amount of oil and gas we have extracted from the earth and the price of oil and gas. (Note that price is different than cost, but the authors rely on price assuming it’s a reasonable proxy for marginal cost. Of course the worldwide manipulation of oil and increasingly gas prices raises some questions for me.) Meanwhile, the study shows a remarkably consistent and steep relationship between the cost and amount of wind, solar, and storage technologies deployed. As an example of how this plays out, here in the U.S., we’ve seen a 64 percent, 69 percent, and 82 percent reduction in the cost of residential, commercial-rooftop, and utility-scale solar, respectively, over the last decade as solar capacity in the U.S. has increased 28-fold over the same period. The cost reductions have been driven largely by hardware cost reductions.

As the study notes, all the dynamics behind this relationship are not well understood. Learning curves appear to be more pronounced for technologies that are manufactured in bulk. However, the cost reductions are not simply the result of increased efficiency of workers or manufacturing techniques, though those play a role. Learning curves instead appear to capture learning-by-doing all throughout the supply chain.

Why then don’t extractive technologies such as oil and gas show nearly the same level of learning? It may be that as easier-to-reach deposits are used up, the technology innovation that might otherwise reduce costs has to be used to extract the harder-to-reach supplies. The study speculates that these industries may be stuck “running to stand still.” Of course, there are easier and harder sites to deploy wind and solar.

Learning curves are based on observable correlations and, as such, not dictated causation. While many energy system models rely on some version of learning curves to project future technology costs and performance, they generally limit how much costs or deployment can change during certain periods. The study documents how many global integrated assessment models relying on these constraints have consistently been wrong about the costs of renewables. (See figure below. Note how cost forecasts and cost floors have consistently been wrong compared to historical trends.) Nevertheless, I’d argue that some of these constraints make sense at the national level and in the near term where models can only capture a portion of the global supply and cost picture. Indeed, the models that NRDC use include these types of constraints.

This brings us to the second part of the study applying the learning curves without checks in a model representing global energy markets. As noted earlier, the study finds that a fast transition that moves away from fossil fuels by 2050 would save the world $12 trillion compared to a no-transition scenario. The results are eye-popping and hold across a range of conservative assumptions: the faster we switch to wind and solar, the more quickly these technologies will become cheaper and the more money we will save by deploying these low-cost solutions.

The lessons from this study are even more pertinent given the passage of the Inflation Reduction Act. Many, especially in the fossil fuel industry, will argue that the incentives for clean energy in the IRA are sufficient and that cities, states, and the federal government shouldn’t take further actions to support clean energy. But what history is teaching us through learning curves is that for the right technologies, such as wind and solar, the faster we go, the more money we can save.

While wind and solar show a consistent rate of learning by doing, what about other, new or innovative technologies? What about a technology such as enhanced geothermal power, which promises to be carbon-free, widely available, and maybe even dispatchable? If we knew enhanced geothermal systems would follow a consistent learning curve, would we be smart to start deploying the technology as quickly as possible? We turn to this in part 2 of this series.

Related Blogs