On April 15th, the California Public Utilities Commission’s Energy Division staff released a draft report estimating the energy savings of the California investor-owned utilities 2006-2008 energy efficiency programs. When the programs were approved, they were by far the largest utility investment in energy efficiency ever. The efficiency programs are designed to save consumers money and reduce environmental impacts of energy use through the use of energy efficient products. When the 2006-08 efficiency programs were completed, they had helped Californians purchase, install and use even more energy efficient lighting, appliances, and equipment than was expected when the programs were approved in 2005. Energy division staff reports that verified savings under program planning savings assumption levels were 110% of what was expected!
Given this seeming success, there are significant questions about the April 15th draft staff report, which indicates that the California utilities missed the savings goals the programs were designed and approved to achieve. The discrepancy is the result of evaluation studies completed by various consultants who estimated that nearly every efficiency program saved less energy than was expected. Ducts were sealed, air conditioners and appliances were installed, and light bulbs were replaced for more efficient models, but according to the evaluations, when the savings were counted they didn’t add up to what was expected. (Although, as my colleague Devra Wang notes in her blog, even the most conservative estimates show the efficiency programs still provided enormous benefits for utility customers, the state’s economy and the environment.)
Some variation in savings should be expected: For instance everyone knows that a CFL produces light using 75-80% less energy than an incandescent, but it’s harder to know how many hours the efficient bulb will be used, or how quickly after purchase it will be installed. My colleague Peter Miller discusses the Energy Division’s Upstream Lighting Program evaluation report in his blog post. But the changes in estimated savings aren’t minor tweaks. The estimates indicate very significant reductions in savings in a wide range of efficiency programs, which begs the question: Are these new estimates reasonable? How much energy was really saved?
NRDC’s review of the savings studies indicates that some new estimates are based on reasonable methods and rigorous analysis. These reports indicate that some programs could be improved in future years to bring substantially more savings. For example, a report on a program that incentivized duct sealing may bring more savings if contractors are paid based on saved energy, instead of by the job, thereby incentivizing thorough and complete work. But other reports appear to vary widely in methodology, timing, sample size and even, according to the conclusions of the evaluators themselves, success at estimating savings and attribution of the efficiency programs. Of particular concern are a number of the estimates of savings attribution (the estimate of what portion of the efficiency upgrades incentivized by the utilities would have happened without utility support). Attribution is a very hard thing to measure after a program is completed, and many of the studies don’t appear to have come up with estimates with any substantial degree of confidence, yet many of the most significant reductions in savings estimates come from new attribution assumptions.
The evaluation reports represented a significant venture. The approved 2006-2008 Energy Division savings evaluation budget was almost $120 million. But the full Commission has not yet reviewed the savings studies, instead leaving this important function essentially in the hands of consultants with staff review and only informal party comments. The wide variety of levels of certainty in the evaluation reports, combined with the impact of their results on savings estimates from past, current and future efficiency programs merits consideration by the full Commission.
These estimates matter. Hanging in the balance are an upcoming decision by the Commission on financial rewards or penalties for the utilities’ performance at delivering energy efficiency programs; the savings that can be expected from the current 2010-2012 and future programs; and the savings from efficiency programs around the country, since many states look to California’s evaluation studies to determine savings from their own programs. Where the draft report provides good, new and useful information, it should be used to adjust the current programs being deployed by the utilities. Whatever assessment information is not based on enough data or sound analysis, it shouldn’t encumber future efficiency investment opportunities in California.
Before the Commission approves the draft report, it should dig in to the detailed evaluations, decide what level of certainty it requires before changing savings estimates, and ensure it has confidence in the final saving estimates.