Thought the Hollywood award season was over? Not quite! The Climate Change Business Journal (CCBJ) just awarded the Project Merit award for Smart Grid & Energy Management to the recent cloud computing study conducted by NRDC in partnership with WSP Energy and Environment.
The groundbreaking report aimed to help IT managers assess the environmental impacts of Internet-based “cloud” computing relative to on-premise computing, and to understand the key factors that drive the environmental sustainability of their business computing operations, wherever they choose to locate them.
Although cloud computing providers have been touting the environmental benefits of their platforms, until NRDC’s and WSP’s study there was no independent analysis to validate that cloud computing is indeed the most eco-friendly choice.
And the Winner Is…
Our study found that running a computer application in the cloud is generally more energy- and carbon-efficient than running it on-premise because cloud computing can serve more customers with less resources.
But It Ain’t So Simple!
However, we also found that not all clouds are created equal: there are “green” clouds and “brown” clouds, just like there are green and brown on-premise server rooms. And green server rooms can be much more energy- and carbon-efficient than brown clouds.
Green clouds and server rooms are those that utilize energy- and carbon-efficiency best practices, from high level of utilization of their servers and other equipment, to efficient cooling and power distribution, to sourcing electricity from cleaner sources such as renewable energy.
Much of industry’s focus has been on the efficiency of the facilities that house servers, including power and cooling equipment. While this is important and progress in that area needs to continue, we found that server utilization (how much of the server’s capability is used) and carbon-intensity of the power source (the emissions from electricity generation and distribution) could have an even larger impact. The following chart illustrates the potential reduction of the carbon footprint of an office productivity application from each of these three factors individually:
Carbon intensity has the largest impact: a data center located in the U.S. region with the cleanest power mix would have less than one-third the carbon footprint compared to the same data center located in the region with the dirtiest power mix.
The factor with the next largest impact is server utilization: Typical servers in millions of U.S. data centers are working at less than 10 percent of capacity on average. Think of an airplane with less than 10 percent of seats filled: No airline could afford to run so inefficiently. But that’s what is happening in U.S. data centers: most servers are spending the vast majority of their time in ready mode, doing little or no work while still using vast amounts of energy. Running fewer servers at higher levels of use, and putting unused ones to sleep until they are needed, represent two of the largest energy-saving opportunities in data centers.
In fact, increasing average server utilization from 5 percent to 70 percent would reduce the data center’s carbon footprint by over 60%, and that savings estimate does not even include increasing the utilization of other data center equipment!
The Path Forward
Here are two initiatives that could go a long way toward making the nation’s data centers much more energy- and carbon-efficient:
- All data center operators should report their fleet average operational Power Usage Effectiveness (PUE, represents facility energy efficiency), and Carbon Usage Effectiveness (CUE, represents facility carbon efficiency), and;
- Define an industry-standard IT Asset Utilization metric that will enable disclosure of this key data center efficiency measure and drive behavior to optimize it.
The Climate Change Business Journal award will be presented next Wednesday (March 6) in Coronado, California.