America's Data Centers Consuming and Wasting Growing Amounts of Energy

Critical action is needed to save money and cut pollution.

Data centers are the backbone of the modern economy -- from the server rooms that power small- to medium-sized organizations to the enterprise data centers that support American corporations and the server farms that run cloud computing services hosted by Amazon, Facebook, Google, and others. However, the explosion of digital content, big data, e-commerce, and Internet traffic is also making data centers one of the fastest-growing consumers of electricity in developed countries, and one of the key drivers in the construction of new power plants.

In 2013, U.S. data centers consumed an estimated 91 billion kilowatt-hours of electricity, equivalent to the annual output of 34 large (500-megawatt) coal-fired power plants. Data center electricity consumption is projected to increase to roughly 140 billion kilowatt-hours annually by 2020, the equivalent annual output of 50 power plants, costing American businesses $13 billion annually in electricity bills and emitting nearly 100 million metric tons of carbon pollution per year.

While most media and public attention focuses on the largest data centers that power so-called cloud computing operations -- companies that provide web-based and Internet services to consumers and businesses -- these hyper-scale cloud computing data centers represent only a small fraction of data center energy consumption in the United States. The vast majority of data center energy is consumed in small, medium, and large corporate data centers as well as in the multi-tenant data centers to which a growing number of companies outsource their data center needs.

These data centers have generally made much less progress than their hyper-scale cloud counterparts due to persistent issues and market barriers, such as lack of metrics and transparency, and also misalignment of incentives.

Related Issues
Buildings

Related Resources