Grid Hogs

As data centers consume an ever larger share of available electricity, critical operations might be at risk.

As data centers consume an ever larger share of available electricity, critical operations might be at risk.

Unless you work in the information technology department, you probably haven't been paying much attention to your agency's data centers-those huge, superchilled rooms filled with racks of servers and cables and mysterious boxes with blinking lights. If you're a senior manager, you might want to start paying attention now.

Last month, the Environmental Protection Agency reported to Congress that the growing demand for computer resources over the last five years has doubled the energy consumption of servers and the power and cooling infrastructure that supports them. That has increased business costs and greenhouse gas emissions and strained the power grid. In 2006, data centers used more electricity than all the nation's color televisions, EPA found.

"Over the past 12 months, I have talked to literally hundreds of customers, and I have found no one who says this isn't an issue," says Dick Sullivan, director of enterprise solutions marketing for EMC Corp. in McLean, Va., which helps organizations, including a number of federal agencies and Congress, protect and manage data.

"I think it is of varying degrees of severity, but as the size of the data center grows, the severity of the issue grows as well," Sullivan says. The issue is especially critical for global financial institutions, data centers in major metropolitan areas where the demands on the electrical grid are greatest, and national security agencies, which have seen an explosion in the amount of data they are expected to manage, Sullivan says.

In 2006, the nation's data centers consumed about 61 billion kilowatt-hours of electricity-1.5 percent of the total U.S. electricity consumption, at a cost of about $4.5 billion. Federal data centers accounted for about 10 percent of that consumption. Under current trends, national energy consumption by servers and data centers could nearly double by 2011 to more than 100 billion kwh, representing $7.4 billion in annual electricity costs, according to EPA's Aug. 2 report, which draws on research by the Energy Department's Lawrence Berkeley National Laboratory.

"The peak load on the power grid from these servers and data centers is currently estimated to be approximately 7 gigawatts, equivalent to the output of about 15 baseload power plants. If current trends continue, this demand would rise to 12 GW by 2011, which would require an additional 10 power plants," EPA reported.

There are a number of reasons for the growth in power consumption-the main one being the greater demand for electronic data, both in the public and private sectors. The demand is driven by several factors. In the private sector, the increased reliance on electronic transactions for banking and electronic trading, the growth of Internet communication and entertainment, the shift to electronic medical records, expansion of global commerce, and the adoption of satellite navigation and electronic shipment tracking all have increased the demand for data.

Federal agencies also have added to the data load by publishing more government information on the Internet, retaining more digital records, expanding disaster recovery requirements, and providing additional online services, such as electronic tax filing and shipment tracking through the U.S. Postal Service. High-performance scientific computing and the data demands of information security and national security also have contributed to the growth of government data centers.

Among EPA's recommendations: "The federal government and industry should work together to develop an objective, credible energy performance rating system for data centers."

Performance metrics and efficiency targets are needed, says John Sawyer, director of mission critical facility services for the Controls Group of Johnson Controls Inc. in Milwaukee, which sells products and services to optimize energy use in buildings. The EPA report will draw shareholders' attention to the issue, which will help prod industry into finding solutions, he says.

One of the challenges organizations face, both in the public and private sectors, is that data center managers typically aren't the ones paying the utility bills, so they may not even be aware of the rising costs. Facilities managers, who are aware of the costs, often have little input into the IT budget or operations. But increasingly, IT managers and facilities managers are realizing that they have common interests, Sawyer says.

"Capital planning, maintenance, operational efficiencies all relate to the total cost of ownership of a given facility. It can go up so high that the overall cost is very near to affecting the overall budget. When that happens, something has to take a back seat," he says.

That point was illustrated last month by Kenneth G. Brill of the Uptime Institute, an education and consulting services firm for facilities and information technology organizations in Santa Fe, N.M. In a white paper titled "The Invisible Crisis in the Data Center: The Economic Meltdown of Moore's Law," the paper considers the prediction by Intel Corp. co-founder Gordon Moore in 1965 that manufacturers would double the number of transistors on a piece of silicon every 18 months and its ramification for data centers.

"A largely unseen cost crisis is looming within data centers, fueled by a growing divergence between rapid server computing performance gains and a far slower improvement in energy efficiency," Brill concluded.

"This trend is largely invisible until an economic crisis occurs that attracts the attention of C suite executives," usually when a data center runs out of power or cooling capacity and a cash infusion is needed, the report found. The paper cites the case of a company that put $22 million into blade servers without including the site facility representative in the decision. Had the company done so, it would have understood that the new servers would require a $54 million upgrade to the facility. When all the costs were factored in, the true cost of the blade servers over three years was $97 million.

Short of building new data centers, organizations can do a lot to improve efficiency in existing centers and lower their electric bills, EMC's Sullivan says. "Typical server utilization is around 10 percent to 15 percent. For some entities, that's even high. How many [chief financial officers] would gladly have assets on the floor being used with only a 10 percent to 15 percent utilization rate? None." So how have IT departments gotten away with this level of inefficiency? Part of the problem, Sullivan says, is that data center managers are understandably risk averse and loath to experiment with reducing capacity when they don't know what the outcome will be.

EPA recommends that government data centers become models of efficiency and commit to publicly reporting on energy performance once standardized metrics are available. In addition, government should conduct efficiency assessments in all data centers within two to three years and implement all cost-effective operational improvements.

It's a recommendation Sullivan supports: "The government ought to be setting an example. I would hazard a guess that there aren't many government agencies, including in the EPA or Energy Department, that would offer up their own data centers as shining examples."

NEXT STORY: Contractor General