U.S. servers use 1.2% of U.S. electricity - more than TVs

Posted on Thursday, February 15 2007 @ 20:36 CET by Thomas De Maesschalck
AMD sponsored a study by Lawrence Berkeley National Laboratory staff scientist Jonathan Koomey to learn how much electricity U.S. computer servers use every year.

They found out that total power consumption of U.S. servers accounted for 0.6 percent of overall electricity consumption, and when you add the power consumption of the cooling equipment it doubles to 1.2 percent of all electricity used in the U.S. This is the same as the amount of electricity used by TVs.
Between 2000 and 2005, server electricity use grew at a rate of 14 percent each year, meaning that it more than doubled in five years. The 2005 estimate shows that servers and associated equipment burned through 5 million kW of power, which cost US businesses roughly $2.7 billion.

Koomey notes that this represents the output of five 1 GW power plants. Or, to put it another way, it's 25 percent more than the total possible output from the Chernobyl plant, back when it was actually churning out power and not sitting there, radiating the area.

If current trends continue, server electricity usage will jump 40 percent by 2010, driven in part by the rise of cheap blade servers, which increase overall power use faster than larger ones. Koomey notes that virtualization and consolidation of servers will work against this trend, though, and it's difficult to predict what will happen as data centers increasingly standardize on power-efficient chips.


About the Author

Thomas De Maesschalck

Thomas has been messing with computer since early childhood and firmly believes the Internet is the best thing since sliced bread. Enjoys playing with new tech, is fascinated by science, and passionate about financial markets. When not behind a computer, he can be found with running shoes on or lifting heavy weights in the weight room.



Loading Comments