"Servers...were subjected to considerable variation in temperature and humidity, as well as poor air quality; however, there was no significant increase in server failures," the paper said. "If subsequent investigation confirms these promising results, we anticipate using this approach in future, high-density data centers."Source: CNET
Intel estimated an annual cost reduction of approximately $143,000 for a small, 500-kilowatt data center, based on electricity costs of 8 cents per kilowatt-hour. In a larger 10-megawatt data center, the estimated annual cost reduction was $2.87 million.
The chipmaker used a normal air filter that took larger particles out of the air, but not fine dust. While the 32 servers and racks became coated in dust, and humidity was monitored but not controlled, the failure rate was 4.46 percent, compared with a 3.83 percent failure rate in Intel's main data center over the same period.
The experiment was run for 10 months, between October 2007 and August 2008. Server units with more than 900 blades, used for production design, were split into two compartments. One of the compartments was air-cooled, with temperatures ranging from 64.4 degrees to 89.6 degrees Fahrenheit. The other compartment was cooled using air conditioning, and used as a control.
Intel: Datacenters can save millions in cooling costs
Posted on Monday, September 22 2008 @ 0:15 CEST by Thomas De Maesschalck