Intel conducted a ten months long experiment in the New Mexico desert for a research on reducing data center costs. The chip giant ran a split-test, half of the servers were placed in a room with air conditioning and stable humidity level while another compartment was cooled by simply piping in outside air. Intel found out that, contrary to popular belief, dust, major temperature changes and humidity variation didn't affect the equipment:
"Servers...were subjected to considerable variation in temperature and humidity, as well as poor air quality; however, there was no significant increase in server failures," the paper said. "If subsequent investigation confirms these promising results, we anticipate using this approach in future, high-density data centers."
Intel estimated an annual cost reduction of approximately $143,000 for a small, 500-kilowatt data center, based on electricity costs of 8 cents per kilowatt-hour. In a larger 10-megawatt data center, the estimated annual cost reduction was $2.87 million.
The chipmaker used a normal air filter that took larger particles out of the air, but not fine dust. While the 32 servers and racks became coated in dust, and humidity was monitored but not controlled, the failure rate was 4.46 percent, compared with a 3.83 percent failure rate in Intel's main data center over the same period.
The experiment was run for 10 months, between October 2007 and August 2008. Server units with more than 900 blades, used for production design, were split into two compartments. One of the compartments was air-cooled, with temperatures ranging from 64.4 degrees to 89.6 degrees Fahrenheit. The other compartment was cooled using air conditioning, and used as a control.