June 8, 2013

It's not nice to fool with Mother Nature - Physics is a bitch

Heh -- from The Register comes this story of feel-good alt.energy going horribly wrong:
Facebook's first data center DRENCHED by ACTUAL CLOUD
Facebook's first data center ran into problems of a distinctly ironic nature when a literal cloud formed in the IT room and started to rain on servers.

Though Facebook has previously hinted at this via references to a "humidity event" within its first data center in Prineville, Oregon, the social network's infrastructure king Jay Parikh told The Reg on Thursday that, for a few minutes in Summer, 2011, Facebook's data center contained two clouds: one powered the social network, the other poured water on it.

"I got a call, 'Jay, there's a cloud in the data center'," Parikh says. "'What do you mean, outside?'. 'No, inside'."

There was panic.

"It was raining in the datacenter," he explains.

The problem occurred because of the ambitious chiller-less air conditioning system the data center used. Unlike traditional facilities, which use electricity-intensive, direct-expansion cooling units to maintain a low, steady temperature, consumer internet giants such as Google, Facebook, and others have all been on a tear building facilities that use outside air instead.

In Prineville's first summer of operation, a problem in the facility's building-management system led to high temperature and low humidity air from the hot aisles being endlessly recirculated though a water-based evaporative cooling system that sought to cool the air down – which meant that when the air came back into the cold aisle for the servers it was so wet it condensed.
Condensation is a killer for electronics -- circuit boards and hard disk drives especially. Drives are sealed but there is usually a port with a filter on it to let them equalize pressure for shipping. Not good to let it get warm in a humid environment and then cool down... My last post at MSFT was in a large lab (over 1,000 computers) and it had special HVAC systems but these were a one-off design and failed from time to time (a certain 'S' company) and when they did, the temp in the lab would rise from shirtsleeve to around 100°F in a short time. I can see the political cachet of using environmental methods to cool a datacenter but I would be sure to have a full HVAC system as backup. A bit more:
Some servers broke entirely because they had front-facing power supplies and these shorted out. For a few minutes, Parikh says, you could stand in Facebook's data center and hear the pop and fizzle of Facebook's ultra-lean servers obeying the ultra-uncompromising laws of physics.

Facebook learned from the mistakes, and now designs its servers with a seal around their power supply, or as Parikh calls it, "a rubber raincoat."
A patch, not a fix. Rotsa Ruck... Posted by DaveH at June 8, 2013 11:14 PM