Facebook it seems has turned ‘quantum leaping’ into a way of life. Having changed the web forever with its open API (which made possible that now ubiquitous little blue “Like” button), the social networking behemoth wants to do the same for the “green IT” sector.
For more than a year a team of three very smart engineers set out to reinvent a server farm for the 21st century. The result is a new data center in Prineville, Oregon which is revolutionary at all scales of its development – from site selection and construction (which has dispensed with costly refrigerated air conditioning and plumbing) all the way down to the design of the server racks, the power supply & distribution, and circuit board design.
This innovative approach has resulted in a new bar of power usage efficiency – a 1.07 rating compared to the industry average of 1.50 (with 1.00 being 100% efficiency). During CEO Mark Zuckerberg’s introduction yesterday at the soon to be relocated Facebook headquarters in Menlo Park, he mentioned the environmental benefits of reducing energy consumption, but made a point that those environmental benefits come as a result of one simple mission: “to provide the most efficient compute at the best cost.”
It was a “light green” message designed to walk the line between environmental advocates like Greenpeace who have been berating the internet giant for its heavy reliance upon fossil fuel power) and the shareholders who so often believe that environmental benefits come at a cost to their bottom line. Zuckerberg made it clear that Open Compute offers the IT industry a win-win strategy. Facebook’s new data center is 38% more efficient, while saving the company 24% on total costs – an achievement nicely captured by the bright green slogan that greeted attendees, “Efficiency is Profitable.”
To greatly simplify Open Compute, there are three main scales of engineering that resulted in Facebook’s efficiency gain – the server building, the server housings, and the server circuits.
The building’s desert location was carefully selected to take advantage of cool, dry nights which allow for mostly passive cooling – no ducts, no refrigerated air, no internal plumbing. In the summer humidity and evaporative cooling are added by misters and fans which pull the chilled air through a series of washable mesh filters. Engineering firm Alfatech led the innovative building design which was matched by an equally inventive power distribution system, dreamt up by lead engineer Jay Park, which eliminated a huge amount of energy waste through a single transformer. Typically in data centers there is an 11-17% energy loss between the power grid and the conversion to uninterrupted current. The Prineville data center limits energy transmission loss to 2%.
Similar ingenuity went into the tall, thin server chassis design which allows for 22% less materials, leaner heat sinks, bigger fans, and custom power supplies that have a 94.5% efficiency (also unheard of in the industry). Each rack of servers has a back up battery column and hundreds of sensors that constantly report on the performance of the servers.
It wouldn’t have been a Facebook presentation however without a big open source announcement – all the specs, schematics, and CAD drawings for the entire data center will be made publically available so these efficiencies can be replicated around the world. The panel of top IT brass – Dell, HP, Intel, Rackspace, Zynga, and the U.S. Dept. of Energy – all agreed that open sourcing these energy efficiency innovations would help to reduce costs and energy consumption particularly in rapidly developing countries where massive inefficiencies are common.
While some in the audience questioned whether a spec written so precisely to deliver on Facebook’s specialized computing needs would translate to other applications, most of the panel seemed to be along for the ride, pledging to build on the new Open Compute standard, which Rackspace Chairman Graham Weston called “..the biggest cost reduction in running severs in a decade."
You can watch a local Oregon news piece about the Facebook data center on KTVZ.
The opinions expressed by MNN Bloggers and those providing comments are theirs alone, and do not reflect the opinions of MNN.com. While we have reviewed their content to make sure it complies with our Terms and Conditions, MNN is not responsible for the accuracy of any of their information.