How Facebook Upgraded the Outside Air Cooling System at Their Prineville Data Center18 min read

by | Apr 3, 2024 | Blog

Facebook is very much on the front lines of data center innovation. About a decade ago, it designed the Prineville data center in Oregon from the ground up to achieve the highest levels of efficiency.

Prineville boasts a Power Utilization Effectiveness (PUE) of 1.15, far above the industry average of around 1.6. That means that for every 100 watts going to the computing equipment, only 15 watts goes to cooling, lighting, UPS and power distribution. Some months, the PUE has gone as low as 1.06.

Instead of the traditional approach of using energy-hogging computer room air handling (CRAH) units, Facebook utilizes outside air to minimize energy costs. Located in the high desert region of Oregon where the temperatures range between 80-90°F, Facebook decided to use one system to cool and humidify the entire building as well as the IT equipment racks.

The upper floor of the facility is devoted to pulling in outside air, filtering it, cooling it and humidifying it before sending it down to the lower floor where the servers are located. No ductwork is needed. Instead, the air handling system uses a wall of high-efficiency 5hp variable speed fans to create a positive air pressure going into a 14’ plenum above the cold aisles of the data center. This minimizes the amount of work required at the server to pull the air past the components and keeps energy consumption low. The upper air handling deck has a series of louvers to control the amount of air pulled into the building and the amount of hot air from the servers that gets exhausted outside or recycled back through the servers.

Cooling System Specifics at Prineville

The cooling and humidification of the dry desert air is done using a fogging system by Mee Industries of Irwindale, California that serves the entire 147,000 sq. foot data center. As a result, Facebook Prineville can operate year-round without using mechanical cooling, even when summer temperatures might reach as high as 110°F.

Facebook favored a custom design for the MeeFog system. It consists of 28 fogging units and more than 6,600 nozzles that provide the exact levels of cooling and humidification available in several stages. The goal was to have one, easy-to-maintain system that cooled everything rather than having one general building cooling system and a combination of CRAH and rack-level cooling. The entire facility was brought to the desired temperature while eliminating fans in the racks and servers – Facebook engineers seek such designs to take as much maintenance labor out of the equation as possible.

The data center operates at 80.5°F in accordance with American Society for Heating, Refrigeration and Air Conditioning Engineers (ASHRAE) standards. That temperature is maintained even when summer heat soars above 100°F.

The original MeeFog system design consisted of 56 x 7.5hp positive displacement fog pump units from CAT Pumps, each with variable frequency drives. They provided 7.62 gpm of fogging at a pressure of 1000 psi. Two pumps are needed per air handling unit (AHU) (one active, another on standby). There are 28 AHUs in the data center. Pumps send water through stainless steel tubing to an array of specially designed impaction-pin nozzles which convert that water into a micro-fine fog that rapidly evaporates. The presence of fog in the air stream brings the air down to the desired temperature.

Humidity Control

As well as dealing with temperature, fogging also provides the desired level of humidity. Prineville is very dry. Rainfall averages 10 inches per year. Thus, low humidity could become a real problem for sensitive IT equipment due to the risk of static discharge. Adding moisture to the air courtesy of fog prevents static electricity from forming on the floor and on equipment. But you can’t add too much humidity. Otherwise, condensation may form, posing a further risk.

“With too much humidity, condensation builds up, but with too little humidity, static electricity occurs,” said Emily Vernon, Senior Content Manager at Metrikus. “Both scenarios can cause costly damage to the systems, so the data center humidity range should always stay between 40% – 60% relative humidity.”

Uptime Institute data indicates that the suitability of a data center environment is primarily judged by its effect on the long-term health of IT hardware. Without proper temperature and humidity control, Prineville would be far from the ideal location.

“Facility operators define their temperature and humidity set points with a view to balancing hardware failure rates against the associated capital and operational expenditures, with the former historically prioritized,” said Daniel Bizo, Research Director, Uptime Institute Intelligence.

Accordingly, Facebook designed its system with a few to maintaining humidity levels in accordance with industry standard while keeping capital costs, energy costs and maintenance costs very low. Ambient relative humidity levels are maintained in the 45% to 55% range through fogging.

Recent Upgrades

Never satisfied with efficiency levels and always seeking ways to improve them, Facebook decided to further upgrade its cooling system. The original fog cooling and humidification system mainly used oil lubricated pumps.

“As Facebook wanted to further reduce system maintenance, we changed out the original oil-lubricated pumps and replaced them with water-lubricated units from Danfoss Pumps,” said Elliot Sloane, humification manager at Mee Industries. “There is very little maintenance now required on the pumps as we have eliminated the need for regular oil changes.”

Sloane noted that further modifications were needed to accommodate the new pumps.

“The Danfoss units cannot turndown as well as the original pumps, yet the wide range of stages Facebook operates means they sometimes need to be turned down to under 10% of total capacity,” he said.

As these pumps cannot slow down as well as the previous models, they sometimes would pump more water than required and this can sometimes heat up the pumps.

“We completed further modifications at Facebook Prineville such as adding heat exchanger assemblies to all of the pump skids to cool down the pump during those low load conditions,” said Sloane. “At Prineville and many other sites, high-pressure fog-based cooling and humidification are cost effective in terms of energy usage.”

The industry's first and only tool-less containment solution!

AisleLok® is the industry’s first modular containment solution,
proven to provide the core benefits of containment with greater flexibility and value.

The industry's first and only tool-less containment solution!

AisleLok® is the industry’s first modular containment solution,
proven to provide the core benefits of containment with greater flexibility and value.

Drew Robb

Drew Robb

Writing and Editing Consultant and Contractor

Drew Robb has been a full-time professional writer and editor for more than twenty years. He currently works freelance for a number of IT publications, including eSecurity Planet and CIO Insight. He is also the editor-in-chief of an international engineering magazine.


Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite


Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest