Airflow Management Considerations for a New Data Center – Part 5: Server Corrosion versus Moisture Levels20 min read

by | Jun 28, 2017 | Blog

[This continues from Airflow Management Considerations for a New Data Center – Part 4: Climate Data vs. Server Inlet Temperature]

This is the fifth part of my seven-part series on Airflow Management Considerations. My earlier articles deal with varying ways to benefit from being able to control airflow volume and temperature – the activity of airflow management, and the key to exploiting both efficiency and effectiveness opportunities in the data center. Of course, I’m making the assumption that your data center is already fully compliant with ASHRAE best practices.

Airflow management considerations will inform the degree to which we can take advantage of our excellent airflow management practices to drive down the operating cost of our data center. In my first installment of this seven-part series, I explored the question of server power versus server inlet temperature, presenting a methodology for assessing the trade-off of mechanical plant energy savings versus increased server fan energy at higher temperatures. I suggested that for most applications, a data center could be allowed to encroach into much higher temperature ranges than many industry practitioners might have thought before server fan energy penalties reverse the savings trend. In the second piece, I presented data from two well-conceived and well-executed experimental research projects that suggest data centers can run hotter than otherwise necessary without adversely affecting server operation throughput. In the third piece, I suggested price premiums for servers that effectively operate at higher temperatures may not be that significant and appear to be trending toward equilibrium. In the previous piece, I considered climate data in light of the first three topics and suggested that chiller-free data centers are much more realistic than conventional wisdom might purport. Today we will look at the role of humidity management on the overall strategy of exploiting excellent airflow management practices.

The consideration of corrosion on server components and printed circuit boards was in the process of being dismissed as a serious threat as ASHRAE raised allowable relative humidity thresholds to 85% and 90% for different server classes and set 75˚F as a maximum dew point, while OEM’s almost universally expanded their humidity envelopes when all this enthusiasm crashed into regulatory obstacles requiring lead-based solders be replaced by silver-based solders for PCB component attachment. Silver, in the presence of high humidity, is reactive when exposed to gaseous contaminants such as hydrogen sulfide, chlorine, hydrogen chloride, and sulfur dioxide. Because of this risk, the folks at ASHRAE TC9.9 back-pedaled significantly from promoting allowable limits and stressed the advantages of living within the recommended humidity envelope, i.e., 60% maximum RH.1 At this level, as the RH of air traveling through the server is lowered as a result of acquiring more heat from the server, the hazards associated with humidity continue to reduce.

The risks of corrosion inside server equipment due to the interaction of high humidity and these gaseous contaminants are obviously going to be more prevalent in data centers in industrial areas with air pollution issues. And, data centers taking advantage of everything we have discussed regarding free cooling will be vulnerable. I recall sitting through meetings where the corrosion metrics were being debated, and one presenter claimed relevant contaminant levels had only been measured in Beijing and Delhi, and another presenter asserted there had been random occurrences recorded in isolated U.S. geographies with much higher than average coal-burning activity. While that debate remains unresolved, for data centers employing free air cooling in locations where extreme gaseous contaminants are suspected, ANSI/ISA 71.04-1985 has been offered as a monitoring protocol, specifying less than 300 Angstroms per month reactivity for both copper and silver coupons. Ironically, since the soft under-belly here is silver-based solder on server PCB’s, the consensus appears to be that the copper coupons provide a more reliable screen for a healthy data center environment.

Interestingly enough, while the normally aggressive ASHRAE TC9.9 community has reined in its enthusiasm for a wider humidity specification envelope for servers, the manufacturers themselves seem pretty comfortable with the wider envelope. I did a random sampling of some of the servers previously analyzed in earlier reports on performance and acquisition costs at higher inlet temperatures and checked their user documentation for humidity, shown below in Table 1, and that community has clearly embraced the higher thresholds and, for the most part, have adopted overall wide relative humidity ranges.

Table 1: Sampling of Server Humidity Specifications
Server Model

Humidity

Dew Point

Comments/Restrictions

Range

MinMax
Dell10% – 80%  10% excursions to 5% – 90%
HPE DL3608% – 90%10.4˚F77˚F 
HPE DL46010% – 90%   
HPE DL208% – 90%   
HPE DL608% – 90%10.4˚F75.2˚FHigher of min RH or DP and lower of max RH/DP
HPE ML3010% – 90%   
HPE ML1108% – 90%   
HPE ML3508% – 90%   
HPE Apollo 450010 – 90% 82.4˚F 
IBM S822LC8% – 80% 75˚F 
IBM x3650M420% – 80% 69.8˚F115 to 135 watt processors
IBM x3650M48% – 85% 75.2˚F60 to 95 watt processors
IBM l 8335-GTB8% – 80%10.4˚F75˚FRecommended ranges = ASHRAE recommended
IBM 8408-44E8% – 80%10.4˚F75˚FRecommended ranges = ASHRAE recommended
IBM S812LC20% – 80% 70˚F 
IBM x3550M420% – 80% 69.8˚F115 to 135 watt processors
IBM x3550M48% – 85% 75.2˚F60 to 95 watt processors

Storage solutions, on the other hand, have a reputation for being a little more sensitive to environmental conditions than servers. Most notable of these differences is the temperature rate of change specification with a 4:1 difference between tape storage and solid state storage. As for humidity and contamination, the literature sites hard disk drive susceptibility to long exposure to higher humidity2 and the general susceptibility of spinning media to contaminants exacerbated by high humidity. Tape storage is generally cited as requiring a more stable environment, including an ASHRAE recommended maximum 5% per hour rate of change for relative humidity.3 When we look at the manufacturers’ specifications, as shown below in Table 2, they do not appear to make much distinction between the different media types, though the relative humidity envelope is narrower than it is for servers and the maximum dew point is generally below that threshold for servers. Nevertheless, with the exception of archival tape storage, the minimum to maximum range of relative humidity for storage equipment is adequate to provide a lot of flexibility for free cooling strategies.

Table 2: Sampling of Storage Solutions Humidity Specifications
   Max Dew

Additional Information

Storage ProductTypeMin-Max

Point

 
Dell-EMC Isilon NLHDD5%-95%  
Dell-EMC Unity FlashSSD20%-80%

69.8˚F

10% excursion to 8-85% & 75˚F max dew point
Dell-EMC VNXe3200both20%-80%

69.8˚F

10% excursion to 8-85% & 75˚F max dew point
NetApp FAS2554both20%-80%  
NetApp FAS2220both20%-80%  
NetApp FAS6200HDD20%-80%  
IBM TS4500 LibraryTape20%-80%

63˚F

 
IBM tape cartridgesTape20%-80% operational
IBM tape cartridgesTape20%-50% Archival storage
HP 3PAR S-ClassSSD20-80%  

The open debate regarding the seriousness of the threat of corrosive damage resulting from exposure to gaseous contaminants at elevated humidity levels, particularly with the RoHS solder requirements, should see progress toward resolution sometime in the next eighteen months or so with a research project conducted at Syracuse University on a grant contract from ASHRAE. The objective of this study is to answer this exact question and the project deliverables will include:

  1. Detailed literature review to understand the importance of field variables
  2. Detailed description of experimental methods used and justification of design of experiments
  3. Results showing the effects of moisture (dew point) combined with varying levels of gaseous 
pollutants – such as SO2, NO2, H2S, O3, and Cl2 – and voltage bias on RoHS compliant electronic equipment degradation in data centers
  4. New guidelines for operating data centers at the higher moisture levels, including recommendations for the rate-of-change of moisture within the data center
  5. Technical paper(s) summarizing the results of the research.4

As a brief aside, data center tribal knowledge has long held that humidity represented the first line of defense against ESD damage to ICT equipment. A similar study conducted at Missouri University found that good grounding practices for both equipment and personnel represented the only viable protection against ESD. The conclusion of the resultant 168-page report summarizes these findings:

The low increase in the ESD risk with reduced humidity indicates that a data center with a low incident rate of ESD-induced damage operating at 25% RH will maintain a low incident rate if the humidity is reduced to 8%. … The concerns raised prior to the study regarding the increase in ESD-induced risk with reduced humidity are not justified. A standard set of ESD-mitigation procedures will ensure a very low ESD incident rate at all humidity levels.5

The humidity environment envelope for ICT equipment is actually broader from most equipment manufacturers than it is from the various standards organizations. The low-end is driven by mechanical protection for storage equipment and not by ESD protection for our servers. The upper end is defined by a general suspicion of corrosive damage resulting from outside contaminants being activated by elevated humidity levels, particularly for extended periods. While we await a major research study report on those upper thresholds, just remember that in free cooling situations, our electronics can be a much more cost-effective heater, and a legal heater in those states and municipalities that have banned reheat as part of their energy savings and global warming initiatives. Consider the psychrometric chart. For example, if we have a nice 55˚F day with 90% RH, and we are free cooling with outside air, if we mix that ambient incoming air with a healthy amount of return air from the data center and raise the resultant combined air stream temperature to 65˚F, the same moisture content will give us 70% RH. Finally, to use our IT load as a cost-effective heater, we want to maximize the ΔT between supply air and return air, which means we have instituted all the best practices for airflow management to keep those air streams segregated until they are subject to controlled mixing.

Continues in Airflow Management Considerations for a New Data Center: Part 6: Server Reliability versus Inlet Temperature

1“Creating a Perfect Storm,” Donald Beaty and David Quirk, The ASHRAE Journal, December 2014
2 Data Center Storage Equipment – Thermal Guidelines, Issues, and Best Practices, Whitepaper prepared by ASHRAE Technical Committee (TC) 9.9, Mission Critical Facilities, Data Centers, Technology Spaces, and Electronic Equipment, 2015, pages 38-39
3 Thermal Guidelines for Data Processing Environments, 4th Edition, ASHRAE Technical Committee (TC) 9.9, Mission Critical Facilities, Data Centers, Technology Spaces, and Electronic Equipment, 2015, page 15
4 “Impact of Gaseous Contamination and High Humidity on the Reliable Operation of Information Technology Equipment in Data Centers (1755-TRP), ASHRAE, 2016, page 1
5 The Effect of Humidity on Static Electricity Induced Reliability Issues of ICT Equipment in Data Centers, David Pommerenke, David Swenson With contribution: Atieh Talebzadeh, Xu Gao, Fayu Wan, Abhishek Patnaik, Mahdi Moradianpouchehrazi, Yunan Han, ASHRAE Research Project 1499 TRP, October 2014, page 3

Airflow Management Awareness Month 2019

Did you miss this year’s live webinars? Watch them on-demand now!

Ian Seaton

Ian Seaton

Data Center Consultant

Let's keep in touch!

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Airflow Management Awareness Month 2019

Did you miss this year’s live webinars? Watch them on-demand now!

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest