What’s driving higher rack densities in the data center?18 min read
The concept of rack density has changed markedly in recent years. Less than a decade ago, the AFCOM Data Center Institute (DCI) released a whitepaper on data center size and density. It classified high density racks as being in the range of 9 kW to 15 kW. Those categorized as being extreme density racks were considered to be more than 15 kW. Hyperscalers, meanwhile are praised for their efficient rack dense architectures. Yet the racks inside their most modern data centers are generally less than 30 kW.
“Today, high density racks are roughly 40 kW to 125 kW and extreme density racks are up to 200 kW and even beyond,” said Mike Andrea, CEO ofOper8 Global and DCI board member.
This shift of an order of magnitude and more is largely being driven by the democratization of high-performance computing (HPC) applications which are now utilized across a much broader range of industries. Those in research and development, aerospace, seismic engineering, 3D modelling, autonomous vehicle simulation, Artificial Intelligence (AI), energy, oil and gas production, weather forecasting, analytics, healthcare, and 3D film rendering are among the big users of the technology.
“While HPC used to be province of businesses and research entities with a billion-dollar turnover, much smaller businesses are now using it for competitive advantage,” said Andrea. “Latency remains a primary driver in HPC along with the ability of data centers to support rack densities in excess of 100 kW per rack.”
His company is working with several customers that are demanding 80 kW to 200 kW per rack. However, he said not to expect entire data centers to be loaded up with 100 racks each averaging 100 kW. That would result in 10 MW HPC data centers, which would keep them in the realm of hyperscalers and cloud providers. A more likely scenario is for an edge or colocation data center to have one extreme density HPC pod of two to twelve racks, or a cluster of several HPC racks alongside more modest density racks.
Expect more and more data centers to introduce highly dense racks in the near future. Hyperion Research noted that HPC growth is surging due to the demand for applications that require it.
“AI, machine learning, and deep learning areas are growing at close to 30% a year,” said Earl Joseph, Hyperion’s CEO.
It is one thing to want data centers to provide dense racks and quite another to make it happen. A number of technologies must integrate smoothly to facilitate HPC. These include more powerful chips such as GPUs, advanced cooling technologies, and applications designed to take advantage of all that power such as the latest ones aimed at AI. Further, there is a need for cabling improvements, computational fluid dynamics (CFD), and the ability to scale data centers via modular components and designs.
But advanced CFD and HPC are complementary. Use of CFD has been largely restricted to areas that could provide the compute power to provide accurate results. Thus, dense racks and HPC deployments make it easier to provide accurate CFD analyses. Hot racks, after all, can cause failures in PDUs due to the presence of so much rear-rack hot air. This is especially the case when racks are more than 35 kW.
No wonder alternative cooling technologies such as liquid cooling and immersion cooling are coming more into play to address hot spots and hot air issues.
“Where high-power consumption elements are packed into limited space, liquid cooling may become a necessity,” said Onur Celebioglu, Senior Director of Engineering, HPC, and Emerging Workloads at Dell Technologies. “This area is increasingly important in designing new data centers or retrofitting existing data centers with new systems.”
Vendors are now providing a large range of water-based cooling options. Dell, for example, offers actively cooled rack doors, direct liquid cooling with cold plates, immersion cooling, and a combination of these technologies. Japanese telecom provider KDDI, too, has developed compact mobile, container-like immersion cooling data centers that are claimed to bring about a 43% reduction in power consumption and take Power Usage Efficiency (PUE) below 1.07. Giga Computing Technology provided R282-Z93 and R182-Z91 rack servers to KDDI for management of cooling and Nvidia supplied the GPU computing nodes for the system. The servers are immersed in a bath of dielectric, nonconductive coolant.
“New generations of components force major change in server design to dissipate the heat from HPC and reduce the carbon footprint,” said Sean Chen, marketing manager at Giga Computing Technology. “Thus, advanced cooling methods are needed.”
Be aware, though, that liquid-cooled HPC applications impact cabinet size. That is why we are beginning to see cabinets that are 750mm or 800mm wide to accommodate extra power feeds and fluid manifolds, and cabinets that are 1200mm deep to provide the space needed for immersion technologies. These cabinets may also come with centralized power supplies delivering 48 VDC directly to servers.
“There will continue to be an insatiable demand for more processing power to solve even more complicated problems that we cannot solve today,” said Celebioglu. “As AI techniques became commonplace in HPC, AI algorithms are often used to augment modeling and simulation workloads.”
Real-time monitoring, data-driven optimization.
Immersive software, innovative sensors and expert thermal services to monitor,
manage, and maximize the power and cooling infrastructure for critical
data center environments.
Real-time monitoring, data-driven optimization.
Immersive software, innovative sensors and expert thermal services to monitor, manage, and maximize the power and cooling infrastructure for critical data center environments.
Writing and Editing Consultant and Contractor
Airflow Management Awareness Month
Our 7th annual Airflow Management Awareness Month live webinar series has concluded. Watch the webinars on-demand below!
Cooling Capacity Factor (CCF) Reveals Data Center Savings
Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.