Keep the Data Center Cool – 2017 and Beyond11 min read

by | Jan 18, 2017 | Blog

It’s not difficult to keep your data center cool. You just need to be informed.

A lot has changed when it comes to data center deployments over the past few years. We saw a boom in data center buildouts; and now we’re seeing new initiatives around consolidation and convergence. In fact, many organizations are looking at ways to reduce their data center footprints and are using converged technologies as a direct driver.

Looking at the industry – let’s examine some key trends around cloud, convergence, cooling, and power utilization.

  • Converged Infrastructure: Industry trends show that the pace of converged systems adoption will only continue to grow. According to a recent Gartner report, hyper-converged integrated systems will represent over 35% of total integrated system market revenue by 2019. This makes it one of the fastest-growing and most valuable technology segments in the industry today.
  • Cloud Growth: IDC pointed out that worldwide spending on public cloud services will grow at a 19.4% compound annual growth rate (CAGR) — almost six times the rate of overall IT spending growth – from nearly $70 billion in 2015 to more than $141 billion in 2019.
  • More Data in the Data Center: The latest Cisco Cloud Index report shows that although the amount of global traffic crossing the Internet and IP WAN networks is projected to reach 2.0 ZB per year by 2019, the amount of annual global data center traffic in 2014 was estimated to be 3.5 ZB—and by 2019, will triple to reach 10.4 ZB per year.
  • Big Data and Analytics: According to Gartner, organizations typically have multiple goals for big data initiatives, such as enhancing the customer experience, streamlining existing processes, achieving more targeted marketing and reducing costs. As in previous years, organizations are overwhelmingly targeting enhanced customer experience as the primary goal of big data projects (64%). Process efficiency and more-targeted marketing are now tied at 47%. A recent GE Capital study outlined how demands on the ever-growing complexity of data center assets continue to evolve as mobile broadband expands, big data analytics becomes more prominent, social media attempts to replace basic email as the holy grail for marketers and cloud services become more entrenched in both consumer and corporate America.
  • Data Centers are Power Hungry and Growing. A 2015 NRDC report indicates that data center electricity consumption is projected to increase to roughly 140 billion kilowatt-hours annually by 2020. This is the equivalent annual output of 50 power plants, costing U.S. businesses $13 billion annually in electricity bills. Furthermore, the latest AFCOM State of the Data Center report showed that 70% of respondents indicated that power density (per rack) has increased over the past 3 years. 26% indicated that this increase was significant.
  • Organizations are Focusing on Efficiency and Cost. In a recent Green Grid research into European data center usage, energy efficiency and operating costs are the most common areas of the data center reported as requiring improvement.

So – how is your data center handling all of this? What are you doing to improve data center operational efficiency? New concepts around cooling, efficiency services, and even power management are all impacting the way we deploy, build, and even update our data center environments. Being tasked with creating greater levels of density while still reducing overall footprints does not have to be a nightmare. In fact, new systems specifically aim to reduce data center complexity and will allow more diverse workloads – like big data, cloud, virtualization, and more. With all of that in mind, let’s look at some “cool” new ways to help your data center breathe a bit easier.

  • Look at alternate methods of cooling – like liquid. Very simply defined – liquid cooling is the process of heat transferring from a solid (the electrical component) to a fluid within the server. From there, the server liquid cooling system will draw the liquid across the internal components, facilitating this heat transfer. Now, new liquid cooling solutions like those from Ebullient are absorbing heat by vaporizing Novec 7000 (a dielectric, non-toxic, engineered fluid from 3M Corporation) within sealed modules mounted directly to the hottest server devices. The cool part, no pun intended, is that this is applicable to servers as well as individual components. For example, if you’re doing big data processing and are using one of the new NVIDIA Tesla K80 graphics accelerator cards – you can cool the card with liquid cooling technologies. The point is that there are new options around cooling impacting next-generation deployments within the data center. Always stay agile and ready to take on new business initiatives.
  • Conducting an efficiency analysis. Data center environmental variables are never set in stone. That’s why they’re variables. This means cooling, heat considerations, airflow, power management, rack containment, and even capacity management are all fluid, constantly changing, operations. With all of that in mind – how efficient is your data center? Do you have good DCIM tools which help monitor efficiency in your data center? Are there blind spots? Sometimes doing a physical walkthrough is an important way of understanding airflow within your data center. Bottom line – with more convergence and cloud impacting the data center – conducting constant efficiency analysis will be critical. Data is important, but also doing physical analysis is important.
  • Cooling as a science, and a service. Picking up from the previous point – many successful data center operators look at data center environment management as a conceptual science. With so many new variables impacting power, cooling, heat, and space – understanding the dynamics of the data center have become more important than ever. So, some organizations are looking at cooling and airflow as a science, or as a service. Cooling as a consumption model makes sense as you’re able to understand where you have the greatest amounts of cooling and airflow requirements. However, to have this type of dynamic environment – you must continuously evaluate and analyze your ever-evolving requirements.

There won’t be a slowdown around the current digital revolution that we’re experiencing. In fact, many organizations are bringing a lot of their analog technologies into the digital world. Automotive, healthcare, the industrial sectors are all working with IoT and new technologies to better deliver their products and services. Remember, at the heart of it all will be your data center. And, the dominant model of the data center is a hybrid, distributed, architecture. In 2017 and beyond – know how to better control these distributed systems; and how to keep them running efficiently.

Bill Kleyman

Bill Kleyman

CTO, MTM Technologies

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Subscribe to the Upsite Blog

Follow Upsite

Archives

Airflow Management Awareness Month

Free Informative webinars every Tuesday in June.

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest