Request a Quote

Upsite Technologies does not sell products directly, instead we attempt to find the perfect distributor for our end users. Fill out the form below or Contact Us for assistance finding your distributor today.

Request a Quote

Download Modular Containment Brochure

Download Modular Containment White Paper

Download EMS Brochure

Download Wirefree Monitoring White Paper

Data Center Cooling – A Couple of Degrees Can Save Thousands

August 2, 2017
Share:
Tweet about this on Twitter7Share on LinkedIn63Share on Facebook0Share on Google+1Email this to someone

Data Center Cooling - A Couple Degrees Can Save ThousandsToo often we get caught up in the big picture of managing a data center. How do we cool entire racks? What can we do with all those aisles? How do we ensure proper cooling from a raised floor perspective?

Granted, all of these things are important. What’s also important is to constantly analyze ASHRAE requirements, how they’re evolving, and where they apply to your data center. There are a lot of opinions out there about the ideal temperature for server rooms and data centers because there are so many variables to consider such as server mix, cooling arrangement, and load conditions.

New technologies, like solid-state, have a different operating temperature than their spinning disk counterparts. The latest metrics show that regularly operating in the recommended 64.4F and 80.6F range, which based on the ANSI/ASHRAE Standard 90.4-2016, will keep your data center nice and healthy. But what about efficiency? What about operating modular containment within the data center itself?

Furthermore, keeping your data center operating at a lower-than-necessary temperature may not necessarily help with performance. It may be that you’re just keeping the temperature low and paying for it, with no real added value.

So, with all of this in mind, it’s important to ask – where are you using your energy today? A report from EYP Mission Critical Facilities stated the following:

  • IT Equipment – 50%
  • Cooling – 25%
  • Electricity Transformer/UPS – 10%
  • Air Movement – 12%
  • Lighting – 3%

This means that 37% of energy is consumed by moving and cooling air! And, these costs will add up.

Furthermore, cooling continues to be a key priority for data center operators. In a recent Cooling Profile Survey (Intel and HPE), it was found that:

  • More than 45% of survey respondents who answered the question (of whether cooling is a priority) were running at least 7Kw of equipment per rack.
  • 80% of respondents agreed or strongly agreed that reducing cooling costs is one of their highest priorities
  • 51% of respondents used air containment as part of their data center cooling strategy
  • 55% used CRAC or CRAH units

To that extent – there are some great ways to actually save on both cooling and energy costs. Consider the following:

  • Reduce air mixing via hot/cold air separation throughout the data center
  • Optimize floor layout (CFD)
  • Closely couple supply and returns for the IT load
  • Provide higher voltage to racks, reduce step downs
  • Use air side economizers & cooling towers
  • Deploy virtualization
  • Implement power save mode
  • Upgrade technology
  • Decommission servers
  • Bill-back power to drive behavior

Beyond that, it’s also important to look at what temperature your data center is operating. Is it static across the board? Or, do you have different temperature points for various use-cases within the data center? The point is – know your set temperature and make sure it’s truly optimized.

IT departments often over cool data centers to ensure that their mission critical equipment will not fail due to overheating.

“Data center managers can save up to 4 percent in energy costs for every degree of upward change in the baseline temperature, known as a set point,” said David J. Cappuccio, Gartner Managing VP and Chief of research. “The higher set point means less frequent use of air conditioning, which saves the energy used to run cooling systems”

Therefore, the key to avoiding overcooling and saving on energy costs is to determine what your optimal set point is. You can base the decision off of equipment recommendations, but a better way is to use ASHRAE guidelines to ensure you are operating within the healthy range for temperature and humidity.

The point here is simple – find ways to manage the environmental variables within your data center more effectively, and you’ll find ways to save big on cooling and power. You don’t have to sacrifice the life of your gear by increasing the temperature by just a degree or two. Rather, evaluating your ecosystem to find the right set operating temperature will not only teach you more about your data center, it’ll help you increase your levels of efficiency.

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

CCF CTA


About the Author

headshot-rjBill Kleyman, CTO, MTM Technologies Bill is an enthusiastic technologist with experience in data center design, management, and deployment. His architecture work includes large virtualization and cloud deployments as well as business network design and implementation. Bill enjoys writing, blogging, and educating colleagues around everything that is technology. During the day, Bill is the CTO at MTM Technologies, where he interacts with enterprise organizations and helps align IT strategies with direct business goals. Bill’s white papers, articles, video blogs, and podcasts have been published on InformationWeek, NetworkComputing, TechTarget, Wall Street Journal, ZDNet, Slashdot, and many others.

Contact Bill:
linkedin twitter facebook google+ email

Comment on Data Center Cooling – A Couple of Degrees Can Save Thousands

Your email address will not be published. Required fields are marked *

Subscribe to get more airflow management news from Upsite:
Thanks!
Thank you for subscribing to Upsite's airflow management blog, you'll now receive our weekly blog posts straight to your inbox.
Thanks again!
You are already subscribed to Upsite's airflow management blog, you'll now receive our weekly blog posts straight to your inbox.
Sorry!
There was an error processing your request. Please try again.