How the Edge Is Changing Data Center Design
The distribution of IT is not anything new. However, the concept of the edge is something to which we should pay attention. Many organizations are now looking for better ways to deliver rich content to users who are massively spread out. Furthermore, we see even more companies push out applications, desktops, and various services to rural locations. A big challenge here revolves around performance and user experience. After all, just because an application can be delivered doesn’t mean that is happening efficiently.
This was the ultimate challenge when it came to cloud computing. “Organizations that have embarked on a digital business journey have realized that a more decentralized approach is required to address digital business infrastructure requirements,” says Santhosh Rao, principal research analyst at Gartner. “As the volume and velocity of data increases so too do the inefficiency of streaming all this information to a cloud or data center for processing.”
Here’s an important final point to remember – according to Gartner, currently around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2022, Gartner predicts this figure will reach 50%. This means edge services will continue to grow. Moreover, based on current projects, the data center will be at the heart of this distribution.
The Edge Will Consume the Cloud
There was an interesting blog from Gartner stating that the agility of cloud computing is great – but it just isn’t enough. Massive centralization, economies of scale, self-service and full automation get us most of the way there – but it does not overcome physics – the weight of data, the speed of light.
This is where the edge comes into play. However, to meet the demands of new use-cases around virtualization, data analytics, and pushing applications close to the end-user, you are going to need processing power, storage, and functional networks. The edge will need some serious muscle.
This might be bad news for traditional cloud vendors – but this is excellent news for the data center. The decentralization of cloud has given rise to the remote data center, branch locations, and strategic points of computing located close to the end-user. Unlike large traditional data centers, these are point solutions, smaller sites, and more agile solutions designed to meet the requirements of the end-user.
However, this does not mean we are just sticking a rack of gear in a closet. If that edge goes down, user experience degradation will quickly follow. This is why you need to take extra care when you design your edge data center and what you put inside.
- Plan around power requirements. I recently helped conduct the 2018 State of the Industry Survey. For full results, you will have to come by the keynote session, where I will be presenting at this year’s AFCOM Data Center World Global Conference. We found out that many organizations are already actively deploying edge locations. Specifically, 30% stated that they have 6-10 edge locations implemented 57% said that they will have 21-40 potential edge locations within 36 months. When it comes to power, currently, the average power density for edge deployments is 6kW-10kW per rack, as indicated by about 34% of our respondents. If you are working with edge solutions, it is imperative that you design around requirements both today and in the near future. The majority of our respondents indicated that there would be new edge deployments for their organizations. This means proper design considerations are critical.
- Cooling and airflow are critical. Efficiency around edge solutions is vital. We are way beyond traditional cooling mechanisms too. A great example of this is AisleLok from Upsite. Modular containment allows you to do some pretty amazing things with specific parts of your data center. It also allows you to deploy smaller data centers based on your particular These types of solutions, although maybe designed for more prominent data centers, can do amazing things at the edge.
- Focus on your use-case and ensure good user experience. Not every edge is designed for the same purpose. Some use it for healthcare data processing; others are leveraging the edge for content delivery. Some are even using edge service for high-performance computing. Whatever the use-case, make sure your design follows suit. The last thing you would want is an edge platform that’s underpowered or under-cooled. The most successful edge deployments are those that have defined use-cases, business requirements, and a prominent understanding of how the user will interact with the data center.
- Maintain strong visibility and management. Please make sure your edge data center is not just another island of computing. Honestly, from a data center management perspective, this is the last thing that you would DCIM solutions are powerful tools which allow you to integrate various points of your data center into a centralized management framework. I highly recommend designing the edge with proper management in mind.
If your organization has not looked at the edge yet – it is time to explore some go-to-market options. Innovations around things like IoT, telemedicine, and even education is pushing resources further out. It will be up to the data center to support these new types of initiatives. If you support the edge today, you will be able to cater to the requirements of some pretty advanced business and user demand which results in happier users and a more competitive market.
CTO, MTM Technologies
Airflow Management Awareness Month
Free Informative webinars every Tuesday in June.
Cooling Capacity Factor (CCF) Reveals Data Center Savings
Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.