How IoE Will Redefine Your Data Center10 min read
My Pebble watch can connect to my Nest thermostats and my Philips HUE lights at my house. I’m pretty sure my next big appliance purchase will be a refrigerator that can order my favorite yogurt and milk as soon as it senses that I’m running out. Usage statistics and automation processes would, of course, help my fridge make the purchasing decision at the exact time that it needed to be made.
So what does my future Whirlpool IoE-enabled Refrigerator have to do with your data center? Everything.
It all starts with how we now process information, the evolution of the user, and how the modern organization has worked to keep up. Before we continue – it’s important to understand that we are talking about much more than just BYOD. Examples of IoE range from cars to inter-connected garbage trucks using cloud for efficiency.
That said – now we take a look at how all of these interconnected components are actually impacting your data center. The latest Cisco Global Cloud Index report outlines how IoE is generating large volumes of data, and currently only a small portion of that data reaches the data center. Although data center traffic is anticipated to reach 8.6 ZB in 2018, the total volume of data generated by IoE in 2018 will be more than 400 ZB, or nearly 50 times higher than the sum total of data center traffic.
- A Boeing 787 generates 40 terabytes (TB) per hour of flight, half a TB of which is ultimately transmitted to a data center for analysis and storage.
- A large retail store collects approximately 10 gigabytes (GB) of data per hour, and 1 GB of that is transmitted to a data center.
- An automated manufacturing facility generates approximately 1 TB of data per hour, and 5 GB of that is transmitted to a data center.
- A mining operation such as that of Rio Tinto can generate up to 2.4 TB per minute.
- As data center capacity improves, more of the data generated locally by the IoE may be transmitted to a data center, and the remaining data may still be available for analysis through data virtualization, which enables analysis of distributed data sources.
Redefining your data center
At the data center layer – things are really starting to churn. Administrators are working hard to make sure they keep up with all of this new data and capacity demand. Here’s what’s changing.
Data center density. The influx of data and new devices are changing the data center as well as the resource delivery architecture. This means that storage, servers, networking and even the rack design are all changing. One big misconception is that a smaller chassis might take up fewer data center resources. The reality is that cooling, power and management all have to change to make sure new kinds of converged systems stay efficient.
The data center environment. To support new kinds of technologies – your data center will have to change. The design you implement will have to handle new kinds of cooling, power, and environmental methodologies. Rack-based cooling will have to evolve as new converged infrastructure allows for greater application and user density. Also – you may have to isolate certain racks to create cooling silos for custom or especially heavy workloads.
Creating distributed data center designs. Cloud computing has created a much more distributed data center design. New connection points allow for the vast distribution of rich content and data. This also introduces new ways of managing and controlling data center environmental variables. DCIM has been evolving at a very fast pace. New control tools can live in the data center and in the cloud. This kind of visibility becomes truly needed as you begin to control more data center resource points. Furthermore, cloud-ready DCIM tools allow you to integrate intelligent automation technologies to create true data center efficiency.
Data center 3.0 and beyond. We are creating a data center platform built around enterprise mobility capabilities and true data distribution. Working and living in a world without digital boundaries allows users to traverse networks seamlessly. Back at the data center – user and resource intelligence allows content to bounce through the cloud and be delivered as efficiently as possible. Through it call – your environment must be running as optimally as possible. IoE is a big reason why data center administrators look for ways to reuse server heat to warm offices, optimize ways to deliver cooling directly into the racks, and deploy highly efficiency environmental control methodologies. Your future data center will need to be as agile as the technologies evolving around it.
So how do you keep up with all of this? How do you make sure your data center can continue to help you be competitive in your market? The following advice will really help you keep pace with IoE and the evolution of the data center:
- Never, ever become complacent. Always learn about new technologies – including those outside of the data center world. Because, let’s be honest, not many people thought that my future refrigerator would be impacting your data center architecture.
- Don’t be afraid to evolve. Try new solutions, research new data center optimization tactics, and always look for ways to be even more efficient. The best data centers in the world find an amazing rhythm where they can test and deploy new technologies in parallel with production systems. This allows them to gain vision into future best practices where they can evolve – quickly.
CTO, MTM Technologies
Airflow Management Awareness Month
Free Informative webinars every Tuesday in June.