Navigating Through a Sea of Data: How Cognitive Solutions Will Impact the Data Center

by | Jan 9, 2019 | Blog

It’s been often referenced that data is the new ‘oil’ of our digital lifetimes. It’s not just benign data either, it’s really important to what we do every single day. IDC estimates that by 2025, nearly 20% of the data in the datasphere will be critical to our lives and 10% of that will be hypercritical. Here’s the other reality—there’s a lot of it, you have to mine for it, and it’s all extremely valuable. And, the better your mining capabilities, the better you can leverage and find value out of that data. This is a big reason so many organizations are investing in advanced cognitive systems to gain an edge, and create new competitive solutions. Let me give you an example, IDC estimates that by 2025, two-thirds of global financial firms will integrate cognitive data from third parties to improve the customer experience through targeted product and service offerings and fraud protection.

Cognitive systems can greatly step up the frequency, flexibility, and immediacy of data analysis across a range of industries, circumstances, and applications. In that same report, IDC estimates that the amount of the global datasphere subject to data analysis will grow by a factor of 50 to 5.2ZB in 2025; the amount of analyzed data that is “touched” by cognitive systems will grow by a factor of 100 to 1.4ZB in 2025.

Edge and IoT

Already in the data center world, we’re seeing new solutions emerge to support advanced data models. Furthermore, data center architectures are striving to bring data and entire experiences much closer to respective services and users. Edge is a great example of this. In fact, edge architectures are specifically designed to support data analytics and cognitive systems. Companies are trying to gain more ‘intimacy’ with their customers. This is being accomplished through new types of services, applications, and certainly IoT. In fact, IDC predicts that by 2020, the spend on edge infrastructure will reach up to 18% of the total IoT infrastructure spend. The ‘closeness’ of these solutions will drive them to better deliver value when it is needed (right time, right place, right product).

To generate this intimacy, there is a proximity that will be critical in order to manage the volume of information and generate a “timely” result. Enabling technologies like 5G and edge computing are making this movement accelerate. And, this is also a major part of an organization’s digital journey. That is, to impact customers while still delivering powerful business value. Remember, the goal of Edge Computing is to allow an organization to process data and services as close to the end-user as possible. Depending on your use-case and the market you’re trying to go after, working with edge might make a lot of sense. In scenarios where edge is involved, you’re processing data closer to the source, reducing the potential for latency, and are getting results faster to your customers.

Data Center Design

Aside from edge architectures, the way you design your data center might need to evolve as well. Remember, working with cognitive systems means you need to take a new approach to infrastructure design and resources utilization. Let me give you a specific example:

I/O and infrastructure for cognitive solutions must be a serious consideration. Please note, services, functions, and even virtual machines running cognitive systems are not your typical workloads. When working with data analytics, the expectation is to actually do something with the data in a timely and efficient manner. So, if you’re running a data analytics job on Hadoop, for example, this may be CPU, memory, or even I/O intensive. Similarly, if you’re running a machine learning engine, you could very well require quite a few CPU cycles. From there, working with real-time, AI-driven, analytics could be super memory intensive. And, if you need to at all transform or prep data for analytics you might have very I/O intensive workloads. The secret for the data center leader is all about sizing and knowing your market. Are you going after a broad market or maybe just healthcare or retail, for example? Similarly, maybe you’re aiming your services at content and content providers. In designing your data center to support cognitive systems, conduct both a business and technology study to size your systems properly.

Looking Ahead

If you’re a data center operator or provider, and you’re looking to get into this space, my biggest piece of advice is to make sure you don’t take a ‘traditional’ approach. This means taking a holistic view of the entire data center and the functions you plan on running. You need to take into consideration your workloads, tenants, and the future of your design. This will impact power, cooling, airflow, and other key design considerations.

I can pretty much promise you that at some point either your business or customers (or competition) will be doing something with a cognitive engine. In some cases, you’ll simply leverage a service in the cloud. In other cases, you’ll be asked to deploy a solution within the walls of your data center. How ready are you for that challenge? It’ll be important to build the right stable of both partners and technologies to help you navigate the data-driven future. Most of all, it’ll be critical for you ensure you really think outside the ‘data center’ when you design around cognitive solutions.

Bill Kleyman

Bill Kleyman

Director of Technology Solutions at EPAM | Industry Analyst | Board Advisory Member | Writer/Blogger/Speaker | Executive | Millennial | Techie

Bill Kleyman brings more than 15 years of experience to his role as Director of Technology Solutions at EPAM. Using the latest innovations, such as AI, machine learning, blockchain, DevOps, cloud and advanced technologies, Mr. Kleyman delivers solutions to customers that help them achieve their business goals and remain competitive in their market. An active member in the technology industry, he was ranked #16 globally in the Onalytica study that reviewed the top 100 most influential individuals in the cloud landscape; and #4 in another Onalytica study that reviewed the industry’s top Data Security Experts.

Mr. Kleyman enjoys writing, blogging and educating colleagues about everything related to technology. His published and referenced work can be found on WindowsITPro, Data Center Knowledge, InformationWeek, NetworkComputing, TechTarget, DarkReading, Forbes, CBS Interactive, Slashdot and more.

Let's keep in touch!

0 Comments

Submit a Comment

Your email address will not be published.

Archives

Cooling Capacity Factor (CCF) Reveals Data Center Savings

Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.

Pin It on Pinterest