Enter ‘The Mist’: What You Need to Know About Mist Computing20 min read
Since you’re reading about ‘The Mist’ here on Upsite, you can safely assume that no, this is not a short-story IT horror blog. But, I think that a great way to distract people, at least a little, from all of the crazy 2020 things is with a new tech saying: The Mist!
To be clear, it’s not entirely as new as you think. Even though you maybe haven’t heard of mist computing, it’s something that’s already been discussed to a certain extent. Seven years ago, I wrote an article on Data Center Knowledge concerning Fog Computing. There, I wrote that the idea of fog computing is to distribute data to move it closer to the end-user to eliminate latency and numerous hops, and support mobile computing and data streaming. Sounds pretty edge-y, right?
Even though Cisco coined the ‘fog’ term, the idea quickly merged into what we know as Edge computing today. This is where we pause for a moment.
I’ve been having numerous conversations where the definition of Edge computing is already getting blurred. Some have argued that end-user devices, like phones, smart cars, and other smaller connected devices, are already a part of the Edge as they’re processing data. However, Edge computing refers to a slightly larger (but still not big at all) type of a system that has a specific purpose. This purpose is to aggregate data as close to devices and users as possible, filter and process some of that data, and send valuable information to a data center or the cloud. Edge environments can be as large as a small regional data center, or as little as a shipping container or even something attached to a light post.
But, those that believe our end-point devices are a part of the Edge aren’t wrong either. They just might be using a term that will likely go out of style soon. Many of the devices at the Edge of the Edge – cell phones, connected cars, smart home devices – are becoming a part of ‘The Mist.’ That is, where data is processed at the furthest reaches of the Edge. This concept of The Mist is fundamental because we are in an era of unprecedented and growing connectivity.
A new report from the American Institute of Physics (AIP) is actually talking about an information catastrophe! Here’s what the report states:
Currently, we produce ∼1021 digital bits of information annually on Earth. Assuming a 20% annual growth rate, we estimate that after ∼350 years from now, the number of bits produced will exceed the number of all atoms on Earth, ∼1050. After ∼300 years, the power required to sustain this digital production will exceed 18.5 × 1015 W, i.e., the total planetary power consumption today. After ∼500 years from now, the digital content will account for more than half Earth’s mass, according to the mass-energy–information equivalence principle.
Even though the report was super interesting, it’s important to note that no, we’re not about to implode from all of the data we’re creating. Efficiency in the data center, especially at the hyperscale level, has been growing tremendously. Working in the hyperscale space, I can attest to this. We’re continually looking at ways to improve efficiency, leverage greater levels of density, and even deploy massive, utility-scale battey+solar arrays to power the hyperscale ecosystem. A report from Science validates these efforts as well as those of other leaders in our industry. It does so by showing that despite massive growth around things like data, Edge, and cloud computing, efficiency improvements have kept energy usage almost pretty flat.
The report showed us that even though the amount of computing done in data centers increased by about 550 percent between 2010 and 2018, the amount of energy consumed by data centers only grew by six percent during the same period.
The good news is that we can keep up with data demands and the vast amounts of information we’re creating. The challenge, however, is what we can do with all of this data as the proliferation of connected devices only continues to increase. With Mist computing, we begin to control and manipulate information far more effectively. The power of the devices at the Mist-level allow for a few key things to happen:
- Localized analytics and leveraging decision-making data
- Faster response to localized data for applications and services
- Data geo-fencing and access control enforce privacy and greater levels of security at a localized level
- Efficient grouping, filtering, and even pattern recognition at the Mist-level
- Deeper integration and usability of smarter devices
The Mist isn’t for simple door sensors or devices that require connectivity to a gateway. Instead, Mist directly helps smarter devices at the Edge of the Edge (the extreme Edge) to harvest resources by allowing both computation and communication to happen right on the device or smart sensor. Further, you can provision, deploy, manage, and even monitor what’s being done with data directly on the device. From there, these devices would leverage their built-in microcontrollers or microcomputers to process data, and then send it up to the Fog/Edge, and then finally off to a data center or the cloud.
Here’s why all of this matters to you
There’s a new saying out there that I’ve heard recently:
“Congratulations. You’ve just completed your five-year IT plan in five months.”
Many IT environments have been on overdrive, ensuring the support of many new connections and remote users. This is why The Mist is something you’ll be hearing about more often in the coming year or two. A recent article from McKinsey states that the number of businesses that use IoT technologies has increased from 13 percent in 2014 to about 25 percent today. And the worldwide number of IoT-connected devices is projected to increase to 43 billion by 2023, an almost threefold increase from 2018. And, according to IDC, connected IoT devices, or “things,” will generate 79.4 zettabytes (ZB) of data in 2025.
“Humankind is on a quest to digitize the world, and a growing global DataSphere is the result. The world around us is becoming more ‘sensorized,’ bringing new levels of intelligence and order to personal and seemingly random environments, and Internet of Things devices are an integral part of this process,” said David Reinsel, senior vice president, IDC’s Global DataSphere. “However, with every new connection comes a responsibility to navigate and manage new security vulnerabilities and privacy concerns. Companies must address these data hazards as they advance new levels of efficiency and customer experience.”
Mist computing is literally here to help organizations control what will be located at the extreme Edge. Remember, we’re only at about 50% global penetration when it comes to Internet connectivity. This means that billions of people (and even more devices) have yet to go online. Mist computing has some real-world examples as well. These solutions can be leveraged to deliver exceptional and life-saving experiences to healthcare, remote clinics, and other health-related services. Mist can actually help connect more people and bring them better services, applications, and device experience.
Even though this might be something new for you – be ready, the Mist is coming. And it’s the cool kind, not the scary kind.
Real-time monitoring, data-driven optimization.
Immersive software, innovative sensors and expert thermal services to monitor, manage, and maximize the power and cooling infrastructure for critical data center environments.
Real-time monitoring, data-driven optimization.
Immersive software, innovative sensors and expert thermal services to monitor,
manage, and maximize the power and cooling infrastructure for critical
data center environments.
Executive Vice President of Digital Solutions, Switch | Industry Analyst | Board Advisory Member | Writer/Blogger/Speaker | Executive | Millennial | Techie
Bill Kleyman brings more than 15 years of experience to his role as Executive Vice President of Digital Solutions at Switch. Using the latest innovations, such as AI, machine learning, data center design, DevOps, cloud and advanced technologies, Mr. Kleyman delivers solutions to customers that help them achieve their business goals and remain competitive in their market. An active member in the technology industry, he was ranked #16 globally in the Onalytica study that reviewed the top 100 most influential individuals in the cloud landscape; and #4 in another Onalytica study that reviewed the industry’s top Data Security Experts.
Mr. Kleyman enjoys writing, blogging and educating colleagues about everything related to technology. His published and referenced work can be found on WindowsITPro, Data Center Knowledge, InformationWeek, NetworkComputing, AFCOM, TechTarget, DarkReading, Forbes, CBS Interactive, Slashdot and more.
Airflow Management Awareness Month
Our 6th annual Airflow Management Awareness Month webinar series has concluded. Watch the webinars on-demand now!
Cooling Capacity Factor (CCF) Reveals Data Center Savings
Learn the importance of calculating your computer room’s CCF by downloading our free Cooling Capacity Factor white paper.