At the recent Schneider Electric International Colo Club meeting in London I had an interesting conversation with one of our customers who noted that we’re raising a generation of kids who will find it quite normal to talk to their houses. “I’m on the way home, set the oven to 350°F, and put an extra chill on a beer.”
It’s not hard to extrapolate from that to an era where we talk to our cars, office buildings and who knows what else (hopefully still to each other).
Just think about how much data we’ll be generating to make such scenarios possible, as it all stems from Internet of Things (IoT) technology where an awful lot of “things” are generating enormous amounts of data.
The reason I brought this up in my talk at the Colo Club is that it’s all about growth: growth in data, which fuels growth in demand for data centers and growth in colocation provider revenue, at least for those providers that can effectively deal with it. Check out my presentation below.
It’s clear to us at Schneider Electric that the world is getting more connected every day and to keep up, customers are demanding a more agile, transparent, safe, interoperable architecture. As I said during my talk, we also see a world where data is the new oil. And the largest consumer of that oil going forward will be analytics.
Here’s just one example, from a large customer of ours in the minerals, mining and metals vertical. Through its various ports, railways and mines, this client produces 2.4 terabytes (TB) of data every minute. Why? To feed its predictive analytics and preventive maintenance systems. The return on that data investment is about $200 million per year over the next 3 years.
Those are some astounding numbers – and a great example of how IoT technology can be used to gain a competitive edge. But to make it work effectively will require lots of edge data centers to help process all that data. And that will put pressure on colo providers to deliver data centers that are agile and reliable enough to meet these ever-increasing demands.
The good news is those same analytics engines that are consuming so much “oil” can also be applied to data centers to help improve their performance, reliability, efficiency and cost-effectiveness. That’s what our EcoStruxure for Cloud and Service Providers is all about.
EcoStruxure uses a three-layer architecture. At the bottom is all the connected “things,” or products in your data centers – UPSs, PDUs, power and cooling systems. And they don’t necessarily have to all be Schneider Electric products because EcoStruxure is an open architecture, built to reflect the heterogeneous nature of data centers.
In the middle is the edge control layer, where EcoStruxure enables local autonomy in terms of monitoring and taking action on any alarms, whether it’s on the building level, for IT systems, or power systems.
The top layer is where the real magic happens. This is where all the data bubbles up to feed applications, analytics and services. EcoStruxure IT Advisor, for example, collects data not just from your data center, but from lots of others. It includes detailed data about building infrastructure, energy use, cooling systems and more. We then apply artificial intelligence and machine learning tools to extract useful information, whether about the life expectancy of various components, how to increase energy efficiency or lots of other areas. And it’s delivered to you as a cloud-based service.
We believe EcoStruxure will be crucial to helping our colocation provider customers meet the demands that their customers are making, and seize the opportunity that is before them in the IoT age.
To get a taste for what EcoStruxure can do, check out this video about how one of our customers, the large European data center provider Interxion, is using it to drive efficiency in one of its newer data centers in Marseille (which we also helped them build in just two months, but that’s another story).