AI and Machine Learning: Key Challenges in Colocation Data Center Market

This audio was created using Microsoft Azure Speech Services

In my last blog, I talked about some of the hype going on in the industry over artificial intelligence (AI) and machine learning (ML). The blog gets into what the current AI techniques are capable of on a fundamental level today, and I also offered a definition for these often-misunderstood terms in the context of data centers. In this post, I’d like to develop the discussion by describing 3 key challenges that the industry needs to address and resolve if AI tools are to be broadly adopted to achieve their full value for colocation providers.

3 Things the Data Center Market Needs To Overcome for Broad AI Adoption

Conceptual background- Artificial intelligence / humans and cyber-business (detailed with millions of small binary code)

The first challenge is instrumenting the data center. The old adage “garbage in, garbage out” applies here more than ever. Despite their “black box” nature, machine learning algorithms and deep neural networks are not magic. Like any analytics engine, they need large volumes of good data to act on. Those with well implemented DCIM suites are probably in good shape. But part of this challenge of instrumenting the data center falls also on vendors of the equipment. Does their hardware collect and report the info necessary to make the algorithms work? Schneider Electric has long been digitizing and instrumenting their UPSs, Cooling units, PDUs, Switchgear, etc. – ahead of the game than others in the industry. But, as we develop AI use cases and the algorithms to support them, we may realize we need new sensors in new places that we don’t yet have today. For example, maybe we find by having a vibration sensor in a different location it might give us a little more proactive visibility into the life cycle of that system that we don’t have today. Things like this will evolve over time.

The second challenge is that traditionally this data has been living in disparate systems. Facility data lives in the BMS, power quality information in the electrical power monitoring system (EPMS), information about the white space infrastructure in DCIM tools, and IT software/virtual resources in IT operations management tools. For the system to understand all the critical variables and how they are connected and impact each other, this data should be consolidated and put in the AI model. Consolidating all this disparate data is still a challenge not yet fully solved. The new Schneider Electric EcoStruxure™ System Architecture and Platform, however, goes a long way to solve this challenge. Without consolidation, AI applications are limited to much more narrow functions like air handler optimization or early warning on cooling unit fan failures. These are useful functions, of course, but they are not earth-shattering.

The third data challenge is what we refer to as data integrity. All this data needs to be correlated to each other and there needs to be context; the model needs to know where the data comes from exactly. For a given dataset from a specific asset, the model might need to know things like: site, room, row, rack, U-space, power path, network port, and policy requirements. The time periods need to be synced in some way. DCIM tools require having this all mapped out and defined, but it takes a lot of effort and resources to set that up initially and to then maintain it as things change over time. It is largely up to us vendors to try to simplify all of this and hide the complexity.

What’s Next for AI in the Data Center Market that Colocation Providers Need to Consider

The point here is that these challenges exist and still need to be figured out before AI use in data centers becomes common practice and colocation providers can best apply the benefits. Having a well-implemented and maintained DCIM system is a key first step for colo providers. Such a system will provide the necessary metering and contextual data that will make AI tools effective. Stay tuned for my next blog where I’ll share how we think AI will be applied specifically in colocation data centers in the near term – as well as why AI’s reliability will have a bigger impact to their business growth.

Tags: , , , , , ,

Conversation

  • RISHI JAIN

    6 years ago

    Nice article.. are there any customers who are building new data centers and ready to install all sort of sensors required for AI?

    • Patrick Donovan

      6 years ago

      Hi Rishi Jain, yes, I think customers who have well implemented DCIM, BMS, and EPMS systems are well positioned to take advantage of AI applications as they are developed. Much of the required sensors, however, will be in the UPSs, power meters, breakers, cooling units and so on. So it will be on the vendors to embed much of the needed sensors, I think.

  • Ka Ahmadou

    6 years ago

    Very nice article, I would like to know What kind of Machine learning and AI applications does Schneider have?
    Best regards

Comments are closed.