This audio was created using Microsoft Azure Speech Services
The data center industry is sitting on a massive resource which could enable huge financial savings
I’ve taken the title for this blog from a famous old English poem, “The Rime (sic) of the Ancient Mariner” by Samuel Taylor Coleridge. The verses relate the experiences of a sailor returned from a long sea journey. During his voyage the ship he is on is becalmed in uncharted waters near the equator. Those aboard are tormented to death by thirst. At the same time, the ship itself is floating in the solution to the crew’s problems. Water everywhere, but nothing to drink.
I believe this is a potent metaphor for our industry. But this is not a blog about water use in the industry, as important as that is. This blog is about the ubiquity of a resource which could be instrumental in delivering important productivity and efficiency gains. I’m talking, of course, about data. As of today, and despite all the uses we are putting data to, there’s been an ironic resistance to make the most of what is a free resource to most data centers.
The entire data center sector exists to support the communication, processing and storage of data. But besides the data we are handling, we also have data pouring out of the physical infrastructure which provisions a mountain of IT and network equipment. Together with sensors monitoring everything from airflow to temperature and humidity, we already have a potential embarrassment of riches which can be readily tapped into.
When Data Center Infrastructure Management (DCIM) software was first defined and mooted as a way for us to get better with the way we design, operate and upgrade facilities, a perceived drawback to market adoption was the cost and complexity of instrumenting. Without instrumentation, went the argument, the data required to deliver the promises of DCIM were not realizable. Less than a decade later, nothing could be further from the truth.
In my opinion, right now we have enough temperature sensors installed in data center cabinets around the globe for us to be making wide sweeping operational improvements and efficiency cost reductions. What’s more, with very little effort, with the aid of a little analytics, this information and guidance could be freely and securely accessed by any data center owner or operator within a very short timescale. I’m talking days or weeks, not months or years.
In recent years it’s become fashionable to say that anything that appears too good to be true probably is. However, big data and data analytics have already shown their worth in a variety of uses from disaster response and pest control, to crop yields and drug interactions (see 4 Breakthrough Uses for Big Data and Why You’ll Benefit From Them). Researchers at Trinity College, Dublin are even using big data to solve challenges created by the demand for big data.
A recent white paper from Schneider Electric’s Data Center Science Center talks about big data analytics, as well as six other trends including cloud computing, mobile computing, machine learning, embedded system performance and cost improvements, automation for labor savings and cyber security, which are influencing data center monitoring. “Digital Remote Monitoring and how it changes Data Center Operations and Maintenance” is free to download.
If the data center industry is going to thrive sustainably, then we need to take advantage of the very technology which we provision. That’s not exactly rocket science. The commercial and environmental drivers for increased efficiency at lower cost will only harden in the coming decade. We need to make good decisions by getting the right data into the hands of data center professionals. If we don’t make a response in the immediate future, I fear we’ll share the same fate as the ancient mariners.