Pushing the Data Center Boundary Blog: Where is the data in data center?

Anyone ever notice that an awful lot of data centers may have plenty of management systems but the poor data center manager never seems to have the data he needs?

I’ve seen it many times – management systems that simply don’t work.  They don’t provide the right information to the right user at the right time.  As a developer of such products and tools, it’s inevitable that we get lots of feedback (some more progressive people call it constructive criticism) on our management tools.

The reality in my view is that most data centers are caught in a vicious paradox of two states:

State 1 (let’s call it Bliss):  “I only need to see something when there’s a problem”.  When the data center manager is in this state of mind we usually receive feedback along the lines of “your tool is too complicated”, or “I need more information, less data”, or “can’t you just send me an email or better yet a Tweet!”

State 2 (lets’ call it Terror):  “I just had an event and I can’t get the data I need”.  When the data center manager is in this state of mind we usually receive feedback along the lines of “I can’t find this particular piece of data – why and where are you hiding it?”.

This bipolar disorder of Bliss and Terror I believe is the natural state of the data center world.  Most human brains don’t want to process data and information unless it has to – and it really only has to when something has gone wrong.  Once the wrong occurs, the data center manager brain becomes like a character on the TV show CSI – digging deep into forensics data to find out the events that led to the death of availability.

Most failures are an error chain – a bunch of small things that if any one of them didn’t happen the failure wouldn’t occur.  Solving these mysteries means you have to have a lot of data in order to piece together the chain of events.  But most of the time we want to (and do) live in Bliss – and we don’t want to be bothered with the details.

This challenge is only getting worse because data centers need to collect more data to provide better information than ever before.  Why you may ask?  Because of what seems to be the standard answer to all data center questions these days:  Virtualization and Efficiency.

Virtualization and efficiency are driving the need to run data centers with higher levels of power density and running closer to capacity limits resulting in the need for better management systems to help manage the inherent safety margin reduction these trends imply.   This is a big topic I’ll save for another day but you get the idea I hope.  Simply put, the chance of making a mistake is higher and that transformation from Bliss to Terror is all that much closer and easier.

It doesn’t help these poor data center manager brains when you have traditional proprietary management systems holding the data in some safe like Fort Knox.  (Maybe Al Gore should have thought of these when he started talking about ‘lock boxes’).  The days of software vendors holding a customer’s data hostage by keeping it ‘safe’ in a lock box are gone.

The future is going to be lots of data, openly available, that can be easily manipulated and analyzed so that it can be delivered to the right person they way that person wants see it.

You’ll be able to spend your Blissful time receiving reassuring Tweets from your data center saying “I’m OK, are you OK?”.

And when in Terror you will be able to put on the doctor’s jacket, dive deep into the data and perform an autopsy to analyze the sequence of events leading to failure.

Easy to say, hard to do.


Please follow me on Twitter!


About Kevin Brown:

Kevin Brown is Vice President, Data Center Global Offer for Schneider Electric. He leads of team of industry professionals to develop and bring to market solutions for the data center market.  In this role, he has responsibility to articulate the vision for Schneider Electric’s data center offer and create comprehensive data center solutions that solve real customer problems today. Kevin is an experienced industry professional in both the IT and HVAC industry. He has over 20 years experience at Schneider Electric in a variety of senior management roles including product development, product management, marketing, and sales.

Tags: ,


  • Hi Kevin,

    Your reference to “data centers need to collect more data to provide better information than ever before.” is in part what the software and IT industry is referring to as “the Big Data Dilemma”.

    Regarding State 1 (Bliss); too many systems do wait to report, in some fashion, only when something goes wrong. This is often by design because quite frankly there are fewer examples of systems in the world that can proactively identify potential issues as well as potential savings, new markets, and opportunities. The financial and medical fields are good examples of industries utilizing such proactive systems. Data may also be hidden in some surprising places to many people. In an internal paper I recently wrote on the use of Hadoop Interoperability with SQL Server I covered the report by Microsoft that an estimated 80% of data in the world is locked away in data stores other than relational databases.

    Regarding state 2 (terror); Data Centers are going to see an influx in systems that will have to handle the “Big Data Dilemma” as the Smartgrid, the Cloud, and general virtualization of systems and platforms become the norm across Schneider’s various lines of business. Information is comprised of data and we should trend towards collecting as much data as we can because we never know just how valuable it may be. If we collect data, and even archive or warehouse 95% of it, we will at least have it for future mining or harvesting needs. Harvesting can be thought of as the act of picking of data we know exists and mining is the digging into data we are unsure of. This is where we’ll find new markets, costs savings etc. This is an often overlooked concept in all data collection projects in the world. As we know, you cannot report on what you do not have.

    And of course with this data being “openly available, that can be easily manipulated and analyzed…” comes an emphasis on security to ensure “the right person” sees it an no other. As we collect more data, we will do more with it, and as we do more with it we will rely upon it more, and as we rely upon it more it will become more important to us. This means disaster recovery and cyber security will become more important than ever.

    Stephen Dillon, Data Architect EMIS2 – Schneider Electric

    • Stephen,

      Very insightful thinking. I agree wholeheartedly with your comments on data mining and and the need for harvesting.

      Looks like it’s a good time to be a software guy!

      Thanks for the comment.


Comments are closed.