I recently presented at the Gartner Symposium and during my preparation for the presentation I revisited some of the concepts I discussed in my “Where is the data in data center” blog. I decided I wanted to expand on some of thinking based on the questions I have been receiving.
In that blog, I mentioned that the future will be open systems but I didn’t describe how this should happen.
It seems inevitable that any discussion of open systems will lead to certain people searching for a central planning organization that will divine a standard. While these organizations have noble objectives, they move slowly because they are consensus driven. [I’ll see you 10 years from now for revision 1 (draft)…]
I was thinking about this issue during a Google presentation at the Symposium. Google discussed their view on the future of social networking/cloud computing/chrome books/google docs and the end of Microsoft as we know it [Ok – they didn’t really say that…].
What resonated with me during this presentation is that we should be thinking about management tools in a context of an information flow model. An analogy came to mind with respect to the information flow model of Twitter and/or Facebook (or Google+) versus dealing with the IRS.
I assert that data center management tools should work under an information flow philosophy that:
1) information should be made easily available by the provider
2) information receivers, not providers, should filter ‘important’ from ‘not important’ information
3) industry wide standardized data structures are nice, but they aren’t a requirement
I think Twitter is a great analogy to this type of information model flow philosophy. When I tweet I tweet things that I think somebody out there may be interested in hearing about. I am sure that many times the people following me may not be interested but I leave that up to them.
I do the same as a receiver of information from the people I follow. I don’t expect everything they tweet will be interesting or important [a lot of it isn’t!] – but that’s for me to decide.
And Twitter didn’t wait for some standards body to determine how they should be making the information available. They just did it – and now practically everything is integrating with Twitter. It’s like the movie Field of Dreams – build it and they will come.
In contrast, I thought about how the IRS behaves during an audit [from what I hear…really….I’ve never been audited….]. From what I understand, those who have gone through an audit work under a different model of information flow. Under the IRS information flow model:
1) The IRS only gives you the information they want to
2) The IRS decides what information is important, and
3) The IRS works under a bunch of regulations that tells them how they are supposed to behave. Which……….can….slow…..things…..down…..
In other words, one model is based on the provider is in position of power (i.e. the IRS) where the other is in the receiver is in the position of power (social networks).
Most management tool vendors, I argue, tend to think of themselves and behave like the IRS. They view that the data they have is their value. They don’t philosophically want to openly share that information and they tend to take the position that they will decide what information is appropriate to deliver.
Instead, I believe that all management tools should openly share their information to whoever is subscribed . And they should not edit what information is available – let the receiver figure it out.
And I can now hear the loud chorus of central planners getting ready with their responses that “we can’t do that until we have a standard data construct – we must have a committee and create a standard”. I believe that’s really just an excuse for vendors not moving. Web services provide an open and easily readable format into anyone’s data. Ideally, there would be standards in place to make everyone’s life easier but do we really need to wait?
I remember in the 1990’s when we were implementing SNMP – everyone wanted the standard MIB support but we thought it was too limiting so we supported the standard and our own expanded version. Market research later confirmed that the vast majority of our customers implementation went with our expanded MIB because they wanted more information rather than less. What a disservice if we had decided to wait until the standard was completed….
It’s time for data center management tools to start implementing a social network and stop behaving like the IRS.
Please follow me on Twitter @KevinBrown77
About Kevin Brown:
Kevin Brown is Senior Vice President of Innovation and Chief Technology Officer for the €3.7 billion IT Division at Schneider Electric. In this role, he is responsible for driving innovation and managing the R&D portfolio for the IT Division as well as driving the overall Schneider Electric portfolio strategy for the Data Center market. Prior to this position Kevin served as Vice President, Data Center Global Strategy and Technology. Kevin has also held numerous senior management roles in product development, engineering, and software development in the power electronics and HVAC industries. He holds a Bachelor’s of Science in Mechanical Engineering from Cornell University.