During my visit to Schneider Electric’s European Technology Center, I had a fascinating conversation with Henrik Leerberg, Product Line Director for StruxureWare for Data Centers about how software may now be changing the dynamics of data center management. Given all the focus and progress being made on virtualization in the data center, I asked Henrik who was looking after the physical servers these days.
Henrik told me that this is a real dilemma in many companies now as many companies can now see that no one is as concerned about the servers anymore – the physical equipment on the IT side. IT departments have moved their focus on and are now increasingly focused on the applications and the services they’re providing to business via the data center and as a result are less worried about what’s underneath. As long as they have the compute, the storage and the network they needed to deliver their application or service, they’re happy.
From Schneider Electric’s point of view, the group to best take over the responsibility for looking after the physical servers are facilities departments and data center managers. To successfully do this though, the facilities organisation is going to need new skills and new software to understand how the server should be operating rather than what goes on inside the server. They also need to know where it should be located, how it should be connected as well as the types of information already being provided by DCIM applications.
This is an interesting shift as when DCIM was first introduced, there was a lot of talk about how this kind of software would be the piece that finally bridged the gap between IT and facilities. It seems somewhat ironic that in fact it’s the IT equipment that’s moving under Facilities departments, rather than the IT teams taking a greater involvement in the physical aspects of facilities.
Henrik agreed that this is a fascinating development and one that many in the DCIM market thought would be the opposite of what’s actually happening now. So the facilities department is expanding their remit, so to speak and moving up the technology stack. IT is also moving up the stack but becoming more abstracted and disconnected from the physical side of the data center.
I was curious to understand Henrik’s view on whether this means that we’re stepping closer towards a more Software Defined Data Center and told me that he does believe that this is a first step of sorts towards. For many years now, compute has been getting more virtualised and abstracted away from the physical computer. Today you can also see virtualised or software defined networks and storage with DCIM being one of the tools or platforms that’s there to understand the entire topology between the different devices in the physical layer.
So DCIM understands how a server’s connected to the power chain, how it’s being cooled down, where it’s located and so on as well as the performance characteristics such as how much memory does it. DCIM can measure and understand these data points and more but Henrik doesn’t see its role expanding to be that of a hypervisor for example, when there are partner applications and tools to do this. Henrik does see DCIM as needing to work in an open ecosystem, receiving and providing data to the abstracted virtualized layers and their accompanying software managers and tools. In this situation DCIM data and software will be able to help hypervisor software make the right decisions about moving workloads around a physical data center and will be a key part of how the data center of tomorrow operates.