This audio was created using Microsoft Azure Speech Services
Somewhere in my Dad’s toolbox you’ll find an ordinary table knife. I suppose at some time it was used for eating meals, but not in my parent’s home. As long as I can remember, that knife has been on hand whenever my Dad was playing handyman. Before molded electrical plugs, it was used to wire new appliances. It opened tins, stirred paint, and was used for small filling jobs. It’s serrated cutting edge removed grouting from between tiles. It was like an ancient progenitor of the Swiss Army knife, with one blade to do it all. Once it seemed to have a talismanic value for DIY tasks, today it is past its best.
Surprisingly, after a number of years in the DCIM (data center infrastructure management) market designing and building specific tools to manage and optimize today’s data factories, the application which I see being used most regularly is the spreadsheet. It seems a choice out of step with global megatrends such as urbanization, digitization, industrialization and the IOT, seismic shifts that are causing an ever upward spiral in data production and an even heavier dependence on the factories of the future – the facilities where data is processed, stored, analysed, turned into useful information and transmitted.
Today we need to address not only the way that we provide infrastructure, but also the way that we manage it. We all know that. So it raises questions about the appropriateness of the tools we should be using, and whether those looking at day-to-day data center operations from the outside would approve of the use of non-specialized management tools for these most vital organizational resources.
During the visit of a group of data center professionals to our Kolding Solutions Center, I asked for a show of hands from those using spreadsheets in their facilities. A lot of hands went up. I asked how many of the spreadsheets were up to date. Much less than half, in fact, around 40 percent reckoned the information was inaccurate, and that’s before we started looking at human error which is one of the hazards of manual data input.
There’s no doubt that spreadsheets have had a place for managing inventory and even network connections. More than that, I’ve been introduced to some very sophisticated data center applications using Excel. For example, I met a group who had developed complex algorithms and calculations which they were using to determine power backup requirements from UPS and gensets. But it was highly specialized stuff that only those well qualified with three-phase circuits could usefully employ.
Talking with the group revealed more insights from the lack of a reliable way of ensuring that data is correct to the inability to build a relational database within the tool – no dependencies, no information sharing and no way of using it at IT, facilities and CxO level since the spreadsheet has to be effectively driven by a single person.
An early engagement we had after launching StruxureWare for Data Centers was with a major colo in Europe. They were doing something fantastic with spreadsheets to manage their twenty-eight facilities. However, it was highly manual and therefore time consuming. There was no monitoring, no scope for real-time data and analytics – no guidance coming back to help them do their jobs better. It was pretty much the full-time work of a whole team of people and we could probably argue they would be more usefully employed elsewhere in the data center.
Interestingly, today we see fewer adds, moves and changes in the white space as a result of higher levels of visualization. You might think that this might swing the balance back towards the spreadsheet, but in fact environments are more dynamic than ever. In spinning up new VMs, determining physical or software defined network connectivity and storage and doing so reliably, efficiently and with sufficient redundancy, both IT and facilities will need to understand resource availability in real time.
The more complex the data center, the more obvious the value of DCIM over a spreadsheet. From auto-discovery of assets in the physical and IT layers, to creating a unified view of physical infrastructure from anywhere on the network, DCIM provides a common language all data center stakeholders. And since DCIM provides insight into what’s going on inside the power and cooling and within physical and virtual machines, it steps into the gap to facilitate optimization of increasingly complex environments. But to do this in a way that suits the demands of today’s requirements probably means breaking up with some old favorites – like that knife – that no longer have any relevance to the modern toolkit.