This audio was created using Microsoft Azure Speech Services
I was in Vilnius this week, speaking at an Oracle Day 2011 conference. To me it’s always interesting to hear what my fellow speakers have to say, as well as get insight from delegates – very often it’s their real world experiences which we can take and use to make our product and service delivery better. However, in this case it was what was being said from the platform that caught my attention and the speakers’ use of the phrase “business at the speed of thought”.
It really caught my imagination – a phrase which sums up what I believe we’re helping to facilitate with our DCIM data center software. Now those of you that are well read probably realise straightaway that the speaker, an old Sun Microsystems guy who attributed much of what he was saying to Larry Ellison, was in fact quoting Bill Gates’ 1999 book, Business at the Speed of Thought.
A decade is a long time in software. But Gates’ book seems remarkably prescient. In his introduction Gates says “If the 1980s were about quality and the 1990s were about reengineering, then the 2000s will be about velocity. About how quickly the nature of business will change. About how quickly business itself will be transacted. About how information access will alter the lifestyle of consumers and their expectations of business.”
If you’ve been around the IT business long enough, you’ll have heard countless rallying calls about the transformational possibilities of IT, about how big and important a thing it is, about how it’s going to make people’s lives really, really, much better. Whether or not you subscribe to these ideas, there’s no denying that Bill Gates was really onto something, the speed of change which IT brings has never been greater.
But in the data centre world, velocity has been somewhat of a problem for us; the ongoing cycles of IT hardware and software innovation continue to challenge facilities with fixed space, power and cooling capacities. But if the first ten years of the new Millennium have been about playing catch-up and learning how to reliably deliver the sort of high density environments suitable for compact and virtualised machines, the next ten years will be about delivering optimised environments for the IT load at the moment it’s required and for as long as it’s required.
And that’s where DCIM comes in. We no longer have the luxury of manually simulating deployment scenarios, poring over spreadsheets for hours on end, making decisions based on data that may not even be current. And we cannot afford to extend data centre changes over days or weeks in order to “feel out” the effects of one change before proceeding with the next.
Velocity dictates that operational decisions, be it introducing new equipment or reallocating capacity to cope with an increased virtual machine load, must be taken then and there. Second generation DCIM systems are able to run through dozens of simultaneous “what-if” scenarios and give us instant answers to our data centre questions. DCIM allows us to optimize power, cooling, space, and networking capabilities – or even dive into the CPU utilization of individual servers – in order to maximize both availability and efficiency, all at the speed of thought…