Is DCIM Ready for the Public Cloud?

This audio was created using Microsoft Azure Speech Services

I feel as though I’m a bit young to be saying “before the internet, we used to do things this way…”, but as one of my colleagues in Schneider Electric’s smart city business points out, there’s a post-millennial generation of young people who have pretty much been born with smart devices in their hands. Nothing short of an apocalyptic event will ever cause them to think twice about the way they use the internet. It is as natural for them as swimming is for porpoises.

On the other hand, for those of us that grew up with idea that owning a personal computer was neat, a few reservations still exist. There are trust issues: Whilst older generations are cautious about sharing too much personal data with even well proven services such as banking and payments, a recent study by the Ford Motor Company found Millennials were nearly four times as likely to make mobile payments.

Privacy and the public cloud

Ford’s research suggests a very advanced level of trust in established internet platforms. The changing of the world’s age demographics may herald, or even encourage a change in some of the ways in which we use and store data in our data centers. Today 50% of the world’s population are under 25 years old, they have grown up with the internet. They are used to living and organising their lives around smartphones. They also have a fundamentally different approach to the way that data is managed.

998-19767852_GMA-US_620x350

Millennial generations take a slightly more pragmatic view on data privacy. They seem to intuitively appreciate that quite often some give and take is required to create a balance between privacy and security. However, they also demand that any such trade-off must be completely transparent, and there must be a clear value to them. Millennials are also keener on exploring the opportunities that open data can provide to businesses and end users.

Some interesting examples of this can be seen where Transport for London (TfL) opened up bus data to apps developers. As a result of making public its live bus information API data feed, over 60 smart phone apps have been independently forged. This innovation is on a scale that TfL themselves simply did not have the resources to develop and trial internally.

The public cloud presents compelling opportunities to data center owners and operators. In the short term, the cost and time associated with initial deployment can be reduced; long term there’s the mitigation of the responsibilities and risks with software updates and patching. Add in what could be achieved through the aggregation and analysis of anonymised data, and it starts to look like a no-brainer.

The question for the data center market is whether enough trust can be built between vendors and customers for new services and better information/analytics to be delivered via a DCIM public cloud. If, for example, enough were prepared to upload data about IT server energy consumption at load, we could start to benchmark devices for the industry. We could start to ensure capacity based upon real numbers not nameplate figures.

We could look at industry norms for data centers. Heck, we could even look at El Nino weather patterns in the Atlantic and make provision for potential disruptions along the Eastern seaboard 6 or 12 months down the line. Imagine being able to provide that as a customer service! If we were able to wind the clocks forward a decade, we’d probably see this stuff in operation as a matter of daily facility life. Speaking as a software vendor, my question is, what would it take to get your trust?

Tags: , , , , , , ,