Reporting on the recent “Business of Cloud, Data Center and Hosting Summit” run by analyst firm 451 Research, journalist Martin Banks defined micro data centers as “small systems designed to provide either specific, specialist services or temporary additional resource at points in the enterprise network where they are needed.”
At the recent DatacenterDynamics, I asked Kevin Brown, Vice President of Data Center Technology and Strategy at Schneider Electric about the sorts of applications for which customers should be using Micro Data Center solutions. Take a look at the interview here.
“It’s impossible to have this conversation,” he told me, “Without talking about the trends of Edge Computing and the Internet of Things. There are certain applications that have always existed and had a lot of data – I think of industrial process control as a good example.
“Process data has tended to utilize proprietary networks, but what’s really changing today is the deployment of internet technologies throughout industry. As a result, all that process information is now being uploaded to the Cloud – whether it’s a private Cloud, a public Cloud or what have you – but to make the best use of the data and analytics to increase productivity and manage costs, these types of applications need to be run very close to the load.
“This is very much being driven by bandwidth and latency considerations. Suddenly the process and control manager is having a new conversation – do I put my application in the Cloud or do I keep it locally? Depending on the amount of data that I’m dealing with it might be cheaper to deploy visible infrastructure than it is to pay for the bandwidth back and forth.”
“So we see some applications like industrial process control where processing is going to have to happen very close to the load; but then others like content distribution networks want to put data close to the point of consumption simply because of the bandwidth considerations.
“Content distribution is driving itself more data towards the edge but we’re very much in this conversation with customers about what is the cost trade-off between locating physical infrastructure close to the point of consumption versus the bandwidth cost when the data is stored more centrally. Latency and bandwidth are going to be the core drivers in these conversations as we go forward.”
Various surveys over the last few years have highlighted the skills shortages in both the data center and IT sectors. I wondered whether the micro data center trend represents a type of simplified infrastructure “appliance” to match integrated hardware which has emerged in the IT markets. I asked Kevin whether micro data center solutions start to answer some of that requirement?
“I think it’s a very interesting question,” said Kevin, “because the industry appears to be moving towards a place where you’re going to have generic blocks of computing that are getting deployed and a lot of Cloud technologies… it’s really trying to get it to where you just have this generic hardware/compute platform that you can deploy any application on.
“If we do this right, it really should simplify what people need to do. You can design it once; you can deploy these anywhere – they’re very standardized. Hopefully, they will have standardized IT inside of them and therefore that really does simplify testing and maintenance. Instead of having to integrate solutions from many different IT software and physical infrastructure vendors, , you can start viewing these just as a consolidated appliance that can be quickly and efficiently deployed.
“Micro data centers are a great example of the big investment that’s being made in prefab data centers. We’re really trying to get things to be standardized blocks of compute that are getting deployed. When done correctly, this will minimize and simplify the training requirements because everything has been pre-manufactured and fully tested.”
Consolidation has been a major trend in the data center industry for a number of years now, but now, all of a sudden, analysts and manufacturers are forecasting a proliferation of data centers as we anticipate edge computing applications to start to answer the requirements of the IoT. I asked Kevin what he thought the answer to an increasingly complex infrastructure landscape might be?
“We always like to talk in the industry that things are going in one direction or the other,” he said. “It’s either consolidating or becoming more distributed! What we think is happening is we’re going to have both. So there’s going to be a mix of things that you want locally and things that you want centrally.
“Now what’s interesting about this wave of edge computing data centers out there is it does introduce the question about how do I manage these? The tools that are available now are much more sophisticated than those that existed 10 years ago, or even five years ago. So the progress that’s been made on the ability to control and understand what’s happening at the local area and the connection between the physical infrastructure and IT is much better and we think it’s a key enabler to allow people to deploy data centers effectively.
As with anything located remotely, the ability to be able to configure, troubleshoot and update without physically visiting the site is going to save cost and time. For those providing content to choosy customers, it’s also going to be critical to reputation. When it comes to the infrastructure that supports these services, Schneider Electric support will not necessarily end when the equipment is handed over: “As a solutions provider we have the ongoing capability to help customers with additional support to help remotely monitor the data center and manage the compute devices as well,” said Brown.
To learn even more about edge computing trends and its potential applications for your business, take a look at this free white paper- “Drivers and Benefits of Edge Computing”.