The use of data science to bring more precision and predictive power to the way we manage data center physical infrastructure (DCPI) is shaping up as the best way to respond to rapidly changing demands brought on by big shifts in information technology (IT) like digital commerce, mobile apps, and cloud computing.
All of these trends can cause the demands on the IT and DCPI layers of a data center to swing wildly. As I’ve been writing about in recent posts, data science is shaping up as the remedy for dealing with this volatility because it allows operators to make better sense out of what is happening with their data centers and model, predict and adapt to change.
Data science can be defined as the extraction of useful information from a data set. It combines computing techniques with mathematical and statistical knowledge to extract insights. In data centers, data science can be applied to questions such as how to adjust power and cooling capacity to meet a rapid shift in demand. When combined with the use of data center infrastructure management (DCIM) software and domain expertise in power and cooling, data science can help a data center adapt to volatility without wasting resources.
Trouble is, there aren’t enough data scientists—a trend noted in research from McKinsey Global Institute. And not only is there a shortage of pure data scientists, but also in domain experts adept at interpreting the analytics. As a result, there is a real talent shortage when it comes to being able to apply data science to data centers. That’s why Schneider Electric has developed a services bureau approach to meet this need. The services bureau enables our suite of “Asset Connect Services,” which you can find out about in this press release.
But what should a company look for under a services bureau approach to data –driven decision making? There are numerous capabilities needed, but here are a few of the top ones:
- A team that combines highly skilled data scientists with domain and technical experts in power and cooling systems, DCIM, field services, as well as IT operations professionals who are certified in Information Technology Infrastructure Library (ITIL) practices.
- The ability to closely monitor critical systems using DCIM, and use DCIM to model and manage capacity.
- An established methodology, not only for aggregating and analyzing structured data from systems such as DCIM, virtualization, or building management systems, but to also consider unstructured data such as reports from energy audits or field service calls.
- A proven method for continuous improvement. There should be procedures in place for benchmarking performance, planning improvements or taking corrective actions, followed by review and further assessment.
There are other specific function a services bureau can carry out. For example, data modeling employing data science and advanced analytics to make predictions about needed cooling capacity, or to predict the systems most prone to failure. Many companies also need help in managing and monitoring a data center, making use of services such as 24/7 remote monitoring. As a result, a services bureau should be modular, allowing a client company to pick what they need.
A data sciences offer for the data center should be able to deliver on the skills and capabilities mentioned. Beyond that, the company providing the services needs three core pillars to be world-class: expertise, technology, and data science.
The bottom line is that a new approach is needed. Major shifts such as mobile broadband have a whiplash effect on IT infrastructure that can be difficult to adjust to without better monitoring, predictive analytics, and continuous improvement. If data science isn’t tapped, many operators, even in newer data centers, are stuck in fire-fighting mode and will see the efficiency gains they first realized with a new facility erode. A services bureau offering can deliver the monitoring, measurement, and data-driven decision making needed to keep efficiencies steady, even in the face of rapid change.