This audio was created using Microsoft Azure Speech Services
Customers often ask me, “What’s the real value of having a Data Center Infrastructure Management (DCIM) solution like EcoStruxure™ IT?” Or “Why should I invest in a management tool like this, what’s in it for me and my business?”
These are appropriate questions and depending on where you are in your data center lifecycle or hybrid IT infrastructure journey, the answer coincides with your short- and long-term business plan and objectives. The simple answer is information. However, ask any data center or IT manager what information they are using to manage their operations and invariably the responses are many and varied.
Often, customers assume that monitoring their infrastructure and devices is all that is needed to manage their data centers and provide the information needed to make good decisions. This assumption is based on the premise that the data we are collecting, aggregating, and using for reporting is sufficient to make good, quality decisions. However, best-practice and experience has taught us that monitoring and collecting data isn’t enough, with many managers falling short on providing, or more importantly, maintaining consistent, quality, and timely decisions. And sometimes we collect too much data, so what is the right balance for our business and the people who use this data?
In the beginning, there was monitoring
Yes, monitoring is required but it’s only the beginning of a process and series of planned actions needed in today’s highly fluctuating and demanding IT and data center business environment.
Historically, the action of monitoring was a manual task requiring the use of tools to measure performance indicators like temperature and air speed. Monitoring is defined as: “to observe and check the progress or quality of (something) over a period of time” (Source: Dictionary.com – Oxford Languages). The interpretation of the data and what it means was left to the data center or IT manager (based on their skills and experience), therefore the action of applying “useful meaning” to the data and how it’s to be used.
Monitoring in and of itself is a reactive process as the state of the measurement has already happened by the time we receive it or interpret its meaning and usefulness. So, the question is: how do we become more proactive in the ways we manage our IT and physical infrastructure? We started with monitoring a single computer, then we moved that computer into data centers where we now monitor racks of IT compute, associated assets, and the storage of more and more data. This transformation has led to an oversupply of data, which has driven up energy consumption, operational costs, and business risk.
Monitoring was good, then came automation
As the rate of technological change increased across the various IT ecosystems, driven by ever increasing customer and business compute, storage, and application demands, the need to “monitor” more assets (in data centers and remote sites) and report on more operational KPIs (key performance indicators) became greater. IT metrics and macro data center infrastructure measurements were the first to receive software and system applications that independently (siloed) automated the collection, aggregation, and workflow management of their specific data. Collecting ever increasing quantities of data meant that as humans we could not comprehend all the data and how it was to be interpreted and used. This of course was premised on the need for smart devices, protocols, a network, and some software to make it work.
Ever increasing dependencies among physical facilities infrastructure availability, IT infrastructure operational workloads, and application performance necessitated a new, holistic approach to monitoring, managing, and reporting. More data was being generated “up the IT stack”, which created more demand for IT compute, power, cooling, energy, space, capacity, people skills, and experience to maintain availability and meet more stringent service levels. In addition, more and more disparate systems were being connected and integrated across domains and geographical locations, adding to the complexity of data gathering and interpretation.
The DCIM management paradigm
We needed a better way of consolidating data sources and interpretating the various data types to provide greater insights to managers and business users. Was there a solution or system that could combine physical facility or remote infrastructure data with that of IT infrastructure and equipment performance data?
Greater analysis of the data was also needed as the increase in the quantity of data was overwhelming current operational practices and systems. The quality of the data was also an issue as the divide between relevant data for decision-making and disparate data metrics from macro infrastructure (E.G., facilities power, cooling) and the micro IT metrics (E.G., PC power utilization, CPU utilization, rack space) became larger.
While advances in IT technology and the systems to manage the technology better progressed at an ever-increasing rate, advances in Information Science regarding the process of transforming data-to-information-to-knowledge then wisdom, while not widely recognised was well underway. While the origins of this thinking are uncertain, many authors believe the idea was first conceptualized by the poet T.S Elliot in his poem “Choruses” from the play The Rock in 1934. However, in 1924, Clarence W. Barron addressed his employees at Dow Jones & Company, referring to the “Knowledge, Intelligence, Wisdom hierarchy.” Subsequent studies have defined the Data-Information-Knowledge-Wisdom hierarchy as referring to at least five separate and distinct knowledge models.
So, you might ask, “What does this have to do with DCIM and the value of EcoStruxure IT Software?” To answer, let me refer to Jennifer Rowley’s study, The wisdom hierarchy: representations of the DIKW hierarchy; 2007, Journal of Information Science, of the various models and their authors’ explanations. Rowley, following her study of DIKW definitions given in textbooks, characterizes data “as being discrete, objective facts or observations, which are unorganized and unprocessed and therefore have no meaning or value because of a lack of context and interpretation.”
If we think of data as the “raw facts” with no context and information as the result of applying actions and interpretation of the raw data then we are transforming that data into useable, decision-making value. EcoStruxure IT solutions such as Data Center Expert, IT Expert, and IT Advisor consolidate data from monitored devices and connected disparate systems, then represent that data in the context of the device’s own environment, data center, or hybrid IT domain. In addition, this information can be further transformed into knowledge through the aggregation of data from various sources like facilities power, cooling and space, and IT data such as compute power, rack space and network capacity, expected operational life, and financial management data. Furthermore, with the use of automation and AI, managers can move from basic, reactive monitoring to proactive knowledge management and operational performance improvement.
Value for EcoStruxure IT customers and partners
Schneider Electric’s, EcoStruxure IT Software and Services sit at the center of knowledge management to assist customers to better understand and operate their IT infrastructure. Our holistic, vendor agnostic approach ensures that the device performance data captured through monitoring is transformed to valuable, usable, and appropriate information for improved decision-making. Couple this information with advanced dashboards, information portals, and business intelligence systems and you create a management system that adds real business value, improves operational performance, and cost management.
Start your DCIM and knowledge management journey with EcoStruxure IT
Hybrid IT infrastructure and the evolution of “data centers everywhere” is driving the need for more data. Alongside that need is another – the need to respond faster and create better information to manage your IT infrastructure more efficiently and cost effectively.
Deploying a DCIM solution like Schneider Electric’s EcoStruxure IT Software applications is becoming more critical. Information is at the core of DCIM, but knowledge is better and improved human understanding should be the goal. Getting the right balance between data, information, knowledge, and understanding is the art.
Contact us today, at se.com/dcim to organize a complimentary consultation on your data center or hybrid IT infrastructure management requirements.
Add a comment