Baselining as Common Denominator in Data Center Improvement

This audio was created using Microsoft Azure Speech Services

The often-heard response to technology advice—“yes, but we’re different”—does have a lot of truth to it when it comes to data center improvements.

Let’s start with the drivers or motivations for improvement. For many companies, escalating energy costs are a major driver for data center improvement, but that’s not the case for every company. For a company about to acquire another organization, the key improvement concerns are probably around capacity and consolidation issues. A different company might need to add capacity driven by a new business model like mobility, while another needs to modernize its current infrastructure with a retrofit project.

Yet another company might be looking at data center services as a means of better managing its carbon footprint, while another wants to take steps to comply with a government program on energy efficiency, such as NABERS in Australia, or the BCA-IDA Green Mark for Existing & New Data Centres in Singapore.

Similarly, there are a variety of strategies one can take to achieve data center improvement. One can make incremental changes to an existing data center; undergo a major retrofit; offload demand by taking more systems to the cloud; use a colocation data center; consolidate data centers or build a new one; or even outsource the entire data center function. With all those choices, it can be hard for the average data center owner/operator to sort out a common-sense approach to improvement.

But there is one common denominator in the midst of all these differences: the importance of baselining your operations. To decide what path makes sense to alleviate your particular pain points, it’s absolutely necessary to know more about your current performance, capacity constraints, and to pinpoint issues such as hot spots or power vulnerabilities.

One way to get a comprehensive baseline of a data center is to make use of an assessment service, such as an EnergySTEP for Data Centers engagement. Another route is to deploy a data center infrastructure management (DCIM) solution, which when properly implemented, results in continuous, online baselining. An assessment, by contrast, is a one-time baselining, though the assessment can be followed up on with further services to assess progress. Either way, the important thing is to start baselining.

While it’s true that assessment services tend to bring some common benefits such as the potential to cut energy costs, it’s also fair to say that trimming the energy bill isn’t the only reason a company would want an assessment. Again, it all depends on what the core driver for improvement is for a particular company.

In one recent assessment service Schneider Electric did for a multi-billion company, the main thing they needed out of the report was a solid plan for managing its capacity over the next several years to meet the expected demand. Sure, this company wants energy efficiency too, but the main thing it wanted was to quantify exactly when it should implement new technologies to boost its capacity over the next several years.

Using data from baselining and DCIM capacity planning tools, we were able to show them what they needed to phase in to meet the expected demand curve. Initially, it was some smaller investments such as additional uninterruptable power supply, ramping up into bigger steps such as a new cooling unit and later transitioning to modular “pod” style infrastructure. They now have a quantifiable plan to move forward with, but of course, it won’t suit every company out there.

Finding the improvement strategy that will fit a particular company takes careful analysis. In essence, it really does vary by company, but what everyone has in common is the need to start baselining so that they can determine where they are today, what suits their needs, business plans, and environment.

Tags: , , , , , , ,