How IBM Helps Companies Build More Cost-effective, Intelligent Data Centers

This audio was created using Microsoft Azure Speech Services

Schneider Electric has been partnering with IBM to deliver data center infrastructure for quite some time now, so it was great to catch up with some of the IBM principles involved in the effort at the recent Gartner Data Center Conference in Las Vegas.

Steve Sams, for example, is VP of Global Site and Facilities Services for IBM Global Technology Services. He noted the IBM-Schneider Electric relationship started about 5 years ago with IBM implementing Schneider scalable modular data center products. “We’ve now installed them for about 500 customers around the world and Schneider is our primary partner for that solution,” Sams says.

Scalable data centers address a couple of key trends that Sams is seeing in the market. One is that customers are realizing that data center expenses extend far beyond the cost of actually building the facility.

“Typically, for every dollar you spend to build a new data center, you’re spending about 5 dollars over the next 20 years to run and maintain that data center,” Sams says. “We’re trying to encourage clients to look at the total cost of ownership of the kind of data center infrastructure they’re considering today.”

Scalable, modular infrastructure allows customers to build a data center that fits their requirements today but can be easily expanded to meet future requirements. That means o customers don’t have to build and pay for today what they won’t need for 10 or 20 years.

“And those requirements aren’t just about size, they’re also around technology,” he notes. “The technology we had 20 years ago doesn’t look anything like the technology we have today. So designing these environments to really be flexible for not just growth demands but also some pretty dramatic change in technology that will occur over the next 20 years is pretty important.”

For example, as data centers become more dense as a result of virtualization, they require more localized cooling systems – technology that may not have existed 5 or 10 years ago. Modular systems allow for implementation of such newer technologies as they develop.

That’s an important consideration as companies begin moving to cloud technology. “The challenge for most cloud environments is the unpredictability of it,” Sams says. The modular approach again helps in that regard, enabling companies to meet whatever demands cloud may bring.

Another key piece of the puzzle that IBM brings to the table is advanced analytics. At the Gartner event, Sams gave a presentation titled, “Watson and Your Data Center,” in which he discussed the role of the technology behind IBM’s Watson computer – the one that famously bested two former Jeopardy! champions last year.

“Analytics are now at a stage where they can be used to help customers make better decisions, automate tasks that they may have done [manually] before, and we’re bringing that analytics capability into the data center,” Sams says.

When it comes to analytics, IT executives have been like the shoemaker’s children, he says: they provide great analytical tools to business executives, but didn’t have great tools for themselves.

“This year we’ve announced 20 new services focused on the data center that have 28 different analytical tools embedded in them,” Sams says. The tools help companies plan a more effective data center and address issues such as capacity planning. “We’re bringing these tools to help IT executives have the same level of intelligence around their own business to drive efficiency, to align their strategies better with the business strategies, to improve their operations the same way analytics have been applied to the core business itself.”

To hear more of what Sams had to say, click on the video at the top of this post.

Tags: , , , , ,