This audio was created using Microsoft Azure Speech Services
Lots of companies are now using the Green Grid’s Power Usage Effectiveness metric to assess their power use and try to improve efficiency, and I applaud them for it. But eBay demonstrated the gains to be had by including PUE and other Green Grid metrics in the design and construction phases of a data center project – even when that project is located in the middle of the Arizona desert.
In 2010 eBay embarked on Project Mercury, an effort to consolidate 11 data centers into three in an effort to not only save money but ensure the company is able to keep up with changing technology cycles and growth for years to come. One of those data centers was a new one to be built in Phoenix, Ariz.
Crucial to the effort was the decision to have IT and Facilities work closely together on the project and use a metrics-based approach to data center design and server procurement. One of those metrics was PUE, which was used in the RFP process. eBay took the further step of tying PUE to total cost of ownership of its servers over their lifetime, to measure the total kilowatt hours (kWh) of energy they would consume running eBay workloads. That gave eBay another important criteria with which to measure server vendor proposals, one that would give it a clear picture of what it could expect to pay for the servers not just up front, but over their lifetime. Pretty smart.
So was having Facilities and IT units under one organizational structure. When you’ve got the folks who buy servers and those who pay the power bills in the same organization, you’re likely to get a more accurate TCO calculation, one that includes everything: the cost of the server, lifetime power consumption, real estate costs, and power and cooling.
eBay also incorporated principles outlined in the Green Grid’s Data Center Maturity Model, which provides a road map for data center designers and operators on current best practices and the path to best practices five years into the future. Using this model, eBay determined that 80 percent of its applications required only Tier II reliability, which freed up capacity in its more stringent Tier IV data centers, essentially extending their lives. This helped lead to capital and operating expenses of the company’s Tier II data centers plunging by more than 50%.
One of the most interesting parts of Project Mercury was its goal to use free cooling year round, with a design PUE of 1.2. Considering that 1.0 is pretty much perfect, and that the data center is located in the desert, where you’d expect a healthy tab to keep it cool, that pretty much dictated some out-of-the-box thinking.
And that’s what eBay got, especially with respect to the containerized data center modules that sit on the roof. Remember, they’re in the desert, on the roof of a data center, baking in the sun. Yet the containers can still use a type of free cooling that does the job – and contributes significantly to the overall site PUE.
To learn more about what eBay has done, check out the case study on the Green Grid web site. I think you’ll agree it’s a fascinating project and one that serves as a lesson for anyone constructing a new data center or renovating and existing one.