This audio was created using Microsoft Azure Speech Services
https://www.youtube.com/watch?v=RpsbSrPlop0
I was talking with Soeren Brogaard-Jensen at London’s DatacenterDynamics Converged conference at the end of 2012 about Data Center Genome project which was announced during the show. The first and most obvious question to ask him is what is it?
“The Data Center Genome is a way of describing the energy profiles of all of the equipment in the data center,” Soeren told me. I asked why that should matter and he talked about recent publicity about the sector which has shown it in a less than favourable light. He cited NYT journalist James Glanz in his piece, “Power, Pollution and the Internet” (published September 2012) who wrote that “most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner.”
The fact is that the data center sector consumes around 340TW/ hours energy – about the same amount of energy that the whole of Italy consumes. Recent research by DatacenterDynamics Intelligence indicates that consumption is only going to rise: Between 2011 and 2012, power requirements grew by 63% globally to 38GW, up from 24GW in 2011. The DatacenterDynamics 2012 Global Census on data center trends estimates a further rise of 17% to 43GW in 2013. (The Census also showed that investment in data centers has grown globally by 22% to $105bn, up from $86bn in 2011, with investment projected to increase by a further 14% in 2013.)
The problem is that currently as many as half of the servers that are hosted in data centers aren’t actually doing anything. And this is where the New York Times definitely has a point – it’s like you’ve left your car running in the drive, with all of the energy cost, emissions and maintenance that’s required but for useful purpose.
“We think that the industry needs to come together to solve the collective problem of Data Center Genome.,” says Soeren Brogaard Jensen. His answer is to get the industry involved in its own social media project, kind of like Wikipedia.
Today Wikipedia is the world’s largest encyclopaedia. It has over 15 million pages on almost any subject imaginable – from Lindsay Lohan to String Theory. Many of these pages are available in more languages than the United Nations actually acknowledge exist. But, ten years ago it didn’t even exist. Somehow people from all over the world have collaborated to create a huge repository of knowledge, also keeping it current and working together to ensure its veracity.
“What we want to do is apply this methodology to eliminating waste in the data center,” Soeren told me. “We reckon that we could save at least $10 billion every year between us. Right now we want to open the project up, so we’re inviting everybody to visit the website – www.datacentergenome.com – and add their thoughts as to what sort of tools we require in order to make the project a success.”
“We want you to tell us about the features and functions that you’d like to see on this website. Over the next three to four months we’ll start to add applications, open for everyone to use, based on that feedback.” Christmas is a time for giving, but it can also be a time for saving. Visit the website today to add your input – or sign up and follow the project on Twitter @dcgenome.