The term “big data” is all the rage these days and there is little question that organizations of all stripes do indeed have lots of data to deal with. But big data by itself does little good; the trick is making sense of the data you have. I call it data coherency.
Here’s an example of the challenge. Say you’re the chief sustainability officer for a large organization. You have one report that shows your energy consumption has gone down 10% vs. the prior year but another that shows your carbon footprint has gone up. On the face of it, that doesn’t make sense; it isn’t coherent. When you dig into it, there may be a plausible explanation, such as differences in what data was used to create each report, and when and how it was collected.
What customers want is a single version of the truth, something they can trust and use with confidence when making decisions. To get there, we must ensure we are starting with a framework that produces accurate and timely data. It doesn’t do any good to pour bad data through an analytics engine, only to get bad analytics. But too often, organizations chase the promise of analytics or regression analysis before they’ve ensured they have a solid stream of data to feed into these tools.
Consider companies like Alcoa and Intel, for example. These companies are connecting executive and employee compensation to sustainability and energy performance metrics. When this happens, everyone in the firm gets serious about issues around data quality. Would you want your personal income tied to an information stream that was only 60% or 70% accurate? Yet, many companies are managing their businesses on energy and sustainability information that is equally flawed.
Avoiding such a situation requires a series of hardware and software working in concert to ensure the accurate collection and delivery of data. Hardware includes sensors and meters strategically located throughout the hotel while the software includes management tools that can provide alerts when things go wrong, such as a broken meter.
It’s unlikely you’ll be able to get such an all-encompassing solution from a single vendor. That’s why Schneider Electric forms partnerships with various industry players to put our pieces of the puzzle – including sensors and management software – together with solutions from our partners. As has been explained in previous posts such as this one by my colleague Alexis Grenon, we do the integration and validation work ahead of time so customers can be confident the solution will work as promised, and so they don’t have to do the heavy lifting that integration requires.
On top of that, along with our partners we have significant domain expertise in a number of vertical areas. That means we understand the challenges inherent in these verticals and come prepared to meet them, further speeding up the process.
There’s no question companies can extract value from big data. But it takes an interoperable set of hardware and software tools to ensure they can gain accurate, coherent insights from it. To learn a bit about some of the software solutions we have in this arena, click here.