When personal computers first came out they had no internal storage. The operating system was on a floppy disc that you needed to load; it was called a boot disk. The next step in the process was to load your application, say WordStar. Any files you created were saved on floppy disks as well. When the first hard drives came along they made it easier and faster to boot up the computer and store data right on the internal hard drive – usually 10 or 20 MBs. So you had to be really selective about which files and applications you wanted to store on the computer.
Today we are facing a new but similar challenge. We are deploying smart, Internet-connected devices all over our buildings, factories, data centers, cities and even vineyards that are monitoring all kinds of systems and generating lots of useful data. This data helps our buildings use less electricity, makes our factories more automated, our data centers more efficient and reliable, the traffic in our cities flow better and our vineyards yield the optimal grapes for the most complex and satisfying wines.
The first challenge in enabling such capabilities is setting up a network to collect all of the Internet of Things data from your application or domain. This could be a single site or it could involve multiple sites. You will need to comprehend how much data will be coming out, whether you plan on storing this data and, if so, where. You’ll need to decide if you will need to access the data constantly, once, or possibly only for a one-time event like a law suit or a police investigation.
Since IoT is relatively new and the analytics that power the real benefits are just getting off the ground there is a great deal of uncertainly around quantity and regularity of data flow as well as what data to keep, for how long, and where from a geographic standpoint, as well as challenges around data acquisition and analytics from multiple global sites. Given all these issues, it may be a good idea to start down your IoT journey with a pay per use model through a cloud provider or a colocation service provider that is skilled and experienced in setting up these kind of IoT aggregation and analytics points.
Let’s give a couple of examples for clarity. In a vineyard, grapes need to go through a complex growing cycle to yield the most complex character. In the past, you logged the rainfall, hours of sunshine and temperature and visually looked at the vines and grapes and guessed at how much water and fertilizer to use. Today you can install sensors and programs to figure it out for you. Think of it as a VAAS or “vineyard as a service.” You will need to determine how many years of this data to keep for future reference and the frequency of the data. You may decide you want all of it, hourly data, daily data or just the rolled up results.
Another example is data centers. In the past you had to take an educated guess at the temperature set points and cooling airflow settings to maintain the highest availability and efficiency. Today you can meter your data center to the point that you can use analytics to automate these set points and air flows. Again, the data from all of this metering can be kept or you may choose just to keep the high level results from the computations.
As the big data era grows with more and more IoT devices, it’s important to understand what you want from this data and what your long term goals are. It may be perfectly fine to keep a subset of analytic results on IT equipment in-house on a simple network. With data networks getting more complex and data storage requirements growing, it may be a good idea to seek the help of a service provider early in your planning process.
Big data and the IOT is just one of the trends colocation providers are dealing with. To learn more, read the e-guide, “Opportunities and Threats to Colocation Providers from Around the Globe.” Based on a roundtable discussion with executives from colocation providers representing 16 countries, the guide provides insights into seven key trends.