Challenges of Data Consolidation Part II: Choosing the Right Transfer Methodology

This audio was created using Microsoft Azure Speech Services

You may recall that last time we discussed data consolidation and focused on the importance of network connectivity. Moving into part II of this series, we’ll cover the different data transfer methodologies that are used in the industry and the advantages and disadvantages of each.

The sheer number of different types of data transfer methodologies reads like a Chinese food menu, and figuring out which “special flavor” works best for you can be a daunting task. Most are chosen based on ease of implementation, which can translate into lower cost. The ease of implementation can vary based on network and platform architectures. Here are a couple of typical options used in the industry today.

With web-based technology so prevalent today, Web Services is becoming the typical standard for sharing data within control and monitoring systems. Web Services transfer data over HTTP or HTTPS protocol very similar to web browsers displaying text and images from a web site. One advantage of using web services is that interaction can be more “real time”. One disadvantage can be when attempting to push data to a destination platform that is not ready to process the data. This is a simple but critical error condition that needs to be accommodated in the design.

A resolution to that specific web services disadvantage is a flat file transfer. Transferring a flat file such as Comma Separated Variable or CSV via email or FTP is common. This allows a system to export data via a typical reporting export feature to a directory on the internal network. Then, a software process can look for new files to be available, perform any translations, format for the receiving system and send to an internal or external FTP site. The receiving platform can import this data whenever it has spare bandwidth. Also, this transfer methodology has benefits for cyber security since only an outbound path through a firewall would be needed.

The most basic but still very reliable method is to transfer data via existing open protocols such as Modbus, BACnet or SNMP. Typically, systems that are reading data directly from equipment can share that same data as a server to a receiving client.

The client system can easily read that data via open protocol just as if it were the equipment. The client system can read data from multiple disparate systems to consolidate into a single platform. The method that’s right for you may not be right for others. Choose an implementation that works best for your systems architecture.

The long and short of it is, you want to know what exactly combination plate No. 3 is going to deliver for your business before you order it, and ensure that the fortune cookie delivered with it predicts security and prosperity for your facility for years to come.

Next time, we’ll wrap up the discussion with a look at the intricacies of data normalization and the ETL process as a final hurdle to overcome in the quest for data consolidation.

Tags: , , , , , , ,