Many companies have made significant investment in control systems that look at individual types of machines. But a single process may use many different machines that speak different languages and have different control requirements.
Insights from the Factory Floor
Matthew C. King | FogHorn Systems
Reprinted with permission from FogHorn Systems:
While the notion of edge analytics is still relatively new even to those in the IIoT community, a lot of noise from a lot of different players is making it difficult to discern how far along companies really are in developing this technology. FogHorn Systems has been pioneering the concept of edge intelligence since 2014. During this time, we have deployed edge analytics in real world use cases across the globe, and our deployments are successfully influencing industry direction. Last year, we celebrated the release of FogHorn Lightning, our new purpose-built edge platform, and we are now knee-deep in a variety of factory and other industrial settings.
I’d like to share some insights we’ve come across while deploying our own edge intelligence wherever industrial processes take place.
Industrial “things” produce a ton of data and most of it is unimportant–however, a small portion of it is extremely important.
The amount of sensor data already being produced today is astronomical. We’re talking terabytes and even petabytes of data. And as the industry continues to enter the IIoT arena, the amount of data will only increase. Sending every individual sensor data point to the cloud is at best, cripplingly bandwidth-intensive traffic and at worst, too expensive by many orders of magnitude. In some cases it’s simply impossible. Ironically, the vast majority of these individual data points are uninteresting and unhelpful.
Yet, there are critically important data points buried in these large data sets, including new insights representing abnormal conditions. The trick is finding the rare important conditions without being overwhelmed by sheer data volume. By summarizing this data into a metadata format (i.e. averages and data points during relevant events), you can provide all the insights the cloud needs at a fraction of the cost of transport and storage.
Although many companies are capturing and storing their data, most of it is never used.
IoT is a journey and companies are increasingly reaching the stage where they have installed a wide variety of sensors and are sorting through the best ways to capture this data. Unfortunately, the majority of these companies are still unsure of what they will ultimately do with their data and how they will realize any value from it.
There are more steps to this journey, but if the end goal is to derive insights and apply these findings to increase efficiency and eliminate downtime, the natural place to do this at the edge—as close as possible to where the data is actually created.
A holistic approach to IIoT derives new insights.
Many companies have made significant investment in control systems that look at individual types of machines. But a single process may use many different machines that speak different languages and have different control requirements. In this case, process optimization isn’t about managing individual machines. It’s about all the machines working together in a common language towards a common goal.
Normalized data ensures that machines can speak the same language and the process is considered in a complete, as opposed to segmented, manner. Then and only then can new efficiency be derived.
“Process control systems are great as long as you leave them alone.”
You’ve heard this before. The initial reaction many operations managers have to IIoT is, “we’ve had this technology for years. It’s my PLC.” Indeed, control systems succeed in making localized real-time decisions. Frankly, many factories have relatively well-programmed control systems in place and because of this, the operations folks will beg you not to disrupt them.
These operations people have a right to be conservative. Changing a control system in the tiniest way risks completely breaking the effectiveness of the system which can lead to months and months of testing and re-testing. As you can imagine, most companies have been very slow to integrate new sensors into the system (like vibration and acoustic sensors), or to try to iteratively improve the system by adjusting the programming.
The beauty of edge IIoT however is its ability to make real-time decisions locally or in very close proximity to the devices themselves with a platform that makes it easy to develop and iterate intelligence. In short, edge IIoT can work at the machine level as well as the system level and is both open and flexible. Legacy control systems are not.
Certain industries have opted out of cloud-based IIoT.
Industries like oil and gas and mining have not made huge initial pushes into cloud-based IIoT. While these sectors do employ traditional control systems, connectivity issues have prevented a lot of these companies from relying on the cloud for analytics.
Because oil rigs and mines tend to be in areas with poor connectivity, edge compute becomes a requirement to perform analytics, derive insights and filter the data that will be ultimately persisted in the cloud. When connectivity is sporadic or even non-existent, this notion of distributed intelligence becomes key to any successful IIoT implementation.
This post does not have any comments. Be the first to leave a comment below.
Post A Comment
You must be logged in before you can post a comment. Login now.