Why are “traditional” data analytics not suitable for process improvement applications?

Data Analytics
Our purely visual method, when properly set up, allows process-knowledgeable end-users to see at a (long) glance a comparison across hundreds of variable.

Process data is extremely highly correlated by heat, mass and momentum conservation and reaction kinetics. It is further complicated by variable volumetric hold-up, back-mixing and recycle streams. It is broken into numerous discontinuous segments by events such as switching a valve between auto and manual; changes of operating practice between operating shifts; variations in arrival rates of lab analyses of process samples; and on-line analysers drifting out of equipment fouling and mechanical faults. Understanding all of this variability and being able to recognise and differentiate between “normal” and “unusual” requires not only the ability to break the historical data into its many discontinuous segments, but also considerable knowledge of the process and its local operating practices – and perhaps even the operating preferences and practices of particular process operators.

This implies that the people needed to do any analysis will be the process, control, maintenance and instrumentation engineers who work closely with the unit and its operations staff. These are hands-on engineers who are always busy and unlikely to have the time or wish to acquire the depth of advanced mathematical knowledge necessary safely to make use of “traditional” data analytics, including “black-box” algorithms, such as neural nets, where the reasons for model outputs are not always easy for the user to understand and explain to others. 

Our purely visual method, when properly set up, allows process-knowledgeable end-users to see at a (long) glance a comparison across hundreds of variables. They can compare, for instance, today’s and yesterday’s operation; desired or best-ever and actual performance against KPI’s, Operating Limits, DMC Constraints, Alarm Limits, Integrity Limits and even economic objectives and can investigate and identify the cause of less-than-desirable performance.  They can then use the same display for discussion in daily or weekly Unit Operations meetings (everyone, including operators, can participate in explaining and often correcting some observed anomaly) and in training sessions with process operators to show, for instance, the difference in unit performance of putting a particular valve into manual or moving a DMC constraint (some plants allow this) to make operation “easier”.

You need to introduce the new method to senior plant engineers first and let them draw in the less experienced junior engineers after they have experienced success. It doesn’t work to start with the junior engineers because they don’t have the depth of knowledge to recognise what is significant, and therefore struggle for success, easily get disheartened and become “too busy” with other activities. Plan the rollout with care. And later give the same tools to the site and HQ statistical groups so that they start with examples of success from the unit engineers and so are deterred from rejecting new technology outside of their own experience. In a number of places the Stat teams happily use our visual methods as a platform for testing and evaluating traditional analytics and equations and sometimes for re-expressing results from visual methods in traditional statistical terms for consumption by others (outside organisations or even organisationally distant managers) not trained to understand the parallel coordinate graph.

Share on Facebook
share on Twitter
Share on Linkedin
Share via Email

If you found this article helpful, why not view the below articles?

Webinar Replays:

On Demand Webinars

All the latest PPCL webinars are available to watch at your convenience.