PPCL Logo Process Plant Computing Limited

Tel: +44 1753 893090 |

LinkedIn Twitter



Register below for one of our live webinars that share new approaches to process improvement & optimisation. 


Or watch one of our Pre-Recorded Webinars here...


Reducing Process Energy Use with Improved Operation

Plants rarely operate consistently at their minimum energy input. This is partly because the minimum is not usually known and also because it is ‘easier’ to operate with excess energy. There is an inevitable drift back to higher energy operation without ongoing monitoring and feedback as part of energy minimisation. Where should you start?

The minimum energy required is not constant. It varies with ambient conditions; operational loads on the plant; changing disturbances such as catalyst activity, furnace efficiency and equipment fouling; and unintentional local optima and sink-holes that are difficult to identify or avoid.

In this webinar we demonstrate our simple no-maths visual method for bringing the many operational factors together which requires only the knowledge of the process and its operating practices and objectives – knowledge your senior process engineers already possess. It uses data that you already have in your plant historian and can identify operational changes to a low-energy operating window. We also discuss how to maintain low-energy operation into the future by monitoring operation against the low-energy operating window, or progressing to real-time use of an operating envelope for minute-to-minute guidance to the operator and/or existing process control systems.

Notable recent successes include increasing product output of a paraxylene plant already at maximum energy input; identifying and steering an LNG refrigeration process to the best of several non-stationary local efficiency optima by adjusting the constraints of existing MPC controls; and providing a hydrocracker operator with a map to avoid several high energy sink-holes that were not previously known, reducing energy usage by up to 40%.




Low Cost Event Prediction That You Can Implement

 

Providing process operators with even a few minutes warning of approaching abnormal events such as column flooding, pump/motor failure, transformer failure, compressor surge or equipment fouling can dramatically reduce production losses. Models of normal operation are necessary. In the past these models were time consuming to create and maintain, but now there is a simple, low-cost method based on process history and understanding that can be implemented by a trained user in a matter of hours.

PPCL has pioneered the development and implementation of the only method for modelling process operations as a multi-dimensional geometric object. While this method is simple, it’s far from simplistic. It can integrate more than 100 process variables over thousands of time points - achieving very high detection sensitivity.

In this webinar, we demonstrate exactly how our method works and answer your questions. We look at a powerful data visualisation tool - C Visual Explorer (CVE) - and how it can be used to investigate events using historical data from the plant historian. Starting with individual occurrences and then expanding the method to multiple similar events, we demonstrate how to pinpoint common causes of faults and spot potential precursor signatures. We show how to use our online process monitoring tool - C Process Modeller (CPM) - to model fault-free operation, providing dynamic real-time warning of developing events and bringing attention to the key deviating variables.




Modern Alarm Rationalization

 

Operator Alarms are an ongoing grumble in almost every process plant. They are intended to request the operator to take action to avoid a normal situation becoming abnormal. It is not uncommon to find that 50% can be false or unnecessary requests - but which 50%?

The problem begins and can be ended with correct values of the alarm limits. They aren’t independent of each other so you can’t set them one-at-a-time or by repeatedly resetting the limits of each week’s top ten most frequent alarms. You will see in this webinar presented by our Technical Director Dr Alan Mahoney that there is new understanding of how alarm limits should be created and maintained as a set. This is what differentiates Modern Alarm Rationalization from the one-at-a-time methods which most people are still using.

It gets better! Not only do you get much better alarm limits to load into your alarm database, but it takes much less process engineering time, cuts out those tedious meetings with all disciplines present for weeks on end and provides what is effectively a functional specification for the detailed design (see IEC-62682 or ISA SP18.2) and implementation steps so that they become faster too.

The same tools are used in the alarm performance monitoring phase for monitoring alarm effectiveness in operating the process, not just counting alarm occurrences or time in alarm which are effects that don’t contain enough information to identify causes.


Finding, Understanding and Repeating Best Operation with Operating Windows

 

Setting KPI targets and reporting is necessary, but for use in the control room these targets need to be translated into operating windows. It is easy to get this wrong and difficult to realize when the window becomes outdated. This one hour webinar demonstrates a better way to address these problems. There can be a new understanding of the relationship between KPI targets, operating targets and process objectives. The webinar shows how to use that understanding to find the best operating window to achieve KPI targets and other operating objectives. Providing the best operating window to operators is the essential first step toward repeating and improving best process operations.


Achieving Operational Excellence

Process plants have numerous “Key Performance Indicators” intended to guide everyone from process operators to senior management towards operational excellence. But do they?

KPI targets are set individually and it is not easy to confirm whether leading KPI’s really do contribute effectively to achievement of lagging KPI’s. It isn’t unheard-of for one KPI to conflict with another and different managers may disregard some KPI’s in favour of others. Operational excellence would suggest that all KPI’s, both leading and lagging, should be achieved.

It is easy to say that all KPI target values should be set to be consistent with each other, but there used not to be a way to achieve that, or even to test whether they were. But it is very easy to do when the leading KPI’s are positioned using an operating envelope that you can actually see defined by the lagging KPI’s and containing possibly hundreds of variables. The picture makes performance monitoring much easier for everyone, as is performance reporting. This, amongst other things, is the feedback to KPI target setting for further refinement and understanding of how KPI’s interact with each other.

The approach is radically different but, as with all really good inventions, much simpler than what it replaces.

In this webinar, Dr Robin Brooks, Founder and CEO of PPCL, explores how PPCL's innovative software products, CVE and CPM, could help YOUR process plant achieve operational excellence.

The webinar is suitable for anyone who has ever had an involvement in plant operations in any process industry or energy industry segment, or who has wondered if there was a fast, practical, no-maths method to extract the information and greater understanding that they always knew was buried in their process history data. Well, there is now!

PPCL is working across the world to improve business efficiency in process plants. Why not see what we could do with you?  




Avoiding Downtime through Event Forensics and Prediction

 

Value lost to downtime and degraded product quality during abnormal events and designed overcapacity to compensate are among the largest avoidable costs in any process plant. Process events come in the form of disturbances, faults, trips and excursions that cause downtime and lost production. Operational excellence requires reducing the frequency and impact of these events.

This webinar explores PPCL’s novel Geometric Process Control (GPC) technology for understanding the course and causes of these events and generating real-time operator alerts for early warning of developing future events.

The start is investigating events using historical data from the plant historian through C Visual Explorer (CVE). Building from individual events to discovering similarities between events, CVE allows investigating significantly larger datasets than most engineering applications, hundreds of variables across thousands of time points. This power lets us quickly see and explore data far beyond what we’d typically use.

PPCL’s online process monitoring tool, C Process Modeller (CPM), goes beyond traditional alerts by implicitly including the relationship between process variables and providing a sensitive detector for changes. By excluding events and event precursors from our model of normal operation, CPM provides a powerful low-cost method of building event prediction models that have been shown to provide hours or days of advance warning of process changes, giving plant personnel more time to react and mitigate the effects of disturbances and faults. We’ll look at examples including a gas turbine-driven generator and surge in a large propylene refrigeration compressor. CPM models can be created in just a few hours, bringing the benefits of condition monitoring to applications where it isn't currently economical. Process availability and efficiency will improve and facilitate the move toward predictive maintenance and operational excellence.


GPC Technology for Big Data and Predictive Analytics in Process Plants



The Problem: Process plants generate continuous data for thousands of variables at sub-minute frequencies. This is far beyond what process engineers can analyse with their conventional analysis tools. This data has enormous value, containing the records of plant operation and implicitly the relationships between process variables and quality variables. But only a tiny fraction of this data is used today. Many may consider the “Big Data” methods that are the current buzz, but these approaches are challenged by process plant data.

Big Data techniques focus on pulling subtle correlations from largely uncorrelated data, but chemical processes have extensive relationships due to balances and governing physical laws. Predictive Analytics provides generalized answers through simplifications; choosing a small subset of variables, processing or averaging data, and ignoring the fundamental complexities such as nonlinearity of the process. This can destroy much of the richness of the data and reinforce preconceptions. It can also be time-consuming and require a statistician to interpret the results.

The Solution: Geometric Process Control (GPC) is a visual technology: quick and easy to understand and implement by anyone familiar with the process. Hundreds of variables can be seen simultaneously on one graph to gain an overall understanding of your operation, target investigation, test hypotheses and quickly identify improvements.

PPCL presents a Geometric Process Control alternative to reach the goals sought from Big Data and Predictive Analytics that shows how to operate the process so that objectives are met, keeping their values at the desired targets.

In the webinar, delivered in May 2017, Dr Alan Mahoney, PPCL's Technical Director, demonstrated GPC using graphical tools to screen for key variables and eliminate the effects of uncontrolled variables.  He showed how to approach big datasets and explore them visually using the parallel plot, a unique graphical technique that puts axes parallel to each other rather than perpendicular, allowing the exploration of interactions between variables with orders of magnitude fewer graphs. By connecting historical data completely across the process from incoming conditions and initial processing conditions to final quality variables, KPIs and performance with the richness of years of data in a technique that can examine hundreds of variables, the parallel plot enables discoveries and exploration that are not possible with today’s techniques. He also demonstrated online GPC models for achieving quality targets and operational excellence.


 


PPCL © Copyright 2011-2017. All Rights Reserved. | Terms of use | Privacy Statement | P.O. Box 43, Gerrards Cross, Buckinghamshire, SL9 8UX. UK.