The time-scales over which simulation has been applied are generally shortening. Traditionally simulation has been used heavily for designs and revamps, but this only occurs occasionally. Simulation use has expanded to include keeping LP models updated, typically on a monthly cycle, monitoring unit and equipment health, and optimization of operations. These additional uses move simulation use closer to real-time, and requires automated operation with limited supervision and intervention.
Real-time optimization applications unlock additional value by maximizing the operation of the unit, but may also lead to an additional set of simulation challenges, including;
- Data quality – Wrong data input will provide poor results
- Model maintenance – Maintaining good simulation models requires robust maintenance procedures just like any other piece of plant equipment
- Work processes – If a model is being used, maintained and acted upon there needs to be formal work processes to codify these actions
In the past, addressing these challenges has relied heavily on the skills of simulation users, and voluntary adoption of good practices and collaboration. This has led to a wide variation in the efficacy of simulation use in industry. Increasingly though, simulation technology itself is now being designed to address these issues.
In order to get the most out of simulation technology, it’s vital any issues with data quality and model performance are addressed in order to provide consistent, dependable results that add value. For organizations struggling with a lack of personnel or limited simulation skills, the cloud now offers potential for remote experts (either from other company sites/HQ, or from the software supplier) to supervise, validate and update the models, ensuring the results are always correct. If current challenges are addressed by plants, the resulting always-correct digital twins, connected to real-time data, can be more easily integrated into business processes and decision support tools. Ultimately, this will automate analysis procedures, freeing up engineer bandwidth from firefighting and data processing to do more proactive forecasting and optimization, for more profitable ways of working.