Conference papers and presentations


Forthcoming papers

PSE will be presenting papers at the following events.

2018 AIChE Annual Meeting  (Pittsburgh, PA, U.S.A, October 28 - Novemeber 2, 2018)

Practical Application of a Mechanistic Model for Twin Screw Granulation for Pharmaceutical Process Development
Dana Barrasso, Leonor Rosa, Sean K. Bermingham, Process Systems Enterprise;
Gavin Reynolds, AstraZeneca

The scientific understanding of twin screw wet granulation processes has advanced in recent years, allowing for the development of mechanistic models that quantify the effects of design decisions on granule attributes. Beyond capturing the scientific understanding of these processes, mechanistic models can be used to solve practical problems in pharmaceutical process development. However, the workflows and best practices for model validation are not always clear, and the model’s capabilities for answering targeted, meaningful questions are not well tested.

In twin screw granulation process development, finding the optimal design and operational space for a new formulation can consume significant time and material due to the highly configurable screw layout in addition to the main process parameters. In this work, a case study will be presented that utilizes a mechanistic model of a twin screw granulation process to improve on the current, experimentally driven process development workflow. Model validation against experimental data will be presented, and data requirements will be discussed. Further, the ability of the model to reduce the experimental burden during process development will be explored. Finally, the incorporation of downstream processing models, such as dryers, mills, and tablet presses, will be investigated.

↑TOP


Model-Based Analysis and Optimization of a Semi-Lean MBC Process for Natural Gas Sweetening
Ven Chian Quek, Imperial College London, Petronas;
Nilay Shah, Benoit Chachuat, Imperial College London;
Javier Rodriguez, Process Systems Enterprise Ltd

Over the past decade, a great deal of research has been devoted to membrane contactors (MBC) for CO2 absorption in natural gas sweetening applications. One key advantage of MBC over conventional absorption towers is their large intensification potential. In this work, the MBC mathematical model presented in Quek et al. [1], is used to carry out global sensitivity analysis and process optimization studies using the gPROMS ProcessBuilder. Global sensitivity analysis is used to determine the membrane characteristics and operating conditions that have the largest impact on CO2 removal performance and costs. Next, the standalone MBC unit model is integrated with gPROMS ProcessBuilder standard unit operations model libraries to model an MBC-based flowsheet of a semi-lean process for natural gas sweetening (Fig. 1). This flowsheet is optimized with the objective of minimizing the annualized cost, subject to a given CO2 specification. The optimization is carried out by applying a simple direct search in Matlab using the gO:Matlab interface.

Fig. 1. Semi-lean MBC process flowsheet.
References:
[1] V. C. Quek, N. Shah, and B. Chachuat, Modeling for design and operation of high-pressure membrane contactors in natural gas sweetening, Chem. Eng. Res. Des. 132:1005-1019, 2018.

↑TOP


Utilizing Mechanistic Modelling to Assist in the Process Development of Pharmaceutical Drug Substance Processes
Rosario Porrazzo, Glaxo Smith Kline;
Sam Wilkinson, Niall Mitchell, Process Systems Enterprise

The development of a pharmaceutical API/drug product requires a large number of experiments, the design and interpretation of which is often based on statistical modelling tools. This approach achieves the definition of parameter ranges that delivers products of high quality in a robust manner. However, the drawback to this approach is that the data only covers the processing conditions investigated and cannot be extrapolated outside of these conditions. Also, this approach does not guarantee a full understanding of all the phenomena involved in a given process and hence, it may yield a viable but not an optimised processes or sequence of operations. For the above reason, mechanistic modelling approaches would be the preferred choice, although sometimes not the most practical, yielding a higher degree of understanding of the chemical and physical phenomena occurring in a given pharmaceutical process. This approach can be applied to the development of both single unit and multi-stage operations, for both batch and continuous processes. Coupling models across multiple unit operations would lead to the optimisation not only of a single operation but the entire process, thus leading to reduced costs and maximising product yields.

In this context, the work presented models batch unit operations, including reactions, separations, and links them together following a common pharmaceutical process structure. In this case, gPROMS FormulatedProducts software was applied to meet this mechanistic modelling need. This tool includes a wide array of mechanistic models for drug substance applications. The kinetic parameters for the reaction mechanisms considered were validated using a small number of data rich experiments, allowing for reaction model and mechanism discrimination to achieve the most descriptive mechanistic model for the batch reaction step. In the future, the validated process models will be utilized to describe the integrated drug substance processing steps. This end-to-end mechanistic modelling approach will help to both optimize a full pharmaceutical manufacturing process and determine its overall robustness against disturbances.

↑TOP


Application of Mechanistic Models for the Digital Design and Online Control of Pharmaceutical Processes
Niall Mitchell, Process Systems Enterprise;
John Mack, Furqan Tahir, Perceptive Engineering Ltd.

Mechanistic models are becoming more commonly applied for Research and Development in the pharmaceutical sector. Traditionally, the output from this activity, namely a validated mechanistic model, which is capable of quantitatively predicting the behaviour of the various Critical Quality Attributes (CQA) for typical batch or continuous pharmaceutical processes for a wide range of Critical Process Parameters (CPP). However, these tools are almost exclusively employed in an offline manner currently, primary aimed at assessing process robustness and variability, with very little subsequent online application of the model for control or soft sensing purposes.

Traditionally, online control companies utilize statistical models that require a significant level of tuning and verification, with the online full-scale plant, using Pseudo-Random Binary Sequences (PRBS) to vary the process parameters and observed the process responses. This requirement for the application of rigorous Model Predictive Control (MPC) using statistical model approaches commonly leads to reduced appetite for uptake of the technology in the Pharmaceutical sector. These online trials for tuning the MPC may result in off-specification productions, potentially leading to significant losses in productivity, as the controllers pushes the process to observe responses at extreme points. However, most of these drawbacks can be overcome by the integration of mechanistic models, developed using laboratory scale data with MPC system, such as that offered by Perceptive Engineering, namely PharmaMV.

In this work we outline, the application of an advanced process modelling tool, namely gPROMS FormulatedProducts, to describe a number of pharmaceutical processes. The process model and the mechanistic model kinetic parameters were validated using process data gathered from the literature and from lab-based experiments. The lab-based process was subsequently used to predict the behaviour of the full scale production scale. In order to make the model predictive, some refinement of the kinetic parameters for secondary nucleation was required using minimal experimental data from the typical plant runs.

The mechanistic model was integrated with PharmaMV to develop and tune the MPC against the mechanistic simulation of the process. The PharmaMV platform was subsequently transferred to the physical process. With this approach the MPC derived from the mechanistic model was utilized to accurately control the defined CQAs, such as final particle attributes, potency or moisture.

↑TOP


Global System Analysis of Twin Screw Granulation Using Population Balance Modelling in Gproms
Li Ge Wang, James D. Litster, University of Sheffield;
Dana Barrasso, David Slade, Process Systems Enterprise Limited

Twin screw granulation (TSG) becomes increasingly popular as a method of wet granulation in the integrated part of continuous manufacturing for pharmaceutical industry. Despite many studies of TSG through the population balance model (PBM), the systematic analysis of the input parameters in PBM is still scarce. As a consequence, the influential parameters in PBM are yet fully identified and the holistic understanding of the granule growth in TSG remains elusive. This paper presents a global system analysis of the PBM operation for a ConsiGma™ 25/1 continuous twin screw granulator. As a precursor to understand the influence of PBM parameters in the granule attributes, the modelling kernel formation in PBM is first defined in a compartmental way. The modelling formation consists of four rate process kernels, i.e. nucleation, layering, breakage and consolidation. Particularly, the recently developed breakage kernel is implemented in the software of gPROMS, which significantly improves the serviceability of breakage kernel for TSG. The feature of the breakage kernel is to take into account the powder feed number, dynamic yield strength and critical breakage size. The input parameters in PBM are identified and categorized whereas the reference values of all the parameters are specified. With the defined input value of PBM parameters, the comparative values of the testing parameters are also defined. With varying the comparative values of input parameters in a univariate way, the influence of input parameters on the granule size distribution (GSD) is investigated.

Apart from the rate process kernels, another two important process kernels i.e. liquid to solid ratio (LSR) and the average residence time in each compartment are also deployed for sensitivity study. Through the global sensitivity analysis of the input parameters in PBM kernels, the leverage of each parameter on the granule product attributes could be identified, which shed more insights on the validation of PBM model and model-based design of TSG products.

↑TOP


Preparing Chemical Engineering Students for the Digitalization of Tomorrow – Integrating Modelling across the Curriculum
Eva Sorensen, UCL;
Pieter Schmal, Process Systems Enterprise Inc.

The advent of increased computational power, availability of more data and improved algorithms is allowing industry to “digitalize” operations. In essence, this means industry is able to use models to represent the plant accurately, use large amounts of data to determine the state of the plant, and use algorithms to optimize the plant performance on-the-fly. To prepare students for such a working environment, it is important to train students in the fundamentals of modelling. At present, modelling often does not go much beyond a flowsheeting exercise, which provides little understanding of the fundamentals underlying a model, and often has become a simple enter-data-and-push-button exercise. Modelling plays an important underlying role in almost all chemical engineering areas, from reaction engineering, physical transport phenomena to thermodynamics and of course process systems engineering, because the core laws, and chemical and physical phenomena, are expressed in equations. Introducing modelling as an integral part of a curriculum is therefore not only logic, but crucial in order to prepare students for the industry of tomorrow.

The Department of Chemical Engineering at University College London has successfully implemented an Integrated Engineering Program (IEP) where modelling is integrated across the entire curriculum, rather than being covered in a single course in the junior or senior year. To support the integration, PSE has developed and provided training material, exercises and tests through the Process systems engineering Academic Teaching Highway (PATH) initiative. The material is modular in nature and covers the typical fundamental topics from a modelling perspective and are supplementary to existing course material. We will reflect on the implementation and delivery of the new curriculum over the past four years, and how the PATH material has supported the students' learning, ultimately preparing them for a digital future.

↑TOP


IRPC Americas (Houston, Tx, U.S.A, September 25-26, 2018)

Using Digital Process Twin technology to drive Operational Excellence in downstream operations
Steve Hall, Costas Pantelides and Mark Matzopoulos, Process Systems Enterprise, London, UK

Digital replicas of operating assets that combine plant data with high-fidelity process models are bringing a new level of decision support to operations in refineries and petrochemicals plants. They provides many benefits, from monitoring and ‘soft-sensing’ to advice on optimal set-points, diagnostics and prognostics such as maintenance interventions. At the heart of such shadowing systems is an always-current digital simulation model of the asset which updates itself periodically based on real-plant performance.

In the context of petrochemical plants, digital twin technology is now being implemented. The associated benefits include: equipment monitoring to determine the state and rate of, for example, coking in cracking-furnaces and catalyst deactivation; real-time ‘soft-sensing’ to provide up-to-date yield and other key information which is either difficult or impossible to measure; forecasting to determine future performance such as run length prediction based on current furnace or reactor state and anticipated operation; operational optimisation to give advice to operators regarding set-points, diagnostics, prognostics and performance benchmarks.

Similarly for refineries, next-generation digital operations tools are now being implemented for soft-sensing, forecasting and optimisation in areas as diverse as monitoring and managing fouling in heat exchanger networks through to operations of whole refinery units to maximise profitability and adapt to constantly changing feedstocks.

Successful digital operations systems use high-fidelity predictive mathematical models of the assets to exploit redundancy between model prediction and plant data and maintain themselves in an always-current state . This means that ‘drift’ in key parameters such as degree of furnace coking, catalyst deactivation, heat exchanger fouling and so on is taken into account in all monitoring, forecasting and optimisation calculations, to provide reliable information for decision support.

This paper describes how the combination of equation-based general-purpose modelling technologies and next-generation digital application frameworks provide an environment for easy construction and application of fast, robust online solutions based on high-fidelity models. A real industrial example is presented where the digital twin of a crude distillation unit preheat train is used to improve the actual unit’s performance and reliability. The twin is linked to plant data systems, updating itself through machine-learning capabilities and dynamic parameter estimation techniques, validating actual performance and, where appropriate, identifying departures from normal operation. Beneficial process changes are highlighted. Visualisation of the twin is seen to be critically important to give operators greater insight and confidence to operate the process safely at the optimum point, thereby making a major contribution towards Operational Excellence.

↑TOP


ACHEMA (Frankfurt am Main, Germany, June 11-15, 2018)

1. Detailed Modeling of LDPE Autoclave Reactors
Alejandro Cano, Shashank Maindarkar, Process Systems Enterprise, Cedar Knolls, NJ, USA
Thomas Lafitte, Process Systems Enterprise, London, U.K.;
In-Seon Kim, Process Systems Enterprise, Daejeon, South Korea

Highlights

  • Detailed model combining rigorous thermodynamics, full molecular weight distribution modeling, and accurate flow patterns derived from CFD simulation
  • Validation against commercial reactor operating data achieved by adjusting a small set of kinetic parameters
  • The validated model has been applied to increase production in existing commercial reactors while preserving desired product properties.

1. Introduction

Developing new polymer grades typically presents a number of challenges – for example, determining the necessary operating conditions for producing desired polymer molecular weight distributions (MWD) in reactor systems with imperfect mixing and complex kinetics. This presentation describes how an integral modeling approach that combines the advantages of SAFT physical property predictions, detailed kinetic modeling, and characterization of flow and mixing patterns using Computational Fluid Dynamics (CFD) captures the effect of changes in operating conditions on the shape of MWDs. This type of modeling work helps in reducing development time for new polymer grades, and in evaluating new reactor designs. It has also been applied in the determination of operating conditions that preserve polymer properties when changing reactor throughput.

2. Methods

Detailed models of industrial Low-density polyethylene (LDPE) autoclave reactors have been built by integrating several advanced modeling concepts:

    1. An advanced thermodynamic model based on the Statistical Association Fluid Theory (SAFT) γ-Mie Equation of State, which represents molecules as chains of distinct functional groups [1-3]. Each group is characterized by its own size. Self-interactions of a specific functional group (i.e. interactions with identical functional groups) and interactions with other groups are characterized by their own Mie-potentials. This approach is ideally suitable for the modeling of polymers.
    2. Kinetic modeling in perfectly stirred reactor zones that considers elementary reaction steps in chemically initiated free radical polymerization [4,5] and solves for the evolution of the full molecular weight distribution (MWD) using the fixed pivot technique (PFT) [6].
    3. A multi-zonal approach [7] to represent the complex flow pattern in industrial-size autoclave reactors. The reactor is represented as a network of perfectly stirred reactors, and the flows among the various reactors are calculated by processing the results of a “cold-flow” Computational Fluid Dynamics (CFD) simulation of the reactor.

The integrated reactor model is built on Process Systems Enterprise’s (PSE) gPROMS® advanced process modeling platform using PSE’s gSAFT thermodynamic engine and PSE’s Hybrid Multizonal gPROMS-CFD tool for automatic processing of results of CFD simulations performed with ANSYS Fluent®.

3. Results and discussion

The integrated modeling approach has several advantages that enable more accurate modeling of LDPE autoclave reactors:

  • The use of a rigorous thermodynamics allows us to incorporate the effect of temperature and pressure on the heat of reaction, improving temperature predictions. The approach also allows accurate prediction of the on-set of two-phase behavior of reacting mixtures, accounting for polydispersity and branching (Figure 1)

  • Detailed kinetic modeling allows prediction of the full molecular weight distribution, including the high molecular weight shoulder seen in many LDPE grades.
  • Multizonal modeling with a sufficiently large number of zones (200-300) allows the model to respond to changes in design details, such as shaft, paddle, and baffle geometry, and positioning and orientation of injection ports.
  • Implementation on the gPROMS platform allows us to take advantage of standard features such as (i) parameter estimation to adjust kinetic parameters to match plant data, and (ii) optimization to find operational set points that achieve desired product properties, such as Melt Index and resin density.

↑TOP


2. Micro-scale process development and optimization for crystallization processes
N.A. Mitchell, Process Systems Enterprise (PSE) Ltd., London, UK;
C. J. Brown, EPSRC Centre for Innovative Manufacturing in Continuous Manufacturing and Crystallisation, University of Strathclyde, Glasgow, UK

Introduction

Initial experimental phases of crystallization process development are commonly carried out at very small scales, typically using 1-5mL vessels. The aims of these early phases of process development are to select a solvent based on solubility and crystal solid state. These activities are commonly conducted in high throughput reactor systems, such as the Crystal16reg; and Crystalline, from Technobis Crystallization Systems1. However, for the development, validation and optimization of crystallization process models this data is usually not utilized and the selected solution system is probed experimentally and more quantitatively at much larger scales, typically between 100 – 1000 mL. A more quantitative usage of the data generated at small scale for the development of process models which may significantly reduce the number of larger scale experiments required, would aid in addressing the increasing constraints on time and materials in pharmaceutical development.

Solubility and metastable zone width (MSZW) experiments are routinely conducted with both experimental systems, with clear points (indicating the point of dissolution) and cloud points (indicating the on-set of nucleation in solution) utilized to indicate the MSZW for a given cooling rate, agitation rate and solute composition [2]. Through turbidity and temperature measurements, both experimental systems provide the ability to quantitatively determine MSZW and therefore, the primary nucleation kinetics of the solution system [2]. In addition, the Crystalline system also has the added measurement capabilities of particle visualization, providing a number based representation of the particle size distribution (PSD) and Raman modules for concentration and solid form monitoring. In-situ sizing via image analysis is also not prone to sampling issues possible with offline PSD measurements techniques like laser diffraction. In addition, the in-situ image analysis capabilities can be enormously beneficial in terms of aiding process understanding, providing crucial information for crystallization mechanism and model discrimination activities. The lack of probes inserted into the reaction vessel also leads to no cross contamination and no interference in the crystallization environment. All of these techniques are integrated into a small reactor with overhead stirring and refluxing capabilities, which can qualitatively mimic the likely vessel configurations at larger scales.

In this work, data from paracetamol and 3-methyl-1-butanol solutions at the micro-scale was utilized to estimate the crystallization kinetics of the model, including crystal growth and primary nucleation, enabling model development, as well as mechanism discrimination. The final predictions of the developed process model were compared with a previously developed and validated process model, which employed larger scale 1 L scale experiments. Cross-validation with FBRM, laser diffraction and online solute concentration data from the larger scale, 1 L was conducted. Although perfect agreement was not achieved, primarily due to scale and reactor dependent kinetic mechanisms, such as primary nucleation and secondary nucleation, the process model was in general qualitative agreement with the original model developed with larger scale experimental data. A key outcome of this work is an adapted process development workflow for the design, model validation and optimization of processing models for crystallization systems. This workflow enables crystallization process development with an order of magnitude lower demands for materials. In particular raw API, which may not be available in the early stages of process development. In addition, the design space for the process can be assessed early on, such a process robustness and viability of continuous processing, with less dependence on larger scale, more material intensive experiments. As a result, by utilizing commonly available micro-scale process data the material demands and requirements for larger scale experiments can be significantly reduced, leading to a step change in pharmaceutical process development efficiency and productivity.

↑TOP


3. Application of mechanistic models for the online control of crystallization processes
N. A. Mitchell, Process Systems Enterprise (PSE) Ltd., London, UK;
Y. Salman, C. Y. Ma, T. Mahmud, and K. J. Roberts, School of Chemical and Process Engineering, University of Leeds, Leeds, UK;

J. Mack, Perceptive Engineering Ltd, Daresbury, UK

Introduction

Mechanistic models are becoming more commonly applied for Research and Development in the pharmaceutical sector. Traditionally, the output from this activity, namely a validated mechanistic model, which is capable of quantitatively predicting the behaviour of the various Critical Quality Attributes (CQA) for the crystallization process for a wide range of Critical Process Parameters (CPP). However, these tools are almost exclusively employed in an offline manner currently, primarily aimed at assessing process robustness and variability, with very little subsequent online application of the model for control or soft sensing purposes.

Model Predictive Control (MPC) is an established industrial technology which has only recently been applied in the Pharmaceutical industry. Existing applications in batch and continuous crystallization processes provide tight control of supersaturation and final particle properties at various scales. Optimized supersaturation control has also been demonstrated to improve batch yield and deliver consistent product. The MPC applications to date have characterized the crystallization process using statistical models and data-driven techniques. The data generated during these experiments are used to develop dynamic process models, calibration models and establish Meta-Stable Zone (MSZ) boundaries. The MSZ boundary information is combined with the concentration prediction and MPC to provide closed loop control of super-saturation.

The drawback of this approach is the time and material costs associated with executing the experimental tests. For batch systems, the product generated during these tests is typically discarded. Furthermore, a subset of the tests must be repeated during scale up as the MSZ is both product and process dependent. The experimental testing time during initial development and scale-up could be reduced by combining information from validated mechanistic models into the Model Predictive Control system. The system Meta-Stable Zone boundaries and growth characteristics could be used to update the system during the plant tests and control system commissioning.

Methodology and Results

In this work, as the first stage of development of a MPC approach, we outline the application of an advanced process modelling tool, namely gCRYSTAL (PSE), to build a model for the seeded batch cooling crystallization process of L-glutamic acid from aqueous solutions. The process model with the crystallization kinetic parameters obtained from references [1, 2] was validated using process data gathered from the laboratory based experiments carried out in 0.5 L and 20 L agitated crystallizers at Leeds [2]. The laboratory based crystallization process model was subsequently employed to predict the behaviour of a pilot-scale industrial crystallizer. In order to make the mechanistic model more predictive of the process behaviour observed at the larger scale, some refinement of the kinetic parameters for secondary nucleation was required using minimal experimental data from the typical plant runs.

The validated mechanistic model of the crystallizer will be integrated with a PharmaMV (Perceptive Engineering) Advanced Process Control system. PharmaMV provides the multivariate control and monitoring platform for the online laboratory and pilot-scale crystallization processes. With this approach the validated mechanistic model will be utilized to drive the successive control steps to achieve a target product crystal size distributions, defined by the D10, D50 and D90 of the final product CSDs.

↑TOP


4. Next-generation utilities optimisation for large-scale chemical production sites
Penny Stanger, Gerardo Sanchis, Frances Pereira, Alfredo Ramos, Process Systems Enterprise Ltd

Refineries and chemical production sites are major consumers of energy in the form of electricity, steam and hydrocarbon feedstocks. Given that tariffs, costs and demands are constantly changing, there is much scope for optimizing on-site production, conversion and distribution of energy to minimize cost and emissions, by managing the options available in the most cost effective way while meeting all constraints of the system.

This paper describes an advanced optimization platform for managing and optimizing utility operation that not only helps planners rapidly optimize equipment selection and load allocation to improve overall efficiency and reduce emissions and operating costs, but also presents operators with a ranked list of possible actions from which they can choose the best for the current situation.

The approach uses mathematical models of the utilities system and major devices coupled with plant operating data via data validation and reconciliation facilities. An advanced optimization system capable of both continuous and integer decisions determines the economically optimal operating point taking into account equipment and operational constraints, including availability. The resulting mathematical problem is solved within an equation-oriented framework, providing robustness and speed of execution well in advance of most current systems.

Because of the speed of solution, a key feature is the ability to run multiple optimizations and present operators with a ranked list of potential combinations of changes and their corresponding benefits, within a dashboard tailored for the site. This allows operators clearly to evaluate and discuss which changes are the best to apply when, resulting in advice that is practical and easy to implement and verify. It addresses one of the biggest obstacles to practical realisation of the benefit of optimization systems, which is gaining operator buy-in to proposed changes.

The paper is illustrated with an examples of implementation on a major European chemical manufacturing site.

↑TOP


5. Control and design optimization of coupled IPA/NPA distillation columns
Mariana Marques, Process Systems Enterprise Limited, London, United Kingdom;
Jan van Schijndel, VSSC, The Netherlands

Traditional analysis of distillation column design and operation is based on steady state analysis. In reality however, distillation processes are continuously subject to disturbances, caused by variations in feedstock, changes in product requirements, the availability of utilities, etc.

Dynamic modelling is a key technology for optimising column design and operation. It can be used to understand the effects of feed and product quality changes, periodic feed upsets and abnormal operation, and then to design and tune control schemes capable of dealing with such disturbances.

Accurate quantification of column transients allows engineers to minimise the effects on off-spec product, energy requirements and purity giveaway, in order to maximise process economics.

This paper provides an overview of recent developments in advanced process modelling that now make dynamic optimization of distillation columns easier than ever before.

For this purpose, an industrial distillation system for the separation of iso-propanol and propanol, designed in a traditional manner without considering the effect of feed fluctuations on the design is introduced and shown to be severelly ill-conditioned. A new control scheme is proposed to overcome the operational difficulties. Moreover, using an optimisation approach that takes operability of the design into account, it is shown that significant economic and operability benefits can be obtained if design and control interactions are systematically considered.

↑TOP


6. New tools to reduce time to market and manage risk
Bart de Groot, Edward Close, Costas Pantelides Process Systems Enterprise Ltd., London, United Kingdom

To date, the main use of process simulation and modelling tools has been to analyse "what-if" scenarios in order to improve process design and operation, and make risk assessments without interfering with the actual process. Such calculations are typically done manually as a point activity, using repeated simulation runs.

A new technology called global system analysis now allows users to systematically explore the behaviour of a system over domains of any user-selected subset of its input and output variables. This provides a quick and easy way to explore the complex process design and operational decision space using high-fidelity models in order to assess risks under uncertainty and screen and rank process design and operation alternatives.

For example, it allows us to take into consideration uncertainty in raw material composition, uncertainty in product pricing, uncertainty in model parameters, etc. and predict what effect this has on uncertainty in e.g. process economics. Such information provides a quantitative basis for process optimization and investment decisions.

This paper describes the underlying technology, which includes the ability to nominate factors, attribute uncertainty to these and quantify the effect of this uncertainty on key performance indicators of interest. It includes state-of-the-art low-discrepancy sampling methods, extensive facilities to visualize and analyse the results, and is designed to be highly resilient on modern multi-core hardware.

The paper goes on to describe several applications to the pharmaceuticals, minerals, energy and chemicals sectors.

↑TOP


7. Resolving challenges in nuclear waste remediation through advanced process modelling
Felipe Basaglia, Adil Sardardeen, Ravindra Chunilal, Sellafield Ltd, Hinton House, Birchwood Park Ave, Birchwood, Warrington WA3 6GR, UK
Mark Matzopoulos, Mayank Patel, Process Systems Enterprise Ltd, 5th Floor East, 26-28 Hammersmith Grove, London W6 7HA, UK

Sellafield has led the development of the UK’s nuclear industry, from the production of plutonium for the country’s nuclear deterrent programme through to the development of nuclear power generation. Today Sellafield is focussed on completing the spent fuel reprocessing programme and is faced with the challenge of cleaning-up the legacy of the site’s early operations, including some of the most hazardous nuclear facilities in Europe.

Effluent treatment using ion-exchange technology has been successfully used on the site to treat radioactivity from fuel storage ponds. Their Site Ion EXchange Effluent Treatment Plant (SIXEP) has made a major contribution to the management of radioactivity discharged from the site. The plant is described in full in Refs [1-2]. The main processes include:

    • The removal of suspended solids in the feed by sand filtration;
    • pH adjustment from pH11.5 down to pH8.1 using carbon dioxide;
    • The absorption of caesium and strontium by ion exchange;
    • The interim storage of sludges from filter backwashes and spent ion exchange media.

A number of challenges associated with effluent management arise when trying to meet regulatory expectations through appropriate discharge forecasting, such as minimising the quantity of spent ion exchanger arisings.

This presentation will highlight how gPROMS technology is helping address some of these challenges and optmitise future operations.

↑TOP


IDTC - International Downstream Technology & Strategy Conference 2018 (Prague, Czech Republic, 22-25 May 2018)

Using a Digital Process Twin to Drive True Operational Excellence in Refining
Steve Hall, Costas Pantelides and Mark Matzopoulos, Process Systems Enterprise, London, UK

A digital replica of an operating asset can provide great insight into operations, and build confidence in operations personnel that they are making the right decisions for the current state of the plant. It provides many benefits, from monitoring and ‘soft-sensing’ to on-line advice on optimal set-points, diagnostics and prognostics such as maintenance interventions. At the heart of such shadowing systems is an always-current digital simulation model of the asset which updates itself periodically based on the real-plant performance.

PSE has worked with companies in the refinery and petrochemical spaces to apply its equation-based gPROMS technology in online digital twin applications. This paper describes how such equation-based systems provide an environment for easy construction and fast, robust online solution of high-fidelity models. Steady-state and dynamic optimisation models can be used to validate current asset performance, identify the optimal progression through transition states as set-points change and establish the true ‘inevitable versus avoidable’ losses in performance.

A real industrial example is presented where the digital twin of a crude distillation unit preheat train is used to improve the actual unit’s performance and reliability. The twin is linked to plant data systems, updating itself through machine-learning capabilities and dynamic parameter estimation techniques, validating actual performance and, where appropriate, identifying departures from normal operation. Beneficial process changes are highlighted together with the associated steps that operators need to make to bring the unit back onto its optimal performance track. Prognostics such as heat exchanger maintenance/cleaning interventions are dynamically calculated. Visualisation of the shadow system is seen to be critically important. The presence of the digital twin gives operators greater insight into operation and thus much more confidence about their ability to operate the process safely at the optimum point, thereby making a major contribution towards Operational Excellence.

↑TOP


SOGAT - Sour Oil & Gas Advanced Technology 2018 (Abu Dhabi, UAE, 02-03 May, 2018)

Modeling and Optimization of Membrane Contactors in Natural Gas Sweetening
Ven Chian Quek, PETRONAS, Malaysia ,Javier Rodriguez, Process Systems Enterprise Ltd and Nilay Shah and Benoît Chachuat, Imperial College London, UK

Membrane contactor (MBC) for CO2 absorption has been widely recognized for its large intensification potential compared to conventional absorption towers. MBC technology uses micro porous hollow fibre membranes (HFM) to enable effective gas and liquid mass transfer without the two phases mixing into each other. In this work, gPROMS Process Builder with custom modeling is used to develop a mathematical model that accounts for the effect of membrane pore-size distribution and operating conditions on membrane wetting for improved understanding of a novel MBC operating at a high pressure 50-70 bar.. The optimization problem involves minimizing the MBC’s annualized cost, which consists of membrane and solvent regeneration costs, subject to a given CO2 specification alongside operational constraints and size limitation of the MBC module.

↑TOP


2018 AIChE Spring Meeting (Orlando Florida, 22-26 April, 2018)

1. Optimising EDC Cracker Operation to Enhance VCM Plant Economics
Sreekumar Maroor, Stepan Spatenka, Renato Wong, Process Systems Enterprise;
Wipada Lertpukpon, Pheeraya Kityanyong, Kritsada Chotiwiriyakun, Nattawat Tiensai, SCG Chemicals Co., Ltd

Ethylene dichloride (EDC) is converted to vinyl chloride monomer (VCM), the key building block for polyvinyl chloride (PVC), by energy-intensive thermal cracking. EDC cracking is carried out in furnaces which typically operate at a conversion of approximately 55-60% and have a run length of more than a year. EDC cracking also produces small amounts of impurities such as butadiene, methyl chloride, ethyl chloride etc., whose concentrations needs to be maintained below specified values to avoid operational issues in the plant.

Coke is another by-product of EDC cracking and its deposition on cracker tube walls leads to increased pressure drop and high tube skin temperatures, with the consequence that eventually the furnace needs to be taken offline for decoking. Even though EDC conversion, yield of VCM and impurities and coking rates are key for profitable plant operation, these are not normally measured directly because of the challenging physical environment.

The paper describes the development of rigorous models for the EDC cracking furnace and their use in monitoring and optimising operation. The furnace model is based on detailed radical-based cracking kinetic mechanism and includes kinetics for coke deposition in the coils. Such a model, once validated against plant data, serves as a valuable tool to predict key performance indicators (KPIs) such as conversion, yields and coking rates. It can also be used for optimisation of the cracker operation to improve the VCM plant economics.

The approach has successfully been used in a project with a Thai petrochemical company to model and optimise their EDC crackers. The cracking and coking kinetic models were tuned to plant data and the validated model is then used to perform dynamic optimisation of the EDC cracker. The optimisation results identify potential for significant savings due to operation at higher conversion, without adversely affecting key impurity yields and run length.

↑TOP


2. Refinery Preheat Train Fouling Management for Enhanced Energy Efficiency
Stephen Hall, Kumar Prashant, Process Systems Enterprise;

Heat exchanger fouling in modern refineries continues to be a major issue for both energy efficiency and environmental impact. Fouling is most acute in crude distillation units (CDUs). It reduces preheat temperatures, necessitating higher heater duties which incur increased fuel consumption. This in turn leads to increased fuel costs and higher emissions. It also can reduce throughput and lead to reliability and maintenance issues.

This paper describes a method for managing fouling to minimize its effect on cost, the environment, reliability and maintenance. The approach combines advanced process modelling and optimization techniques with the latest data visualization techniques in an Operational Excellence (OE) platform.

A key development is the use of equation-oriented process modelling tools to model network performance through time. By predicting future performance over an operating window, for example up to the next turnaround, overall costs and environmental impact can be calculated and informed decisions made. The approach results in heat exchanger networks which achieve a practical balance between thermal efficiency and operability, with all fouling effects presented to operators both as costs and environmental impact. An added advantage is that when implemented online the tool can immediately detect anomalies in operation that give rise to abnormal rates of fouling.

Another key feature of the approach is the visualization of results. The technology is deployed via various dashboards, depending on the user’s role. One dashboard is used by engineering teams to study ‘what-ifs’, another by operations staff to ensure day-to-day performance is optimal, and another by the maintenance department to plan maintenance schedules which tie in to cost and environmental objectives. Current plant data and operational status are displayed in real time.

A real-life case study is presented showing how the technology is used to predict the effects of temperature and flow on fouling, and also the effects of more significant process changes such as throughput increase, bypassing exchangers and the effects of removing heat exchangers. Associated cost and environmental benefits are presented.

↑TOP


3. Advanced Process Modelling for Refinery-Petrochemical Integration
Stephen Hall, Process Systems Enterprise;

Oil refineries focus primarily on producing transport fuels. Feedstocks that flow to petrochemical plants from refineries tend to be consequential. There is now an accelerating trend to reduced dependency on transport fuels and this coincides with an increased demand for petrochemical derivatives. As a result, there is a growing effort to integrate refineries and petrochemical plants, so that refinery products are more aligned with usable petrochemical plant feedstocks and overall economics and environmental performance improve.

There are integration opportunities available today with minimal change to current refinery operating conditions. Refinery streams such as fuel gas, LPG, FCC light ends and naphtha can be fed to petrochemical plants and petrochemical plant streams such as hydrogen and C4s can be fed back to refineries. Energy, in terms of steam and electricity, can also be easily moved between the two, providing significant financial benefit.

The traditional route to generating lighter components from crude oil in a refinery is by feeding VGO or Vacuum/Atmospheric Resid to an FCC unit to generate olefins and aromatics. Extending this technology, crude oil can be cracked directly or the crude can be distilled first and the products cracked via steam or catalytic methods, or even both.

Refinery/petrochemical integration tends to increase the overall capital investment but the improved generation of higher-value products and the reduction in previously ‘stranded’ products makes integration financially very attractive.

When assessing new design or revamp, one significant challenge is how to adequately evaluate the costs and environmental benefits of refinery/petrochemical integration. The standard approach is to use Linear Programming models with simplified plant feed/product/cost models. However, this approach is fundamentally flawed because although it accounts for flexibility in stream routing, it does not take account of reactor catalyst performance in the units and the wide range of operating conditions across the refinery that are possible. A suitable overall process model is needed where stream routing, catalyst performance and operating conditions can all be simultaneously manipulated to achieve maximum profit. The associated model complexity is high yet it is essential that this be considered, otherwise significant potential benefit may be left unrealized. It could also mean that a profitable project is stopped due to an inappropriately chosen set of operating parameters.

This paper shows a new approach, where advanced process modelling techniques are used to tackle such complex problems and deliver insights to design parameters and stream allocation. The approach combines equation-based models with advanced model-pruning and equation solving techniques. These are used to determine optimum stream routing, plant configurations and operating conditions. A case study shows the benefits of the new approach. As a result, new or revamp refinery/petrochemical integrated designs are generated which fully exploit the operating freedoms available and deliver maximum profit.

↑TOP


NAPEC 2018 (Algeria, 25-28 March, 2018)

1. Assessing and maintaining flare and relief systems against a backdrop of evolving industry guidance and standards
Paul Frey, James Marriot & Ryan Goggin Process Systems Enterprise Ltd., London, UK

As the last line of defence for process and equipment integrity, relief, blowdown and flare systems need to be adequately designed and maintained. Systems need to be periodically re-assessed so that they operate safely throughout the life of the facility that they help protect. This talk highlights the importance of ensuring flare and relief systems remain fit for purpose throughout their lifetime maintaining accountability for plant modifications, changes to operations, and changes to industry guidance and regulation. Recognizing high profile an recent incidents involving depressurization, relief and flare systems, including Westlake, Grangemouth and Piper Alpha and others which that have led to loss of containment due to excessive vibrations and brittle fracture of the flare piping, we show that the root cause can be linked to failure to recognize hazards and perform adequate analysis, which is fundamental to Process Hazards Analysis programs and Overpressure systems verifications.

In recognition of these & other incidents, we describe how relief and blowdown adequacy assessment has tightened in recent years in regards to significant changes to API 521, and in particular:

  • the importance of verification of relief and flare systems to ensure they comply with latest Industry Standards
  • to ensuring the suitability of materials of construction so as to avoid brittle fracture risks during depressurization
  • the adoption of the more rigorous analytical methodology for assessing vessel survivability under fire attack for blowdown system design
  • to assessing the likelihood of failure due to acoustic and flow induced vibration in flare piping

Through a number of case studies we describe a methodology for assessing and analyzing existing infrastructure for these risks and explain how a detailed model-based analysis can ensure an inherently safe relief and blowdown system design that is compliant with the latest industry guidance.

↑TOP


2. Multi-site optimisation of natural gas processing operations to maximise asset utilisation
Costas Pantelides, Maarten Nauta, Bart de Groot, Process Systems Enterprise Ltd., London, UK

Gathering significant volumes of natural gas and supplying processed gas and associated liquids such as liquefied petroleum gas (LPG) and natural gas liquids (NGL) to consumers usually involves connecting wells in different fields with an extensive network of processing and storage facilities that can be spread across vast geographical areas.

The use of network modelling and optimisation technologies for strategic decision-making can yield substantial benefits not only in economic and environmental terms but also in improved understanding of the interaction between the various components of the process and the overall business. Key benefits include increased profitability through better asset utilisation, improved reliability through the ability to rapidly reallocate production on equipment failure, better investment planning to reduce network bottlenecks, and flare reduction in order to minimise the environmental and economic penalties from inefficient operations.

Conventional approaches to the optimisation of large supply networks usually tend to rely on rather simple models of the individual nodes (e.g. production facilities and processing plants) in these networks, often taking the form of simple (often linear) relations between the flowrates of the various materials entering and leaving each node. Whilst this greatly simplifies the solution of the underlying mathematical optimisation problem, it may lead to solutions that are unimplementable in practice; pragmatic adjustments to ensure feasibility almost always lead to sub-optimal solutions which, given the very substantial money flows in such large networks, may translate into significant loss of opportunity.

This paper describes an alternative approach to natural gas supply chain optimisation across distributed sites using a higher level of physical detail in describing the operation of the individual production and processing nodes, thereby ensuring that any solution obtained satisfies all important constraints on the operation of plant equipment. Until recently, this approach was considered to be impractical; however, the combination of modern equation-oriented modelling techniques and the continual evolution of computer processing power now allows large-scale models, comprising detailed models of individual equipment items within wide system envelopes, to be constructed and solved reliably with minimal user intervention. In practical terms, it is now possible to perform optimization of models comprising several hundreds of thousands of nonlinear equations subject to many tens of decision variables and process, equipment and product quality constraints.

The paper is illustrated with an example involving a large-scale Middle Eastern gas processing network, which shows that gains of 5% on normal processing, representing tens or hundreds of millions of dollars annually, are now possible

↑TOP


Our website uses cookies so that we can provide a better browsing experience. Continue to use the site as normal if you're happy with this or find out more about cookies

OK