Keeping tab on the processEKA Chemicals, Gothenberg, Sweden, uses a simulator as a soft sensor to monitor chlorine emissions at its plants, which make bleaching chemicals for the paper industry. The first system was installed about two years ago, says Nathan Massey, president of Chemstations, Houston. EKA uses Chemstations; ChemCAD simulator, and now is adopting the approach in its other plants.Previously, monitoring required six laboratory analyses per day. Switching to the software approach has saved more than $100,000 per year just for analysis. More than that, though, Massey says, it has allowed plants to run less conservatively, which has led to 1% to 2% overall energy savings. In addition, the simulator provides operators with insights about when to calibrate or clean instruments and equipment.Peter Henderson, London, Ontario-based product manager of simulation for Honeywell which just bought the Hysys simulation business from Aspen Technology foresees models filling a broad online monitoring void left by instruments, such as for analysis of gas turbines and compressors, catalyst aging and the fouling of complex heat-transfer equipment. Some properties that are easily measured are being used to infer ones that can;t be, says Marco Satyro, chief technology officer of Virtual Materials Group, Calgary, Alberta, which offers VMG Sim. For compressor monitoring, simulators enable measurement of water content in gas, he says. More broadly, the software can serve as a pseudo-analyzer. When the specific gravity and the roster of constituents are known, it can determine the actual composition.At NOVA Chemicals, Peter Dolsinek, leader of process automation and engineering systems, based in Sarnia, Ontario, notes that steady-state models are being used for performance monitoring on equipment like distillation columns.A model validated with field data (Figure 2) can help plant engineering staff identify the root cause of problems like equipment degradation and line plugging, and allow quantitative evaluation of options to solve these problems, says Todd Willman, president of EPCON International, Houston. EPCON offers a simulator, System 7 Process Explorer, specifically geared for use by plant personnel.Figure 2. Field validation
Readings from an ultrasonic flowmeter are used to verify flow rates predicted from a simulation model. Source: EPCON International
In developed countries, emphasis has shifted from investing in new plants to making the most of existing assets, and this is spurring more interest in using the models for optimization and productivity improvement, says Gilles Hameury, sales and marketing manager for ProSim SA, Labege, France.
Alastair Fraser, Lake Forest, Calif.-based vice president of the SimSci simulator business of Invensys, sees increasing demand for simulator use for predictive maintenance. He predicts that in four to five years use of models for performance monitoring will be widespread. Massey of Chemstations adds, I believe that performance monitoring and predictive maintenance [using simulators] are both near the beginning of their product cycles and eventually will become standard for any processing company that wants to remain competitive.
Further in the future, Satyro hopes that such models will be built right into the control chips for process equipment. Hardware developers must be aware of this capability so they can drive the dream, he says.
Improving operator training
Rigorous models have played a limited role in operator training because they often aren;t fast enough, but that may be changing. To be useful for training, a model must be able to run at a minimum of three times and sometimes up to 10 times real time, Massey says. Major discontinuities, such as a pump or valve trip, can slow down performance. First-principles simulations can;t keep up now, he says, so companies rely on shortcut models to provide qualitative results.
Improved numerical techniques plus faster computer hardware will make performance of rigorous models sufficient within five years, he believes, opening up the prospect of providing quantitative as well as qualitative results for training.
NOVA;s Dolsinek certainly sees the value of using rigorous models to broaden operator training from qualitative to quantitative. He also points to another strong motivator increasing the return on the investment in high-fidelity models.
Honeywell also sees potential for maximizing the return on rigorous models by using them for operator training, Henderson says. Indeed, its initial thrust with Hysys will be in that area, he adds. Operator training often is considered after the completion of control system design, but the same model also could play a role during design, he says, and later to assess the root cause of operating problems.
Firms also are focusing on how to get more engineers to use models. EPCON;s Willman says that only about 30% of site-based engineers use simulators. Many operating companies still generally consign simulation to specialists, which limits the broader application of the models. Dolsinek;s goal is to have operating people, not specialists, able to use the system.
Microsoft;s Excel may excel in that role. Since most engineers use it, simulator vendors are working to make it easier to work through Excel. For instance, Chau-Chyun Chen, a technical fellow at Aspen Technology, Cambridge, Mass., expects AspenPlus to accept properties through Excel within a year. More and more tools will be available via Excel, adds AspenTech;s chairman Larry Evans.
Dolsinek aims to develop a process control strategy for NOVA that takes advantage of rigorous process simulation models. He now has put in place a company-wide structure. It;s 10% technical challenge, 90% people challenge, he says. Specialists have favored specific software systems but that;s not cost effective, he explains. Instead, he plans to implement consistent, standardized tools. He expects to use the models for advanced process control (APC) within a couple of years.He is targeting selected APC applications where throughput or product consistency clearly could benefit from the sophistication of the first-principles models. After that, he plans to integrate the models into the supply-chain system.Bill Tchir, research manager for Sclairtech process R&D in Calgary, hopes to implement such models for control of NOVA;s large Sclairtech high-density polyethylene units in Joffre, Alberta, within a couple of years. This should provide extra output and smoother grade transitions. He also cites the benefit of having the same physical properties model for simulation and control.Honeywell;s Henderson definitely sees a role for rigorous Hysys models in APC and expects products to debut during 2005. The company will add tools to enable use of Hysys in its Experion Application Profit Controller, which now uses linear models. This should improve the fidelity of advanced strategies, he explains.Massey of Chemstations believes that most distributed control system (DCS) vendors are on the path to converting from empirical to first-principles models. By the end of the year he expects at least one DCS firm to sign a pact to use ChemCAD. The first application will be for feed-forward control of a pressure-swing process with fast gas-phase dynamics, he says.However, Chen cautions that first-principles simulation won;t always make sense for APC. While rigorous models can provide additional benefits, these have to be very significant versus what can be achieved with empirical models for control. He adds, A promising direction is the combination of empirical and first-principles models, which could lead to a merger of advanced control and real-time optimization into a single discipline and the ability to handle steady-state optimization and dynamic optimization/control problems in a consistent way.Sighting on the siteExpanding the scope of simulation models from analysis of a single process to analysis of an entire site represents a major opportunity, says Suresh Sundaram, Cambridge-based director of product marketing, engineering and innovation for AspenTech. A process plant never operates in isolation. Changes in a process affect other processes and have an impact on the utilities and site-wide infrastructure, he explains. To get the overall site optimum requires gauging the impact of one process on another.It now is possible to simulate and optimize a whole site, says Larry Evans, AspenTech;s chairman. Being able to do this will bring big benefits, he adds. Fraser of SimSci agrees. He says integrated refineries are leading the way.Seiji Terado, senior control engineer in the Process System Technology Center of Idemitsu Kosan Co., Tokyo, feels that site-wide simulation is beneficial, not just for more accurate modeling of existing operations but for assessing refinery-wide product balance when adding or removing units. Axel Polt, director, conceptual process engineering for BASF AG, Ludwigshafen, Germany, says that the company now is working to achieve site-wide simulations at all its plants around the world.Chemstations; Massey, however, doesn;t see a groundswell of interest from clients for site-wide simulation. Their caution is wise, he says, because a site-wide simulation requires a very good understanding of each process and how to put them together.Extending to the enterpriseMany operating companies would like to leverage their investment in these models even more broadly, such as for scheduling and planning. Integration with information technology is becoming more of a focus, says Harpreet Gulati, director of product marketing for simulation, optimization and advanced control at SimSci in Lake Forest, Calif. Agility for a process is the key to success. So, simulation will play a wider role. There is a clear need for faster decision-making and execution based on what is most profitable for the company,
"The process industries utilize numerous other modeling and optimization tools for specific business problems, such as production planning, scheduling and distribution. Each application has been approached from a different viewpoint, using the most appropriate technology and devising a user paradigm specific to the business task at hand, Chen says.
Planning and scheduling now typically rely on linear-programming models. However, Terado of Idemitsu notes that first-principles simulations overcome limitations posed by linear-programming models and that rigorous simulations can be used selectively and can be simplified as appropriate.
Tying in the supply chain and information technology is a major theme at AspenTech, Evans says. As a long-term objective, the company;s new Open Simulation Environment (OSE) will incorporate planning and supply-chain modeling to enable consistent decision-making, Sundaram says. Within a year, it should be able to mix and match first-principles, empirical and statistical models and update one model type with another. The goal is to have data entered only once.
However, SimSci;s Fraser notes that companies commonly face workflow barriers. Departments have separate data and don;t share.
The ultimate goal, explains PSE;s Pantelides, is to develop a general modeling capability that can be used for diverse functions. Specific models tailored for a particular task would be derived automatically from the master model. This would facilitate collaboration and minimize costs of software acquisition and deployment, he adds. It also would avoid problems due to inconsistencies among models and data for various applications.
"The real dream is an equation-based mathematical model that could do everything, Evans says. This view is seconded by NOVA;s Dolsinek and Willman of EPCON. No one is expecting this anytime soon, though.
Achieving plug-and-play
Process simulation historically has taken place within proprietary modeling environments. This has limited the users; ability to mix and match components from different vendors. A recent initiative, the development of the so-called CAPE-OPEN (CO) Standard, promises to change that. Open, which is multiplatform and available for free, aims to allow components from one vendor to be used easily with process-modeling software tools from other suppliers.
The initiative is now under the aegis of the CO Laboratories Network (CO-LaN), Rueil-Malmaison, France, which is supported by major operating companies, including Air Liquide, BASF, BP, IFP, Shell International Chemicals and TOTAL.
The thermodynamics part of CO version 1.1 was issued as a draft in September 2002. Minor modifications are under discussion, says Michel Pons, process simulation group leader for ARKEMA, Lyon, France, who will become chief technology officer of CO-LaN in January.
"The key benefit [of CO] will be to be able to use âbest in breed; software tools without having to worry about interoperability problems ⦠It won;t restrict any more end users to one suite provided by a specific vendor.
CO is driven by owner-operators who were concerned about the ability to inject their own expertise and use other software without custom coding, says Willman of EPCON.
Right now, CO-LaN has special interest groups (SIGs) on thermodynamics, unit operations and interoperability, Pons says, and an SIG on solvers may debut in early 2005. The thermodynamics SIG is working on an extension to mixtures involving solids. CO-LaN already has started the development of a logger to monitor all communications going through CO interfaces, to provide detailed information to developers and, ultimately, users.
Technical obstacles have been overcome, says PSE;s Pantiledes, but wider adoption will depend on continued support of software vendors and end-users.
In October, Aspen Technology launched aspenONE, a software portfolio for the chemical industry that features OSE and supports CO. OSE and CO are synergistic ⦠OSE is an important step for AspenTech. We want to embrace third parties; modeling tools, Evans says.
A number of operating companies already are telling vendors that they prefer and may even insist upon CO-compliant software from now on, Pons says.
Tchir of NOVA says that CO may fit into the company;s strategy for interconnections. However, it hasn;t decided on whether to insist on CO since interconnectivity is only part of the issue.
CO has very strong support in Europe, says Fraser of SimSci. Massey of Chemstations adds that one major U.S. chemical company stated that it would continue to use the ChemCAD thermodynamics package only if it were made compliant. The package will be compliant no later than the end of the first quarter of 2005, he says.
However, the response to CO at many U.S. companies seems less enthusiastic. Pons is not surprised. CO is seen as a European initiative ⦠Then, everything that is coming from Europe, and especially France, is looked upon with some suspicion. That is regrettable, but you can;t help it.
Philosophically, CO is doing the right thing, says Satyro of Virtual Materials. The real problem is acceptance. To have CO succeed, the standard has to be widely promoted and companies with money to spend on simulation must insist on compliant software. These companies must be willing to accept that implementation will give worse performance at least in the short term versus a simulator;s native environment, Satyro says.
Speed can be an issue, Pons admits. Using the CO interface instead of a simulator;s native one typically leads to slightly slower performance. Implementations so far give a 3% or worse drop-off. This reflects any inefficiency in the implementation as well as the interface itself, he adds.
The initiative will open up opportunities, but does not represent a step change, Fraser says. Equity in current models will place some barriers in the way of making existing applications compliant, he notes.
CO should allow users to more easily take advantage of niche software from small, specialized vendors, says ProSim;s Hameury. Areas such as unit operations and thermodynamics should be the first to emerge, Pons adds.
Developing new capabilities
The ability of simulators to more accurately handle demanding tasks continues to evolve, both through better methods and via links with other software.
Areas with scope for advances include modeling of larger molecules, like those common in pharmaceuticals and polymers, as well as operations involving solids, says AspenTech;s Evans.
In the pharmaceutical industry, process design often involves selecting the best solvent or solvent mixture from among hundreds of candidates. To speed that process, AspenTech has developed a model, called NRTL-SAC, for fast, qualitative estimates of the solubility of organic non-electrolytes in common solvents and solvent mixtures, Chen says. It works well not only for small molecules but also for oligomers and polymers. He expects the approach to be extended to organic salts within a couple of years, if not sooner.
The model, which is already being used by several pharmaceutical companies, is accurate enough for phase 1 and phase 2 pharmaceutical trials, Chen says.
There;s a huge physical properties data gap, says Chemstations; Massey. Compounds are being developed faster than reliable properties for them. So, Chemstations has just started offering a service to predict such properties of pure compounds via a novel group-contribution method. The company expects to offer the same service for mixtures by mid-2005 and to release a CO-compliant version for use by non-specialists within 18 months.
Computational fluid dynamics (CFD) has long played a role in equipment analysis. Now, hybrid approaches that link CFD with simulation are emerging. PSE;s Pantiledes, for instance, cites their value in crystallization, where differing flows in portions of a vessel can dramatically affect nucleation in those areas. PSE this year launched a package combining its gPROMS simulator with CFD software from Fluent. The same approach holds promise for reactors and cracking furnaces, he adds.
Pretty soon, says Polt of BASF, end users not the software will become the limiting factor in simulation.
Mark Rosenzweig is editor in chief of Chemical Processing magazine. E-mail him at [email protected].