Improvements in analytics and automation are proving effective in tackling some difficult challenges in batch processing.
In 2011, CP outlined the results of a six-month field trial by Emerson and Lubrizol of prototype online multivariate batch analytics software at Lubrizol's plant in Rouen, France (Figure 1) (Online Analytics Improve Batch Operations). The data analysis led to numerous production benefits, including overcoming a plugging problem due to batch-to-batch variations in component densities, and identifying a problem with the reactor cooling system.
"These projects were testing a prototype of what has become Batch Analytics software. This will be released fully as part of DeltaV 12, probably in mid-2013," says Dawn Marruchella, DeltaV batch product manager for Emerson Process Management, Round Rock, Texas, and co-author of the CP article.
Three companies now are testing the beta release of Batch Analytics: Lubrizol, Wickliffe, Ohio; MillerCoors, Chicago; and a major U.S.-based pharmaceutical manufacturer.
MillerCoors currently is the only one with the beta release fully installed — and already is reaping process benefits. For example, by comparing one line that uses the software with one that doesn't, the company has identified important control issues with the way a critical feed is being added to the process. It also has found that a trigger initiating a process step led to an overall reduction in line efficiency. Likewise, it was able to pin down the difference in the rate of operation of two similar production units to variations in steam pressure. In addition, MillerCoors determined a critical process parameter was too low over the course of a run. As a result it currently is implementing a point-of-use heat exchanger. Finally, the Batch Analytics software revealed that pH meters on the feed water system were out of calibration, which was causing control issues.
"Batch Analytics has really heightened the awareness of batches for these companies and there is a lot of enthusiasm for it. The potential savings as their experience grows and batch failures are reduced could be enormous. In a pharmaceutical process, for example, the loss of one chromatography batch could mean tens of millions of dollars," notes Marruchella.
One of the keys to success for all the users has been an initial focus on simpler small units, such as preparation tanks, that can generate a lot of batch comparison data quickly. "If you pick a fermenter to start with, the process can take two weeks to run a batch. So it takes a long time to start getting results. MillerCoors wanted to do this originally, but then decided that applying the software to a different process unit than runs more than fifty batches per week would give them faster access to useful data."
For one of the beta testers, end-to-end integration with its business systems was a critical part of the project. "There was a lot of work put into integrating the Batch Analytics product in such a way that the company can automatically access and use data stored in their ERP [enterprise resource planning] system and LIMS [laboratory information management system]. However, all of the testers have some data in an external system, be it an enterprise historian, a LIMS or an ERP system," notes Marruchella.
Ease-of-use is also an important issue. "It is important to make a product that doesn't need a PhD in statistical process control to operate it. The goal is to make it simpler for users to make more-informed decisions and better understand their process," she adds.
BATCH AUTOMATION
Simplicity should play a key role in batch automation, stresses Honeywell Process Solutions, Phoenix, Ariz. "We have done a lot of work on human factors so that, for example, if an operator's attention is drawn to a potential problem, it is very easy to navigate to product information and specification. This is becoming increasingly important as the number of recipes is proliferating. By having a very tight product spec, you get a better price and margin," explains Reading, U.K.-based Chris Morse, batch product manager.
"Operator mobility is another important issue, and how they access process information, for example, via an Ipad. This touches on the importance of usability. It is important to make the job of everyone involved much easier so that they can be more productive. So we are focused very strongly on human factors."
However, the biggest driver for adopting new batch solutions is availability.
In the last couple of years, Honeywell has moved the storage and execution of batch activities from Windows servers to DCS [distributed control system] controllers — a move Morse believes is unique. The motivation is that if a problem arises with the Windows server during operation, a user doesn't really know what happened to the batch or why while the server is down, which has implications for the quality status of the batch. A DCS is much more robust, particularly in a redundancy role.
"Another reason the server is important is a chemical one. If you lose communications between a controller and a batch server you could have a very hazardous situation. There is also the lifecycle cost of being able to run a batch process in a controller — typically $22,000 per year based on a warm standby pair of servers replaced by a redundant controller," adds Morse.
One other interesting side benefit of the move is that the DCS controller is faster, so the cycle time between batches can be reduced. This can lead to an extra 3% of product per year, he notes. "One PVC plant customer said they were reducing each batch recycling time by one minute. This translates to an extra 1,000 t/y of production. Many PVC manufacturers are now spending a lot of time and effort tackling these incremental cycle times."
Likewise, a manufacturer of medium-value specialty chemicals managed to run 94 extra batches per year, increasing annual revenue by $564,000.
Control system capabilities are paying other dividends. For instance, Kronos Worldwide, Dallas, achieved a number of important benefits at its Leverkusen, Germany, titanium dioxide plant after installing the Honeywell Experion Process Knowledge System (PKS) automation platform and ACE-T (application control environment). These include: a new recipe system that expands production possibilities; better operator awareness of batch progress; a semi-automatic mode in case of trouble; no more searching for flags and numerics during interruptions; and new user-friendly operator screens.
Avoiding the cost of generating, reviewing and maintaining paperwork can provide big savings in regulated industries like pharmaceuticals. For example, at its new multi-product biopharmaceutical plant in Biberach, Germany, Boehringer Ingelheim uses Experion PKS to support paperless plant operation and documentation by U.S. Title 21 CFR 11 compliant data collection and storage. "For Boehringer and other regulated customers in general we can help to save well over $100,000/y with the Experion PKS paperless solution," reckons Morse.
He also sees a significant move by the chemicals industry to implement the sort of end-to-end control seen in more-regulated environments because of its ability to improve overall process control. Specifically he cites management of controller configuration, post-execution analysis of batches, auditable operator actions and barcode verification of manual materials additions as key issues — especially for the speciality chemicals sector.
"This gives a lot of value, especially in terms of quality. For example, one of our customers made 300 different products five years ago, 700 today, and is expanding to 1,000 shortly. We make management of this many products a lot easier: a single architecture means more-flexible recipe configuration, for example, calculated variations on values for product variations can be applied to a single master recipe at run time — something that is not normally possible from a server. Secondly, a master recipe can run against any unit of a particular class. Previous generations of batch automation assumed all units of a particular class were identical, which is rarely the case; our products can handle dissimilar equipment. Finally, having tools for the naming, segregation and structure of recipes makes handling large numbers straightforward."
For the future, Morse believes that extending integration even more will be crucial — i.e., having a single automation platform cover everything from the raw materials' loading bay through to regulatory records.
CONTEXUALIZATION
More powerful analytics generate even more information. The key to handling massive volumes of data is to contextualize them, says Michael Schwarz, Dusseldorf, Germany-based MES [manufacturing execution systems], EMI [Enterprise Manufacturing Intelligence] and batch software product marketing manager for Invensys.
"This puts batch planning and execution information in context with other relevant information such as: quality results managed in LIMS or quality management systems; high resolution process behavior information in historians; and cost information, which is typically only in the business systems and may include non-batch process information such as packaging and shipping of products derived from batches. Of course the information generated by all these plant systems means there is even 'bigger data' to manage," he notes.
Invensys offers context modeling across multiple systems in its Wonderware Intelligence software (MES, InBatch, etc.). This comes with predefined information models and interactive dashboard reports. The model can include inputs from virtually any system needed, such as ERP for planning/scheduling, as well as cost information.
"The manufacturing intelligence approach allows analyses and monitoring of information which did not exist for the plant or enterprise operations before. At the same time, it adds context for batch runs, quality, cost and asset performance — uncovering potential improvements which are not able to be analyzed and understood within the individual applications," he explains.
In the case of the consumer packaged goods, beverages and food sectors, for example, the challenge for end-to-end production control is to properly represent batch operations within the overall manufacturing operations system (Figure 2).
"We are investing in platform application components which will represent batch phases and batch equipment in our system platform plant model — these will be leveraged to auto-generate the model from the InBatch configuration. When this functionality is available, the generated batch model in system platform can be enhanced by MES functions for providing specification data to the batch phase or to collect execution, events, consumption, production and quality data from the batch operation. These platform application components are due to be released in April or May 2013. It's an out-of-the-box solution that I think is still unique with the platform approach," says Schwarz.
In the face of growing demand for integration and standardization of end-to-end production control, the company also is working to develop InBatch and generic batch execution as operations in the Wonderware MES software product. This end-to-end information management would allow passing production requests to the batch operations and collecting all relevant production responses for further production control, end-to-end traceability and product genealogy. Such an offering is roughly two years away, he notes.
"Typically there is not much contextualization between different plant systems, but if you give an operator an integrated environment which covers multiple domains such as asset performance and quality you will gain benefits. This intelligence approach helps to extract information from batch to batch, so that the whole throughput can be improved," he concludes.