Shutterstock
Lessons Learned from Formosa Plastics Explosion

Podcast: Lessons Learned From the Deadly Formosa Explosion

April 30, 2024
The incident at Formosa Plastics in 2004 highlighted critical lessons in process safety. Human factors, communication gaps and inadequate equipment played significant roles in the death of five workers.

What Happened Before Deadly Formosa Plastics Explosion

In today's episode, we're going to extract some lessons learned from an incident that occurred 20 years ago on April 23rd, 2004. Five workers were killed, and two others were seriously injured when an explosion occurred in a polyvinyl chloride production unit at Formosa Plastics in Illiopolis, Illinois; the explosion followed a release of highly flammable vinyl chloride, which ignited. The US Chemical Safety Board. Investigated this incident and made several recommendations based on human factors and communication issues. Trish, can you give us a little bit of insight on what happened that day at the plant?

Trish: What was going on at the time was the operators needed to clean one of the reactors between batches. And so, it was a very old facility. It didn't have a lot of instrumentation, a lot of automation associated with it. So, several reactors in the facility were separate reactors, and they went to clean one of them. Now, to clean the reactor, they needed to not only open the top of the reactor but the bottom of the reactor. Now, this was across several floors in a building because these reactors are very, very tall. Operators were at the base of one reactor, and operators were at the top of a reactor, but they weren't at the same reactor. When the operators at the base of the reactor bypassed an interlock to open the drain valves, which was normal practice for them, to bypass that interlock, they ended up opening a full reactor, not an empty reactor, to clean out. And that resulted in the massive release that occurred.

Consider Human Factors

A lot of the issues associated with it was around the instrumentation, the human factors of understanding which reactor they were at, the fact that they couldn't physically see each other because of the flaws. But even exacerbating that, they didn't have two-way radios to communicate with each other either. There was a complete lack of communication, so there was no way to know or to verify whether they were at the right reactor through communication because it was completely separate. If you imagine you walk down a set of stairs, and you've got to turn to the right to do something, but you accidentally turn to the left because the whole system is identical, that's where you can start to see human factors play out in causing or being a significant cause of incidents that occur.

Because if you need to turn to the right, but for some reason, every time you've done it in the last five times you've gone to the left, even if you know you need to turn to the right, you're probably going to turn to the left because that's one of the things that we get conditioned into what we're used to doing, and that's a challenging part of human factors. There were a lot of design issues associated with lack of instrumentation, lack of automation, and the fact that they needed to bypass the valve to get it open, and then they deliberately did this on a number of occasions that that became normal practice. They didn't see that they were bypassing something significant. That was just what you did. That was how the plant worked. That was how you did your job in this particular facility,

Traci: One of the CSB's recommendations was to design the systems to take into account human error. So, for example, walking down those stairs, and if you go to the left, it looks the same as if you go to the right. So maybe designing it differently so that going to the right is very much... Would it be enough to paint those vessels different colors? Those types of things. What are the best ways to design and take into account human errors, and should every system in a facility be designed that way?

Trish: So, it's interesting. Things like labeling, signage, different colors, at initial thought, you think, okay, that's really good because people are going to see it, they're going to notice it. However, one of the challenges of human factors is that we stop noticing things. If we see something every day, we stop seeing it. We just don't even see the sign, the label, or the color anymore. That's not enough to do that. Labeling is important. Labeling is important for when you're learning the system. For when someone new is in there when an emergency is taking place, labeling is critically important, so we have to do labeling, but we can't only do labeling and only rely on that. Really, human factors improvements are more around things like an interlock that would've prevented the opening of that valve if there was pressure on the other side of it that could not be bypassed.

Design For Process Safety

Hardware design things that prevent the error from taking place; they physically stop the error. Effectively, in this instance, I think from memory, they could either connect an air hose or disconnect an air hose to swing the valve manually. That was how they bypassed it. So preventing that sort of bypass would've helped if the interlock said, "I've got pressure on the other side of me as a valve. I can't physically open right now." So we do need to make sure we engineer appropriate hardware so that we can't get it wrong. We can't make that mistake because the system physically won't let us make it. That does sound like it could be expensive, and in some instances, it might be, but we don't need to do everything to that same degree. We need to understand what the key controls or barriers are we have in our facility that are the ones that are actually preventing catastrophic outcomes, and this was certainly a catastrophic outcome.

What are the things we need to do that we need to maintain, that we need to have in place, and they're the things we need to make sure we have good human factors in so that people can't accidentally get it wrong. I like to describe human factors as engineering the system so it's easier for people to accidentally get it right. If they're going to slip up, you want them to accidentally do the right thing, not accidentally do the wrong thing. There's a story about a university that I think laid out all its pathways, but they didn't actually concrete them in until they saw where all the students walked. Then they concreted where the paths naturally formed for the students, making it easy to get it right accidentally. If you try to force someone to walk in a certain area, but that's out of their way and it's going to be a long way around, they'll take a shortcut.

Is that shortcut safe? And if the shortcut is secure, make it safer and let them use it rather than forcing the long way around because we'll always try and find the simplest, most efficient way to do what we need to do. And that's, to me, what human factors is all about.

Execute Evacuation Training

Traci: Great way to describe all of that. When I was further reading about this incident, the CSB determined that if all personnel had evacuated at the onset of the release, no lives would've been lost. But a few of the operators really tried desperately to stop the release. They likely thought they were going to save more lives but sadly lost their own. How do you train for these situations?

Trish: Yeah, I think that's one of the biggest tragedies. In a desperate effort to try and correct the situation, they stayed too long. And we do see this happen. We see similar things happen in confined entry fatalities. One person's in a confined space, and they go down for some reason, and the person watching jumps into the rescue and then subsequently goes down as well. And so, in confined spaces, we often see multiple fatalities for that very reason. We have a desire to help out and we have a desire to fix what's wrong, but we actually need to train that out of ourselves because we need to make sure that we remain safe in any response. If you think about the standard first aid response that we talk about in first aid, ABCDs, the D, the first thing you do is check for danger. You do not go and help the patient if you will be in danger by helping them because then you're just going to have two patients, and someone else is going to have to come and help.

You need to train and practice and get that concept through to people that getting out alive is the more heroic option. Going in and potentially saving the day or potentially getting killed is not the heroic option. When we see our emergency responders, our fire brigade, our ambulance people, they turn up our paramedics, and they turn up to a site or a situation. They don't rush in. They stand back, they look, and they check for danger because their priority is their safety, and we need it to be. Their second priority is our safety, and their third priority will be assets and the environment. We need to practice and do emergency drills and make sure people understand why they need to get out when they need to get out. It is one of those challenging ones because you always want to fix a situation. You want it to get better. And particularly if you've made a mistake, you want to rectify that, but not at the cost of your life or the lives of some of your workers.

Traci: And that's the thing with the psychological safety of it, too. If you make a mistake, it's okay to make a mistake and get out there with your life rather than trying to fix it because you don't want to face some consequences.

Trish: Absolutely. It's always easier to deal with the situation after the event than to get caught up in it, and the situation becomes a lot worse.

Communication Bolsters Safety

Traci: Let's dial back a little bit. You talked about the lack of communications that the operators had. Can you talk about how that happens and what can be done to improve communications?

Trish: This facility, as I said, it was quite an old facility. They didn't even have some of the most basic communication tools that we would expect. And this is not the only incident where this has occurred, where we've... In investigation, we discovered that there were no two-way radios for operators to communicate with each other from remote areas of the plant. People had to physically see each other to have communication because there was no other way of interacting. That becomes an issue if you are physically separated in your normal day-to-day work. How do you confirm that something is working or not working if you can't actually talk to someone about it? One of the key critical things is to make sure you've got the right infrastructure. And two-way radios are not sophisticated pieces of equipment. We're not talking; this is brand new and no one's ever done it before.

These are fundamental pieces of equipment that children can use to get walkie-talkies at the toy store. They're not difficult to obtain. In a chemical plant or a plant with flammable substances, they do need to be intrinsically safe, obviously. But again, they're readily available. So it's around making sure you give your teams the right tools to communicate. How do you think they're going to communicate if they can't physically see each other and they're remote from each other for various reasons, particularly if there's a task that requires two people doing something from a remote location? You've got to make sure that they can communicate with each other. Just because you have radios doesn't mean that you are communicating well, though. And so that's why it's really important to make sure that when we bring people into our organizations, we actually train them on how to use two-way radios or walkie-talkies. What is the correct protocol? Because communications can sometimes be garbled in them. Communications can be unclear.

You might respond back in a way going, yep, it's all good on the radio. Well, what does all good mean? Does it mean you've done the task? Does it mean you've checked it? Does it mean nothing needs to be done? We need to make sure that we train people to be very clear about their intent when communicating so that there's little room for misinterpretation of what the message is. It might be, "I've checked valve 372, and it is closed fully," as opposed to, "I've checked, and it's all good," making sure we get the right message and making sure that we can communicate adequately, and people understand when the message is done. If you look at the military, they have very strict radio protocols on how they communicate. We are less focused on that, I think, in the process industries, to our detriment. And I think we should be more focused on teaching people proper radio protocols so that we get clear, concise, accurate messaging.

Traci: That is something that you never really consider. Is there a place to get that kind of training, or is it just something that you sit down at within your own facility and create your own documents for training?

Trish: Well, given that there is... As I said, the military does this quite well. There must be established protocols. We have a phonetic alphabet that we can talk about as well to ensure there's no miscommunication in spelling or activity.

Traci: Right.

Take Safety Recommendations Seriously

Trish: So all these things do exist. It's about rolling them out. So, I would suggest that there are probably areas where there is existing training available, and it's around sourcing what works for you. I'm not suggesting that we need to be as rigorous as military or police radio control necessarily, but certainly, we need to be a lot closer to it than we are today. Are there any other lessons learned from this incident that we can apply? I think the key ones are really around human factors of how we design appropriately so that it's easier to get it right, that making sure we can communicate effectively, training and training and training in emergency response so that if something goes wrong, our action is just second nature. We kick into almost an automated response of what we need to do.

I think they're probably the key big items. The other thing that is important to note about this incident, though, and what the CSB did find, is that historically, there were some recommendations to improve the interlock system or the system and the equipment at points in the past that had been dismissed or not done. This is also a very common thing that we find in the cold light of day after an incident; chances are there was some audit or inspection at some point in the past that recommended we fix whatever piece of equipment or hardware it was that caused this incident, and we didn't do it. We need to make sure that when we have these audits and inspections, we seriously consider the implications and the actions that are recommended. Why was it recommended? What could we do to make sure that we fulfill that recommendation?

If we can't fulfill it in the way it's recommended, is there something else we can do to reduce or manage that risk? So, just because an auditor has recommended something doesn't mean it's automatically right. That's not what I'm saying. But what I'm saying is if someone's recommended something to you. You're not going to do it; you really need to think long and hard about your justification of why you're not going to do it, particularly if, down the track, it's then found that you chose to ignore a recommendation or didn't do something and people died as a result.

I certainly know personally, that's not a situation I would ever want to be in reflecting back on that. Understand the recommendations of your audits and inspections and make sure you take appropriate action on them, not just, "Oh, we can't afford it, so we're not doing it. That's not a justification. What you need to understand is what you can afford. And, if you really can't afford to do it safely, maybe that's not the business for you. Maybe you shouldn't be doing that activity if you can't afford to do it as it needs to be done.

Traci: Trish, thank you for always helping us consider our implications and actions to keep ourselves our workers and our communities safe. Unfortunate events happen all over the world, and we will be here to discuss and learn from them. Subscribe to this free podcast so you can stay on top of best practices. You can also visit us at chemicalprocessing.com for more tools and resources aimed at helping you run efficient and safe facilities. On behalf of Trish, I'm Traci, and this is Process Safety with Trish and Traci. Thanks, Trish.

Trish: Stay safe.

 

About the Author

Traci Purdum | Editor-in-Chief

Traci Purdum, an award-winning business journalist with extensive experience covering manufacturing and management issues, is a graduate of the Kent State University School of Journalism and Mass Communication, Kent, Ohio, and an alumnus of the Wharton Seminar for Business Journalists, Wharton School of Business, University of Pennsylvania, Philadelphia.

Sponsored Recommendations

Heat Recovery: Turning Air Compressors into an Energy Source

More than just providing plant air, they're also a useful source of heat, energy savings, and sustainable operations.

Controls for Industrial Compressed Air Systems

Master controllers leverage the advantages of each type of compressor control and take air system operations and efficiency to new heights.

Discover Your Savings Potential with the Kaeser Toolbox

Discover your compressed air station savings potential today with our toolbox full of calculators that will help you determine how you can optimize your system!

The Art of Dryer Sizing

Read how to size compressed air dryers with these tips and simple calculations and correction factors from air system specialists.