1660602049757 Safetygurutrevorkletzts

Learn from a Safety Guru

Oct. 23, 2014
Take advantage of the insights in the books of a renowned expert

The first anniversary of the passing of Trevor Kletz, who died on October 31, 2013, provides a good opportunity to re-emphasize how much we still can learn from him. Frequently called the father of process safety (see: "Trevor Kletz Bequeaths Better Process Safety”), he wrote many books on the topic; some sit on a shelf in my study. Kletz directed some of his attention to the managerial malaise that explains why past accidents recur. One of my favorite quotes seems to strike at the heart of the problem: “If you think safety is expensive, try an accident.” So, let’s explore some of his comments on process safety. Pardon my paraphrasing.

Let’s begin with his thoughts on design and construction. Changing procedures to improve safety is far less effective than changing design, which is why design is so important. Don’t use different parts that are interchangeable — e.g., piping with dissimilar ANSI ratings for a compressor inlet and discharge. Make construction easy, use the same rating throughout or distinguish parts to avoid substitution. On p. 307 of the 4th edition of “What Went Wrong?,” Kletz suggests a useful mnemonic for selecting materials of construction: SHAMROCK, where S stands for safety; H for history (go with what you know); A for availability (spare parts); M for maintenance/maintainability (check cost savings carefully); R for reparability (training, experience and time to repair); O for oxidizing/reducing nature of fluid; C for cost (lifetime costs); and K for kinetics of corrosion mechanism, i.e., understanding of corrosion. Make a job simple or the people doing it will find shortcuts. Keep isolation as close as possible to the equipment being blocked off. Make maintenance easy by retaining line of sight, e.g., have local instrument readouts. Whenever possible, use instruments that measure parameters directly. Inferred values may not be reliable — an issue that plagues pH and some flow meters. A magnetic flow meter is a perfect example. If the core plugs, the velocity reads high and the inferred flow rate is too high. This same argument applies to complex controls. Instead of closing valve A and then valve B, why not close them together? With downsizing and consolidation, too many companies rely on contractors, often hired on low-bid lump-sump contracts, to provide the engineering knowledge of people they fired. Kletz mentions this problem has been going on for over a hundred years — it wouldn’t surprise me if the pharaohs chose the low bidder to build the pyramids! Poor construction management has ruined many good designs but stellar construction practices have improved lots of mediocre ones.

Now, let’s consider day-to-day operations. Never start up a system after an accident or serious near-miss, when equipment could be damaged, without a thorough explanation of the cause of the incident and an inspection. Many companies inspect their process pipe but ignore utilities, bypassed process pipe and, especially, hoses and expansion joints. Test during commissioning any process steps or equipment, such as emergency equipment, that must be operated quickly — and retest periodically after that; ideally, walk through the unit with operators before finalizing the design and again after purchasing. Have a backup plan if such a process is out of commission and test that plan, too, at least by walk down. Inspection errors and failure to periodically test emergency equipment and other similar procedures have been reported in historical events going back hundreds of years and, unfortunately, remain common.

Communication problems afflict both design and operations. Be redundant — until it hurts. On p. 105 of the 4th edition of “What Went Wrong?,” Kletz relates how 4,600 calves died because a Dutch company ordered a chemical from a U.K. supplier by number alone. In the U.K. that number was a poison. The Dutch firm should have ordered the chemical by name, number and description. Redundancy also works with communication: e-mail and call — keep calling. Another repeated root cause he identifies is role confusion. This contributed to the explosions at Esso’s Longford, Australia, gas plant in 1998 and BP’s Texas City, Texas, refinery in 2005 — as well as other accidents dating back decades, perhaps centuries. In addition, Kletz warns of the dangers of siloing information. This is particularly true when the engineer who programs the distributed control system isn’t the one who uses it. He suggests the same person should have both roles. Lastly, he cautions that a good record of minor lost-time accidents doesn’t indicate you’re safe from a major catastrophic process accident. How prophetic. Perhaps by reading some of his books you can help us avoid repeating history.

About the Author

Dirk Willard | Contributing Editor

DIRK WILLARD is a former ASBPE award-winning columnist for Chemical Processing's Field Notes column. During his 10+ years as a contributing editor for CP, he wrote hundreds of valuable and insightful pieces on design and operational issues. He retired in 2023. 

Sponsored Recommendations

Keys to Improving Safety in Chemical Processes (PDF)

Many facilities handle dangerous processes and products on a daily basis. Keeping everything under control demands well-trained people working with the best equipment.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Rosemount™ 625IR Fixed Gas Detector (Video)

See how Rosemount™ 625IR Fixed Gas Detector helps keep workers safe with ultra-fast response times to detect hydrocarbon gases before they can create dangerous situations.

Micro Motion 4700 Coriolis Configurable Inputs and Outputs Transmitter

The Micro Motion 4700 Coriolis Transmitter offers a compact C1D1 (Zone 1) housing. Bluetooth and Smart Meter Verification are available.