Podcast: Mastering Risk Management in High-Hazard Industries
Welcome to Process Safety with Trish and Traci, the podcast that aims to share insights from past incidents to help avoid future events. I'm Traci Purdum, editor-in-chief of Chemical Processing, and as always, I'm joined by Trish Kerin, the director of the IChemE Safety Centre.
Traci: I'm excited for our conversation today. You've brought along a guest with you, Mark Jarman. Who has been working in the risk field for over 35 years, mainly in the petrochemical, upstream, downstream and mining industries. He specializes in risk assessment, process safety management, auditing, development and training, safety cases and due diligence evaluations. You two seem like a pair matched well. I know you go way back. Tell me a little bit about the relationship.
Trish: Yeah, so we go way back to around about the year 2000 or so, when I was working at a refinery. And the government in Victoria had just introduced their safety case legislation and Mark was helping us work through producing our safety case. And that was where I first came across Mark, and we've crossed paths ever since for the last, what, 23 years?
Mark: Exactly, Trish, and yeah, I first met you when we were doing the South Crude Tank Farm Hazops, et cetera, for the safety case preparation. So yes, it goes back a long way and we've had a lot of involvement since, and we've kept in touch at various points in time.
Traci: Well, thanks for joining us, Mark. I appreciate it.
Mark: It's my pleasure.
Traci: In today's episode, we're going to tap your risk assessment expertise. And right out of the gate, I want to talk a little bit about how risk itself has changed over time and has this impacted risk assessment methodologies?
Mark: I would say that the risk approaches have changed enormously since when I first started. I first started off as a loss prevention engineer in the late 1970s working with Factory Mutual, which is probably one of the best training grounds in the world for loss prevention engineering. But it gave me a really good foundation for understanding risk and especially consequences.
And also the fact that their standards were performance-orientated. And I was probably blessed and cursed at the same time with always inquiring and asking why this is so.
So when we first started doing risk assessments after I left Factory Mutual, I joined Det Norse Veritas and the only tools available really were Lee's Loss Prevention in the Process industries. It's the very first book. So we were basically self-taught and learned a lot about how to do risk assessment from scratch.
And in those days we were doing fault trees and event trees. Computers were only just starting to be available, but they certainly weren't used in the actual risk engineering field. And everything was pretty much black box. And that's where I was actually wanting to make a huge difference. And the industry's made a huge difference since then, too. So I always riled and hated black boxes, and I wanted transparency.
And so I left Det Norske Veritas and formed with two other guys, a company called VRJ Risk Engineers, and one of the real thrusts I had there was to develop transparent risk-based assessment and methodologies such as Hazops, QRAs, loss prevention, fires, 3D modeling, et cetera. And over the years we developed lots of things, including probably some of the first based computer technology for recording Hazops and doing all risk assessments associated with that and incorporating matrices. But we also were very keen on the previous experience I had with Factory Mutual on importing performance-based approaches into risk assessments, especially into QRAs.
And based on those days, we didn't have any bowties as we know them now. We had fault trees and event trees, and they were complicated, they were difficult to use, very mathematical. And since the introduction of bowties has made things a lot easier and also very transparent.
And we've also gone a long way from just using results based on numerical quantitative approaches such as 10 to minus 6 and 10 to minus 7, one chance in million or one chance in 10,000,000, and through the process. And the much abused and used cost-benefit analysis through to, I'm pleased to say the current technology or SFARP approach, which is used right throughout Australia at the moment, which is a very good way of addressing whether a risk is so far as reasonably practicable.
And I mentioned that it's the previous approaches, ALARP, it was good at the time, but we've progressed, but it was very easy to write off big risk reductions based on a cost-benefit analysis, which was difficult to do to actually argue against or to actually put argument up against it. But we saw lots of examples where risk reductions were just written off or were not done because of a so-called cost-benefit analysis approach.
Some of the transparent types approaches, if you look at say, AS1940, back in the mid-nineties, early 2000s, all of the actual cooling requirements were based on the properties of petrol. If you didn't store petrol, then you had problems. Sometimes it would overestimate what the actual cooling water demand was, and sometimes it seriously underestimated what the actual cooling water required. And that's where transparent methodologies were coming to play where we invented, developed, programs like tank heat where we could actually physically model how much heat was being received at different points across the surface area and when to apply cooling water, how much. This sort of technology was presented to the authorities, both the fire brigades and the actual other government authorities, which led to changes and risk-based approaches being used in Australian standards like AS1940.
When you have examples of technology being used, it does aid and progress the understanding of risk and consequence modeling to the benefit of not only the actual client and the actual facility, but also society.
Traci: Trish, what are your thoughts on the progression?
Trish: Yeah, we've certainly seen enormous development over the years as things have changed. And as Mark said in the early days, it was very much black box and I don't think many people really understood what was going on inside those black boxes. And so we were at the mercy of them. I think the transparency that we see, the methodologies that we now use really allow clear discussion of what the risk is that we're trying to manage, what is the consequence, what is that likelihood and what are we going to do about it? And we now have much better tools, I think, and as Mark said, transparent ways to deal with them so that everybody can understand it. And more importantly, we can communicate the risk.
And what I mean by that is there's no point communicating the risk as 10 to the minus seven to a chief executive or a financial director that has to make a decision. That's not going to be helpful. What we need to do is be able to tell the story of the risk. What are we trying to prevent? What is going to go wrong when it goes wrong? And what can we do about it? So the ability to be able to clearly communicate that, I think is really important.
And we've even recently seen, so a couple of years ago, you'd remember, Traci, in the Safety Center, we put out a document called Delta Hazop, which focused on the creeping and cumulative change in facilities to assess that. So we are constantly looking at ways that we can provide refined tools for people to use. We're currently working on one focused on transient operation risk assessment and how best to do that as well.
Traci: That Delta Hazop podcast that we did was very popular, got a lot of interest. You also wrote an article for us on Delta Hazop, and we did get a lot of traffic from that, very much so. Very poignant.
Let's talk a little bit, you both have touched on it, the technology, the role that technology has played in shaping the evolution of risk assessment. AI, data analytics, modeling techniques. Let's talk a little bit about that.
Mark: Okay. I think the potential there is huge, and certainly the data, and I'll speak in bits and pieces about it as I answer the question, but certainly, there's a great potential in preparation for risk studies. But I think it is still very early days, and I'm not probably, well, let me rephrase that. I'm certainly aware of it. I would have slight reservations about certain aspects of it because I still want to see the actual human input into it, especially the operators having a major input as a tool. I would hate to actually see that change because that would go against all the actual knowledge I know, and I would certainly see that there can be improvements in preparation for Hazops and hazards studies and also reviewing the cumulative changes through MOCs and work permits, reviews that have occurred.
But I also know that there's a lot of information in data loggers within plants which are simply not being used. And if you can get access to that information and analyze it, then you have a whole host of other information to determine how the performance of control systems are working. And when they nearly reach their set point as opposed to exceeding their set point trips. And so there's a whole host of information which can be gathered and used there.
But if I go back a bit, like in Hazop studies, the transformation has been huge from just hand recording things through to computer-based technologies now where everything's transparent. You can actually have all the words agreed on the screen and agreed by everyone around the room. If not, you discuss it until you reach a resolution. And then you can move on to the actual next node or the next guide word.
And so that has been really important. And as Trish and I mentioned previously early black box consequence modeling, it was really hard to see what was going on. And the only people who really knew what was going on were the people who did the work. There's a whole lot of things that associated with say 3D modeling and certainly we had a lot of 3D modeling in the actual VRJ days, but certainly, the CFD modeling for explosions and jet fires and even pool fires, et cetera, is huge. It opens people's eyes as to when they can physically see the time progress from failure of say a high-pressure propane, LNG compressor system failures, and the consequences of a fire explosion and see the damage which actually can occur. And that leads to a lot of understanding by management, senior management who have had no exposure to that sort of stuff. And it's easily explained, as Trish mentioned before.
So the actual data in logger plants, it's rarely used information, but that covers a lot of the actual culture of the way things are being done in plans by operators and by maintenance. Typically I've seen that a lot of maintenance management systems tend to be service-orientated toward the maintenance side, but there's a whole host of rich data in there that can be used by the risk analysts if they can get access to it properly that records everything and it has all the actual failure rate data of the plant based on the actual processes and systems which that company uses to maintain their plants.
And so when opportunities present themselves we were able to harvest the actual information and that gave a realistic approach towards our modeling and we could justify it. Certainly, if the information wasn't available, there's plenty of publicly available information for actual failure rate data. But having that cultural stuff influenced by using plant data has always been very good.
So I would say that that's the gold standard if you can actually get that and you can get the culture in its place to be used in your risk assessments, that is fantastic. Plus being able to focus attention on data of each MAE or MI is critical for overall transparency.
So I would say access to AI is going to only advance that actual knowledge that we have and the information that we can have at our fingertips and use for doing better risk assessments.
Trish: Yeah, I'd agree with all of that, Mark. Generative AI has some enormous potential in being able to gather data that, as humans, we don't necessarily have the capacity all the time to bring in and analyze. And so if we can actually gather some of that data together to then move forward with the human review, then, so the data pulling out what's important to us so we can then do something with that. I think that's enormously powerful. It's about the use of the big data that we've got.
Companies have all sorts of information and we need to get better at tapping into it. And some of the AI tools are really quite helpful in doing that. But I do share your concern that we actually don't want AI making the decisions for us. That does require human intellect because it's not a simple decision to make. It actually not only has intellectual aspects to it, but it also has emotional and that human side aspects to it as well. Because when we do talk about risk in these terms, we are actually talking about people's lives. We're talking about whether people live or die. So we can't just leave that to the AI to do, and that's just unconscionable in my mind. We need the ethical aspect of the humans being engaged in that conversation.
I know that there are some companies that are doing a lot in terms of how to scrape data and gather it and prepare it and do things like they're talking about, AI driven hazops and things like that. But it's really, again, about gathering of that data so that the assessment can then be done.
Mark: I would like to go back and add a few more things on there in terms of the actual practicalities and the detail, Trish. If you look at what we used to do to get that following on from all that work, we did a lot of reviews at the plants like you worked at, and that actually looked at all the data that was being used for evaluating the effectiveness of the control systems once they've been in use. And also looking at all the MOCs and the permit to work systems, et cetera, associated with every MI.
And being able to actually get that data and analyze the effectiveness of the controls and where improvements can be made is really crucial. And if you rely on AI to do that for you and not have the human input, like you said Trish, there are real concerns that you'll miss things.
Trish: Just for some of our listeners that may not be familiar with some of the terms MI, major incident, MAE major accident event. So we're using terminology that's very well known within the world of risk-based performance legislation but perhaps not so well known within a prescriptive legislation regime.
Mark: Thank you.
Traci: Yes, thanks for that clarification there. Let's talk a little bit about some historical events and incidents and how they've driven change in risk assessment regulations and practices. Anything come to mind?
Mark: That depends on the frame of reference. I suppose I go back quite a few years, and I have a fairly big frame of reference, but I also think that the podcasts or the information you guys put out, Trish and Traci based on the actual historical events that have occurred and the lessons learned. I think that they're gold and continue to do that, guys, because it's very important.
But in my time, the actual key incidents are Flixborough, Bhopal, Exxon Valdez, Piper Alpha, FAZEN, Esso Longford, Texas City, Deepwater Horizon, Fukushima, specific mine tailings damage failures, Buncefield, Coode Island. They have all been super important for our learning and gaining knowledge and also understanding societal expectations about risk and the acceptability of plants near residential areas. And also shows up, highlights the issue of risk as creep and residential areas get closer and closer toward high hazards plants industrial areas.
And we've all seen that over the years and there's probably no better example of that than in the States, in the UK, and in Australia. And if you learn anything, and that's where part of the experience I've had acting as expert witness for various big events which have occurred in Australia. You really get an appreciation of what the law says is acceptable and not acceptable based on the actual outcomes.
So what I would say is that these historical incidents are very important, they're gold for learning information, but we can get a better understanding of the cause or mechanisms including the politics. And it's always interesting to learn. It has always concerned me that we still make similar mistakes, same events still happening.
And if I look at crude oil storage tank fires, we know all the mechanisms which can lead to crude oil storage tank fires and we should be able to reduce them to so far as reasonably practical, but they still happen. And that happens across the world. In Australia, we have a very, very strong major hazard facilities regime. And hopefully those sort of things should not happen, but that probably will happen because of the lack of corporate memory to keep the required documentation, which has been prepared and as has been left go, but I'll speak about that just a little bit later on.
But all the incidents that have occurred and the information Trish and Traci put out, we are all still learning and we can still learn.
Trish: Yeah, I think from my mind, one of the key historical events that you did mention, Mark, was actually a turning point for me in my career and that was the Longford incident. And if I look at the impact that that then had, so for me it was an enormous awakening to process safety and a realization that we can't have these things happen anymore. Two men, Peter and John died that day. We can't keep doing this. And so we need to be doing things differently. And that did lead to the change in the legislative regime that led Mark and my paths to cross.
But interestingly, that incident also then led to implementation in other places and then followed on. We've now seen, for example, New Zealand have a safety case regime implemented following their coal mine disaster that they had at Pike River. But originally, the documentation, the legislation that New Zealand picked up was actually the Australian one, which was largely based on what happened after Longford.
So it all sort of follows on. And if you go right back into Flixborough, that's when the legislation started to change in the UK, and then Piper Alpha really kickstarted that change again to the sort of regimes we now have that require certain risk assessment practices to be undertaken when you've got certain types of facilities. And so we have seen enormous change.
One of the things I really like about the performance-based regime is that it is always changing because the standards always changing. With a prescriptive regime, you have to wait for the legislation to actually change for it to be required by law. But when you've got a duty to manage a risk so far as is reasonably practicable, reasonably practicable changes as standards improve, as state of knowledge improves, as new technology develops. So you're always constantly trying to improve to get that risk even lower.
Mark: I would say that what you said is a hundred percent correct, Trish, and when Longford happened, I was fortunate enough or unfortunate, whichever way you want to look at, to be on the frontline of people to go through having a look at what went wrong. And it's very sobering.
When the safety case regime first started in Victoria and it was the first comprehensive one implemented within Australia, but the regulations hadn't been written. And so it was really difficult, and we had a two-year timeframe to actually get the safety cases done, but the regulations weren't written. So there was a lot was angst about how the performance based stuff was going to be done. But from my point of view, it was pretty straightforward. We had a lot of interaction with the major hazard facilities section of the Victorian government in shaping thinking, and other people did as well. And we were very fortunate to be on the forefront of that thinking process of helping to assist in developing those regulations. And so it's always changing. We can always learn, we can make improvements and I just love the performance-based stuff — that says it all.
Trish: It was actually quite a fascinating time. So I ended up going to work for a company that was one of the exemplar safety case during the development. So what that means is, as Mark said, from the time of the incident to the time that a completely new legislative regime was implemented was only about 18 months. And so all of a sudden, industry had to do a complete turnaround in 18 months of how they managed risk. And as Mark said, the first safety cases were being developed before the legislation had been written to say what a safety case looked like in Victoria. And so there was a lot of collaboration. It was a fascinating time to be involved with anything that was going on. And I think it actually, whilst it was a very fast timeline, I think it actually worked quite well. We got a very high quality legislation to work to that I think has actually improved safety for workers in the community in the state of Victoria where it was first developed here.
Mark: I think it's fair to say, Trish, that compared to other states I've worked in that Victoria-based approach from when it was first developed and contributed to by a lot of people, it stood the test of time.
Traci: We're in the industry, you're in the industry and we are a little more aware of these incidents, but some of these incidents that you're talking about, the public understands and sees these as well. So I want to understand how public perception of risk and safety have influenced the way risk assessments are conducted and communicated. Trish, I'm going to let you start on that one.
Trish: Okay. I think in a lot of ways, the general public doesn't have very much risk education or knowledge, and that makes it very difficult to have a conversation about risk with them because typically people want something just to be, "Well, you can be there, but you have to be absolutely safe." There's no such thing as absolute safety in this world. A meteor could come through and strike my home right now, so you can't even stay in bed and be safe. There is no absolute safety, but people have this perception of you just have to be safe.
The fact is we need to manage the risks that we face so that they are at a tolerable level that we can move on and live with them. But the same people that want absolute safety often also take enormous risks every day without realizing it because they don't actually understand the controls that they've got in place. So they get in their car and they drive down the freeway. Now, let's put that in a little bit of context. If you are driving at 70 miles an hour, you are moving approximately, I won't get the exact number right, but it's going to be somewhere around about 18 yards in a second. Now, that's pretty fast when you think about it that way. It's 27 meters per second. If you think about how fast that is, and then think about the fact that you're doing that in a little tin can hurdling across the surface of the earth. And there's other cars, other little tin cans hurdling at you at the same speed. So the relative speed is double, but we just do that and we don't think about it.
So we don't understand risk in society. And I think that is one of the challenges that industry has is that we spend a lot of time understanding risk in our businesses and then we try to talk to communities, but we actually need really good risk education for people so they do understand the risks that they take and that life is not risk-free. It's all about choices and actions that we take. And we need to be very careful about the choices and decisions and actions we take because they have consequences. And we need to do the very best we can and I'm not letting industry off here and saying, "You can't be absolutely safe so you don't have to." That is not my point. My point is we actually need people to understand the risk conversation, and I think that's on industry to influence government, to influence education.
I think it starts at primary schools, it starts in high schools. It starts with children understanding what risk is because that way they'll grow up to be adults that understand what risk is and can have good discussions and make good value-based judgements on the actions or decisions they're going to take.
Mark: I would agree with everything that Trish has said there, and I would say that there are most definitely issues that need to be addressed. And I would say that the public perception of what they accept and what high-hazard industries have to actually implement, there's probably two or three odds of magnitude difference in the quantified risk levels there.
So like Trish said, the chance of dying in a car accident is probably one chance in a thousand. And if you use numerical terms and safe levels are down to one chance in 1,000,000, one chance in 10 million or lower.
So there is a huge perception there, but at the same time, I mentioned be-fore that you gain a lot of knowledge about the law's approach towards what's acceptable or not acceptable through the Westminster system of government law that we have here in Australia. And for instance, if a major incident occurs and there is a prosecution, it's most likely going to be under common law. And this requires evaluation of causation, foreseeability, preventability, and reasonability. And expert witness evaluations provide specific guidance for risk engineers and management to be aware of and associated liability issues. And a legal team can't argue until they have the risk argument available to them.
And there've been some cases where I've actually gone into the legal team and said, "This is a no-brainer. The actual client is a hundred percent guilty for these reasons." And the legal team fashions that in their argument. And the outcomes of those judgments influence what society expects. Expert witness evaluations provide specific guidance for risk engineers and management to be aware of and associated liability issues.
Some good examples, our land use planning criteria, and Trish, you may have to help me here because I'm not too sure what the latest name is, but the New South Wales Department of Planning, they put out some very valuable guidance in the early eighties and they're still being used for land use planning and it is still exceptionally relevant.
And I mentioned previously about ALARP and SFARP. Thankfully we have gone long past ALARP and the much-abused cost-benefit analysis, which I mentioned before.
However, land use planning laws developed in the 1940s through to the 1990s have not always been favorable to high-hazard industries established 60 to 70 years ago through residential encroachment. But there are also times when straight consequence modeling will give you a great result. First up, go or no-go. And there's been plenty of examples that I have been aware of and acted as an expert witness about where a vulnerable small group are right at the extremities of a major hazard facility envelope. And you have a vulnerable population who do not know how to react, haven't been trained how to react, that could be exposed to a very high consequence, low-frequency event, which could, if it occurred, cause fatality and serious injuries to that small group of population. And so there are times when the risk argument is good, but there are times when just a straight consequence modeling determines if this is a go or it is a no-go.
For example, expert witness evaluations are prepared. Sensitive population groups like I mentioned, are really very, very important. And some of the last major projects I worked on were used in risk approaches and requirements that echo late 1980s and early 1990s approaches. And I have always been concerned about the lack of sustainable corporate memory for risk-related issues, not only associated with expert witness work, but also clients who manage major heads of facilities because the people leave and it's very hard to retain that corporate memory. And there are always people out there, companies out there who see an opportunity and want to take advantage of that opportunity. And I understand why, they can't be allowed to expose vulnerable people to high hazard consequences.
Trish: Yeah, absolutely. One of the big challenges we do see is land use planning all over the world. It is a challenge. And if we think about some of the significant incidents that it has really demonstrated the issues in land use planning, Bhopal, the residential encroachment around that facility over the years, the decades that facility was in place led to hundreds and hundreds of thousands of people living in that exposed area when the incident occurred. Up to 5,000 people potentially killed in one night while they slept in their beds, hundreds of thousands made ill, it was absolutely horrific.
But even fast-forward into this century. West Texas explosion, the ammonium nitrate explosion that occurred. When you saw the damage that occurred to the nearby nursing home and the nearby high school. Now, fortunately there were no children there at the time and somehow the residents of the nursing home managed to escape injury after that damage. But these are examples of sensitive populations near higher risk facilities where if a consequence occurs, can actually well and truly expand outside their boundaries and really create some quite significant issues.
Traci: I want to switch a little bit and talk about sustainability, the focus on sustainability and long-term impacts. How does this influence the way risks are assessed and managed?
Mark: I think this is a serious, problematic area, and I think a lot of it is based around language and consistency of definitions and lack of corporate focus on sustainable management systems. And I think I despair sometimes, well, I despair a lot when I see language being used in various international standards where the terminology associated with risk and hazard and consequences is not correct and it doesn't help the consistency of language being used to maintain a consistent platform.
And it comes down to corporate memory. I mentioned that just a few minutes ago or lack of, and the vulnerability. For instance, one very large project required not only the evaluation of all major incidents or major accident events and they're safe up to international requirements, but also the required sustainability documentation developed for ongoing use over its remaining lifetime.
That was good and certainly helped for about a five-year period. This worked well until questions were raised as to where risk sits in the hierarchy of an organization, let alone a site. This led to the sustainable risk documentation becoming orphaned and eventually cut off. For instance, there's lots of standards, there has been lots of documentation gathered about how to maintain the safety or the specific major incident areas to our safe requirements and all the actual operations and maintenance requirements to actually keep it there.
We know how to absolutely minimize the potential for large scale fires and explosions and toxic gas releases, but when information gets dismantled, then the corporate memory is lost, and when it gets lost, it is very hard to actually recoup it again. People move on, their documentation's lost. It's been dismantled, and it's hard to repeat it.
Think of it like this. When I was with Factory Mutual, they were very big on maximum foreseeable loss, which is very much like a major accidental event definition versus material impacts. And if you look at the Australian Stock Exchange Risk Management standard requires you actually to document and have plans in place for managing every material impact, which is the same as a maximum foreseeable loss, MI or MAE. They all relate to the same thing but are described in different languages.
Language is critical. So if you start talking to business people about risk, they will all have different approaches towards what their understanding of risk is. And the language is very different. But risk classically does not sit well under the current organizational structures within companies, definitely because of language and integrating the information in a way that fits in with the executive paradigm of a corporate structure.
This is the major challenge I think, in being able to actually have a clear focus and continuing focus going forward. Evaluation of risk consequences (people, environment, reputation, business interruption, legal and regulatory impacts) can be summated into an overall value, it can be ranked and it can be managed. But without this approach or something similar, risk reporting invariably finds a home under the Chief Financial Officer's subsection of Risk and Audit Committee, and that information is then fed up into the corporate levels, board levels, executive management levels within organizations.
Thus it becomes, and this is a bit controversial, I suppose, it becomes an auditing tick box function and material risk impacts have historically not found their way to a board level. In some companies it is changing, but in a lot of companies it is not. And this is a major issue for sustainability, risk assessments and management.
Another major issue I see is being the lack of connectivity and management between short-term risk and long-term risk, as per se, the risk diagram. It is relatively easier to get wins on the high frequency, low consequence side of the risk diagram, but connecting the management systems across all sections of the risk diagram is key and must achieve greater corporate focus for sustainability. That is really a fundamental major issue that I've seen for a long, long time within companies and corporate structures.
Trish: Yeah, I agree there, Mark. We have this lack of common language and it does present some issues because, as engineers, we keep talking in our risk language and our financial people keep talking in their risk language. We are actually often saying the same thing, but we don't seem to understand each other. And that means that things do get left behind and those material risks or maximum foreseeable loss, major accident events are the sorts of things that we actually need to be aligned on to make sure that corporate auditing does adequately address those.
If we look at the recent banking Royal Commission that took place in Australia a few years ago, that actually talked about conduct risk as well as compliance risk, and that was an interesting thing to focus on. It really looked at this idea of can you do something versus should you do something. It might be legal, but is it ethical and moral? And so that was an interesting learning for us to come out.
But it also was, I found it quite a fascinating read as I read through talking about operational risk in a banking sector versus financial risk in a banking sector. And the operational risk they were talking about was really the sort of things in processing industries we'd talk about as being major incidents that we need to manage and prevent. So it was interesting to see that even in the financial sectors, they still acknowledge that they have not only financial risk, but they also have operational risk that they need to be aware of and manage.
And it's similar in the processing industries. As our operational risk tends to kill people, destroy plants, have severe impact on company reputations, and results in financial risk as well. So there's clear financial risk that companies bear on a day-to-day basis, foreign currency trading, et cetera. But then there is the operational risk can have enormous financial risk for you, too. So it has to be managed very effectively. And I think we've got a bit of a way to go there.
We are now sort of in a lot more sustainability type language being used. It's interesting though. We were actually talking sustainability 15, 20 years ago. We were talking about corporate social responsibility 15, 20 years ago. We seem to have lost that term. We've replaced it with another term. We're still trying to get our heads around it. Fundamentally, at the end of the day, we all need to get on board with the United Nations Sustainable Development Goals and try and achieve those. And if we can do that as we do our business and work towards our sustainability goals, so things like feeding the world, making sure that people have clean water, the ability for education, the ability for healthcare appropriate to what's needed, all these United Nations Sustainable Development Goals are absolutely critically important for us to move together as a species into the future.
And so we all need to come together to work on those and not just keep using the corporate jargon around it, but actually do meaningful things so that we understand the risks of what we're doing as we address the challenges of today.
Mark: I couldn't agree more, Trish. I've been in boardrooms previously when we've talked about risk and I've had to hold a discussion with them about what they mean by risk. And sometimes it comes down to the basic issues where they put up a risk matrix, and risk matrices are good for certain things, but bad in other ways as they move the actual risk around based on what they think. And I've seen that happen on many, many occasions. And what they think, there's no relation to the information which is in the organization to actually tell them what the actual risk is.
Trish: Yep. Agree.
Traci: Trish, do you want to add anything to this conversation before we wrap up here?
Trish: Look, I think it's just, it's incumbent on all of us to make sure that we, one, understand the risks we take in our business and we adequately control and manage those. Fundamentally, that is what this is all about. We work in high-hazard industries. That does not mean they have to be high-risk. The hazard, we can't change unless we are able to eliminate it, which would be the ultimate if we could. But let's face it, if your role is to produce a flammable substance, then you're not going to be able to eliminate the production of a flammable substance if that's your product that you're selling. So whilst elimination is our ultimate goal, if we can't eliminate, just because it's a higher hazard does not mean it has to be high risk. We have to effectively manage it to prevent, and then if prevention fails, mitigate the consequences. Because as I've said a couple of times in this podcast already, people's lives depend on this. If we get this wrong, people die. And we need to get better at that because we can't keep having people die at work.
Mark: And I would also say that as an adjunct to that, Trish, which I agree with everything you said, that there is no such thing as zero risk, but we can actually take a hell of a lot more effective approaches to minimize the potential for those risks being realized.
Trish: Yep, absolutely.
Mark: And be held accountable for them.
Trish: Absolutely.
Traci: Well, thank you both for your dedication to help people mitigate and minimize those risks. And unfortunate events happen all over the world. We will be here to discuss and learn from them. Subscribe to this free podcast so you can stay on top of best practices. You can also visit us at chemicalpro-cessing.com for more tools and resources aimed at helping you run efficient and safe facilities. On behalf of Mark and Trish, I'm Traci, and this is Process Safety with Trish and Traci.
Trish: Stay safe