Podcast: Tips To Manage Cyber Risks

Oct. 10, 2022
Part one of a three-part series aimed at helping you develop a sound cybersecurity plan focuses on assessments and training.

Welcome to the Solutions Spotlight edition of our Chemical Processing Distilled Podcast. Solutions Spotlight -- delving deeper into a topic from an industry perspective.

In this episode Matt Malone – ICS/OT cybersecurity consultant at Yokogawa -- offers insight into sound cybersecurity measures.

After graduating from Texas A&M, Matt embarked on his eight-year career in the U.S. Navy as a salvage diver and bomb disposal technician. He completed two combat deployments and returned home to Texas to begin a career with National Oilwell Varco, a major manufacturer in the oil and gas industry. Matt moved to Honeywell as an account manager and later became a project manager. During his time at as a project manager, he caught the cybersecurity bug and went on to work for a startup company to focus solely on ICS cybersecurity. He has since devoted his career to learning everything possible about how to secure industrial control systems against cyberattacks. Matt joined Yokogawa in April of 2020, earned his G ICS P certification and master’s degree soon after, and is passionate about securing customer ICS networks.

Matt and I will be chatting for three podcast episodes on cybersecurity and this is the first in that series.

Transcript

Q: Now, let's get our head around the risks here. What is at stake for chemical processing facilities in terms of cyber security?

A: Well, I don't really like leading off on this question, but that's fine. I don't like to be... No, sure. I mean, let's go ahead and get it out of the way, right? We've got to talk about the doom and the gloom when it comes to cyber security, and not just in chemical processing plants, but any type of automated manufacturing process where you have caustic materials under high heat, high pressure. And the biggest threat to that, the biggest risk is going to be injury, loss of life, and those types of things followed by property damage. And that's just your first order effects, okay? So many of my clients comprise the critical infrastructure of North America. So when we start looking at second and third order effects, then we have to look at things like supply chain disruption, which we've seen some examples of that in the last few years. So like I said, I don't necessarily talking too much about the doom and bloom side of it, but we still have to identify those risks and what could actually happen in a certain type of situation.

Q: Well, I agree with you. The doom and gloom, you don't really want to lead with that, but folks need to know what's at stake here and what they need to combat against. And there are a lot of big things at stake, as you mentioned. And so I think it's smart that we're, A, doing this type of podcast series to talk about it, and B, getting our arms around it and understanding where we need to begin. And I know a lot of things can happen, but human factors are a huge impact in terms of cybersecurity issues and not malignantly so, but accidentally so.

A: Right.

Q: Can we talk a little bit about that?

A: Sure. I think it was, I'm probably going to get nailed on this one, but I want to say it was CSO online who had some sort of figure, I think it was around upper 60% of accidental intrusion was because of the insider clicking on a phishing link that would bypass the firewall or download some type of EXC file or something to that effect. And so the human factor is going to be one of the things that we have to work towards in the future in terms of reducing the risk on that side. And the good news is that there are already some fantastic plans and procedures in place when it comes to just the initial training part of it of, "This is what a phishing email looks like."

Or you have some folks maybe on the IT enterprise side that'll conduct fake phishing exercises. I don't know if you've seen any of these yet, they're pretty creative, where they will set up a fake email account that'll mimic a phishing email, send it out to people within the organization and employees that are unlucky enough to click on these links, find out, "Oh, by the way, you've got a month to finish cyber awareness training," things like that. So there's some really creative ways to go about reducing the risk on the employee side.

Q: And that brings up the... I'm so paranoid now. I'm paranoid of clicking on anything. So how do you get around that? The zero trust, I guess, is what I'm after.

A: Well, sure. And it's going to be with us for a while unfortunately. There's always going to be some type of weak link in the chain that is going to be easily exploited. And from looking at this from the malicious actor side, the hacker side, why wouldn't I want to do that? If I'm looking the 80, 20 rule, 80% of, or the 20% that I'm going to put forward is going to bring me back 80% of my return. So why wouldn't I want to maximize that effort instead of having to... Why would I want to learn how to program a PLC and have to get through a company's DMZ to attack their DCS when all I have to do, I can send an operator an email? He's going to be shortsighted enough to click on this link by accident or on purpose not knowing any better. And now I've completely bypassed everything and I'm sitting and I'm looking at an HIS, right? For the DCS, and now I have free reign to do everywhere I want.

So one of the ways to fight that is the internal training, some of the posters on the wall reminding people to be suspicious of emails especially if they're coming outside the organization. Some of the receiving filters need to be tuned up. And then part of that training is more than just be like, "Hey, beware suspicious links." But if you're getting an email from somebody that maybe they even have your company's domain within the email address, but you don't recognize the sender, you might want to take that opportunity, start looking through, "Okay. This guy's got a yokogawa.com email address. But all right, are the Os really Os or is one of them a zero?" So things like that that they're kind of trying to trick people on is also helpful.

Q: Those are great tips. And talking about some steps to initiate a cybersecurity program, what are some other steps that folks can take?

A: I deal with this part a lot is folks that are in the position of developing an OT cybersecurity program from the ground up. And I've seen either the IT specialist, the IT engineer that has now been kind of like, "He's knighted, and now you're in charge of all of OT security as well." And so I have the opportunity to tell them and train them about the differences between the OT and the IT side. "This is what you need to function on." And for the person in that position, I always give them the advice of building a coalition. In some instances they may be put in a position of responsibility and no authority. So you have to build a team. You're not going to be a general, being able to spout orders and everything is going to be able to happen. So you have to build a team.

And one of the best ways to do that is to make it cross functional. So if you're in the IT, like I said, if you're this IT engineer, now all of a sudden you're placed in a position of having to create an OT cyber security program, you've got to reach out to your process control engineers and your process control managers. They're going to be intimately familiar with that DCS or the SCADA system, whatever it may be. I know there's some purists out there when it comes to the definitions of the things. So I don't want to turn anybody off with my nomenclature, but reach out to those folks over there and ask for the help.

Like I said, don't be demanding about it, but use some of those interpersonal skills and go to them and say, "Hey, the plant manager just said I've got to do this. My expertise is SAP, email, ERP, things like that. I'd really appreciate y'all if y'all could help me with this." And so once you build that cross functional team, then you can start outlining and developing the skeleton of that program. The first thing you want to go to is probably a specific standard ISA 62443 is probably the gold standard when it comes to cyber security for process automation. I always recommend that unless you're in some sort of industry vertical where you have to go off of a specific one.

So the folks in power gen for BES, they're going to have to go through NERC CIP, especially if you're in the states. But for anybody else that doesn't fall into one of those special circumstances go by 62443. And that probably needs to be, because that standard, just buying a copy, that standard's not going to be cheap. So that would be the first thing to say for somebody kind of developing that program. "Hey boss, I'm willing to do this. I want to do this, but you've got to give me the tools. I've got to get this standard. This is how much it's going to cost. I need an ISA membership in order to get access to this type of stuff." And then you can use that as your roadmap to start building out and building the wings, so to speak, from the framework that you're developing.

Q: You bring up a good point, the cost. You did mention the cost and going to your bosses, but sometimes it's not that cut and dry and easy. How can you make a good case to put manpower and dollars behind a cybersecurity program?

A: You got to start with a risk assessment. And there are two different flavors out there right now. The more common one is going to be that qualitative assessment, and you can do it internally, you can do it externally. On the OT side, you want to make sure it's going to be some sort of passive assessment, especially if you're going to be looking at data. I've seen some type of assessments. People are just maybe looking at some sort of health monitoring status check, which will just give you, "Okay, these are all my machines, these are my OS revisions, these are my application revisions," for everything that you're running, and kind of go from there. But when you do something that's going to look at maybe some type of PCAP capture analysis that's really going to help you with asset inventory or identification, especially for legacy plants.

I mean, I don't think many folks out there are operating plants that are younger than 15 or 20 years. So for some of these older plants that may have changed hands between companies, there's a good chance something may have been lost on the hardware side. So doing some type of PCAP data analysis can help you reconstitute something that might have lost along the way. And looking at that and looking at it as objectively as you can and say, "Okay, all these OSs, they're passed into life, passed into support. We've got all these vulnerabilities lining up with this." And then do the same thing with the applications that you're running.

And then the same thing, even for the hardware side, especially for your OEM providers in the automation space. God bless us, we have the outlook to develop hardware for very long shelf lives because we know the life of these plants is not going to be five years, right? Some of these guys are going to try to stretch out to 50 or more, but that doesn't mean we're going to be able to foresee every change coming down the pipe in 10, 20 years. And so by that force alone, we're going to have to develop better PLCs and things like that. And as that happens, we're going to have to take some older ones off of support.

And that represents another vulnerability. All of these hardware pieces down to level one, maybe even level zero level if you're looking at the Purdue model, are going to be old, pass, and alive and a support. They're going to have vulnerabilities that may not be able to be overcome. So once that's done, we can go back and we can say, "Okay, these are all my vulnerabilities. These are the risks that could happen based off of that. This is kind of if I want to start mapping and attack tree. If I was putting myself back into the shoes of that hacker, then I could say, 'All right, I can exploit this vulnerability and I can exploit that vulnerability and that's what's going to give me access to maybe an engineering workstation, God forbid.'"

And then there's another flavor of the assessments that's out there is a quantitative assessment. These represent two really different price ranges. And I think that's really a matter of the intellectual property that's involved with each. PCAP data analysis has kind of been around a while, but if you look at some of the folks that are doing quantitative analysis, especially in the OT space, that's relatively new. And so that IP is going to be more guarded in how they go and develop those risk assessments. And those risk assessments are going to tell you in some cases, "Yes, this plant has," let's just say for example, Plant X has a 75% chance of being attacked with ransomware. All right.

Within that attack, there is a 57% chance that the attack is successful. If that attack is successful, then here's the standard loss equivalency, which is going to factor in the plants' daily revenue production rates, the cost of replacing critical infrastructure within that plant. All that's going to be tied up within that intellectual property of whoever's doing this quantitative assessment. And they're going to be able to tell you very distinctly and very objectively how much is at risk monetarily due to your curtain cyber security posture.

And if they're really good, they could probably give you a return on investment analysis as well like saying, "Okay. Well, if you updated the OS for this workstation, that's going to cost you approximately $50,000, but it's we're going to reduce your cyber risk by 2 million dollars." And then you can start to prioritize the things on your cybersecurity roadmap. And that's really what's going to make it a roadmap. Identifying all of your vulnerabilities and your risk are just part of it, but prioritizing what we should do and in what order, it's going to be another kind of a challenge in its own.

Q: I guess I did not realize that it could get as granular as you're speaking to, and that's fairly impressive to have that type of information. Have you gleaned any great lessons learned from past risk assessments, and do folks not take it seriously sometimes?

A: I don't know. If someone's just been given me lit music telling me what I want to hear, I certainly haven't been any of the wiser. To be honest with you, I wouldn't want to know if they weren't taking it seriously, and telling me otherwise. No. So far, the folks that I've dealt with, they do understand the nature of the beast. And some of these folks, bless their hearts, I've had so many conversations, Traci, where it'll be this process control manager, process control engineer, chemical engineer, and they'll say something to the effect of, "I'm so glad we're finally having this meeting. I have been complaining about this, carrying on this, screaming, hopping up on my desk about this for the last five years, and we're finally doing something about it."

And so those are the conversations I love. Those are those that's the time when I can really kind of knuckle down with the client, and we can get the business really quick. And there's no kind of explaining or helping adjust the expectation level of the client at that point. Usually that person's kind of done their homework, they know what to expect and things like that. It's the person who is more or less kind of, "All right. I was kind of thrust into this position. Yes, I understand it's important, but I've got other priorities going on right now because I'm also wearing five other different hats. I've got to do process safety along with the engineering side of it, and now I've got to do cybersecurity too. So what can we do?"

And it's not going to be short, and it's not going to be a fix. This is not a break fix model when we're dealing with cyber security. This is a risk management model that we need to think about. This is the kind of thing that is going to be on our shoulders just like safety and HS and E. This is one of the things I also say a lot too, is I think the OT cyber security space is where HS and E was probably about 25 years ago. I grew up in Odessa, Texas. My dad roughnecked for a little bit. All the kids who lived on my street, my friends growing up, their dads were roughnecks.

And yeah, I knew grownups that were missing fingers, that were missing hands. I'm not that old, I was born in the eighties, but this was still one of those things that happened back then before there was this radical change. And then I went from that to when I got out of the Navy, I worked for a company called NOV doing sales, chasing oil rigs. And I remember seeing roughnecks getting written up because he had on the wrong type of gloves. Not that he didn't have gloves, but he had on the wrong type for a certain type of job he was doing.

I'm not sure what it was now looking back on it. And I just remember thinking that in the back of my mind like, "All right, so and so's dad down the street didn't have a finger because he was slinging chain or doing whatever else, and now this guy's getting written up for not even having the right PPE. This is really dramatic." And so I think that we're kind of at a similar watershed moment with cyber security in the OT space that the industry is really kind of woken up and saying, "Right now, this is serious. We've got to do something about this."

Traci: Excellent points, excellent illustration of that point too with that. Matt, I want to thank you for laying the groundwork for us in terms of organizing cybersecurity initiatives. In our next episode, we're going to talk about the tools necessary to execute these initiatives. On behalf of Matt and the team at Yokogawa, I'm Traci Purdum, and this is Solution Spotlight.