This podcast was first published by Oliver Wyman here.
Healthcare organizations and those who lead them must think carefully about the efficacy of their security programs to assess how attractive they could be to hackers.
So says Jason Crabtree, Chief Executive Officer of QOMPLX, who joins the Oliver Wyman Health Podcast to talk about cybersecurity with Paul Mee, Partner and Head of Oliver Wyman’s Cyber Platform.
Consumers want the confidence to know the healthcare they receive is high quality, but also that's it's timely and won't be manipulated or used against them, says Jason. Someone probably wouldn't tell their doctor their deepest secrets, for example, if they knew the information will be blasted across the Internet. In time, says Paul, consumer awareness of what's happening to their medical data, the cumulative effect of breaches, and so on will increase. Industry leaders must focus on how more compliance and more standards don't actually create more security, necessarily.
Episode Highlights
- "You shouldn't get mad at a security team for having a breach. You should be mad if when you have the breach, they don't have DNS logs stored, information about their active directory environment, end-point tools installed, visibility on their external security posture, and they had RDP facing the internet. I mean, these kinds of things are unforgivable, right?"
- "The value of healthcare records is far higher these days than financial services records. And they also have a more meaningful impact on me as an individual. I don't necessarily want information about my very personal health being out there. And it's a valuable commodity in the dark web space."
- "People in cybersecurity often give terrible advice, right? They say things like, 'Well, patch everything well.' Okay. First of all, it's literally not how it works for an FDA-certified medical device. There are other constituents and certifications. We have to have a more rational conversation."
Transcript:
Jason Crabtree:
We are now in a very digital world, and the quality of care that I get now is highly linked to the availability of the information systems that contain information about me and information about what the medical staff servicing me or my family should do. And so there's just no way that's not true anymore.
Jacqueline DiChiara:
That's Jason Crabtree, CEO of QOMPLX. Jason's here to talk about why healthcare leaders must prioritize risk management and cybersecurity efforts and we're exploring some top strategies. The Oliver Wyman Health Podcast is brought to you by the global management consulting firm, Oliver Wyman. For more, visit our online healthcare publication, Oliver Wyman Health at health.oliverwyman.com, and follow us over on Twitter @owhealtheditor. I'm editor Jacqueline DiChiara. Enjoy the show.
Paul Mee:
Hello and welcome. My name's Paul Mee and I have the pleasure today being with Jason Crabtree, around cyber risk management in the healthcare sector.
Jason Crabtree:
Paul, it's great to be here and really looking forward to speaking with you.
Paul Mee:
I think we live in interesting times. I think as people are returning to work, we've got a different use of the way we're working together, the way we're sharing information. But also if I look at some of the amazing innovations we're seeing across the healthcare sector, we've also got some challenges as regards how data gets used and how this connectivity and seamlessness of data is both exciting, but also I think changes the attack dynamics and the threats that are out there. But I really appreciate your views. I know you live and breathe this stuff in terms of what we should be looking for next.
Jason Crabtree:
Oh, thanks, Paul. It's great to be here. Organizations are facing more and more active threat environment and I think one of the challenges that everybody's realizing is that it's not just that there are really active adversaries out there, but it's also that there are more and more tools that expose vulnerabilities, in a way that makes them highly visible to those attackers. And you see this in the ransomware scourge and other kinds of areas where it's just become so quick and fast to profile the kinds of vulnerable targets that are out there that look like big juicy meals. And it turns out that means that organizations have to care not only about the efficacy of their security program, but they have to look at how attractive of a target they are.
Paul Mee:
This idea of bad actor motivation is coming to the fore, because as you and I have discussed in the past, the value of healthcare records is far higher these days than financial services records. They also have a more meaningful impact on me as an individual. I don't necessarily want information about my very personal health being out there, and it's a valuable commodity in the dark web space. So I'm kind of interested in your perspective. What do we do about this? Because we're going to see the prices for this kind of information rise and also the motivation for bad actors to get this very sensitive information, whether it's in the research space like we saw with vaccines or whether it's down to individual patients. This is a rich goldmine for bad actors.
Jason Crabtree:
No, it's absolutely the case. And I think QOMPLX recently acquired a partner of ours, Hyperion Gray, that has one of the largest breached databases in the world. And one of those types of things that we've seen with a lot of this work that we're doing and looking at historical breach records and everything from passwords to medical record data sets and other things that are out there is that it turns out, as you noted, medical information provides so much detail that supports bypassing a lot of the sort of knowledge based identity verification that used to be so popular. Where did you go to school? What did you eat for breakfast? All these fun questions that we've embedded in a lot of historical legacy applications and services. And that's part of why you see the price record differentiation between healthcare records and financial services.
And one of the challenges I think for defenders, and I think this is something that healthcare organizations have to pay particular attention to, is that because they're sitting on this vast treasure trove of information, and there are a lot of different counterparties that need to access that for different purposes, we have to go back to the tried and true CIA triad. It's not just around the confidentiality of this data, which is certainly an aspect of it and historically that's what most of the breach discussions have focused on. It's also about do you have trust in the integrity of the information that's in my health record? Are you sure that it's not been modified or that it's not missing something, because something being missing from it like an allergy or other things can have real consequences. And then, again, and we've seen this again with ransomware, is it available? Is it available when I need it if I'm a medical provider so that I'm actually able to get the care that I need as a patient.
Paul Mee:
And I think that as we have moved some more and more seamless technology across the healthcare system where we're not only sharing patient records, there are also more and more devices that are informing that view on patients and how they're being looked after, that protecting and designing cyber defenses in right down to the device level is going to be more and more of an imperative. And I guess I'm noticing where the industry is where it needs to be as regards getting that defense into the devices. And I've seen examples where we have pacemakers that can be hacked. We have scanning devices that can be hacked. Parts that are in the human body that are more vulnerable than they should be. And I guess that's worrying to me that we're running really fast, top speed, when it comes to innovation, but maybe just not giving enough due care and attention to the vulnerabilities and potential cyber risk that are out there.
Jason Crabtree:
Oh, absolutely the case. And I think there's a couple of key differences when you start thinking about cyber security in health, especially on the provider side or on the connected medical device side, when you're looking at embedded devices that are part of your body. One of the challenges that people often who are in sort of normal cyber security give terrible advice, say things like, "Well patch everything." Well, okay, first of all, that's literally not how it works for an FDA certified medical device. There are other constituents and parts and certifications. So we have to have a more rational conversation that says, "Hey, there are different aspects of patient care, but medical devices that are embedded in a human need a different kind of security model and update model than medical devices like diagnostics, where they're subject to some of the same kinds of challenges, they can both lead to bad outcomes in patient care if they're manipulated or if they're compromised, but the way that you handle a security risk for a diagnostic device where you're concerned about data integrity or confidentiality or availability is different than things something that's the performance impact on a pacemaker. I think we've got to start to have a really, really detailed discussion around this and some of the work that we or some of our partners do with both government and commercial side healthcare providers, folks like Defense Health and others, where we're actively supporting these kinds of kinetic medical device and diagnostic capabilities, it turns out that a lot of the vulnerability management programs that are pushed in the IT side of security are just not appropriate. They can't be just lifted and shifted into the kinetic medical device market.
Paul Mee:
I think to your point earlier on motivation, this is becoming the potential gold rush for bad actors. And from my own experience, we saw a 40% increase in hacking incidents where well over 30 million patient records were compromised during COVID. We've seen fake contact tracing where people are being spoofed and they think it's important information they're giving and actually they're just feeding the bad actors. And we saw hospital and patient data being put on the dark web for sale just this past March whereby it was literally for the highest bidder with Bitcoin they could get this information. We have devices that are now sending information to China about facility. So I guess, are there just too many
doors that are open here or is the industry just needing to play catch up more aggressively than it is right now?
Jason Crabtree:
Yeah, I think some of this is understanding that security typically is as bad as it can afford to be. And I think when you're starting to see meaningful financial and reputational impacts from things like ransomware, and you can look at in the September 2020 UHS ransomware attack, I think we were looking at something that was widely reported as a $67 million sort of direct loss. But a lot of people now associate UHS with this significant ransomware attack. That's not positive from a consumer perspective and from a brand perspective.
So I think as organizations grapple with real world impacts, it is driving home that there are real world costs and consequences. I think one of the reasons why it's really important for healthcare to become leaders in many cases in cybersecurity back to why these medical records have such tremendous value, is ultimately that the consumers want to have confidence that the healthcare they're going to receive is quality, that it's timely, that it's not going to be used or manipulated against them. You're not going to
tell your doctor your deepest secrets that you need help with if you think that it's going to be blasted about the internet.
And I think what we're starting to see is that I do believe that at some point consumers are going to continue to become more aware of this and that they're going to be more aware that the cumulative effect of these breaches where your medical record can be combined with a historical breach of a credit writings agency or some of these other things, the totality of that actually builds in terms of the kind of susceptibility that individuals or businesses have when it comes to other people being able to commit fraud against them or doing other things that harm them economically, absent any of the privacy or confidentiality kinds of considerations.
Paul Mee:
And then where do you think that's going to go? Because to my mind, that prompts almost the need for accreditation, in the same way that when I go to a restaurant, I know that it's been approved by a local inspector and the food's not going to give me food poisoning. Do you think we're going to get to a position whereby when I use a clinic, when I use a service, when I actually use a doctor or a surgeon, there'll be an expectation that this person has some degree of qualification and attestations that they are caring for my information?
Jason Crabtree:
Yeah, I mean I think one of the key things here is that it's got to be about more than claiming standards or audits. Cybersecurity's so dynamic and you can see this when you see the kinds of massive breaches that are out there. If you look at all the post-exploitation capabilities that happened in SolarWinds, right? This is all becoming, after you gain the initial entry, it's taking over active directory and expanding throughout the enterprise to gain domain dominance and either taking out information, creating fake users, or broadcasting ransomware. DarkSide at Colonial Pipeline, another active directory centric ransomware attack.
So we're seeing those kinds of elements where you have to really understand what are the critical links in the attack chain? So we've got to sort of focus on what matters. And I think one of the challenges with this is that more compliance and more standards don't actually create security. But compliance is a really useful tool because there are a lot of businesses that frankly won't invest in these capabilities if they're not required to and they need a standard to comply against.
Paul Mee:
More smoke alarms don't stop fires. So it means you'll detect them. And you raise a good point, is active directory like the gateway into your entire organization if you don't get that right? It feels like because active directory is a spine to the connectivity across your organization, if you're not protecting that, then basically you're opening the doors and potentially leaving doors open and for the bad actors to walk around your organization and potentially do harm.
Jason Crabtree:
Yeah, so I think one of the challenges is that it's really easy to put trite sort of statements out there like, "Well, we're going to implement zero trust." What the hell is zero trust? And it turns out that as you dig into those things, that zero trust as an example really means 100% trust in your identity provider.
Paul Mee:
Right.
Jason Crabtree:
Well, what's the identity provider for on premise environments? For most organizations in the world it's active directory and in the cloud there's a whole bunch of different sort of SAML based providers. What does that mean? It means that the protocol Kerberos, which is supposed to be getting used for your active directory environment on premise, and the main protocol SAML, which is getting used in the cloud, are actually the core of whether or not you can trust that an authentication event's really true. Is someone who they claim to be?
And the funny part about this in a lot of ways is that all the role-based access and all the other sort of access control schemes that we have, even things like two factor and multifactor authentication, all rely on the authentication services that are fundamentally Kerberosed or SAML based not being a lie.
Paul Mee:
Right.
Jason Crabtree:
Turns out, look, go look at SolarWinds as an example. Why are Russians controlling the Department of Justice or the Department of Treasury? It's because they can exploit protocols like Kerberos and SAML. They could spoof traffic and basically forge tickets for active directory Kerberos and forge SAML tokens, and they created their own accesses. And once they have those administrative rights, they can do lots of other bad things in the environment, whether that's a ransomware actor, or whether that's someone who's trying to do something for espionage to get information on people or on entities. And I think this is the challenge, I think, for folks in risk and I think really great risk managers always go back to say what are the fundamental things that must be true for the rest of my security or risk management program to be there? And it turns out that in enterprise security, if you don't watch the authentication providers, it is the apex security control. If authentication is spoofed or manipulated, all of the rest of your security tools and appliances and IT controls all assume that that's not true. So you've got to actually spend the time and attention to make sure that's real.
Paul Mee:
No, I totally agree. Because otherwise you're giving away the keys to the kingdom and allowing people to wonder around.
Jason Crabtree:
Right.
Paul Mee:
Flaunt whatever trouble they want to do and give themselves back doors and entries to come back later. And I've seen this in other situations whereby in certain circumstances they're testing your abilities, because they're going to come back later. They want to see how you respond. This is a multi move game. And if they can understand, "Okay, we did this thing as a smoke screen or diversion. How did they respond? Then next time we can be smarter and better informed and have different ways in." And I think you can raise a really good point. Once somebody has access and it's active directory is the spine to do that, you're just creating a theme park that the bad actors can learn from.
Jason Crabtree:
Well, and I think one of the challenges for folks is everybody wants to fixate often on what's the initial entry point? Was this a phishing attack? Was this an insider? Was this a malware that was downloaded after somebody went to some malicious watering hole site? Or whatever it's going to be. But it turns out that while all those are different ingress points, the lateral movement and privilege escalation stages are all the same.
Paul Mee:
Right.
Jason Crabtree:
They're all exploitation of fundamental authentication protocols, gaining administrative rights, and then using this kind of domain dominance for domain joined active directory types of networks, which is again, the majority of corporates. And most people in the corporate environment have peered their cloud environment to their active directory environment. And when they do that through something like Active Directory Federation Services, which is really typical and you saw this again in the SolarWinds post-exploitation stages, now you can compromise the cloud.
And so it's so critical for folks to really understand what must be true, ad and identity is at the center, but you actually have to not just do log management, you have to do realtime validation of the actual protocols themselves, because those protocols are what are called stateless. It means that effectively multiple domain controllers in active directory or multiple SAML providers can all issue or authenticate tickets and tokens. And on very large networks we have clients where in their active directory environment we're validating 14 to 170,000 identity transactions per second. And on a single corporate network and hundreds of countries around or dozens of countries around the world.
And it turns out that kind of capability means that organization's now able to have confidence that logon events are real ones. And you can't do it with just Windows event logs. You can't do it with just logs off of domain controllers. You actually have to look at the Kerberos traffic. There's another protocol called DRS and SAML token exchanges. And super important and something we spend a lot of time getting right with enterprises, but the real advantage of doing that is now you can actually do things like behavioral analysis. But if someone's telling you they're doing behavioral analysis and they don't know that the authentication events are real, then they're analyzing behavior that can be easily mis-attributed to the wrong entity. So it's just a garbage in, garbage out problem. And that's a problem as old as time.
Paul Mee:
And I think the industry's already got as much of a challenge dealing with false positives as it is without there being even more noise in the system. I'm interested, I mean you live and breathe this stuff, in kind of lay person's terms, if you were to meet somebody at a cocktail party and say, "This is what you need to do about it," what are the kind of three or four stripes that you'd say that organizations need to get right in this regard?
Jason Crabtree:
I think the main thing, Paul, is you've got to actually map your dependencies. You really need to know what you depend on and what happens if it goes away. You really need to make sure that you understand the fundamental assumptions you're making. Is authentication at the core? It is, so how do you make sure you can trust the identities that you're using to base all the rest of your security decisions on? And you've got to really make sure that you understand things like asset management. So asset management though isn't about IT assets, it's about how those assets that are across the business ultimately relate back to the business processes they support. How do they impact patients? How do they impact billing? How do they impact all the practical things that have to happen every day for the organization to serve its customers and to be able to meet all of its own obligations?
Paul Mee:
For the board and executive who may not go as technical as you've just described, you've being held to demonstrate that you are less vulnerable. And I would argue that by doing the kind of things you've just described, you're effectively buying down the risk. You're reducing the probability of this happening to you. You're reducing the impact even if somebody does get in. And I think without those technical fundamentals, you're either playing chicken with the bad actors or you don't truly understand the risk that you face.
And I think this idea that by making the kind of investments you just described buys down cyber risk is an important concept for the board and senior management to get, because otherwise you don't understand what it is. Imagine that you start off without these kind of things and I've got 1000 units of risk. By implementing the kind of controls and kind of capabilities you describe, I start to buy that down. I'm never going to be risk free. But imagine I'm in a position whereby I go from 1000 units of risk to 600 units of risk because I understand how I'm going to protect and defend the perimeter. And importantly on the other side of it, if something does go wrong, I know what my right response mechanisms are going to be.
Jason Crabtree:
Well I think that's right, Paul. And a lot of what we've seen is that after an organization gets owned up by one of these kinds of issues or a close competitor or a partner or a friend does, they end up spending a lot more time and attention on it. One of the challenges for people though with this is understanding the difference between frequency and severity and QOMPLX does work not just in cybersecurity, but we do work in quantitative modeling of insurance risks.
Paul Mee:
Right.
Jason Crabtree:
And one of the reasons why we think that's really, really important is it's given us some insight into
trying to help work with clients and communicate about the financial ramifications of these things. And one of the things that we like to talk about when it comes to things like active directory and what we would call domain dominance attacks, where someone becomes your administrator. Obviously you'd like for a foreign entity, whether that foreign entity is a ransomware group or whether it's a nation state who wants access to health records or whatever it might be, to stay out of your network. But if you really look at what that means from a financial risk and brand risk perspective, these are tail risk events. These are the big things. I won't call them black swans because they're actually predictable. These are not unexpected events, but they are nonetheless impactful, they're in the tail.
And so part of why we talk to organizations about getting out of the prevention mindset and getting into the detection and response mindset is that you really want to think about how you elegantly degrade during a cyber event. How do you think about minimizing impacts to patient care? And by the way, patient care includes things like billing. It includes practical things that relate to running the business so that you can conduct economic activity. And I think for a lot of organizations, when they start to look at things that like the tail value at risk and they think about what that looks like over time, getting visibility ensures that when you have to do an incident response event, you actually have the data needed for someone to come in and hunt in your environment to help you expel a bad actor.
If you don't even collect some kinds of information, it would be like trying to run your hospital without any of the diagnostic tools that you're supposed to have. It doesn't mean that you can't have talent surgeons or doctors show up, but if they don't have the tools that are there or they don't have any of the imaging data that they're supposed to have, they're literally going to go do the equivalent of an exploratory surgery. And I don't think that that's what you want to have inside your organization. And so we really encourage people to think about visibility. Am I collecting the raw information? Am I putting it somewhere? They don't need to jump to automation and all the new hotness about SOAR or something else. That's just making bad decisions faster if you don't have the fundamental visibility and process and control that you need in place first.
Paul Mee:
When I started in this industry, a black swan was really a once in a lifetime event. This isn't once in lifetime.
Jason Crabtree:
No, it's not.
Paul Mee:
Worse than that it can be once in a career lifetime where your career's over if these things hit. So I think you're absolutely right. And also it's not a good way to learn when you get a flesh wound, and it's not also a good way to pay attention when your neighbor's been attacked. So I think there's a real need for the industry to be more proactive, to look into these risks, and prepare accordingly. You shouldn't be, when these things happen, shaking hands on the battlefield. This should be something that's part of your muscle memory and part of management ability to navigate these situations. And once you trip, be able to stand up in a sophisticated way, dust yourself down, and move forward rather than, as we saw with WannaCry, clinics and hospitals still being on pads and pencils three and four months later after the event.
Jason Crabtree:
Yeah, well, and I think one of the challenges, and you see this in a couple areas, you shouldn't get mad at a security team for having a breach. You should be mad if when you have the breach they don't have DNS logs stored, they don't have information about their active directory environment, they don't have other types of perimeter logs, they didn't have endpoint tools installed, they didn't have visibility on their external security posture and they had RDP facing the internet. I mean these kinds of things are unforgivable. And I think they're unforgivable in the context of they're all, we're not even two best practices yet. And I think one of the things that healthcare can really benefit from is there are actually established practices to talk about and in fact really highlight errors and judgment or errors in process. And we have this in quality assurance counsel, kinds of governance models, that most hospitals and healthcare providers already have.
And I think this idea of taking that HQUACK model which is well known, it's also well known to boards in the healthcare industry, and actually thinking about how do you extend it for the IT and operational security. We've jokingly called it HioQUACK, which is fun to say. But I kind of building on the HQUACK function that's already a core competency I think's really important, because I think we're past the point where someone can credibly say that a ransomware event that's impacting the availability of health records, when someone's in the middle of a surgery or someone's denied the ability at care, doesn't have an impact on the quality of patient care.
And I think we've got to be honest with ourselves that one of the reasons why HQUACK's were really successful in reducing negative impacts on patient outcomes from human and process deficiencies was that it forced transparency of reporting. And to your point, these things are not rare, but there are lots of organizations that do some version of, "Well, it wasn't a really a breach because we had no evidence that the data left our environment." Well, that someone was in your environment and you know you don't have any logs to look at what went outbound. So no evidence, because you don't buy a micrograph, doesn't mean you don't have germs on your hands. I don't look for this is not an excuse for saying it's not happening to me.
Paul Mee:
No. And in my very simple mind I count doing these kind of things as basically just opening the aperture of duty of care. You 're making sure you're in a position whereby duty of care isn't just in that immediate interaction with the patient, it's in the environment that you're conducting the work that you do as a medical professional.
Jason Crabtree:
I think when you think about the fact that we are now in a very digital world, even compared to a decade ago. And fundamentally I like to think of risk as a consequence of dependence, and the quality of the care that I get now is highly linked to the availability of the information systems that contain information about me and information about what the medical staff servicing me or my family should do. And so there's just no way that that that's not true anymore. And so now that we're more dependent, we have to think about investing differently than we did historically. And I think we're to the point where we've got to have those kinds of honest conversations and make sure we're resourcing it appropriately. But this does come down to a duty of care. But duty of care requires transparency about this kind of information with patients, it requires it with regulators, it requires it with administrators and board members, and we've got to have those kinds of conversations.
But I do think in health in particular, in part because of those differences and medical devices and certifications around diagnostic information and all these other things, this isn't just the nonsense, just patch everything faster that is just a sideshow distraction from the real issues, which is do we have visibility? Do we have controls? Not do we have policies? And do we have transparency in reporting and accountability? This is when we make a mistake, which we will, or when a bad actor does something to us despite good faith efforts that may even be sufficient efforts against our duty of care, we have an incident in our organization. Are we collectively communicating all those things in a way that allows other people to learn and allows our own organization to learn about how we're moving forward?
And I think folks have to be talking to vendors, to insurers, to others that talk that way. If someone tells you they're going to stop everything at the gate, it's a good sign they either don't know what they're talking about or they're just not an honest broker. And I think we've got to start having those kinds of honest conversations.
Paul Mee:
I totally agree. It's like saying, "I'm going to work in this building but I won't ever do a fire drill because, well, the fire might not affect me." If you don't drill for fires, have fire marshals, and understand what you do, then you put yourself in jeopardy. And I think to my mind, that's the analogy here is to say,"Let's make sure we know what to do when the worst happens. We know how we'll respond. We'll know how to communicate and we'll know importantly how we'll coordinate, not just with ourselves, but with all the other people that were dependent upon by their suppliers or other actors in our own ecosystem." Because without that, it's like being in the center of wildfire without a map. You're going to get hurt.
Jason Crabtree:
Yeah. Well, and this is ultimately about shifting it from being an IT issue to an operations issue. These are operational problems. They impact our ability to put people and resources at the right time in the right place to affect an outcome. And when boards and organizations talk about it that way, and they hold operators accountable, we'll be successful. When the IT teams are a stand in because senior executives don't want to be bothered to learn, then we're not going to be successful because it's not going to get inspected. And I think it's really important that just like in zero trust, you need to trust but verify your identity provider. When it comes to cybersecurity, you have to actually go inspect it. You have to actually go run those drills. You actually have to look at it and you can't just scope your way out of it because you say, "Well we had a pen test. We also said they're not allowed to touch active directory, which is a common scope exclusion. We're not allowed to do all these other things."
And so if the organizations that are trying to secure themselves think and let the folks that they work with to help them gauge their own security? If you look at yourself like an attacker looks at you and you really look at your posture and you look at how you actually look to an adversary, you're going to have a much more realistic program. If you're looking for a get out of jail free card with a slip that says we're Aokay? No security program's probably A-okay, and good programs, they look worse before they get better. But you've got to inject the red onto your health chart and so you can actually start to deal with the challenges and start to have those conversations about duty of care. You can't just say a, everything's green and we're going to make it greener. That's not a solution for success.
Paul Mee:
We almost have the concept of a human firewall, though that's relatively blunt, rather than just thinking that the CSOs going to fix everything when bad stuff starts to happen.
Jason Crabtree:
I think you're correct, Paul. And I think the other thing is healthcare has a lot to offer here, in part because healthcare organizations practice in other parts of what they do degrading elegantly, degrading truthfully. This is a group of people, especially on the provider side, that have to think about what if X isn't available, what will I do then? Well, if we have this approach for IT that's really about how to operators and providers understand the dependence they have? And if you really understand your dependence and you understand what Plan B and Plan C and Plan D are where you go from your primary plan of action to your alternate to your contingent to your emergency capabilities? That's actually what you should be talking about your cybersecurity program like. And that is not technical. The CSO and the CIO are supposed to be doing that translation to know about things like active directory to say, "Hey, do you realize that if your computer can authenticate on AES, it downgrades to RC4?" "Well that's a downgrade. Is that dangerous?" Well, it can be. Is that an indicator of breach? Well, maybe. But no different than a doctor doesn't need to have those things taught to them by the IT provider.
So we've got to have this much more focused sort of discussion that says the operational goal is exceptional patient care, the operational goal is deliver gas to all Americans on the East Coast, whatever it's going to be. If you're honest about the business outcome and the patient outcome and you say what are the dependencies that must hold true for this outcome to occur? I think you have a better discussion around duty of care and you don't need to get dragged into all the weeds. But it does go back to the business unit owners, it goes back to the folks that actually touch the clients, and say, "Hey, you're involved in this. These people support your business outcome. They don't support giving you computers tomorrow."
Paul Mee:
I think that's spot on. So I guess from your vantage point now, what are the kind of two or three things that as organizations continue to go on this journey and continue to have an ambition to buy down the associated cyber risk?
Jason Crabtree:
So I think number one is risk is a consequence of dependence. Map your dependencies. Know them well and that's how you create that cyber culture that you were talking about. Number two, if you don't have visibility, that doesn't mean it's not happening.
Paul Mee:
Right.
Jason Crabtree:
And, again, I think this is an area where healthcare knows a lot about this. We spend a lot of money on imaging. So just because you can't see it doesn't mean it's not there. So make sure you have visibility. Don't go focus on fancy automation before you have visibility of all the kinds of things that you want to be able to make sure you can measure and make sure that's available for that hunt operation if you need it. And then the third layer is think really hard about these fundamental assumptions you're making. Authentication is at the core, especially as organizations move to a zero trust mindset, and people are still grappling what that means, but it really means 100% trust, not zero trust, it means 100% trust in your identity provider and all of your IT and security controls are all relying on it. So you really have to understand those types of fundamental assumptions that you're making around your security architecture and that's where you have to start with visibility.
I think the final thing to think about is when organizations are trying to build this operational capability, don't let policy exceed controls. Get visibility and do that really well with more manual processes and identify the parts that rob your people, your most valuable people, of time. Then start to embrace automation and get fancier and faster. We see organizations all the time that don't have basic process in place and they're over here trying to implement fancy SOAR stuff and the only thing they do is make bad decisions faster. And it's not that there isn't a lot of value in automation, it's not that there isn't value in a lot of those things, but nowhere you are in that life cycle. If you don't have visibility, you shouldn't be talking about auto magical stuff. You got to focus on the basics.
And that finally leads to the last one, which is stuff like asset management matters, but a really good asset management program isn't asset management for the IT leader, it's asset management for the business owner who's dependent on that service being operational. And that kind of ties us all the way back around.
So if you kind of go in those five things to think about what are the fundamentals that you're really driving in a program, I think organizations can really get a tremendous amount of improvement, that's improvement against attackers, not against an auditor who's coming in. Compliance and all that stuff? Yeah, you got to do that. There are important reasons why those standards exist, but frankly I think organizations can spend way too much time on standards in advance of do we have meaningful operational capability that links to the duty of care with which we're handling and ensuring our clients.
Having a tome of paperwork that doesn't relate to how you operate? That's the opposite of duty of care, to be honest. I think that's a fig leaf and I think there's too many organizations that have spent time and money doing that with while neglecting some of the operational basics. And we need to get out of that as a community.
Paul Mee:
No, I totally agree because you're describing the basics to get to what I would term operational resiliency whereby you have the ability to still provide care under challenging circumstances because you understand, as you described earlier, what your alternatives are. So this has been great, Jason. It's always a pleasure.
Jason Crabtree:
Always fun to catch up with you.
Jacqueline DiChiara:
Thanks for listening. Follow us on Twitter @owhealtheditor. You're welcome to subscribe so you'll be notified when a new episode goes by. Thanks for listening and we'll see you next time