S2 E11: Reducing the Risk of Internal Threats in Healthcare

Reducing the Risk of Internal Threats in Healthcare

Transcript

Jordan Eisner: So welcome to Compliance Pointers. I’m your host, Jordan Eisner, and I’m joined today again, I think my most regular guest, Carol Amick.

Hey, Carol. Good to have you back on.

Carol Amick: Hello. Good to be back here.

Jordan Eisner: I guess it’s because you just got all the good content.

So today, we’re actually, I think this will be a relatively short podcast. Who knows? 10, 15 minutes, we can see. It depends on how deep we get into some of these topics.

But as a reminder to our listeners, this is Compliance Pointers. This is where we talk about information security, information privacy, and regulatory compliance as it pertains to risk around certain sensitive data sets.

Carol is the director of our healthcare practice group here at Compliance Point. She has vast experience both on the consulting side, but also working with inside covered entities, inside compliance and risk management in the healthcare industry.

And so we’re going to be talking about PHI. We’re going to be talking about internal security threats, specifically in the healthcare industry, and even more specific within covered entities that could compromise an organization’s protection of PHI.

So let’s dive right in. Carol, there have been, I think, some recent examples and set of breaches that resulted in fines. You and I talked, two come to mind, one sizable penalty, just around $5 million, and then one a little smaller one, around a quarter million.

But let’s talk about both of those. Montefiore Medical Center, the first, and then Yakima Memorial Hospital. Give our listeners a little bit of example of these fines, what they were for, what the incident was, and then maybe we’ll talk about how companies can avoid these sort of issues.

Carol Amick: Thanks, Jordan.

So Montefiore is kind of an interesting scenario. They are a medical center that recently got one of the largest fines in several years from the Office of Civil Rights for the Department of Health and Human Services. And they were fined for almost $5 million. It’s the largest fine since 2021. And it’s only 12,000 records.

So a lot of times when you look at these really large fines, it’s because there are lots of records. It’s only 12,000 records. So obviously, the Office of Civil Rights felt like there was some negligence here.

The records were accessed by an employee of the medical center who was using this to basically facilitate identity theft. And the facility was actually notified that their data was on the dark web by the New York police. That’s obviously not the way you want to find out you have a breach and a problem, because that means you’re already way behind the situation.

The interesting thing about this facility is this is not the only time it’s happened to them. They also had in 2020, they reported another breach of 4,000 patient records by an employee who was using that data to do a billing scam, basically billing insurance companies for fraudulent care.

So obviously, there were some gaps there and probably why they wound up with a 4.75 million fine, which is fairly significant.

Jordan Eisner: Do you know how much that was? Maybe you don’t have this info top of mind, but when it was 4,000 patients, the one you mentioned before, do you know what that fine was?

Carol Amick: There has been no settlement on that yet. Move slow.

I mean, the one we’re talking about originally was May of 2015. The original breach was reported in May of 2015. So you can see there’s a long lag with the Office of Civil Rights sometimes between the time an issue is reported and you actually do the settlement.

The Yakima one was another breach. I think it was for, I don’t know exactly when this one occurred, but they had medical records that were accessible by the security guards. Now, the fine itself is not large, but it’s just kind of the concept that there are very few reasons the security guards would need to have a lot of medical record information, possibly just room numbers or something. But obviously, these people were in there doing more stuff and looking at more data than just that. So there were some problems.

I looked at the Verizon Business Data Breach Investigation Report, which is kind of an industry standard for a lot of things in cybersecurity on breaches. And they pointed out that 35% of the healthcare breaches in the last few years have been related to internal threats. We focus a lot on external threats. You and I have done a lot of podcasts on those, but the internal threat is increasing and something we can’t afford to overlook.

Jordan Eisner: And, you know, when I see different numbers, obviously the 4.75 is big, right? And you mentioned how big it is. I don’t want to, anything hurts, right? Anything not expected hurts, right?

But a lot of organizations, I can’t say that. I can’t speak with that sort of authority, but just me outside looking in, $240,000, it’s not crazy.

Carol Amick: You know, it’s not the worst find you’re going to see, but it’s a tip of the iceberg in your cost because, of course, they were 419 individuals. You had to notify the 419 individuals. So that’s a cost and an expense.

This made national news. It made local news. It then impacts your standing in your community, the trust that your patients, your providers, your doctors, your affiliated facilities have in your organization. And that can have a long-term bad impact on your bottom line.

I mean, depending on the size of the facility, healthcare does not have great margins. So while it doesn’t sound like a huge amount of money, it can actually really impact your ability to do something you were planning to do to improve your quality care, to improve your facilities, your operations, maybe a new product line or something next year because you just had to give up some of your excess income if you even had it to this product. So there’s a lot of downstream effects of having a small fund that you have to think about.

And I think the public relations side is sometimes in the behind the scenes. Having been under an investigation, not for a privacy thing, but for something else in my prior life, the amount of money we spent dealing with it, dwarfed the fine. And at the time, the fund was one of the largest ever done.

So you can see the problem there.

Jordan Eisner: Yeah, no, I think that’s good context. I think in some other realms, some other regulatory areas we cover, and maybe the fine is a few million dollars or a couple million dollars and doesn’t seem that much compared to an organization’s bottom line. But then when you take into account legal fees, opportunity cost of time, brand reputation, right, those things you start to add up.

So we’re going to get into some strategies organizations can take to prevent these sort of incidents, good monitoring strategies and so on.

But talk a little bit to a little bit more. You touched on just a little bit. Hey, there was 419 people that needed to be notified or that individual patient records that need to be notified because of the Yakima Memorial Hospital, right, breach. What for our listeners, what happens?

There’s a breach in summary, right? What’s an organization looking at that they’re going to have to do now because PHI has been breached? What onus is on the compliance department and whomever owns that? Obviously, you want to avoid it because you don’t want to breach the information, but what sort of workload does that mean then for the compliance folks?

Carol Amick: Well, one of the real challenges is you have a time limit. And once you report this to the Department of Health and Human Services as a breach, if it’s a significant breach, over 500 people, particularly you have a timeline. So you really got to scramble, to be honest, to figure out exactly how big your breach is once you find it.

So let’s go back to the bigger breach we talked about with the 12,000 patients. The New York Police Department contacted them in May of 2015. Really they should have been able to, the government, once you’ve been able to report to them what’s happening in about July, June, July, it’s very hard to do. It sounds easy, but it’s very hard to do to figure out what all had been best broken, what had gone wrong. Sometimes at 60 days, as we know from our clients, they’re still trying to figure out what has been breached. So that’s the other thing.

Once you’ve done that, no matter how big the breach, if it’s one patient or 12,000, you have to notify everybody affected via mail. It’s supposed to be a letter, so you’ve got to put it in mail.

I got one this couple weeks ago from a doctor’s office via email. Very nice effort, but that’s not what the regulation says and they have not complied and that could come back to cause them problems in a future investigation.

A lot of times, depending on what’s been breached, for example, with this billing issue, this identity theft issue, they’re probably also paying for credit monitoring and stuff for those 12,000 patients because you’ve got that basically what they’re saying is your identity was used to set up credit cards, et cetera, et cetera. So you’re going to have to pay for credit monitoring, which is going to be cheap for 12,000 patients for a couple of years. So it’s a snowball effect.

And then of course, you send this into the Office of Civil Rights and they send you a letter saying they’re going to investigate you and there’s a lot of expenses there. So you can tell how long it takes too. As I said, this New York thing was 2015. They settled in February of 2024, almost a decade, nine years. They were dealing with this. It takes the resources away from everything else you’re doing for nine years. And so it’s painful. It’s expensive.

Good lawyers are not cheap, as we all know, and you’re going to have to have one if you get into this situation.

Jordan Eisner: So that’s a good lead into some steps organizations can take to prevent incidents like these. What are policies and procedures you typically recommend?

Caol Amick: Well, a couple of things. You want to make sure that everybody knows what the appropriate use of protective information is. And so that needs to be documented and clearly defined for your workforce. What is the appropriate use? What is the inappropriate use? What’s going on?

You want to have a method for people to report anything they’re curious or suspicious about. It’s much better to get this from an internal person and stop it early than to hear from the New York Police Department, as I’m sure the hospital would tell you. So you want to have that. So you want to have some policies and procedures that kind of say that.

And you want to also kind of make it clearly understood your workforce what will happen if they do use PHI in an inappropriate manner. And that can include, depending on their role in the facility and their licensing, reporting them to a licensing board and getting their license suspended. I’ve known cases where nurses and others, doctors have actually had their licenses revoked in certain states because of inappropriate use of PHI. So you do want to, you know, they need to make sure everybody’s aware of all that, that, you know, it’s not idle threats.

I might lose my job. Well, yeah, if you lose your job and your license, you’re not going to be able to find another job. So, you know, you got to look at that.

So the policy procedures are good. I’m going to sound like a broken record because I know Jordan’s heard you and people who listen to these podcasts, but have heard me talk. But this really comes back once again to doing that good enterprise-wide risk assessment, Jordan.

When you’re doing the risk assessment, do not just focus on external risk. And I know in the current environment, as we record this, there’s an organization dealing with a major ransomware. We hear about ransomware on the news for healthcare organizations, CompliancePoint is, as a lot of you know, is headquartered in the Atlanta area and Bolton County, where we live, just went through a situation on ransomware recently. It’s in the news.

But don’t just focus on those external threats. When you’re doing your risk assessment, you need to start looking at your insider threats, your insider environment. What have you got? Do you have controls to prevent people from doing things inappropriately without collusion? In other words, working a couple of people working together so that, you know, I can’t just download all the data in this fear and go set up 12,000 fake identities.

Jordan Eisner: Yep. We talked about the risk assessment a lot. I was gonna say it surprises me how often I talk with organizations. That’s more often business associates. And maybe just don’t know, but I talk to a lot of covered entities that they are not regularly doing risk assessments.

Carol Amick: I saw all this recently, I went back and looked at the 2023 OCR enforcement actions and most of those were against covered entities. And 70 percent of those, the OCR cited the organization’s failure to do a risk assessment. I think it’s probably even higher in the business associate world. We talk to them, we consistently say that are if we do see it sometimes it’s very high level and very kind of rudimentary.

But I mean, 70 percent for a law that’s been in place. Going on and getting close to a couple of decades is not at the standard. And you know, you can tell from the enforcement actions, and the and this was called out in the in the Monte Fari $4.75m fine. No risk assessment. So, you know, it’s it’s once again, they’re saying you didn’t even try to identify gaps that could have cost this and therefore you’re more negligent and we want more money.

So you need this. And I know it’s not like a broken record on these podcasts, but you know, there we go. 70 percent of us still aren’t getting it right.

Jordan Eisner: I heard recently you got to tell somebody something three times for it to stick. So we’ll just keep doing it. We’re trying to help.

Policies, procedures, understood recommendations around those. Now what about monitoring strategies to spot this?

So I’ve heard a consultant one time. I’ve used it a lot. I like what he said. Not that these were hacking incidents. But they say, you know, hackers don’t hack policies. They don’t have procedures. They get into systems. So how can you monitor where maybe inappropriate access is happening? Just because they have policies and procedures in place doesn’t mean it’s not going to happen and everybody’s going to follow it. And then, of course, you’re going to have bad actors, too.

So what are some good strategies for monitoring inappropriate access to PHI?

Caol Amick: There are a couple of things. If you are a covered entity, you probably need to be looking at some tools to help you monitor this.

There are tools out there that will help you. If not, your IT shop can probably do some ad hoc where I worked before we didn’t have a tool. We did have an IT shop that did ad hoc monitoring. You know, I got a call one day saying somebody has gone into 100 records today alone. You know, they’re a doctor. What are they doing? You know, that kind of thing. So you want to have some tools to help you.

It’s going to be very hard to monitor this manually. It is going to require some system improvement and system integration. You want to have for anybody, healthcare, covered entity or a business associate, you need a logging and monitoring system that has some intrusion detection system. It also notifies you when there’s been a data move or a data copy or something like that. And it’s not that hard.

I will tell you, I got a call from our IT shop when I did a move of a bunch of files from my system out to a shared drive so they can make sure that I knew that was done and it was appropriate. So, you know, it’s not just PHI. I mean, obviously, we didn’t have any PHI, but that’s the kind of thing you’re looking for. Big file move, big file dump.

The other thing that I would recommend on this monitoring is kind of restricting the use of USB drives and other things. Make people ask for permission to do those so you know why they’re exporting data outside of your network onto a portable media and that you would think that’s an appropriate use of the data that they’re exporting.

You know, you don’t want people to be able to like this 12,000 charge, be able to just download a whole bunch of patient information onto a USB drive, take it home and start doing things with it. You want to know what’s going on. And they’re going to be legitimate reasons, research, et cetera, what people will need to download. But you want to have maybe locked out and have your IT shop open it, give passcodes, whatever tools you can buy.

You should have a logging system and you really should periodically review the logs. The logging system alone doesn’t get you there. I mean, if your logging system has been running for four years and it comes every time anybody rewrites, deletes PHI, but you’ve never scanned through it to see if anything looks unusual or done a dive into it or had a run a query on it, you probably are, you’re only halfway there. So you want to be sure you’re monitoring and you need to have it set up periodically as it’s going to go.

So that would be a couple of things that I would recommend on that side.

And make sure that you’ve given people, as I said earlier, a way to report unusual behavior or things that look unusual. Make it, if at all possible, give them an anonymous reporting, use a hotline, use something, make it anonymous.

And one of the things I do when we do audits sometimes is we will call to see if it’s anonymous. And if somebody calls me back, which has happened quite often, I call the hotline number and I get a call back going, Hey, Carol, what’d you need? That means it was not anonymous because you know it was me. I didn’t have a message.

That’s going to scare people off. They don’t want to turn in their coworkers and be ostracized. If they think something suspicious is going on, you know, Carol is suddenly driving a Bentley to work, where’d she get that money? What’s going on?

You know, maybe and she’s been talking about how she can help me get, use my current job to make more money. This doesn’t feel good to me. Maybe I should say something to compliance. Make sure they’ve got a way to do that without feeling like they’re going to be at risk of having some kind of retaliation.

Jordan Eisner: Well, that’s a good segue into another thing I was going to ask about, right? You talk about people and things they can do. You shared a stat earlier, I think 70% of the fines mentioned something about a risk assessment. I’ve heard before using 70% that 70% of cyber incidents are due to employees, you know, being phished, being smished, something in that degree. So how can you strengthen your security training program to make employees more aware?

Carol Amick Yeah, and that’s, you know, when I think about insider threat, I think about an employee kind of doing something intentionally. There’s also, as you call it, the insider threat of somebody being phished and not realizing it, not reporting it. This is one where I think training is critical.

One of the things I recently saw, which kind of shocked me was that in 2021 KnowBe4, which is a big training vendor for healthcare, did a survey and 21% of the employees said, well, I’ve never been trained on healthcare security, security and PHI. I was just kind of like, excuse me, where are we?

And so you’ve got to train and you’ve got to train them to understand what’s going on with the risks are to the environment. And I would say you need to do the phishing training and the phishing training is not, look, here’s an example. And no, I mean, I know the training we get a lot of times they show you this email that’s so poorly written, you’re just laughing. You’re like, well, of course this isn’t real. Cause you know, they’re typos in it and they’re saying, so you look for typos.

Well, the fishing people have gone beyond making stupid mistakes like that. Their emails look just like the email you’re getting from your CEO. When I was at a hospital here in Atlanta, I got one from the CEO and everything was perfect except the last, it was from the company name .net. That was the only thing wrong was that .net. That was not our web, our address.

And you know, I did kind of look at it when it came in going, why would she contact me directly? That should be your first clue is, you know, and we’ve seen that in our company. You know, no, the CEO did not contact, you know, one of the guys who started working with me first day he sends me a message going, why did the CEO send me an email? I go, I don’t think he did. I mean, you put on LinkedIn that you’d started here and somebody tried to phish you.

And so, you know, make sure your phishing campaign is not just a class, but it’s a real-life training. They’re actually doing it.

Jordan Eisner: I’d say for our listeners, you know, well, Carol, you back me up on this when I’m talking with organizations and their concerns around HIPAA, I usually and if they’re not already doing it, security awareness training and HIPAA awareness training and both are just so relatively affordable. You know, per user, per seat sort of thing, whether it’s KnowBe4 or other training partners out there, we partnered with KnowBe4 and some others, but it’s just cost shouldn’t be an excuse. ROI, I think is huge.

Carol Amick: If you’re a small organization, you can even do it, you know, do it in person.

But yeah, you’ve got to do it. Make sure you’ve got everybody knows, you know, it’s basic stuff. Even the email and I see it on most of my clients do have this because I can see when the emails come back. When I send an email, they get the banner at the top of the page that says this email came from an external source. Exercise caution. That’s another way.

And there are a lot better email filters out there now to help you with that. But it’s not just emails.

Make sure they don’t answer the phone. I mean, I got a phone call one day from somebody who claimed that they were from Microsoft and my IT shop had told them they had outsourced some work on that. They needed me to do something on the laptop. Now I hung up. I was like, you got to be kidding. But they were very professional sounding and I can see where, you know, would be easy for somebody to fall for that. And okay, now what’s your password?

Jordan Eisner: They’re only getting better.

Carol Amick: Yeah. They kind of quit using email as much. They’re texting you. They’re calling you. They’re no longer doing the traditional fishing because we’ve started to catch on. So you’ve got to stay on top of that.

And then you’ve got to watch. That’s part of your monitoring, though. All of a sudden, there’s a lot of activity. Somebody may have let somebody in who shouldn’t be there. And it may not be the employee. When you go in to see why, you know, Jordan is downloading all this PHI, it may not be Jordan. Somebody could have just gotten Jordan’s credentials and they’re having a good time at Jordan’s expense.

Jordan Eisner: So yeah, look, this wasn’t as short and sweet as I thought. This was sweet. I think this was really good information. I think we covered more than I anticipated, but that’s good.

I think to wrap it up for the listeners and Carol, thanks again for your time. This was a meaningful podcast, as always. Build policies and procedures. Implement those policies and procedures. Train your staff on those policies and procedures and the risks from a security standpoint, from a breach standpoint, from internal mistakes or internal bad actors. And invest in some monitoring for these activities, too, so you can nip it in the bud, you know, and get in front of some of these things and avoid the 10-year wait cycle on a fine and the notification that the breach has occurred and the legal fees and the opportunity costs and the brands. So very, very insightful stuff from somebody who’s lived it.

Carol Amick: Yeah. And I will say, you know, it’s always better if you’re the one reporting it to the government and you found it and you dealt with it, then, you know, the New York Police Department are calling you because at that point you’re just you’re yeah,  I’m sure there are class action lawsuits over this.

Jordan Eisner: So get in front of it.

All right, Carol. Well, thanks again. Have a great rest of your day to our to our listeners, same thing to you. Have a great rest of your day or evening or whatever point in time it is you’re listening to this podcast.

If you have questions about HIPAA, whether it’s what’s a good tool for monitoring, what policies and procedures should I have? How do I make sure they’re being followed? How can I get my employees trained? How can I avoid these things? We just had a breach. We need help with it. Reach out. We’re here to help.

CompliancePoint.com, email our distribution email, which is connect@compliancepoint.com.

You can book meetings with us online on our website. You can reach out to Carol and myself on LinkedIn. We’re here, right? We want to fill the questions and we want to be of service if we can. So use any of those channels to reach out to us.

Thank you for listening and we’ll be back with more content and another episode.

Let us help you identify any information security risks or compliance gaps that may be threatening your business or its valued data assets. Businesses in every industry face scrutiny for how they handle sensitive data including customer and prospect information.