The Audit - Cybersecurity Podcast
Brought to you by IT Audit Labs. Trusted cyber security experts and their guests discuss common security threats, threat actor techniques and other industry topics. IT Audit Labs provides organizations with the leverage of a network of partners and specialists suited for your needs.
We are experts at assessing security risk and compliance, while providing administrative and technical controls to improve our clients’ data security. Our threat assessments find the soft spots before the bad guys do, identifying likelihood and impact, while our security control assessments rank the level of maturity relative to the size of the organization.
The Audit - Cybersecurity Podcast
Red Team Warfare: A Navy Cyber Officer's Inside Look at Military Offensive Operations
What if your security team is playing defense while hackers play offense 24/7? Foster Davis, former Navy cyber warfare officer and founder of BreachBits, breaks down why traditional penetration tests become obsolete in weeks—and how continuous red teaming changes the game. From hunting pirates in the Indian Ocean to defending critical infrastructure, Foster shares hard-earned lessons about adversarial thinking, operational risk management, and why the junior person in the room might spot your biggest vulnerability.
What You'll Learn:
- Why red teaming creates psychological advantages penetration testing can't match
- How operational risk management translates technical findings into executive action
- The real cost of point-in-time security assessments (hint: ask St. Paul, Minnesota)
- Military-grade frameworks for continuous threat simulation in civilian organizations
- Why attackers operate 365 days a year—but most organizations test once
Don't let your organization become another headline. Security teams need to think like attackers, not just defenders. Subscribe for more conversations that challenge conventional cybersecurity thinking.
#RedTeam #CybersecurityStrategy #PenetrationTesting #MilitaryCyber #ThreatHunting #InfoSec
You're listening to the audit presented by IT Audit Labs. My name is Joshua Schmidt, your co-host and producer. Today we're joined by Eric Brown and Nick Mellum with IT Audit Labs. And our guest today is Foster Davis of BreachBits. Thanks so much for joining us today, Foster. How are you doing? Yeah, thanks for having me, guys. Looking forward to it. Absolutely. So we'd like to hear a little bit of background from you and your experience. You're a naval officer as well as a tech entrepreneur. That's right.
SPEAKER_02:Well, at least in my former life, uh at least uh recently left the reserve, but before I did uh served for about 15 years active duty uh U.S. Navy, uh cyber warfare cryptologic officer, as they as they were calling it as I left, um, and spent about half the time in the service fighting pirates on the high seas, and the second half working uh spooky missions in dark buildings and uh red teaming, of course.
SPEAKER_00:Awesome. So we've seen these traditional red team engagements, it cost a fortune, and you kind of mentioned that a lot of the times it become outdated within weeks.
SPEAKER_02:I think from from our end, it's really important that you have an outside look on your security posture and what you're doing. There's I think two reasons for that. One is technical, because those things need to be looked at. As we used to say in the Navy, um inspect what you expect. The second reason though is it's psychological. You know, there's this um there's this blindness that you get from looking at your own situation, and that could be of course we're talking about security, but that could be everywhere in life. You know, sometimes you've ever had that your best friend is the one that will tell you you've got, you know, some mustard on your face or something like that. So that has just always that fascinated me um from the first time I ever did red teaming. So there's the so there's the part about red teaming itself and that it should be done at some point, and then there's the magic of what if we could do this process continuously or near continuously, and and why would that even matter? And I would say that's because attackers are sure coming after you all the time. And so why do something once a year when the other 365, you know, the attackers are? Um so that's kind of how we approach it.
SPEAKER_01:That outside in perspective really brings to light a lot of the things that the internal teams already know and say, but then when you have that that essentially certification from an outside resource who says the same thing, packages it up, and and puts it into a format that's digestible, it seems to be better accepted and listened to by the people that make decisions in the organization. And we see that too coming in as consultants. We could be saying the exact same thing that the internal team has. And when we've gone in to do these engagements, in the past, the internal team who's usually partnered with us to do the engagement will just tell us like all of the areas that they have been trying to get attention on, local admin or whatever it is, so that they can make sure that that those things appear in the report as well.
SPEAKER_00:Aaron Powell What is it inherent about these red team engagements that make them go obsolete so quickly? And then what have you both Eric and and Foster have you seen to remediate that that timeliness of those those assessments?
SPEAKER_02:I think what what we've seen is that the speed with which attackers develop, it could be new tactics or it could just be variations on tactics that are well tried and true. Um but sort of I'd say the the life cycle of the attacker's reattack or their um next try, couple that with the fact that IT infrastructure's attack services are always changing, and there's a there's a challenge to keep up with that. I mean put those two together, and now you have a situation where if you aren't looking often, it's almost like you're not looking at all.
SPEAKER_01:Um Yeah, and Nick, you and I have been been faced with this recently here in the local environment related to the breach that's going on in in the Minnesota, and in particular the the St. Paul area uh where there was uh an encryption event that occurred. And uh the the localities around here are just scrambling to get their cyber um really to have a focus on that because it wasn't just one breach, but it was a few different breaches at a couple of different municipalities. And it just it shines the light on the fact that point-in-time hygiene is is really not what works. It it has to be that continual investment in people, place, or or technology and process, because you can't just say, hey, we're gonna spend a million dollars this year and then we're gonna forget about it for five years, which unfortunately is what we see all too often. So you do need that continual reminder. And and Nick, have you you've been living in this uh for a while. What are you seeing?
SPEAKER_03:Well, yeah, and uh kind of I guess around this whole topic is I think uh when you use that, I feel it's kind of like chasing your tail, right? They're chasing the tail, they're doing the same thing, and they're doing sometimes either for compliance reasons because they own a set of data, maybe CGIS or PCI, whatever it is, and they're doing it just enough to meet those compliance needs. But to foster his point, they're doing it once every year or whatever it is. In those other 364, they're not doing anything, and then that loop comes back around, they're chasing their tail. And then that's when that breach happens. And this is what we're seeing at St. Paul. We're not necessarily hands-on with St. Paul right now, but we've gone into that defensive posture at one of the accounts uh to make sure that there's no issues. Um, so we're well defended, but we're seeing maybe at the at St. Paul that things could have been done differently. And I think that's probably the case for most anybody that goes through a breach, you know, hindsight's 2020, they could have done some things different.
SPEAKER_00:So, Foster, I'm curious if you could give us your take on what a red team engagement looks like. Certainly.
SPEAKER_02:Um I like to start by thinking as to why we even call it red teaming. Um, I've heard different stories around this, but one that I like is it's sort of like if you are on the football team and you're preparing for the championship, what are you gonna do? You're gonna throw half the players a red penny, a red pennant jersey. They're gonna play the play the other team. And they're probab maybe even run some of the plays that the that the other team is known for running to see if your team, how well you're gonna do against it. And even just making a statement like that, or this I kind of brought up earlier about I see it as as two halves. There's the technical part of it, but then there's the psychology. And so where I would say is start with a psychology of what are we here to do. The psychology here is that there's a team of people who are inside who are working very hard every day to accomplish a mission. And what we need to do is we need to have a group of people that are outside of that, who will pretend to be um bad guys, nefarious, call it what you want. And and once you can, if you start the engagement by just saying we're gonna have two separate teams, just if you just even start with that, and and then later you can have different levels of technical ability and things like that, but start by keeping it a bit separate. And so what you're gonna do with Red Team is you're gonna have this separate team who's gonna think different. They're not gonna communicate or trade information with uh the blue team, as we would say, the the good guys, the internal guys, and we're gonna have them be separate and essentially plan, you know, spy versus spy or a chess match, whatever you want to think about it, and start to plan how you're going to achieve an attack on an organization. And even then, right there, even if just you did it on paper, um, the goals of the organization might be X, Y, and Z. The posture and protections of that organization may also be here. But already that red team is thinking in ways about how to skirt around or how to approach that organization to get what they want. Because what the attackers want isn't always what the defenders think they want. So, in this, in setting up this framing, you allow this red team to go through that independent thought of what needs to be done. And then, you know, and so then when you start an engagement or you start the start the scrimmage, start the clock, um, you uh allow each team to s to put up their best um their best effort. And then periodically, what you're gonna do is take a break. You're each team will talk about the things that they noticed, and they tell them from their perspective. And it's one of those engagements where there's no right answer, there's no wrong answer, it just is. It's just what was observed. And if both teams can um equally, mutually respect what each other team is seeing, um they're what each what each team is seeing is absolutely correct and that that's what they observe. And then if you put some programming around it such that um you're led through a discussion to understand that what somebody saw is what they saw, but we can also reconcile what is actually correct. And then for the blue team, and did that particular issue even matter towards the mission of the company? So some of those things all put together, and then of course, a lot of the red team guys are going to be technical experts. They're gonna be maybe they've done some hacking in the past, maybe they've done penetration testing. And so that if you do all that together in a very deliberate way, now you're you are sort of evolving past uh what I might call a discrete penetration test into this red teaming methodology, this mentality that what the hacker believes could be true is very, very important to know as you're trying to defend that.
SPEAKER_03:Yeah, I I certainly agree too. And I think a lot a lot of the ways we're thinking about this at IT out labs is shifting focus to the the not reactionary, but we're playing the offensive game. We're not waiting for a breach to happen. We're training as if it did happen and getting people to shift into that mindset of not just playing defense and reactionary. And I think we just see that across the board. People just live in that space of waiting for something to happen. They do what they do. I don't know if I necessarily want to say the bare minimum, but they're getting by, right? It hasn't happened yet, um, or it hasn't been reported yet, at least, that something's happening. But I think a lot of it is a culture shift, you know, to get people to think about these things, how it could happen, where their weak points might be, and to continue to do these, put these efforts forward to find that, those issues before they come problem.
SPEAKER_01:You know, Nick, it it's it's we're we're experiencing it right now, right? Where never waste a a good breach. And that's kind of like the the internal uh messaging to take advantage of for the IT and cyber teams to get the funding or to bring on the resources that they need to accomplish the items they've identified as areas to improve in the organization. But it's unfortunate that it it comes down to that, right? You know, look at St. Paul as a as an excellent case study. This thing is taking them offline um for weeks, if not months. It's gonna cost tens of millions of dollars, probably upwards of 50 million, to resolve when had they just had the right tooling in place, the the right commitment financially from the organization in that environment, maybe spent you know, a tenth of that on proper tooling, education, resourcing up front. It wouldn't be in the situation they're in. But what we see this time and time again, it's like people just don't learn. Baltimore County School District, huge breach. They weren't investing in cyber beforehand. Then you know they have this event, and not only do they have the cleanup and the impact to the students and teachers and community, now they've got to reinvest in tools. And the money's got to come from somewhere. It's unbudgeted, so they've got to pull it from other resources, and the whole thing is disruptive. But having this continual way of talking about cyber, getting it up to the board level where the board has that responsibility to ask, what are we doing about cyber? Are we doing enough? Do we have the right people in place? Are we having the right conversations? You know, let's hear from the cyber teams themselves. Let's not filter it through the reporting structure of, you know, this person reports to this person, this person reports to somebody else that, you know, can't even spell cyber, and they're responsible for it, but they don't know what it is. Let's get the actual teams to the board level having those conversations and presenting to the leadership of whatever the entity is, of where their true risk is. And I think you know, tools like we're talking about today can help with that from a continual perspective of looking at the organization as a whole, that that hacker mentality. If we're if if we were in college or high school or even at the professional level playing another team, we're going to be watching game footage in that football scenario. We're going to be studying those other players, which is exactly what the threat actors are doing against all of our organizations, and especially in the public space. It's public information what tools they bought years ago because it's public funding. So all of those records are available. And it's just another one-up for the threat actors to be able to learn that environment and specifically focus on breaches that might impact the tools that that environment's using. So it's it's just, you know, just circling back, it's it is just frustrating that we get into this um cycle of do nothing, breach, do everything, then do nothing, and then the cycle just repeats.
SPEAKER_00:So, how do these engagements look different from the civilian side of things or what we see in the private or public sector versus what your experience is in the military? I'd I'm really curious to hear um how these red team engagements go down, and and even if you could share, you know, what some of the goals are or the takeaways, or even some of the particulars of those engagements, what those might look like in a Navy situation or a naval situation or a military situation. No, that's a very interesting question.
SPEAKER_02:And it it the first thing that popped to my mind when you said that was the there's a stark difference. One of the stark differences we noticed is uh we thought it was difficult to run red teaming exercises when we were in the military. We thought it was difficult getting people on board. We thought it was tough. Um, and we had admirals commanding other admirals to do it. We had, you know, forces of law essentially commanding, commanding people to do it, in the real sense of that word, commanding people to do it, and it was still very tough. And you still had to get people together, and you you had to have people have mutual empathy and understanding, and you had to do really tough things that people would just rather focus on other things. Um boy, was it a lot even tougher when there was not a literal command from the commanding officer to do something. Now there are organizations where the uh chief executives or the C level or the board understand, perhaps implicitly, or perhaps before we got there, they understood this concept of why red teaming, as compared to other types of testing, but this this adversarial approach to testing, um why it is superior and knew that that's what needed to be done, and people tended to fall in line and we could get things done. Um I found that when there was a lot of education needed in helping people understand, especially in a market where the concept of penetration testing is is well understood and has its place and is important, um, but where there are other options to find out what you think is wrong, um, it became even harder to suggest, well, what we really need to do is the pinnacle practice here, and that's red teaming. And um so we so that was a major difference. Um before in in the military, when I was conducting red teaming across our different geographic fleets, these were very large exercises with very large uh command structures. The reason that that was able to happen was because we had um the two admirals that I worked with were bought into the idea from the beginning. And it actually came from not cyber, it came from war gaming. It came from, you know, the first admiral who who we did it with uh or did these red teaming exercises with at a fleet level, he was not a cyber guy. He was actually a ship, what we called a um a ship fighter, a surface fighter guy. And um and he saw that the wisdom carried over from this other practice. And so that I think that was the that's really the major difference I think that comes to mind. Uh, I think that there are um ways that incentivize people in the private sector. And that's what we've been really fascinated with is what incentivizes what we security folks know is good behavior. I'll just kind of put it in that way. Um what does incentivize people? And I think it's really equally important to understand when business owners don't want to do a test continuously, when they don't want to do a test at all, uh, when they say they have no reason to do it, when they say that it can't happen to me, when they when the money only unlocks because there was a recent breach somewhere nearby, I think it's really important to understand because those are very valid ways of thinking. And what we've learned since 2018 is how to understand and unlock those those ways that the private sector needs to operate. Um, and and of course, a big one that um that that we found are compliance, of course, has its place, uh, but insurance, um, because it's a completely voluntary scheme, uh, and yet it drives to the mission of the organization.
SPEAKER_00:Well, other than breaches, what have you seen work in tying uh untying those purse strings or loosening up those purse strings to actually create some some change in an organization with their security posture?
SPEAKER_01:So we often get brought into an organization to come in and perform a leadership role in that organization. So we have a seat at that strategic table, which is really helpful. But I I view my role coming in um as that cyber advocate. So as quickly as we can, we try to move up the organization from an educational standpoint. So we're speaking with the decision makers that are leading that organization, if it's a board or a council or an executive team, whatever that looks like, but then bringing cyber to them in a very tangible way, because a lot of what we do behind the scenes really is a lot of uh it's it can be complicated, right? You know, you start talking in jargon about IOCs and and threats and things like that that aren't really tangible. But if you can bring that to a level that is understandable at the personal level with the decision makers, uh so maybe it's around email security or maybe it's around physical security, you're likely some somebody in their family has had a cyber event in the past. You know, maybe, maybe their credit has had an issue from credit theft, or they know somebody that's been impacted. But being able to translate that personal interaction into what's happening in the workplace, just the other day I had the opportunity to present to a large group at one of our customers, and we were going through reviewing here's all of the inbound emails that come into the organization in a month. Yes, millions of emails. But inside that, all of those emails that are coming in, the filters that are in place are stripping away all the bad emails, the malware, the phishing, and you're left with a much smaller amount. So diving into that and just using some of these key aha moments to show that we're under attack 24 by seven, and what does that mean to the organization? You know, who's being fished in the organization? Who are the attacked people and why? And I think once you bring that to a tangible level and can actually show this is what's going on. And a year ago or two years ago, before you had any of this protection in place, this was all hitting your organization. And of course, somebody is going to click on something and enter creds. And at that point, it's it's pretty much a waiting game until the threat actors uh perform an encryption event or steal some data, and and then you're you're in front of the news camera.
SPEAKER_00:I guess we all want to hear like uh, you know, kind of like this movie kind of Hollywood stories. But if you have anything you would like to share that's kind of real-world action, I think that'd be really cool for our listeners.
SPEAKER_02:No, that's a that's a good question, and and I think it'll be a good it it actually draws on something Eric brought up, which which I really uh agree with with with the way you put that, Eric. So um so then to so Joshua, you know, you you how does the military go about these things? And I think that something that of course we were a part of and we witnessed and later I found out uh I took for granted when we came to the private sector. In the military, we have a very good process of when you are doing these, let's call them war games for simplicity, but when you are thinking about how to go against your adversary. So a story that um uh I'll tell a story from one of the red teaming operations we did. Um I will not tell you where it was, I won't tell you the specific vulnerabilities, anything specific, I do say I'm just making up for illustration. Okay, so all the caveats are out there, right? I've got lifetime obligations like I'm sure Nick and others do uh to protect some of that information. So um so this this one though, I think gets to something Eric uh in a way alluded to. In this story, what happened is one of our most junior people, one of our uh one of our most junior red teamers. So this would be a hacker. Uh this is an active duty service member. Um she was on our red team, and she found, she's one of the most junior people, but found one of the biggest issues. And then, and then um, so let me start to tell the story. So we were in this theater of operations. Um there were uh several different uh military units, probably a dozen Navy units. There were some Army and Air Force units there. Um this was an area where they was a base where they conducted a lot of operations in a very large area, um, anywhere from keeping people fed to keeping people a place to sleep to resupplying them, doing reconnaissance missions, and also what we would politely call kinetic missions uh as well in this area. So we, you know, on day one, we walk in and my team, my red team, and uh, I say uh Admiral, in about uh six weeks, my team is going to come and we are going to try to completely take down your network while you're doing live operations. And he says, you know, like hecky are commander, what the heck are you thinking you're gonna do? And I say, well, hear me out, Admiral, hear me out. Here's how we're gonna do it. Here's how it's gonna be done safely. Here's how it will, here's the type of ways we'll communicate. Here's the here's the referee can blow the whistle. If this if this happens, so don't worry, we will not uh put anybody in particular danger, but we what we will do is we are going to test the five layers of redundancy you have in your systems to can conduct your operations. So that was sort of the situation we were in. Uh our team, um we were the red team. Uh our team was uh we we did a lot of our preparation work, uh and we we got to let's call it a halfway point. Let's say it was halftime. At halftime, we were we we paused everything and we had sort of a halftime check. And we were telling the admiral um a lot of issues that were going on, um, some of which were minor, but a couple that we really needed him to pay attention to. And honestly, we were in the room with a lot of the uh very senior people. And tell me if you've ever seen this. You just see a glaze go over the eye. We had we on the red team had done some of the most incredible work that I think had been done in this practice in the military ever. Okay, and that's not necessarily me. I'm just I'm talking about the skill of the of the service members that we had who really knew what they were doing. These were very good white hat hackers. They definitely knew what they were doing. We had done some incredible things and yet we saw the glaze. And we all, okay, that was halftime. We all go back and I say, guys, we've got to figure out what's going on because they didn't notice this particular issue that we came up with. Um so then what happened is we said, look, we have to describe this, we have to communicate this in a way that the business owner, the admiral, could understand. And that's where we said, what really has to happen here is we have to tightly couple red teaming with risk management principles. So the next time we came back, um, everything we told them, um, we said, here's the bad thing that you don't want to happen. Here is the we've quantified this risk in terms of how likely is it to happen and how dangerous would it be if it were to happen. And we used a risk chart. Nick, you may know this as operational risk management. Yep. We used operational risk management, which is the same language that I told you we had a uh a shipfighter admiral, we had we're we're talking to pilots, we're talking to submariners, we're talking to the army, all these guys, they all use operational risk management to know the here's the bad thing that could happen because it highlights to the leadership here's something that actually should be resourced right away. We now showed, let's say, the same data to the admirals, but now there were these several red dots, and the glaze that was there immediately disappears. Sonny, what's that right there? What's this big red thing right there? What are you telling me about here? Are you saying there's a yes, Admiral, there's an issue? And right here I've got um, you know, petty officer Smith, and uh and and she's going to tell you what they found. Why is it why is it this 20-year-old petty officer Smith who's coming to tell me about this? This is with this part about the junior person. Oh, why is this here? I said, well, because she's actually the smartest person in the room right now about this particular issue. She's gonna tell you exactly why this was an issue. And and so we were kind of having a bit of fun with it all, but but the point there was that we got people to break down the walls. We broke down as security people, we broke down our wall of just wanting to tell people a table of data, you know, and push our glasses up. And we broke down that to give it in a way that they like to hear it, and they broke down their walls, understanding that, hey, actually a 20-something in the room might be the person that we need to listen to the most. Why, Admiral? Why? Because this thing that she found ties back to this absolutely critical item that you told the president that you were going to make sure uh needed to be done. And so, needless to say, you know, she uh she she got lots of um lots of uh accolades for that. At the end of the day, I think we even awarded this this sailor a medal just for that specific uh item that she was able to be found.
SPEAKER_03:It it sounds exactly like a military engagement. You get set senior leaders that aren't they don't like hearing from a younger eva maybe even officer, but when you go to the enlisted area, I've got so many things that are bopping into my head now about this. It has nothing to do with cyber warfare, but I'll recall and really quickly, we did a security engagement, physical security in in the kinetic uh mindset for the anniversary of Iwo Jima uh for Mount Sara at Mount Sarabachi, where I think if I recall correctly, it was one of the long last uh Japanese military members. I think he was an officer in World War II uh during the Battle of Iwo Jima. This is 15 years ago. And uh we went out there to provide security. We went to Mount Sarabachi because the dying witches of this officer was to have his ashes spread on the beach um or on the island. So we went out there and uh I went to Mount Sarabachi, came back one day. I was one of Five Marines on this ship with the Navy. And uh I was only I was an E4 at the time, and it was me and one of my E3s. We're the only two Marines on the ship. And we were running comms. Comms happen to go down. Of course, it always does. It goes down. I had on a hook with the captain of the ship. I don't remember recall his rank. And I'm talking to this guy at an E4. And I'm like, sir, you know, whatever I did. I was like, we gotta move the ship. Whatever is happening right now, we can't get comms. And he's like, son, what rank are you? I'm like, uh I'm a corporal. And he's like, what rank is that in the ring card? I was like, E4. He's like, gotta be shitting me. And E4 is telling me to move my ship. And I was like, I'm sorry, sir. Whatever it was, he laughed, but it just made me remember just all these times hearing from Laura enlisted how that must have felt for her.
SPEAKER_00:Eric, have you seen any of that in in your sector? Just uh maybe some young guns coming in with some information that kind of changes the game.
SPEAKER_01:We've got a young guy on the team now that has he joined us about uh uh seven months or so ago, just through a random encounter at a um at a security conference that we were at. And he came to a couple of our game nights and and just really enthusiastic uh individual. And as we got to know him, it's like, wow, this this guy's genius. You know, we the more we threw at him, the more he's able to absorb, and he's bringing in all of these ideas. And it it you know, for me, it was it was really cool to have somebody with that energy and that ability to absorb information and and do something tangible with it. So it's it's kind of like I gotta rein them back in in some areas where it could just you know go way off into the into the toolies with something that might not be 100% relevant and with our customers, but it's just you know, it's just really refreshing to be able to interact with the younger people who are coming up that you know don't know what a dial-up modem is, and all of these things that we just kind of take for granted, but yet you know, here they are you know, kind of coming out of the womb with uh with AI. So I I just get really excited about working with people that are you know maybe maybe a little bit different intellectually, and how do we work with that as a as a team? And you know, maybe they don't have the best presentation skills, but how do we you know how do we bring that that forward and kind of round out presentation skills, but then get the other team members excited about this new technology? I mean, we're we're interfacing with customers now that are still pushing back. Given the Heisman on AI, it's like, wow, you guys just don't get it. Like it is here, it is coming, whether you like it or not, it you cannot escape it. And if you continue to try to escape it, you're just gonna get shuffled further and further back. People are gonna leave the organization. You know, it's just a everything that we all know, um, it's just it's shocking the meetings that we're in arguing about what sort of data some AI tool has about you that you know is probably already replicated 500 times out there somewhere else. Like nobody gives a shit about your data. To me, and Nick, I know you're gonna appreciate it, Josh, too. It to me, it seems like yes, the piracy is a problem, but it seems like it's a problem that could be solved with some 556 NATO and you know really not be a persistent problem. You know, back with the the movie with Tom Hanks, where you know he was a captain and you know, all that sort of stuff. But like I'm surprised that it's still happening. I'm also surprised that the deterrence that like, okay, we're spraying, we're washing them with some hoses, hoping that's gonna stop the problem. But pirates in a rubber dinghy, I mean, let's solve the problem permanently.
SPEAKER_02:And depending if we want to go down the topic, it it actually I hadn't thought about it until you you break bring this up. But you know, when I talk to people about piracy, they have a similar reaction of the like, what that actually happens? What are we talking about? Jack Sparrow coming down here? And like, um, and and I would have said the same thing before I was in the Indian Ocean being briefed by um the intelligence operations there. And the the short of this is that the same reason people don't understand pirate like how could that even happen? Yes. It's because you don't understand what the pirates are going through in their life and in their circumstances, you don't understand how easy it is to do certain types of attacks on ships, which is a very appropriate concept to the same these are the same people who perhaps might say, well, hackers aren't here to get me. I mean, did you know that it's actually with a canoe, an outboard engine, and um and after the monsoon season when the ocean in the Indian Ocean is flat, did you know you can go 50 miles an hour in a canoe and all you need is a machine gun? And because most um vessels, due to the laws of their nation, are not allowed to have guns on board, there you go. Now now did you know that so that's a that's a C V E right there in terms of piracy. The C V E is canoe plus machine gun plus calm water equals hack.
unknown:Right.
SPEAKER_02:You know, and and but but once you get to once we do did piracy and we understood what these pirates are going through, and there's a whole other, obviously, there's a whole other element to this as to why even people would engage with it, but once you actually see what their lifestyle is like and what they're and what's going on there, it actually makes perfect sense. And and so that's the benefit of the red team mentality and the methodology is that things that would make no sense to you, now you have some 20-something using AI, um, doing something completely away that you would have never done it, and now, oh my gosh, actually, all of a sudden, that that completely um destroys the mission you were trying to accomplish.
SPEAKER_03:Yeah, it's a great, great topic to get into. I think we all appreciate it. I do want to ask one more question about cyber before we keep going to the military thing. We're I guess a good place to start, Foster, was you were talking about the engagement with the junior enlisted finding the issue. So let's say that's completed, right? You've briefed the commanding and the commander of the ship. What does the remediation look like after that in the military?
SPEAKER_02:I'll tell you the right way and the wrong way, and there's direct analogies to the private sector. Um, let me start with the wrong way. The wrong way, which happens occasionally, you know, um in the military, um, but certainly um we see in the private sector, here's the wrong way. Here's your report. We've given you the brief, we've told you the issues, we'll see you later, we'll see you in a year. Um let's take that report and let's put it in a filing cabinet.
SPEAKER_01:Yeah.
SPEAKER_02:And and it's human nature, it makes sense. It's it there's lots of reasons to do that. We've got a lot of things going on, right? We've got a lot of things in our lives going on, and we're kind of just trying to get home to make dinner with our kids or or whoever we spend our time with. So the second part of that, which would be, I think, the reason that's the wrong way is because what happens is there's not an imperative to do something about it. There's no follow-up, there's no corrective action. The fact that it's buried and not seen often, I would say is the prime root cause or prime one of the root issues as to when things go wrong. So what's the stark difference? In the military, when it would write, when it would go right, is this admiral I was talking about, he said, okay, Davis, you take that operational risk management chart that you made, and we're just gonna put this in front of everybody every week. And I want all the commanders and I want all the division heads to come tell me every week how we're doing against this picture of Skittles right here. And I don't have to be uh the admiral. I don't, he says, I don't have to be uh an expert. I just know that Skittles that are red are bad. Get rid of them. But what he did was every week, every cycle, everybody was up there explaining as to what did or did not, and it was independently verified, you know, by the red team. Yep, that ruts that that dot's gone. Um and similar things though in the private sector. If there is an ethos or strong leadership that is telling people we cannot just forget about it, that's where we see it go well.
SPEAKER_03:Yeah, it so it sounds they're very pretty similar. The remediation similar. Yeah, very similar. I mean, that's really how we're operating too at IIT Addile. I was like, we we'll want to do an engagement where we're with you from the start and then we finish the engagement, but then we stick with you throughout the remediation process. We don't want to leave the pile of papers on your desk and say, thanks to the check, you know, we'll we'll see you next year. Like, well, let's stay with you throughout the year and make sure everything is good and and remediate these processes that have been broken.
SPEAKER_02:Yeah. Yeah. Continuous. I mean, that gets to why we're putting all of our everything we are is is putting into the idea of let's do this continuously, let's make sure AI isn't incorporated in the right spot, let's make it easy so that the easy choice is to do the right thing, and so it's continuously brought up. And uh, and as long as you reduce, as long as you eliminate false positives, because that'll kill you. As long as you can eliminate false positives, that we've seen is the way.
SPEAKER_00:Thank you so much for your time today, Foster Davis with Breachbits. You can check them out on LinkedIn. Or is there any other places you'd like to point our guests if they wanted to learn more?
SPEAKER_02:Breachbits.com or see us uh when we travel to London and the uh financial district.
SPEAKER_00:Awesome. We'd love to stay in touch with you guys and uh stay abreast of what you're working on. And you've been listening to the audit presented by IT Audit Labs. My name is Joshua Schmidt, your co-host and producer. Today our guest was Foster Davis with BreachBits. We also have Eric Brown, managing managing director at IT Audit Labs and Nick Mellum. Thanks so much for joining us today. And uh please check us out on Spotify. We have video now. Please like, share, and subscribe, and leave a comment and uh review on Apple Podcasts if you're so inclined.
SPEAKER_01:You have been listening to the audit presented by IT Audit Labs. We are experts at assessing risk and compliance while providing administrative and technical controls to improve our clients' data security. Our threat assessments find the soft spots before the bad guys do, identifying likelihood and impact where all our security control assessments rank the level of maturity relative to the size of your organization. Thanks to our devoted listeners and followers, as well as our producer, Joshua J. Schmidt, and our audio video editor, Cameron Hill. You can stay up to date on the latest cybersecurity topics by giving us a like and a follow on our socials, and subscribing to this podcast on Apple, Spotify, or wherever you source your security content.