The Audit - Cybersecurity Podcast

Cyber News: Advanced Phishing, ClickFix & AI Wearables

IT Audit Labs Season 1 Episode 81

Microsoft dominates 22% of all phishing attacks, a $800 tool tricks 60% of victims into self-hacking, and Apple's planning a surveillance pin that records everything—welcome to 2025's cybersecurity nightmare. In this episode of The Audit, co-hosts Joshua J Schmidt, Eric Brown, and Nick Mellem are joined by Jen Lotze from IT Audit Labs to dissect three headlines that prove the threat landscape isn't just evolving—it's accelerating. From brand impersonation scams that exploit your brain's pattern recognition to ClickFix malware that bypasses antivirus by weaponizing copy-paste commands, this conversation reveals how attackers are shifting from breaking through defenses to manipulating humans into opening the door themselves. 

What You'll Learn:

  • Why trusted brands like Microsoft, Amazon, and DHL are irresistible phishing targets, especially during high-traffic seasons when vigilance naturally drops
  • How ClickFix attacks exploit legitimate-looking broken websites to trick users into installing malware through their own command prompts—achieving 60% success rates that evade traditional security
  • Real-world consequences of sophisticated social engineering, including a $116,000 wire fraud loss that proves even tech-savvy professionals aren't immune
  • The privacy and consent implications of Apple's rumored 2027 AI wearable with dual cameras and always-on environmental recording
  • Whether constant surveillance is becoming the unavoidable price of technological convenience—and what that means for building security cultures in organizations today

From training employees to recognize copy-paste scams to navigating the ethics of ambient recording devices, this episode delivers frontline intelligence for security professionals and practical awareness for anyone trying to stay safe online.

#phishing #clickfix #cybersecurity #socialengineering #applewearable #privacy #malware #infosec #brandimpersonation 

SPEAKER_02:

You are listening to the audit presented by IT Audit Labs. I'm your co-host and producer Joshua Schmidt. And today we're joined by the usual suspects, Nick Mellum and Eric Brown. And then we have uh Jen Lotzi with us today, riding shotgun from the IT Audit Lab office in St. Paul. We're uh bringing you some live news today, going to talk about some headlines. We all picked out a news article that stood out to us. Eric, how are you feeling? I'm feeling good.

Eric Brown:

It's cold. It's cold. Really cold. We we had a uh a high school student, he's a senior, he came in to see how the vegan sausage is made here at IT Auto Labs. Um the vegan chorizo. We had pizza. Uh yeah, it was cool. He um he had some questions, just he's in some cyber security classes now in high school, and he um it was just a good opportunity to see what happens in the real world in cyber.

Nick Mellem:

It's cool to see the next wave of operators coming into the office engaged.

SPEAKER_02:

Jen, how are you doing today? Are you staying warm out there in Wisconsin? How's the commute these days?

SPEAKER_03:

Uh it's good. I'm just looking at that beanie that we're looking at here. Like that's that's a little cool.

SPEAKER_02:

Well, you know, I had to bring this in since we were talking about some fun stuff today and we're talking about the cold. You know, Eric um has injected this into my Instagram and all the marketing timelines across all of my browsers now, thanks thanks to him sharing this. Um this is an EMF uh Faraday cage for your for your noggin. And I did a little research here. This is a$75 beanie, y'all. So I see that. You know, so you can be extra hipster. You can stop at the coffee shop, you know, roll it up so it's just barely on the top of your head, just covering the dome.

Eric Brown:

I I can't take full credit for this one. My brother sent this one to me, and my brother's uh a uh an artist who used to teach at the Maryland Institute College of Art. He's retired now, but he uh um he sends me a lot of different Instagram things, and this was one of them. Uh it's got silver thread in it, and it hearkened me back to the episode that you, Nick, and I did a while ago where um yeah, I I don't know what we were talking about, but I know it had to do with UFOs.

SPEAKER_02:

Well, yeah, we have to have a tinfoil hat section on every section.

Nick Mellem:

This is the modern tinfoil hat. Jen's rethinking her uh guest experience today.

SPEAKER_03:

Always. Always.

SPEAKER_02:

We always have a tinfoil hat section. So that this we're kicking it off with the tinfoil hat today. So in case anyone's interested, you know, this blocks the EMF radiation from you from your your dome. So your your noodle doesn't get baked. Quote unquote. And I actually have I'll throw this in the uh show notes here, but there's actually prepper press has a very, very thorough breakdown uh on the EMF situation, health concerns, pregnancy exposure levels that are acceptable. So we'll put this in the show notes for all our preppers and tinfoil hat enthusiasts out there. Yeah. So are we can we can expense this, right? To the IT Auto Labs account.

Nick Mellem:

Absolutely.

SPEAKER_02:

Okay, great, great. It's yeah, we got to. We get 10, it's only like I think we'll get a discount only like 700 bucks.

Eric Brown:

I was having some stuff delivered to the house today, right? And I always love these calls. So he calls me up and he's like 10 minutes away, and um he goes, I'm just confirming your address, and then he starts, he's like, I'm you know, I'm downtown in in the town next door, and then he starts relaying like street directions back to me, like you know, okay, I'm I'm gonna go up this street over this area. I'm like, yo, dude, just put it into the GPS and it'll take you there. Yo, dude. So he goes, he goes, I'm old school, I don't use GPS. So then I'm like, well, what now I'm I I gotta relay directions to you? So I said, look, I just moved, I don't know how to get there myself. I just put it in GPS and that's all you gotta do. And then he's like, well, I'm uh I'm waiting for the day where technology goes away and I want to be able to just get there on my own. I was like, well, okay. And uh he figured it out, I guess.

SPEAKER_02:

Sounds like a ludite was what we call the supposed technology technological advantage and not the map quests or whatever you used to do, copilot.

Eric Brown:

Why are you not using GPS these days?

SPEAKER_02:

Yeah, I mean he's still relying on technology to call you to get the directions, right?

Eric Brown:

You gotta work hard. Yeah.

SPEAKER_02:

Without further ado, we have our first article. This one's coming from Nick. Nick sent me this from the Tech Radar Pro talking about phishing scams. Who are the most spoofed brands in phishing scams? Let's be honest, you can probably guess most of them, but there are a few surprises. So uh this is goes on to explain how 22% of all brand phishing attempts are tied to impersonating Microsoft. DHL was the only campaign outlier in the top 10 that wasn't a tech firm. And identity is the biggest attack service for cyber criminals. So, Nick, serve this up for us for discussion. Why did this stick out to you and how does it relate to kind of what we do at IT Labs?

Nick Mellem:

Yeah, I do I like to, well, first off, this article stuck out to me because I like to have a little reflection on maybe the previous year, what we did wrong, what we did right, and then how we can learn from those two items. Doesn't have to be good or bad. With that thinking in mind, you know, what was the trend of the previous year? Now, this article is relatively low-hanging fruit because I think all of us on this call would have guessed basically everything that's here. But I think it resonates with anybody that's technical and not technical. We just got out of the holiday season, DHL makes sense, Amazon makes sense. We can look at all these from also from a market share standpoint, Microsoft being the kingpin here. Everybody, Fortune 500 companies, small organizations, pretty easy to guess that they're gonna be running Windows, right? So that's gonna be the easiest to market, you know, for uh uh bad actors, right? They're gonna go that route because they get a much bigger market share. But no matter what you're thinking, this article still just resonates with anybody, like I said. Your grandma could read it and be like, wow, I'm gonna start watching out for Amazon campaigns or whatever it is. But uh it's an interesting article just to dissect from that standpoint to see if there was any outliers like DHL or something like that.

SPEAKER_02:

Yeah, maybe we'll kick it to Eric and then we'll get Jen in here on this too. But Eric, why do you think Microsoft is such a target? Is it just because it's kind of ubiquitous across a lot of a lot of tech platforms and entrepreneurial business usages, or is it is there something inherently uh you know attractive about the Microsoft ecosystem?

Eric Brown:

I think it's what Nick was talking about as well, where it's just a lot of companies use Microsoft for their identity provider or email. Um and it you can impersonate campaigns. We'll talk about that a little bit with uh the other topic that we're gonna talk about today, the other article on ClickFix, where it is a technique of they want to use the more popular things that are gonna land in your inbox to entice you to click on something that looks legitimate. So it it just makes sense that you would use something that um people are going to normally expect uh email, the the type of email they're gonna expect to see. We talk about around the holiday season um where we might see DHL or UPS or USPS, FedEx, what have you, where those types of phishing attacks might seem to be relevant at a time of season where a lot of people are getting packages. Um I think it's uh the social engineering side of uh playing off of that um what people are expecting. So their their guard is down a little bit.

Nick Mellem:

You know, we're so ingrained with these brands. We know every one of them. So if you're reading a, let's say, a phishing email, you're not gonna second guess the name. You already know who the sender is. So if I'm social engineering somebody with a phishing email or phishing somebody, I could easily misspell the name to get you to still click on it and you're not gonna register. Your brain's gonna fix it and go right through because you already know Amazon, you know DHL, you know, Microsoft, et cetera, down the list we go. But you're much you're gonna be more apt to just cruise through because you know the brand versus something you maybe haven't seen, right? You're gonna you might take a second guess or a second pass at it and say, oh, that's not spelled correctly. I'm not gonna click on this.

SPEAKER_02:

And and they can get really creative with that, right? Just like there's little little tricks in the in the text and like the options for emojis and symbols and things like that where you can replace, you know, ends and and things like and O's with zeros, for example. Yeah, Google.

Nick Mellem:

You added up a third one.

SPEAKER_02:

Yep. So so Jen, we know there's a lot of friction in um in companies of you know, balancing security versus like implementing some of these controls. And we want to like balance usability with security. So how do you communicate that with the clients and then like people in your professional life is and personal life as well?

SPEAKER_03:

I think the underlying foundation always has to be why. Like if I just say I'm locking something down for no reason and it doesn't make sense to them, like I want to always make sure that I give a little bit of the why because it's unfair to just like IT is known as the place of no, so we kind of have to combat that a little bit. So IT is the place of no, but have you considered this other solution? We just need to be very human about how people can protect the company, protect their company-wide data, but also know that all of our employees are humans too. And so if we connect them to how do you protect your kids, your parents, etc., uh, it goes a long way further to just establish that culture than just sending a bunch of emails.

SPEAKER_02:

And Eric, where do you see where do you see the landscape going, especially in in fishing and dishing? I know you you just sent uh Nick and I uh quite a humorous audio recording of a of a fishing call you had. Um where do you see this going? I mean, there'll always be the classics around, right? Like just the impersonations or scams on the phone, um, kind of making you panic, maybe, or try to get elicit some panic to create an action uh on your behalf. But where do you see it going and how do you help organizations prepare?

Eric Brown:

The first part of that question, Josh, I I think it evolves over time. We we remember those of us who who go back in tech aways, uh remember the Nigerian prints scams, and you know, obviously that worked for some because it it you know they did it. Um and I think that's just evolved to a modern version of that Nigerian prints, and now we see clickfix where they're enticing you to um copy-paste. I know we'll get into that here in a little bit, but I I I think it the the social engineering aspect preys on the the weakness of the human to either want to believe something that might be too good to be true, like the uh um the Elon Musk and Free Tesla scams or the romance scams. Um these threat actors are going to take advantage of the the times and the technology to um, you know, really try to victimize us. So I I I don't know where it's going. I just know that with AI and how legitimate we can have technology impersonate our voices or our writing, the detection side is just going to be more and more difficult. And we'll we'll need to to leverage both the technology side tools to help us, but also education. So that's w one of the things that uh Jen, Nick, and I get out and do is uh talk to the community around um these things that are happening and making people aware so that when this does land in their inbox, they're at least more aware of it. Because now it's voicemail, it's uh email, it's texting. Um pretty soon it'll probably be robots knocking on your door.

SPEAKER_03:

Right. I've been involved in two recently that have just been truly awful. Like one was uh with a title company. Uh I'm guessing that that title company got fished and someone somehow got some access to some information. And long story short, this uh title company got spoofed um and someone actually got taken for$116,000. And just working with this person from the ground floor up, you know, they are submitting it to like 10 different complete centers, and like there's really no way that you can work to like collectively try to get that money back. Like it is a really hard process. Um, and this person's very tech savvy and it and it just goes to show that like it doesn't make you like weak or anything. Like it was a really, really, really good scammer.

SPEAKER_02:

You you just mentioned quick fix, Eric. Um, I'll serve this up to you, and then maybe you could kind of get us into this article a little bit. So I had to kind of read through this is a very dense article from InfoStealers. From my understanding, cyber criminals created air traffic and an$800 tool that tricks 60% of its victims into hacking themselves. It makes legitimate websites look broken with corrupted text and glitches, then offers a fix button with instructions to paste code in your computer's command prompt. Uh when you do that, it installs malware and steals your password and your credentials, and attackers use those stolen credentials to inject the scam into more websites, creating self-spreading infection cycles. Um, so yeah, and it's it seems like it bypasses all uh security software because your browser sees it as copying text, and antivirus sees it as opening command prompts. What brought this to your attention, Eric? Is it has it been in the news?

Eric Brown:

Yeah, on the clickfix, we see this uh a lot um you know across industry. It's it is unfortunately a really popular way that threat actors are attacking organizations. Um in St. Paul here, we we just had uh a pretty big breach uh a little while ago with uh city of of St. Paul. And I I think they the the publicly facing attestation was that it was from a drive-by, uh, but but they they haven't released more details yet um around exactly what the indicators of compromise were. But this click fix uh as a standard term from a uh piece of malware that when you visit a a website and it it it could be either embed and this article here talks about how it can be embedded uh really as a as a program onto a site that a threat actor gained a foothold on. It could be part of potentially we we've seen it in um maliciously formed banner ads as well. When uh an individual goes to the website, the attack method is to entice them to copy malicious code, which is downloaded to their browser, and they're they're doing a copy-paste command into the uh uh the essentially the command line of their commute the of their computer. And what this is doing is taking somebody that you don't have to have any technology experience because it's telling you exactly what to do, right? You know, hit the hit the Windows key, um do copy control V and then the the run command and and control V into the system. And it and then when you're hitting enter, it's running this command unbeknownst to you on your computer, which is then going out and um potentially getting additional malware or um calling home to a um a server that's gonna send further instructions. And it's a self-replicating cycle because what they're trying to do is get information about maybe web servers that you may have access to and they're downloading their credentials to those web servers, they'll go and install content, this malicious content on those web servers so that when additional people go to those to those web servers, then they become infected as well, and the cycle just continues to repeat. And I think this this um particular piece of software is you can essentially purchase it for$800. Um, so uh a little bit of money, but in the grand scheme of things, if you're selling the credentials that you capture um or other information that you're getting from the victim's machine, that$800 could easily pay for itself pretty quickly.

SPEAKER_03:

For me, I've been seeing it where we know that like you can buy a phishing uh campaign for$25. Right. So I think it just makes the ability to do harm so much more accessible to more people. As far as the the attack vector, it just makes it like it puts so many more threat actors into that space because it's so much easier. You don't have to be highly, highly technical anymore to run these scams. I mean, a lot of our best scams are predominantly social engineering, right? They call you on the phone and they're doing all these things that don't actually involve any technology whatsoever. They just ultimately end up getting information from you that they can take and use and exploit. So I think knowing that it's only$800, I mean, that's a lot. I could get some really great shoes for that, but like it just makes it way too accessible.

Nick Mellem:

Great point, Jen. I was gonna touch on the how accessible this is. I mean, here you've got a legitimate website selling something that you can just out of the box go. So it just shows the reach um that malicious actors can have and how easy it is to, I guess, to get started. You don't have to, you know, some of these companies have an HR department, right? You don't need to be to that level. But it really shows um, I think from our side, the training um that we need to have with our staff. And we've talked about this in great detail over many episodes. Um, but from my understanding, and maybe somebody knows differently, I've never seen a website ask me to copy and paste.

SPEAKER_02:

So traditional, you know, training teaches us not to download suspicious files, right? But this kind of looks because it's asking you to copy and paste, it might appear to somebody that untrained or not paying attention that this is actually a legitimate technical fix for something, right?

Nick Mellem:

This is a good example of why it's so important to have your technologists, your cybersecurity experts not sitting in a back room in an organization. They need to be out normally talking to, let's say you're uh you work on a factory line, a factory floor, right? Go out there, talk to the people that might be working on computers, go everywhere and have these discussions, put this in a newsletter, but had normalize these kinds of conversations, show what's happening out in the industries. If you have a company meeting, this would be a great topic to discuss because it's I don't think we're normalizing this specific uh issue here. It's these are the things that you should just be trained to watch for. Everybody sitting here on this call right now. If we see this instant, it's not even a second before you know it's BS, right? You just know. Jen is a good one to talk about this with because she's you know out at organizations of uh different sizes talking about these things from a tabletop uh uh perspective, right? Practicing these things. So you have the initial training, training your staff, but then it's what do you do after, right? Let's say somebody does copy paste or practice after. So it it's two it's two parts, but again, I'll leave it there to get input from everybody else.

Eric Brown:

I think we could fish Jen with shoes, Nick.

Nick Mellem:

Shoes, copy, records, um concert tickets, anything sparkly, really.

SPEAKER_02:

Jen is clicker.

SPEAKER_03:

I feel like you guys know me or something.

SPEAKER_02:

Is clicker a cybersecurity pejorative now that I'm not aware of? Is that is that what we can start calling each other if we're falling for phishing scams?

SPEAKER_03:

Yeah, I mean that's kind of like, no, yeah, you're a clicker. It's okay. We love you anyway, but you're a clicker.

SPEAKER_02:

We got sorry to interrupt. We have a question here from one of our viewers that's patiently waiting. Um I'll throw it up on the screen here. I I think the answer is just yes. Um, is it still true that even if we pull put all the security controls, we can still uh have the single person mess it up, right? Yes. Like but this is even more funny. Uh I love this question. The beard guy is saying that cybersecurity guys should be an extrovert early.

Nick Mellem:

This guy's not really no, it is. I agree. Go ahead. Somebody's talking.

SPEAKER_02:

I think you're on this one, Nick. You're the beard guy.

Nick Mellem:

No, and I was actually laughing about this the other night because it is so true. Asking people to be extroverts extroverts. But I think that's where we actually gain ground in this space because true story, I was still a bearded guy 10 years ago, plus years ago, working at a manufacturing company, running some help desk tickets. And instead of just remoting into somebody's computer, I would walk out there to this third building to sit at that person's computer and fix the problem. And the reason I did that is because then that person, if they see somebody they don't know about or are unsure about a phishing email, whatever it is, they just say, hey, I'm just gonna call Nick. Right? So you build that culture. And that's what I'm saying. So great question. I appreciate his question. He is correct. That is a gap we're gonna have to figure out if it's not you slowly but surely.

Eric Brown:

And Nick, could it be in a larger organization, it it's tough to have you know 30 Knicks running out? You've got to have somebody at the keyboard technically savvy doing works. But you could have people that are representative of the security organization out and about. And I think just to call back to what Jen was saying before, not making security a no, but it's a no but, or how do we do this to help the business?

SPEAKER_02:

So this is the next article that uh Jen brought up. And I thought this was really cool. I hadn't heard about this, but um, this is from 9 to 5 Mac. Apple's working on an AI-powered wearable pin. And now this is a report. So this is kind of speculative. Yeah, this is saying wearable could be launched as early as 2027. Uh Apple's pin, which is a thin, flat, circular disc with aluminum and glass shelf, features two cameras, a standard lens, and a wide angle lens on this front face designed to capture photos and videos of users' surroundings, the people said. Also includes three microphones to pick up sounds in the area surrounding or the person wearing it. And as a speaker, uh yada yada, you get the idea. So um, Jen, why did this stick out to you? And you know, what are the security implications here? Uh, you know, there's some that are kind of stand out as obvious, but but uh love to hear your thoughts.

SPEAKER_03:

Yeah, I always come from so I come from the school district. I was a teacher, tech leader, all the things. And so I always live in this space about kids. So privacy is always like my number one thing. Actually, I'm super crazy about it. Um, and I even struggle with this a lot too, because uh we have wearables now, we have AI meeting tools, we're recording so much more information, I think, than we ever did. And I think in that space, it it it lowers our sensitivity to some of the things that we're doing. Um, so I think like just because you have this wearable and you want all that information doesn't mean that the people on the other side want you to have that information about them. Um, so I think I always find these articles in this tech really interesting because we live in this space of like you're supposed to have consent before you record someone. Um, can't record without their knowledge. And so this kind of makes it really interesting. Uh like what will the world look like as we move forward regarding privacy? It started with social media and has continued to evolve through, you know, influencers and all this stuff, and we have a different insight into people's lives. And I think this just continues. And I just wonder like, who has access to that data? What if something criminal is recorded? You know, are there things like it's just it opens up this whole can of worms about privacy? And I'm just I use AI note takers, I appreciate them because then I can focus on the meeting uh and understand and listen better, but also like my voice is recorded, so when I make that dumb joke, like it lives on an infamy. So um that's really where my head goes around this. Like, what does this really mean, you know, for privacy? So that's me.

Nick Mellem:

Yeah, I think this thing, I think these things are really cool. Just I know I think most of us just appreciate the technology, you know, where things are going, glasses, etc. And that's you know, we already have the meta glasses as well, cameras, you know, looking everywhere. I think we've become accustomed to it. Do you like it? Probably not. But if you're in a public place, not much you can do about it. I just like it from a technical standpoint. I have one of the uh little, like one of the first ones I've heard about, the B, I think it's called. Um, I have haven't really used it a terrible uh amount of times, but the idea, right? Same as this pin it, you can walk around, have the conversations. It might show you where you had a conversation if you're in a meeting at a coffee shop, et cetera, et cetera. To me, they're handy tools, but I'd be curious to see which way Apple's gonna spin this because if we know anything about Apple, they'll have some way that they made it better, right? They'll wait to get into the space until they can press it along. Because one of the biggest players right now is the humane pin, I think. Something of the along those lines is the big player right now. I don't think it's going very well for them, from my understanding. But uh certainly here, uh Jen's already touched on it. For me, the big thing is obviously going to be privacy. Uh Jen actually brought up exactly what I was thinking about. If a crime or something occurs, uh, I mean you catch it, um whatever the situation may be, uh would be really interesting how that's uh handled. In the past, if a phone iPhone was used, Apple denied access to the FBI. So maybe we would see the same uh situation here, but uh time will tell.

SPEAKER_02:

Eric, you can stick one of these right on your Haven beanie and then wear it, and everybody will know. But I I wanted to ask Eric, you know, CISO experience, if you see people showing up to the office wearing these next year, it's like, let's say this trend just takes off, everybody loves them.

Eric Brown:

Well, one, I think the whole brand's been going downhill since jobs left in 2011. But other than that, specifically to address your question, I don't know that there's a lot that we can do without working with the individual clients legal team to discuss how they are viewing wearables and privacy in their office and working space. And do they have policy that governs it? Because that's where you would have to start.

SPEAKER_02:

They actually mentioned something about your first comment about the brand going downhill. So they've outsourced some of their AI stuff. They haven't been doing well with AI reportedly, and they've outsourced that to Google, which creates a whole nother um security implication, right?

SPEAKER_03:

Because if we're working now, that's so interesting to you.

SPEAKER_02:

That's great. With Alphabet, right? Like all of a sudden you're working with Alphabet. So, you know, Apple, who's prided themselves on storing that information and being a secure closed ecosystem, is now outsourcing a lot of AI stuff to Google.

Eric Brown:

The outsourcing of proprietary content kind of reminds me of what happened with 23andMe, where it got bought by private equity. And then, you know, here's something that our our DNA data, right? It was like, okay, yeah, this company's gonna keep it secure. They have, you know, they're they're ethical, whatever, whatever. And then five years later, whoops, got sold. Now this uh new company, you know, private equity may not have the same ethos as the original company. Now who knows what's gonna happen to that that data. So I same thing and Apple's much bigger, but the I I think we just have to be as consumers conscientious of not only what's going on today, but what are the future implications.

SPEAKER_02:

Did you see this kind of like an AI moment in the way of that like you're trying to hold water in your hand, you're not gonna stop this advancement of wearables and things like that? Exactly.

SPEAKER_03:

Because I come from that school space, right? Like we don't put cameras in classrooms. We don't, there's certain things that we just don't do. And uh so we put it in both our like our employee handbook and our student handbook about wearables, right? And recording acceptable use policies are super important when it comes to this, but ultimately, like you have to establish the culture uh so people know why, right? Like we have to identify what data we have, where it lives, um, and how we're protecting it. So then when we say you can't use something like this, it makes sense.

Eric Brown:

Looking forward five, 10 years, I don't think we're gonna be able to escape the the outflow of data. I think it's just going to be um ubiquitous, right? I and you know, Jen, I understand what you're saying now, but I at some point cameras will be in the classroom, my prediction, for, you know, they'll they'll say it's for student safety and wearables, glasses, pins, what have you. It'll just be commonplace, just like wearing a a wristwatch today. Um everything that you do from the time you get up to the time, you know, you you go to bed, and maybe even while you're in bed, you know, we've got monitors recording our sleep patterns and all that stuff now. I think it's just going to be a black mirror episode where everything is on camera 100% of the time. So if we wrap our minds around that and then we think about how how do we exist in that type of environment and start planning for that, that's probably the best use of time versus trying to hold it like water in your hands because at some point in time it's just gonna run through your fingers.

SPEAKER_02:

Thanks so much for your time today, you guys. It's been a pleasure chatting with you about the news. Uh, looking forward to the next one next month. You can always go to itautolabs.com where we have a podcast page and it shows you our next live stream is coming up. Believe field notes will be coming out in a week or two here. We have a game night coming up in early February, first Wednesday of every month. But please like, share, and subscribe. And if you have the time, please drop us a little review on Apple Podcast. And then check out our other podcast, Sip Cyber, presented by Jen Lotsi here in IT Audit Labs. So lots going on. Keep checking in. And uh we'll see you in the next one. You've been listening to the audit presented by IT Audit Labs. I'm your co-host and producer, Joshua Schmidt. Today we have Nick Mellum and Eric Brown and Jen Lotsi. See you soon.

Eric Brown:

You have been listening to the audit presented by IT Audit Labs. We are experts at assessing risk and compliance while providing administrative and technical controls to improve our clients' data security. Our threat assessments find the soft spots before the bad guys do, identifying likelihood and impact, or all our security control assessments rank the level of maturity relative to the size of your organization. Thanks to our devoted listeners and followers, as well as our producer, Joshua J. Schmidt, and our audio video editor, Cameron Hill. You can stay up to date on the latest cybersecurity topics by giving us a like and a follow on our socials, and subscribing to this podcast on Apple, Spotify, or wherever you source your security content.