BlackHat 2024 Day 2 Recap

Episode 301 –

On this episode of the podcast, we have another recap from the BlackHat security conference in Las Vegas. This time we discuss a new initiative to protect the world from deepfakes, followed by a penetration testing engagement that proved immutable backups doesn't always mean available backups.

View Transcript

Marc Laliberte  0:00  
Hey everyone, welcome back to the 443 security simplified. I'm your host, Marc Laliberte, and joining me today is Corey,

Corey Nachreiner  0:06  
shallow, real Nachreiner,

Marc Laliberte  0:09  
shallow, real,

Corey Nachreiner  0:11  
opposite of a deep, fake.

Marc Laliberte  0:13  
Ah,

Corey Nachreiner  0:14  
but shallow,

Marc Laliberte  0:15  
okay, and that's,

Corey Nachreiner  0:17  
I don't know. It's the first thing that came to my head. It's been a long day at Black Hat.

Marc Laliberte  0:22  
Well done, as Corey just mentioned, we just finished up day two at Black Hat, just like an hour and a half ago. So today we're coming back with you with another short recap of just our two favorite talks from today and some takeaways for you to take back on what they mean for our industry. And I guess, without any further ado. Let's, I said, hack our way in last time, breach our way in. There you go. Yeah, that sounds great.

Corey Nachreiner  0:48  
It's like a whale out of the water.

Marc Laliberte  0:57  
Let's start today with a quick recap of today's initial keynote.

Unknown Speaker  1:04  
Black Hat, yeah,

Marc Laliberte  1:05  
I think one of your favorite. I like favorite, but someone that at least you seem to talk about a lot. So I imagine someone obviously

Corey Nachreiner  1:12  
Marlin, Spike, encrypt, cryptography, SSL, genius, old school hacker. I think from even dark tangents age, some of the examples he used are pop, pop core examples. So I like that.

Marc Laliberte  1:25  
And more recently, the founder of signal, the secure messaging app which started

Corey Nachreiner  1:29  
is, what was it called, red red box, some red phone. Yeah. But

Marc Laliberte  1:33  
anyway, so the keynote this morning, to kick things off, started with like Moxie going up on stage and giving this long discussion about, like, basically, how we should improve engineering, was my takeaway from it, and make sure engineering is more aligned and less and we'll get into in a second. But then he went into just a little fireside chat with Jeff Moss, THE DARK TANGENT about a few takeaways from that too, but I wanted to focus on his like initial discussion, because there were some really interesting bits from it. I actually unfortunately missed the first 510 minutes of it for the AI webinar, which is available on the watch guard webinars download page on demand right now, if you want to learn how to make an AI security policy for your company. But when I came in, it was super interesting. So maybe you can fill in before me if I'm missing anything key from it. But he was talking about how in the world of just engineering right now, we're going through a phase where we're starting to really focus on abstraction layers for components that we're using. And there's two different ways of handling abstraction layers. Think of it as like a, you know, a library that handles some piece of code so you don't have to go write that. Like, one of the examples he's gave, he gave was logging for an application. Like, you could go write logging components, or you could go get a logging library and let that take care of it for you, and there's kind of benefits and trade offs from using these abstraction layers, depending on how you use them. If you're still like very engineering minded, you might use it as a way just to speed up your code while still understanding fundamentally what's going on under the hood. But more often than not, people are treating them as just black boxes where they know it does something, they know how to set it up, but they don't, like fundamentally understand how they work, and that's causing some issues. And I think

Corey Nachreiner  3:28  
it started even with Dark Tangent, talking about the complexity of software in the Cloud. Now, all these mechanisms, software, different clouds, different platforms, that interact with each other, but are these abstraction layers? And I think Marlins main takeaway is, while, like you said, there's benefits to abstraction layers, I mean, at a high level, it's rather than reinvent the wheel every time we want to pop up in a Windows dialog box, we don't have to do that. But he was trying to show the actual innovation that someone that actually digs deeper, that that reads the books, that reads all the details about a technology, suddenly they know so much about the deeper stuff that they can start to do magic. And one of the examples he started, which might have been before you got there, was he was showing animations from what had to be the 1980s when enhanced graphics, EGA. I don't know if you even know what EGA card is, but when I started VGA, VGA is the better one. Okay, so when I was a kid, you know, graphics were black and white, or green and white, green and black, or amber and black, then they had CGA, which was Color Graphics Adapter, which did four colors. Whoo, EGA bumped that up to 16, and VGA was 256 if I remember correctly. Dang. Look at all those colors. But one of the things you couldn't really do in the traditional way you could do now is animation with lots of rich colors in motion. And he showed how understanding how each. Was working, and really on a bit by bit level, where you would have some of those 16 colors mixed in a different way to make a pattern that, from far away, looks a certain way. We're doing a simple thing like a bit flip. Rather than trying to change all these pixels to create an animation, you could do a bit flip that would just change one of the colors to something else, it would have a big effect. Long story short, if you didn't know the underlying technology of how EGA worked, you couldn't figure out this cool way to create animations that were pretty they looked decent nowadays. For something that, back then, 16 colors is not a lot to work with, and it was really his argument for if you don't understand the deeper systems you can't get to, like the optimization or the neat tricks or the better stuff, yes, 90% of the time you'll be working in those abstraction layers, but understanding that deep stuff is important. He

Marc Laliberte  5:56  
gave a more modern example too, after I arrived on the scene about video game development, where, in the world of game development now, there's all these different game engines that you can use to make, really, anyone be able to develop a video game like some of them, like RPG makers, one I'm aware of, where you don't even need any coding expertise at all. It's basically all plug and play. Even

Corey Nachreiner  6:17  
Unreal and Unity nowadays is pulling boxes around to do animation. Yeah, definitely

Marc Laliberte  6:21  
hop out a ton. So you don't need to know like fundamentals about like C Plus Plus development in order to create a game these days. And what that's meant is, he showed this chart of the volume of games being released on Steam, which steam is like, the repository for video games? Which are all of you partnered to know that exactly the volume of games being released on Steam has like skyrocketed over the last decade because of tools like video game engines making it easy for anyone to get in. But that doesn't necessarily mean that we're releasing quality games. Because he looked on the same chart, um, added a dot graph showing like the volume of critically acclaimed games, basically, ones that get like, really good reviews as high quality video games, and that's been steady at best, for even dropping in some cases too. So just because it's these abstraction layers are making things easier doesn't mean they're necessarily making things better. And

Corey Nachreiner  7:17  
you talked about how engineering organizations are starting to do same thing, where they're siloing teams into, like modules of, I don't know, you do cloud, you do network layer, you do whatever, but those silos end up being abstraction layers, where the teams may not be getting as much communication to each other, where they can really when you start to learn all those technologies, you can Learn neat ways to mesh it. So it really started where he said that, right now, the vision in engineering has been separated, and it seems, because of that, velocities down, and there doesn't seem to be a ton of visionary engineering or game making, or whatever, where he really believes to bring vision and engineering together, you have to go into those abstraction layers. But then, I mean, maybe you can talk about how the point of this was security, though, we're the ones in order to find vulnerabilities. You know, abstraction layers are where you go. That's where all the meat of the complexity and the interactions happen that you can tend to take advantage of. So, yeah,

Marc Laliberte  8:19  
that's where it really hit close to home. And clearly the point of this whole discussion was like security professionals, and specifically like application security engineers, the ones that are trying to find issues in code, like you said, they have to understand on a fundamental level. They have to think differently and break through that abstraction layer, understand exactly what's going on your level, sometimes even more than the person that actually wrote it in order to find and resolve these issues. And these applications remain strange

Corey Nachreiner  8:46  
and inspiring too, because he essentially said, we have inherited the world. Because seeing that deeper into those application or abstraction levels is magic, like if you only know that if I put these inputs into this abstraction layer, I get these outputs and you never go deeper, you won't find all the magic and interesting things you can do if you understand how the subsystems works. It was

Marc Laliberte  9:07  
actually the takeaway from it for me was I'm glad I went into cybersecurity and not software engineering, because we definitely have the cooler job and the more interesting.

Corey Nachreiner  9:17  
Feels that way, although, and I say this is someone that does not code much at all anymore, but I did at one point. I actually think, to really be good at cybersecurity, you do need to understand software engineering, and that's something you're diving into yourself,

Marc Laliberte  9:30  
absolutely. So the the keynote was pretty cool. They had a little fireside chat that, I mean, it was interesting. There were some interesting takeaways about how, like, software development is fundamentally expensive because you have to maintain what you put out there. Yeah. Versus the analogy used was like a recording an album, as a musician, once you're done

Corey Nachreiner  9:47  
recording, make a movie and you have a thing, you don't have to keep it. Everything around the movie is not changing, so you have to update it over George Lucas, that's a good point. That's an excellent point. The metaphor. Breaks down, yeah?

Marc Laliberte  10:00  
But for normal humans and filmographers and musicians, like, once you put out your creative work, you're done. Versus with software, you have to maintain it for

Corey Nachreiner  10:08  
1020, years, forever, yeah, as long as you want it there. So

Marc Laliberte  10:11  
that was, I thought, a good way to kick off the overall day two and final day of the Black Cat conference. And I guess, for the sake of time for this episode, let's just each pick out, like, one interesting chat that we saw. Sounds cool, and you want to maybe pick one of yours first? Sure we can dive into it cool. Yeah.

Corey Nachreiner  10:28  
Well, I will tell you, I'm not going to talk about my favorite talk. My favorite one I'll mention was living off Microsoft copilot. It actually, if you're someone proof of concepting co pilot, that one will freak you out. I'm not going to talk about that one today, because for time sake, we want to keep this short, so I'll save that maybe for our recap. If we're allowed to do more than just DEF CON stuff, I think so. Yeah. But a simple one, easy one I just want to bring up is a big it's something anyone in the world can can understand and think about it. Was called tracing origins, navigating content authenticity in a deep fake era. And it was strangely a dude from Adobe, like we don't think about Adobe people coming to cyber security conference as much. So that was interesting, and realized that

Marc Laliberte  11:13  
they weren't just made up entirely of lawyers at this point, although he

Corey Nachreiner  11:17  
probably is closer to a lawyer than anything else. But the concept was, I think part of why I wanted to bring it up is you and I have talked about deep fakes a ton, and it's a problem we don't know how to solve. And one of the big worries for me, at least for deep fakes, is misinformation of not just deep fakes, obviously, are getting better and better and will get exponentially better, not only not being able to identify something that's fake. But once we're not able to identify fake stuff, we're not going to believe real stuff either. And I think on that podcast, you once asked something like, so what's the solution to this? And I don't think we either really knew, but I did bring up it has to be something like watermarks. At the time of content creation, you would have a watermark. And that's essentially what pelos Uli talked about. I'm sorry, who pelus, P, E, L, E, U, S, Uli, U, H, L, E, y, all right. How would you pronounce it?

Marc Laliberte  12:12  
I don't know, no better or worse than you.

Corey Nachreiner  12:14  
So he talked about the deep fake problem. By the way, he talked about, he started with things I didn't even I think we both know that there was that deep fake with Biden's voice the big deal during the campaign, what I didn't realize is they found that guy, he's been charged with 26 criminal charges, and he's been fined $6 million so crap, making deep fakes of other people is is a big deal, And apparently has some fines behind it. I'm

Marc Laliberte  12:42  
curious, like,

Corey Nachreiner  12:43  
I how you can what law, how he's going to be held up.

Marc Laliberte  12:46  
I could fund I understand why it should be illegal. I just, I struggled to understand what the actual law is.

Corey Nachreiner  12:53  
I don't know the law either. It was. He's showing some headlines of why deep fakes were such a big deal. And that was one of them. Some of the concepts he mentioned is deep fakes are becoming big in politics of all countries, India, Russia, all over the world. One thing he brought up that I didn't realize is deep fakes aren't always bad. There's some politicians like in India, they speak like I'm exaggerating, but 26 different languages, right? So there's some Indian politicians that commission deep fakes of themselves. May be talking Punjabi, which they don't speak natively, but if they're trying to get constituents that are native speakers there, they will commission a deep fake of themselves. So it just adds complexity to you know, how do you differentiate real video or voice from fake video? Malicious voice versus fake video that is good versus fake. You know, it's just becoming a cluster bad word, which I won't say. Long story short, he got to there's actually an organization that Adobe and a lot of other companies have put together called the Coalition for content provenance and authenticity, C, 2p, A, that was founded in February 2021. Really rolls off the top. I know more acronyms, right, and not even great ones, has almost 1000 members, all big companies which are trying to solve this problem. They're trying to figure out standards and ways to basically put some authenticity information, or at least, what did he call it? He's not saying that assertions, putting assertions in image video that says, This video was created at this time by this person on this camera, and it's digitally signed. So interesting. Long story short, C, 2p, A is like the the founding organization that's writing the spec for how you add metadata to images, audio and other content to try to give it some digitally signed authenticity. He also talked about a standard they've created, which is Cai content authenticity initiative. And you already know the metadata that's in images. It's building off that similar metadata, but adding a digital signing part of it, so people will have keys. He talked about the CAS the certificate authorities that will be involved. They're going to be different than browser ones, because it's for a different reason, and other things around that. So, long story short, the takeaway is there is an organization trying to create a standard. And folks like Nikon and Canon are already using some of the SDK that Cai is making. And so when their cameras take a picture, they're embedding this content in the image. And you can even have your like a journalist can have a certificate that says, I signed this. This is my picture. This is the original. And if anyone tries to take that image and alter it in some way, create a deep fake of it, it wouldn't be signed, right? So you'd know it's fake. If the journalist is in another country, wants to send it home and the newspaper wants to make sure this is not someone spoofing the journalist. They have the digital certificate. So it's at least a way to maybe add some not mandatory, but possible data to images and audio and video.

Marc Laliberte  16:05  
So is the proposal like a unique key per individual? Yes, a journalist would be tied to my camera kind of thing.

Corey Nachreiner  16:12  
It's a unique key per individual, and that individual can use that key in devices, in Photoshop and in other things. But then is also talking about like aI creations like Dall E is part of this organization, they would agree to put metadata that also identifies this image was created by Dall E and have a signature there, maybe even for the PERT the user on their site that created the image. So it adds some accountability. So at the very least, you know it's not going to be mandatory, because for privacy, and you know some things for kids, you can't force this on everything, but if you do do it, you can at least tell this is the person that said all this detail again, the reason he called it assertions. The assertion is I made this image in this program. I made this image at this time, at this date. Maybe those are all lies. Like you could lie about that and sign it with your your certificate still, but at least it would be known that if that ever comes out that you lied, that was your certificate, you're the liar.

Marc Laliberte  17:16  
That's interesting. So like actual repudiation protections as well, where you can prove the origin of something,

Corey Nachreiner  17:22  
yeah. He even showed an example where, you know, he made a an original image in Dolly, and it had, it had a flaw in it. It had, like, three strings instead of two for a hoodie. He made a hacker image, so the Dolly had all the attestation and digital signature, saying this was made that way, but then he went in Photoshop and edited one of the strings out, and it had that chain so you saw, okay, this original was made in Dolly, but this is a changed one by someone else in Photoshop. That's

Marc Laliberte  17:53  
cool. Maybe this is a potential solution, then it's

Corey Nachreiner  17:56  
the best I can think of right now. But it is like voluntary. So it will have to be people deciding that if images or videos don't have this, they don't trust it. And even then, when you are looking at this, you have to realize the only everything in the the metadata is assertion. The only thing you can know for sure is that this person's private key signed it. That's

Marc Laliberte  18:20  
interesting because, I mean, the main, the end goal of this is, like, how do you get the everyday citizen to give a crap about this? Like, it makes sense, where, like, now a news organization can say, oh, you know, that was actually blah, blah.

Corey Nachreiner  18:34  
I think it will happen, though. I mean, if we start to get bombarded with misinformation, the one thing, like a normal user's not just going to go look for this metadata. It has to be the programs we open. It will start to show this like, imagine you open something in Photoshop and it has a green check mark versus a red one showing it has this information. But, but it will have to be the point where normal people are so sick of misinformation that they look for it. So who knows if it will catch on and and if it's only in the metadata, no normal user is going to use it. But I imagine other programs will start to give you, like the the green lock thing on a secure website. They'll give you something showing that at least the image has this in it. Maybe you could see

Marc Laliberte  19:16  
that like something displayed on YouTube or something because, I mean, it has to be real time, is, my opinion. It can't be a like fact checked after the fact because, like, as we've seen from just, you know, verbal misinformation from certain politicians, like, it doesn't matter if you fact check it two days later, because their base is already eaten up as truth, and they will never see that. And

Corey Nachreiner  19:37  
that's why they're working with content. That's why they're working with camera companies. That's why they're working with image generation or video generation companies, and even chipset makers, you know, the chipsets that go in all these equipment too. So we'll see if it takes off. It seems to have some some momentum, and

Marc Laliberte  19:55  
now we've got to worry about like protecting cryptographic keys and cameras. Are we going to start throwing TPM? And everything I

Corey Nachreiner  20:00  
was going to say, I mean, we say, you at least know it was signed by someone's private key. You still don't know if it was signed by them, because what happens if their private key

Marc Laliberte  20:09  
gets to stop it from stealing someone's camera, ripping it out of the firmware? Then, although

Corey Nachreiner  20:13  
that's why the chip makers are getting involved, you put it in the TPM, and it should at least be harder to strip out of the camera or whatever chipset device.

Marc Laliberte  20:21  
So this is just all the ruse from big hardware, trying to guess. We'll find out. We'll find out either way. That's really interesting. That is a potential solution to what I thought was a

Corey Nachreiner  20:31  
it's going to be a hard problem. It'd be interesting to see if this works. Yeah, cool

Marc Laliberte  20:35  
talk. The one I wanted to go over was actually the very last one I went to today. It was called, are your backups still immutable, even if you can't access them? And it was by rush knock Shetty and Ryan Kane, fit,

Corey Nachreiner  20:48  
tree falls into forest. Does anyone actually hear Yeah.

Marc Laliberte  20:52  
So it was really interesting. And the whole topic was, it starts with the premise that, you know ransomware operators, they are trying to go after your backups, because most organizations hopefully have a good backup and restoration process.

Corey Nachreiner  21:05  
A lot of them are like automated cloud like Veeam, so they have systems that are pretty standardized that a ransomware author can just kind of go after quickly Exactly.

Marc Laliberte  21:13  
So if you can disrupt those backups, either make them unrecoverable, you probably can't delete them. In this case, they showed like ways that you could not delete them, but if you can at least slow down the recovery, you might still incentivize that company to pay the ransom in order to get back online quickly. And so I forgot what company they work for, but this basically started their two penetration testers internally for the company they work for. And this started as a like targeted penetration testing engagement to evaluate the immutability of their own backups that their company were using, huh? And so they went after the three backup systems that they're using. One of them was a Dell, EMC Data Domain device. Another one was an IBM DS 8000 device, and then also cloud backups, using AWS backup service as well too. It was interesting for the Dell One, the EMC one, they had a lower level environment device that they could just go ham on the DS 8000 though it was, they only had a production one, and so they were a little constrained what they could play with, exactly, but they still got some wins on that one. And then AWS backup, they got their own account and their Azure organization to again just go ham on see what they could do. So basically, it walked through the steps they took in their penetration testing engagement to see, could I as a ransomware operator make these seemingly immutable backups unusable because most of these tools, in fact, all three of these and most major backup solutions these days have different features in them, like compliance requirements or like a write once read, many protections, where basically, as soon as you write to storage, a bit gets flipped or a fuse gets hit or something, and you are now unable to overwrite that at all, and there's no way to do that without just completely reformatting the whole device or shipping it back to the manufacturer or whatever. I know in AWS is their compliance hold feature. Basically, you have to sign like a send a legal letter from your lawyer with like a court order to turn off some of these features too. That's good. Yeah. So there's a lot of really strong protections, where once you write to the data, you can still read it, but you cannot delete it or modify it, and that's the intent. So they were trying to see, how can we still break access to this? So starting with the Dell EMC device, it was really interesting. It has a really stripped down shell that you can use to access it, similar to, like what the firebox has from WatchGuard, where you can run certain commands. It's not an actual bash shell, but you can still, like, do management activity on it. It did have, at the time, something called the SE mode, the systems engineering mode, which had a few more commands that you could do, but still not a true bash shell. But Dell does actually have like, a a the ability to open a bash shell on the device. It's you have to go through their support, you have to set up credentials, you have to get a key that only works for four hours in order to do it. And that does open up the actual bash shell. But even with that level of access, you can't just go in and, like, delete the files. There's other protections in there. So they were going through from the premise of, like, let's say we had access to that, you know, the limited shell. What could we do? And there's actually not a whole lot you can do to modify configurations or gain root access on it. But they did find one undocumented configuration or one undocumented command in this limited shell called reg show config, which is basically going into the registry of the device, like the configuration setup for it, and grabbing all of the configuration items out of it. And when they were reviewing that, they found a few configuration. Items that were effectively cron jobs on there, and the cron jobs were running as root on the device, and

Corey Nachreiner  25:06  
they left those root cron jobs you can exactly

Marc Laliberte  25:09  
so they found that through the SE mode you could also do a reg update command to update some of these settings in there. And basically they ran the reg set command for config cron tab and then a certain cron tab entry and basically set it to when it executes, open up a reverse shell to a server under their control. So that was their way to gain at least root access to the file system. But they still couldn't delete data like it was still protected when I write once read anything. But what they found they could do was go in and I guess one last little bit of background info. In this Dell system, there's a bunch of different like, local user accounts that are responsible for, like, managing the data backups on there, they're not accounts that, like one of us would log into, but they're accounts locally that other processes can use to authorize themselves to, like, manage configurations or manage the backups. So what they found, though, is they can go in and change the passwords for these accounts in the Etsy shadow file, Linux file, and then also remove the LDAP configuration so that, basically, the system could no longer manage backups. Users authenticate,

Corey Nachreiner  26:18  
and it couldn't do anything. It's supposed to do it exactly internally, yeah. And

Marc Laliberte  26:22  
so basically, it bricks the device. And they predicted that, like, through support with Dell, you recover, you might have to have someone come in physically, like, remount the hard drive and like, re add these credentials so it's possible to recover, but I would add slow it down a lot, and that might incentivize an organization to pay the rants, especially

Corey Nachreiner  26:44  
because they used to have 72 hour one week timers, so you don't have a long time. Good news is,

Marc Laliberte  26:49  
Dell fixed a lot of this. They even removed that se mode entirely. And in the retests these two individuals, they weren't able to find a way to replicate. That's good thing. Then they moved on to the IBM DS 8000 which is like a single piece of hardware with three different services running on it. One of them is like the management service. One of them is responsible for the backup management, the other one's responsible for recovery, for the backups. This is the one where they only had production access. They were given some pretty strict guardrails of basically nothing that could impact production on it. So they couldn't, ultimately, like, prove that they could disrupt it, like they did with the Dell one, but they still found some interesting things. So for two of those three applications, first off, they are able to log in with default credentials, which gave them full administrative access to them. For the third one, they were prevented from logging in by the UI, like they had credentials to get in, but the log out button was grayed out in the web page. And as you might suspect, you could go, edit the HTML, remove the disabled button, and hit log in and you gain admin access. Then at that point, wow, um, they found that they could read arbitrary files off of the machine through a series of exploits, or at least read the first line of arbitrary files off the machine that does include the Etsy shadow file so they could get the admin password out of there. They also found they could delete arbitrary files as on the device, but they saw capabilities or exactly, they could most likely block access.

Corey Nachreiner  28:25  
Now you couldn't, you probably couldn't override the actual backup, but you might be able to override a system file that would cause nastiness Exactly.

Marc Laliberte  28:32  
So they found they could enumerate all the files on the file system, and they figured pairing that with this arbitrary file delete, they could find the important system files that let it run, delete it, and it would basically make it unrecoverable again, without help from the manufacturer. Then the third one was AWS backup. And this one was interesting. It wasn't they didn't find necessarily an issue with AWS backup itself. It was more of a series of issues within their own organizations implementation of it. So they were given a their own account, and their group organization called like, POC backup or something, and then they had like, simulated backup data going to that just like the actual production account was using. So that was their target to go after they started out with no access to it whatsoever. So as a penetration testing engagement, first thing they tried to do was gain any bit of credentials that could like, get them into the system or environment or something related, then they could move, laterally, elevate and ultimately gain access on that. So what they ended up finding was internally at their company, there was a Docker repository that was unauthenticated, and so they were able to go in and pull Docker containers out of this repository, and one of those containers had administrative credentials to their internal git lab source code repository. So GitLab, just like GitHub, have this concept of action. Actions where, basically, you can run CICD automations on like containers that get spun up. So like at WatchGuard, we would use them to, like, build source code and then deploy it to the actual application. So these containers that get spun up, though, you can create, like, a pool of ones that are just permanently available, and then you can attach permissions to them. So for example, the AWS infrastructure team at their organization had a set of these runners, as they're called, that had there were actually EC two instances in AWS, and they had a Iam role attached to them that gave them some permissions within AWS, because for what they're supposed to be used for is building and deploying code to the AWS environment. Yeah. And so now, using this GitHub actions, they were basically able to assume the role of this slightly more privileged, privileged account within AWS to give them access into the AWS like organization, they found that from that instance, they were able to assume a another role. So AWS, it's like this series of like it's like an onion of roles where you can have a role attached to like a server and EC two instance that role might have permissions to assume another role under certain circumstances, or under any circumstance. Long story short, these runners could assume a role that had full admin permissions, literally like star, dot star, so any access on any resource and So, long story short, from the unauthenticated access to a Docker repository, they're able to get GitLab administrative credentials that then let them use the GitLab runner to gain access to the infrastructure production account within the AWS organization. They were then able to assume the role, which then gave them access into that POC account, which they then used to just delete the account.

Corey Nachreiner  31:58  
Well, there you go. That's an easy way to delete immutable

Marc Laliberte  32:00  
exactly the backups in the AWS backup service, but you can delete the accounting account, yeah. Now again, this is another one where AWS could probably help you. I assume they keep, like, still there, yeah, but it would slow it down, yeah. So I thought this was like a really interesting talk, like, from the perspective of a penetration tester, like pretending they were a ransomware operator and like, here's what we would do in our own company's environment to make sure they could not recover from a ransomware attack that was kind of cool and very cool, terrifying, very terrifying. These are all things that like even those that last I wonder if

Corey Nachreiner  32:37  
they just knowing that since they were pen testing themselves, they had a little bit of internal knowledge that might help make lateral movement quicker. But a threat actor sitting on networks for a long time could eventually probably learn all that exactly. I

Marc Laliberte  32:49  
mean, none of it's out of the realm of possibility, like it totally makes sense a threat actor would find a unauthenticated Docker repository.

Corey Nachreiner  32:56  
We've seen privilege escalation in so many ways once you're inside

Marc Laliberte  33:01  
exactly, so it's totally in the realm of pot. You're right, though, like a white box test versus a black box test, which is what like from the perspective of a,

Corey Nachreiner  33:09  
I mean, I for instance, the Docker the place that had all the Dockers that they have knowledge of that before, so they might have had an idea. But like, like a black puck test bad guys would just take a little longer to find this all out.

Marc Laliberte  33:23  
Yeah, so either way, I thought that was a really cool talk. And like, the main takeaway that I had was just because your backups are immutable, and, like, there's actually proven technical controls making it so you can't modify that data doesn't mean you can't just totally screw up the availability of that cup solution too. So really interesting. And like you, like, there were so many other interesting talks today that will hopefully cover at least one or two more on the next one you would like

Corey Nachreiner  33:50  
the attack on RPKI for BGP. Anyways, we'll talk about it later. That

Marc Laliberte  33:57  
sounds cool, but that's it for black hat. Now I feel like that went way

Corey Nachreiner  34:02  
quick it did. I don't know if I have the energy for the more fun summer camp, but it is more fun. Maybe I'll get the energy we didn't

Marc Laliberte  34:08  
have a booth, so I'm chock full of energy right now.

Corey Nachreiner  34:13  
You're doing so much at the same time. It's pop, pop, Corey, my old man talking.

Marc Laliberte  34:18  
It's the shots of caffeine running through my blood right now. That's probably helping me vertical, but either way, next up is DEF CON, which is the more technical, hands on conference. It's

Corey Nachreiner  34:30  
just funny, interesting characters. People let their hair down. You get the uncensored version of talks. Yeah, they're usually fun. I'm really

Marc Laliberte  34:37  
looking forward to that, and we will have a recap for that one early next week for you with our favorite takeaways from DEF CON, as well as maybe a couple more. Yeah, but yeah, thanks for listening. And yes, that's a good place to end. Yep, Hey everyone, thanks again for listening. As always, if you enjoyed today's episode, keep that to yourself. I mean, rate review and subscribe. Thanks again for listening. If you have any questions on today's episode or suggestions for future episode topics, reach out to us on Instagram at watching the

Corey Nachreiner  35:09  
messenger pigeon at this point, underscore

Marc Laliberte  35:12  
technologies or email us, or smoke signals or whatever, but yeah, thanks again for listening, and you will hear from us next week,

Corey Nachreiner  35:21  
peace, applause.