ByBit Says Bye to $1.4 billion

Episode 321 –

This week on the podcast, we cover the largest cryptocurrency heist ever (for now). Before that, we cover Apple's decision to disable Advanced Data Protection (ADP) for its UK customers. We end the episode with a review of Wiz's State of Code Security report for 2025.

View Transcript

Marc Laliberte  0:00  

Hey everyone, welcome back to the 443, security simplified. I'm your host, Marc Laliberte, and joining me today is

 

Corey Nachreiner  0:07  

Corey the spicy, crunchy Nachreiner, very spicy. Or today, let's see,

 

Marc Laliberte  0:14  

on today's episode, we will be discussing corey's favorite company giving up privacy features for its customers. After that, we'll discuss a billion and a half dollar heist for a cryptocurrency exchange,

 

Corey Nachreiner  0:28  

and we'll secure

 

Marc Laliberte  0:32  

Yes, round about with her review of wizzes, state of the code, security 2025 report. With that, let's go ahead and, uh, check our way in.

 

Corey Nachreiner  0:46  

We'll request our way in, I guess, approve, fine.

 

Marc Laliberte  0:57  

So let's start this week Corey with, uh, a news story that has been brewing for I mean, honestly, it's been brewing for years now. It feels like but had a bit of an update at the start of February, and a pretty major update most recently, where last week, Apple announced that they are disabling the Advanced Data Protection feature, or ADP, for customers in the United Kingdom after the UK Government demanded access to encrypted user data. So this kind of started and or most recently, started back up in earlier in February, when the UK Home Office issued a demand under the Investigatory Powers Act, which compelled Apple to give access to user data under court order saying that the end to end, encryption technology that Apple provides enables criminals to hide more easily. This isn't the first time a government has tried to do this to Apple. The US government, back in 2016 tried to get them to give access to the encrypted data of the San Bernardino shooter's phone back then, they ended up withdrawing the request after partnering with a third party company to effectively break into the shooter's account and access the data through undisclosed means. But Apple has historically resisted any attempts to back door or disable this end to end encryption feature.

 

Corey Nachreiner  2:22  

Well, actually, I want to just pause there, like the back door is one thing, but I didn't even know. Like the one thing I'm not sure if is new. I should know as at least you call me the apple resident fanboy. I don't think I agree 100% but I do like this part of Apple, but in the US, there is no advanced data protection feature because it's just on. So this says that ADP is, at least in the UK, is something that you had to enable as a customer. So I is that new because of this, or is that now they're removing that feature that was UK only, which is now, you know, in the UK, you could turn it on and get the same you own your keys, only situation that we have in the US. But to be honest, the fact that this article talks about how it's a feature you could have even turned on before the UK thing is a surprise to me, because in the US, it's a default. And I don't think, I think all customers simply have their their keys and Apple doesn't see the data. So it being something currently

 

Marc Laliberte  3:30  

a default in the US either. I know this because I just pulled it up on my phone and I still have to turn it on on my device. The reason being, like, while it is still it's a great privacy feature of and then all of your user data. It does mean that if you lock yourself out of your iCloud account, there's no recovery, for better or worse, which honestly like, the reason I haven't turned it on is because I only just got an iPhone and have forgotten to set it up, thinking it was enabled by default, apparently. But I think it is a it's a feature that should be, maybe not enabled by default, but heavily pushed by Apple for privacy protections.

 

Corey Nachreiner  4:12  

I would even say most, I think there's use cases when you sell a cloud based product where you do have, you want to allow, say, manage service providers to have access to your information, to manage things for you. So there may be some B to B products where you shouldn't have it, but I would even argue, I think having the keys in control of the customer and the customer only is something that should happen in every consumer data product. It's, to me, the ideal secure situation. I mean, the customers do have to take the liability for now, it's up to you to maintain your key and not make a mistake when you switch phones or whatever, and Apple or whatever company can't help you if you do. But I would even argue that beyond I like that. Done this, I wish it were the default. And I wish it was the default to at least consumer products, knowing that there might be some use cases where you would want to share data, maybe,

 

Marc Laliberte  5:10  

and especially consumer products where, like by design, they have a massive amount of personal data. And beyond, just like personal like metadata, like, you know, email addresses, names, but, like, literally

 

Corey Nachreiner  5:22  

personal dash words, yes, medical information, your little literal biometrics, not just to log in, but your health biometrics, yeah. I mean, my phone has a lot of personal data,

 

Marc Laliberte  5:37  

exactly, well, unfortunately, if you are a UK Apple customer now if you try to go enable ADP, you're given a screen that instead says that Apple can no longer offer advanced data protection in the United Kingdom to new users, and they do plan to disable the feature for any existing users sometime later this year. And that really crappy thing is, this still may not be enough in the United Kingdom, so the original order it's not just for UK customers. They are demanding that Apple disable this across the entirety of their customer base, basically saying that by having any sales or whatever in the United Kingdom, the entirety of apple falls under jurisdiction, and they want to be able to go after anyone's data through court order

 

Corey Nachreiner  6:29  

that, sorry, that's never gonna like countries are different, buddy, go away UK.

 

Marc Laliberte  6:34  

I mean, at the same time, if they really wanted to, the UK Government could come back and say, Okay, you're not allowed to sell in the United Kingdom, then how would they do that

 

Corey Nachreiner  6:42  

to app? So honestly, well, I don't want to miss your agenda, but that's what I want to get into, is a lot of people are lot in apple that they just removed the feature, but it didn't get for UK, but kept it around the world. My worry is now every government, I don't lot apple on this, because every government in the world is going to see this, and now the US government is going to say, I don't want ADP here too, and every other kind of certainly all the you know, censorship states will do the same. And I would have rather Apple said, this is our design. It's by default. Don't buy our phones. Because I feel like that would have much more likely had a huge citizen response in the UK who I we're hearing. I don't know about you personally, but a lot of my UK co workers and friends have reached out. They I'm glad you had this on the podcast, because otherwise I would have mentioned it's a topic we should hit, because they're saying, What do you think about this? They're mad. So yes, I know that's a huge risk for a business, because they're have the potential to temporarily lose profit in a major center, but I have a feeling the UK citizens would have stepped up the second that happened and forced the governors to change their Idiocracy. And this is Idiocracy, by the way.

 

Marc Laliberte  8:09  

Yeah, I do agree with that, that they probably could have eaten some temporary pain, and the UK Government threatening to ban them for sale in there. And, I mean, crap. Just look at what happened with Tiktok, with the massive uproar of that getting banned temporarily for 20 hours. Yeah, you could imagine the uproar for Apple being banned for sale would be even more.

 

Corey Nachreiner  8:36  

Yeah. I mean, the good news is there are. There do seem to be, as the article states, lots of us, not lots of a portion of Congress saying that this is dumb. This is going to make the whole world less secure. I think I get like, I think we've talked about this so many times, but we want our investigative authorities to be able to catch bad guys. But this is not the way. This is being able to see into everyone's stuff, but also allow every criminal to now see into everyone's stuff because it's less secure, is not the way. And without getting into politics, the back door option of this giving just the UK Government or the US government a back door equally stupid, because powers change. There's many dystopian novels of democratic governments going to authoritarian ones. We've seen it happen. It's

 

Marc Laliberte  9:27  

not happening at all. And, yeah, this is my

 

Corey Nachreiner  9:31  

attempt to stay non political Marc. But you definitely know, like, I feel like we live in the United States in a dystopian novel. So this is a perfect example, at least, if you're on my side of this idea that I don't want even a good willed government to have a third party key because power changes, and governments sometimes become authoritarian and remove rights, and then they can abuse this power far beyond. On what they should meanwhile, just giving them the back door gives criminals a potential weakness that they can abuse, which is the entire freaking point. This encryption protects millions of pieces. I mean, it's used in business. It's used in banks. It's like it's made so only the people that are supposed to participate can see the traffic. So anyways, yeah, it's sad. And while some people think, hey, at least Apple's just taking a stance in the UK, they're not taking a stance. They've allowed a feature to be disabled, and if they're doing it for a Western government or a democratic, supposedly democratic government that's not even known for kind of spying and censoring their users. What's to stop them as soon as American government like this is the government that would put pressure on Apple to do this, and then, let alone every freaking authoritarian government. So to me, not pushing back against even Apple's choice means this is going to be the hair on the camel's back that, or the crack in the foundation that makes the whole house tumble over time.

 

Marc Laliberte  11:07  

Yeah, I am a little concerned about that too, but I guess we will see exactly how far this spreads outside of the UK. For now, if you are a UK individual, the good news is, the data on your phone is still encrypted. It's just anything you back up to iCloud is no longer and then encrypted. I mean, it is encrypted in there, but now Apple has access to the keys, which means governments, through court orders, have access to your keys too.

 

Corey Nachreiner  11:36  

Hopefully you're at least in a country that needs a court order to do this,

 

Marc Laliberte  11:41  

yep, and in a country that court orders are transparent and followed correctly, which is not the case in some other ones. Anyways, speaking of countries with authoritarian leaders, last week, North Korea's state sponsored hackers, Lazarus stole $1.4 billion billion dollars worth of Ethereum cryptocurrency from the crypto exchange by bit. This makes the the largest cryptocurrency theft in history, and potentially the largest financial theft in history as well, too,

 

Corey Nachreiner  12:17  

at least, by the way, in a we traditional bank, you said 1.4 just for people that are on YouTube. As Marc even said to me earlier, the change in the headline is simply because the value of it has dropped enough that the that's suddenly what they got away with is worth less.

 

Marc Laliberte  12:33  

Yep, it was originally 1.5 at the time it was stolen, it is now worth 1.4 just another week or so later.

 

Corey Nachreiner  12:41  

But either way, also, I'm surprised. I'm glad, like, I agree with it, and when we get into by bits research, but you didn't even put allegedly, in front of North Korea, but some people might like, we'll get into how they believe it's North Korea that's doing it. But

 

Marc Laliberte  12:58  

I saw at least updates from researchers, and one post from the FBI that seemed to confirm that it was Lazarus heavily involved in it. It's also like by now, if you're not familiar with them, Lazarus is the North Korean state sponsored attacking group that has consistently targeted cryptocurrency exchanges in recent years as a method for funding the North Korean government. Well,

 

Corey Nachreiner  13:21  

to me, it wasn't just them mentioning Lazarus as name from a couple people you'll find later. I mean, they, they, well, one of the things we might talk about in this hack was the Fast Money laundering you can follow every you know, cryptocurrency, Blockchain transaction. So the wallets are already kind of known to be associated with Lazarus slash, North Korea. So there is some strong evidence in this case, rather than just averaged tools, tactics and procedures.

 

Marc Laliberte  13:48  

And before we get into the details, and thankfully, there are some details to go over. Now, bybid is actually offering a 10% bounty on all recovered funds. If anyone is able to hunt down and freeze or recover funds, they'll give you 10% of the return value, so potentially up to $140 million if they get it all back. But on the 26th of February, so just yesterday, at the time of recording this bybit actually published their initial forensic reports from two different firms that they brought in to help investigate this incident, and there's actually some really interesting details in them, and they both came to the same conclusion about how this attack actually went down So real quick, bybit uses a application called Safe Wallet, which is a multi signature wallet application to manage their cold wallets, cold being the typically offline stores for high values of cryptocurrency that you only extremely rarely interact with, to transfer to a warm wallet to then ultimately use on a regular basis, Safe Wallet. It's a smart. Contract application. So in Ethereum, it's a an application that runs on the blockchain itself that supports multi signature transactions, meaning, in order to send funds from your wallet to another, you need more than one person to approve it. In the case of bybit, it was three including their CEO. Can imagine why, if you're going to send a 1.4 $1.5 billion transaction from your cold wallet. Why you would probably want at least a couple of people to have to approve that transaction before it's allowed to go. But so on February 18, the attackers staged a malicious version of that cold wallet smart contract with a back door in there to withdraw funds. It was basically a copy of the the original smart contract that runs the cold wallet with all of its normal functionality, but with some added malicious code in there that would let them drain the funds of any wallet that use this contract on February 19, so a couple days before the actual incident, they modified two JavaScript files and safe wallets content delivery network to include malicious code. The code first checks the signers address and only executes for transactions that are either from bybits own wallet, their specific cold wallet, in this case, or another one that was probably used for testing by the threat actors. It then copies the original transaction information so it can use it later. It modifies some of the fields in that transaction to ultimately change the recipient field to the attacker's malicious wallet address. And then, once the transaction has been executed and signed, it then replaces the original or restores the original transaction information. So if you're like looking at it in the UI, it looks like nothing was modified at all. So bybit initiated a transaction from their cold wallet on February 21 which ultimately used that compromised JavaScript from safe wallets, Content Delivery Network, and effectively transferred ownership of their cold wallet to the attackers, who then quickly drained it. The attackers must have had some automation in here, where, two minutes after that transaction was executed and signed, they then replaced those malicious JavaScript files with the original, benign ones again, back on the content delivery network to try and cover some of their tracks. And the end result of this was just by trying to execute a transaction and move some funds from their cold wallet to a warm wallet, the attackers gained control over the entirety of that cold wallet and drained it a couple things there, though. Like, first off, this was highly targeted, like it was one test wallet it looked like, and the other address that could trigger this was the known address of bybits Cold Storage wallet. This this one. Now they have multiple cold storage wallets that they use, but they went after this one specifically. The other thing is, it does seem to indicate that this attack was carried out largely because of a supply chain compromise against Safe Wallet. The attackers clearly had a AWS credential that let them upload this Melissa's version of the JavaScript.

 

Corey Nachreiner  18:14  

At least it was a credential, right? It wasn't like an open s3 bucket that because eventually they basically just got access to Amazon AWS s3 bucket, and were able to change some files to do all of these, if I understand, right? But at least it was credentialed access. I mean, still a huge hack, but so used to unsecure s3 buckets. It's nice. At least there was an attempt to secure this one, yeah,

 

Marc Laliberte  18:37  

and for a billion and a half bucks, like going and compromising an AWS credential for a company, it seems like a pretty good return on investment matter how much that actually took to get well,

 

Corey Nachreiner  18:48  

we've said it before. Hackers don't break in, they log in MFA, and authentication is the cornerstone of security. Yep, digital security, anyways,

 

Marc Laliberte  19:00  

and yet, I mean another terrible example of a supply chain attack going after an organization, in this case, highly targeted. There was a bit of, like, interesting irony in this. I don't know if you agree with Corey, but So the whole point of cryptocurrency is, it's this, you know, libertarian decentralized system, where you know code is law, and there's no big central regulating banks getting in the way of everything. But in this case, it was an attack against a centralized store of funds, and because this is cryptocurrency and not the real world, there's like no recourse or no mechanism to get those things back pretty easily, and no regulation for it too. It's just it's gone

 

Corey Nachreiner  19:46  

well, this has been my argument about cryptocurrency forever, and the difference between fiat currency that I get, why libertarians like the idea. But if you do not have some entity that will, you know. Whether that's a government, a big business, some some entity you trust to validate and and regulate things, you know, it's never going to work his fiat currency and

 

Marc Laliberte  20:12  

the other like, I agree. And another point I wanted to make so they're using up. So traditionally, a cold wallet and cryptocurrency, it's kept offline for a large period of time, and you only transfer funds when you really need access to that large store of funds. Maybe you're spinning up a new warm wallet or whatever. But either way, it's interacting with that cold wallet is a highly risky action anytime you do

 

Corey Nachreiner  20:39  

and why some people still print out their things and throw them in a safe exactly,

 

Marc Laliberte  20:45  

and so, I know it's easy to say in hindsight, but interacting with your cold wallet by using JavaScript files hosted on a content delivery network, so meaning dynamically downloaded at The time where you make that transaction. Yes, in hindsight, it's clear they shouldn't have done that. But this feels like when you're going to be using that type of application, you should be using, like, local, validated code for how you're interacting with that, that cold wall. And I know that is not the case across like most of like cryptocurrency and the folks that operate in this space, but for that type of, like, really risky activity, it feels like you should be have some additional assurances that you're not downloading totally compromised code for some remote location. Yeah,

 

Corey Nachreiner  21:35  

it's almost like Marc that that the need to have some big entity that you might pay a little bit to, like taxes, to to have regulation which isn't bad regulation is controls, security, standards and other things, and to have, you know, IGS investigator generals make sure people aren't messing around with your most important Resource money. Maybe there's a reason. There's things like governments and regulations, and they're not just sitting on their hands doing nothing, huh? It's interesting. Who would have thought maybe libertarians live in this utopia land that, yeah, sure, everyone should be free, but what happens if someone's freedom conflicts with yours, oh yeah, there's no such thing as true freedom either. Libertarians like maybe have a clue how people in society works. I love blockchain. I would love digital currency, but it needs to be regulated by an entity people can at least kind of trust.

 

Marc Laliberte  22:39  

Also. Let me hop up on my blockchain soapbox as well one more time, where for the last decade now, we've been being promised like this is going to be life changing. There's going to be so many amazing new things we can build with all this. And so far, it's literally just been meme coins and like rug pulls targeting vulnerable people. Oh, maybe

 

Corey Nachreiner  22:58  

the people that are billionaires actually got all their money from a Ponzi scheme that took money from millions of people who are now mad, but somehow trust the billionaires who already took their money. Huh? Yeah, I'm sorry to me, cryptocurrency is still in the Ponzi scheme state. Yes, some people can make money, but you're gambling with a winner, and anything where only the 1% succeed, but the 95% thinks that, oh, I have another chance to maybe make it into 1% Guess what? 9596 99% you're all going to be losers. So either way, where I am with people, currency and many other things,

 

Marc Laliberte  23:40  

I agree entirely, either way, though, like for folks that do still work with cryptocurrency, I feel like there's got to be a space now for even more secure mechanisms for transactions. Why do they

 

Corey Nachreiner  23:52  

not want the regulation? To me, it's like, Yes, guys, you invented a really cool technology that could help society, but it shouldn't be run by a bunch of monkeys that just want to get rich quick. It should be regulated. And maybe, if you let a little regulation in there and have people that trust or at least require security frameworks around what you do, oh, I know that might slow you down, but guess what? It might prevent $1.4 billion from disappearing. Maybe, maybe getting it officially regulated would allow you to turn cryptocurrency and what it's supposed to be, which is something everyone can use and could make life easy. I think no one's stupid. It's a great technology, but this idea of it being controlled by a bunch of nerds with no centralization, keep on throwing your money into the fire, guys. Sorry, I'm spicy.

 

Marc Laliberte  24:44  

It is a very spicy take today. Corey has been eating chilies. But I guess I don't think anything you're saying is necessarily wrong. Like, one of the big issues with this space is it is entirely still the Wild Wild West, and that's why crap like this keeps happening. I. And

 

Corey Nachreiner  25:00  

I think the term regulation, no one wants over regulation. No one wants a waste of time. We in the security community know that there's some compliance that is common sense and should be done, and there's some compliance that gets a little onerous and dumb. So I'll agree with any everybody that thinks regulation can also be a pain in the ass. That's true sometimes, but there's a reason for some of it, so why don't we just come up with the right regulation together?

 

Marc Laliberte  25:25  

Yep, because otherwise you end up with the likes of FTX losing everyone's money and yes, 1.5 billion

 

Corey Nachreiner  25:33  

and in the wild west. But I know you all own guns and think you're the best Gunslinger, but I guarantee you there'll be a better gunslinger than you. So I really don't want the Wild West.

 

Marc Laliberte  25:45  

Yeah, I cannot wait until the entire cryptocurrency ecosystem collapses and hopefully something a little better,

 

Corey Nachreiner  25:52  

or is a real one, or we finally adopt one that actually is built on something with regulation, and people can use it, and then all the rest go away. They will like the only reason they have value is because there's no real replacement anyone's using the second there's one that everyone uses. I predict all the other meme coins and even other cryptocurrencies will quickly lose value.

 

Marc Laliberte  26:14  

Yep, I agree. So anyways, moving on to the last topic for this week, though. So so the cloud security company wiz released their state of code security report for 2025 last week, and had a couple of interesting findings. Like, one thing to point out immediately in this is all the data is coming from wiz customers, and so it is, I'd say there's a little bit of selection bias here in terms of some of the stats we're going to get into. But with that in mind, there's still, like, some interesting trends I think are worth talking about, for

 

Corey Nachreiner  26:49  

sure. I mean, it's the same as our ISR, right? It's based on their product, so they can only use the quantifiable metrics they have. Same with our internet security report.

 

Marc Laliberte  26:59  

So first and foremost, like a couple of quick stats, just about the ecosystem itself. So of the platforms that wiz supports doing source code analysis on, or at least source code repository security on GitHub, is the most popular, with 81% of the market share, followed by 13% to get lab and 6% to Azure DevOps. I think that makes sense. GitHub is far and away still the most popular source code repository that I see from other stats out there too.

 

Corey Nachreiner  27:33  

By the way, am I stupid in that I knew GitHub and Azure DevOps, but I didn't know git lab, and if I heard of it before, I thought it was maybe just part of GitHub,

 

Marc Laliberte  27:44  

I mean, but I

 

Corey Nachreiner  27:46  

always go to GitHub man, like it's the default to me, but I paid attention to GitLab. GitLab is

 

Marc Laliberte  27:55  

growing in popularity, and it had a pretty big surge after Microsoft bought GitHub. And some of the concerns around that I see, but as you see, they still trail pretty significantly behind GitHub in terms of usage and number of repositories. They had another interesting trend, so they looked at for all of the repositories on these platforms, what percentage of them are public versus private on GitHub, 35% of the repositories are public, versus less than 10% for Git lab or as or DevOps. So basically, if you're a GitLab or DevOps user, it's primarily for private repositories and open source projects for them, GitHub is still king.

 

Corey Nachreiner  28:38  

Interesting to see how many of the companies on these different repositories? How many organizations have no public Yeah,

 

Marc Laliberte  28:49  

something like 80% for Azure DevOps down to like 30 or 40% for GitHub. Yeah,

 

Corey Nachreiner  28:55  

GitHub seems to be very much community, open source level, whereas Azure is commercial IP level development and GitLab, similar to Azure.

 

Marc Laliberte  29:07  

So beyond the ecosystem stuff, which is interesting enough, they went into a few security specific findings, including the first one that stood out to me was they looked at repositories that have secrets saved somewhere within the source code or checked into the repositories. They found 7% of private repositories have a secret saved in the code, 2% for public repos. 86% of organizations have at least one private repository with a secret, though, and 61% have at least one public repository with a secret and they even noted they suspect that, like the the general population numbers are higher, just because, by nature, these are wiz customers. They're presumably getting alerts and notifications for this and hopefully resolving it. But even with that, the case 80, 86% of organizations having. Least one repo with a secret in it seems pretty high,

 

Corey Nachreiner  30:02  

yeah, although, to be fair, like it's the 61% those are the idiots like that is your repo is currently you have a key sitting in the open right now, the 86% the fact that it's private repo, it's still bad practice to have any keys in your code, but at least it for that one, you'd have to get access to this private repo to get the key. So it's a more escalation internal, you know, escalation vulnerability, but it would like it. The key thing is you shouldn't have secrets in your actual repository code,

 

Marc Laliberte  30:38  

because one of the first thing adversaries do against software engineering organizations when they compromise an account is they go check out what source code repositories they have access to now and get scrape secrets out of there, if they can. Absolutely they did an analysis of like, what types of secrets were stored in there too. For example, the most common types of secrets and private repositories were anthropic API keys. Was the biggest one. That's a little bit yeah, that one there in the in the report that core is showing on the screen. Okta secrets for the next biggest ones, and then GitHub Personal Access Tokens, so credentials that control or give access to source code repositories themselves was number three. When you look at public repos, the dropbox API becomes the most common one, GitHub Personal Access Tokens for the second one, and then TerraForm vault secret tokens for the third most common one. It was, I'm actually it feels like a weird anomaly. Seeing anthropic API up there is such a I

 

Corey Nachreiner  31:42  

wonder if that just literally happened in the last year or two, like with the heavy use of machine learning. And it is kind of weird, anthropic specifically, versus some sort of open AI key. I don't I assume open AI has keys too. Anthropic is a big AI company, and kind of a cool one, in my opinion, but yeah, I wouldn't

 

Marc Laliberte  32:02  

this may be a bit of a selection bias, also, just because these are secrets that wiz are able to identify and link back to a specific application. So there can be other keys that they just haven't figured out yet correct. So like a lot of applications will use just a generic like, U, U, I D, for their their keys, and you can't really link that back to a specific application, but some of these other ones, like GitHub Personal Access Tokens, start with like, P, A T, or g, h, p, a T or something, which makes it easier to identify.

 

Corey Nachreiner  32:35  

Makes sense.

 

Marc Laliberte  32:38  

I wonder, though, is there like, like we talked a couple weeks ago about, what the heck was it like, default keys and example projects as a mechanism for attacking organizations? I wonder if there's like a an example anthropic project where they've got a secret config and companies

 

Corey Nachreiner  33:00  

either, I'm actually surprised that they don't have a graph on that, like, if there is a default key, it would be one they would see over and over in multiple different organizations if they were using it. So it seems like something wiz could track, but that's a good question. Or how many default one used in multiple places, or a config

 

Marc Laliberte  33:21  

file that gets automatically checked into the repository because the like the Git ignore file isn't correct in the example project. So either way, interesting seeing anthropic is the biggest one that felt like a bit of an anomaly for me. They also went into an analysis of like SDLC practices and like deployment practices, they found that only 12% of GitHub organizations are using GitHub actions, for example. So GitHub actions, it's a pretty popular platform or mechanism for running like workflows, to, for example, trigger when you make a pull request into a repository or to deploy your code, I guess 12% of them using GitHub actions. That's not super surprising to me. Like internally at WatchGuard, we're a pretty heavy GitHub shop, but we don't use GitHub actions, at least from the company level. We use a different STLC mechanism. Our team uses GitHub actions, mostly because I prefer them, but the rest of our engineering team uses something else. They also looked though at GitHub actions, specifically which and the permissions that organizations are granting to them, they found that 90% of repositories that have GitHub actions allow those actions, the permissions to write back to the repository, and 80% of them allow actions to approve pull requests too. So very powerful permissions being granted to these these workflows at pretty high percentages, like 90% can modify the code within the repository. Now that is a legitimate. Need for some of these, like capabilities. For example, we've got a repository where, when we build a new version of it through GitHub actions, that action goes back and updates the version number included in a configuration file in that repository. So it needs that level of permissions. But wiz made some probably accurate assumptions that maybe that isn't required in all cases and organizations are potentially giving more permissions that are actually necessary for these these powerful tools, they found that only 31% of private repositories have branch protection enabled. Thankfully, 66% of public repositories do branch protection is a way to add additional controls to only allow checking code into like your specific branches if it meets certain requirements, like code reviews or passing validation checks by your CICD pipeline or and actually quite a few different options. Is there as well too. What do you think

 

Corey Nachreiner  36:01  

about that? Like it seems, the trend we're seeing is when someone picks private, they do less security, like they had more keys in their code, and now they're worrying less about branch protection, because they assume private is going to protect me, whereas if it's public, yeah, there are still keys in code, but less and here, more people turn on branch protections because they realize they're public, maybe. So I, I actually, I mean, I get the logic, it's definitely more dangerous to have something that not have security things enabled publicly and when you're when you're open public repository. But I feel like it's just an example of laziness the crunchy center, like it's the lack of zero trust. You know, we already know that being private is like the crunchy exterior. You do have a hard shell around you, but everyone has teeth that might crack your shells. So why aren't you turning on the security things you can do internally,

 

Marc Laliberte  37:03  

and like branch protection specifically, is a really powerful and useful feature, like we use it on in the security operations team here at WatchGuard, on just about every single repository we have, because these are tools that are running in production in our organization. We've got automation where when stuff gets checked into main the main branch, it gets deployed, and that should only happen after it's been reviewed, validated and tested. You don't want people being able to just cowboy their new code straight into production. And I'm surprised that more

 

Corey Nachreiner  37:37  

you might get a supply chain back door. Someone breaks in with how there's no checks before it's backdoored, or

 

Marc Laliberte  37:43  

even beyond, like security, just someone checking in bad code and breaking the application because there wasn't any testing, validation or reviewing before it. Like, these are important features that should be like, it takes a tiny bit of time to set them up. Like, not a non zero amount of time, but like, it's totally worth it

 

Corey Nachreiner  38:02  

is huge. Yeah.

 

Marc Laliberte  38:07  

And then let's see. The very last stat that they went into was 77% of GitHub apps have pull request permissions. So GitHub apps, it's the mechanism similar to like Microsoft, Azur ad application registrations, where you can give some other third party component permissions within your GitHub repository or GitHub organization, and the overwhelming majority of the permission grants that are given to these applications are just metadata read. So being able to read information about the repository probably it's security controls. I'm assuming that Wiz, their own product requires metadata read in order to access repositories, which is probably why it's like 98% of them have that but granting pull request permissions could potentially allow these applications to update code within your repository. So a very privileged permission, I was surprised seeing 77% of GitHub apps granting that permission to the repository. I think if I had had to summarize, like, some of the takeaways from this, you already hit one nail on the head, where feels like private repositories have less security than public ones. Is one trend I'm seeing. The other one I'm seeing is feel like there may be a lot of excessive permissions being granted to in source code repositories beyond what is probably necessary. Where I feel like that happens pretty common, just across the space the

 

Corey Nachreiner  39:33  

easy button, or just still ignorance of like, sometimes they skip it because it's just easier to quickly set up and get the coding without going through all the set things. It's like making your your RPG character. You could spend two hours perfecting it, but you want to get to the game. But if you don't spend a lot of time perfecting it, you may hate it. So do you think it's that, or do you think it's perhaps ignorance of some of the what? What a repository does in the back end, or how these in you know, different things work.

 

Marc Laliberte  40:06  

I think it's got to be a mix of both, like the easy button, for sure, and like I was saying, that's a common trend across security in general, of just give it admin, make it work, and then move on, versus actually setting fine grain access controls like you should, but there probably is a bit of ignorance in it as well too. I would hope that organizations that are using wiz are probably a little more security conscious than others. You

 

Corey Nachreiner  40:33  

would at least think the less that have a private one is more likely like a public one could just be hobbyists. If it's private, that you're trying to protect it, you're probably trying to make money off it. And then you would argue, well, then you should have an organization with mature SDLC habits, and you should know all this stuff. So yeah, it would be weird for organizations not to know some of that. Yeah,

 

Marc Laliberte  40:57  

but either way, like, if you are a software engineering organization, or if you've just got a GitHub organization that you manage, I'd recommend checking out the report and trying to take away your own learnings from it. And really the key takeaways are use the security mechanisms that are there for you to help protect your code and protect you from making mistakes that might lead to your stuff getting compromised, and God, stop checking in cloud secrets into source code. Seriously, oops, because maybe that'll end up with another billion and a half dollar heist from a cryptocurrency company, depending on who you are.

 

Corey Nachreiner  41:38  

Yeah, I'm going to start putting my house key in my mailbox.

 

Marc Laliberte  41:45  

That sounds like a great plan. Corey, I hope that works out for you. It's like the equivalent

 

Corey Nachreiner  41:49  

of cloud secrets in my code.

 

Marc Laliberte  41:54  

Is it okay? Well, we'll exactly

 

Corey Nachreiner  42:01  

or on my front door. Here I have this application that's locked, but let me hopefully put my key right here on the mat for you.

 

Marc Laliberte  42:12  

Perfect. That's a still feels like a stretch of an analogy, but I'm too tired to fight against it.

 

Corey Nachreiner  42:22  

All right. My strategy is working. I'm wearing them down. Hey

 

Marc Laliberte  42:29  

everyone, thanks again for listening. As always, if you enjoyed today's episode, don't forget to rate review and subscribe. If you have any questions on today's topics or suggestions for future episode topics, you can reach out to us on, what is it? Blue sky, I'm at it's Marc.me, core. Is it SEC adept? Dot, blue sky, dot, whatever. And you can find both of us

 

Corey Nachreiner  42:50  

landmark with social media. I think you should just throw stones in my window or do it in sky writing. So yes, on Instagram,

 

Marc Laliberte  43:02  

yes. Instagram, WatchGuard underscore technologies if you want to like share a picture or something you whatever you do on Instagram,

 

Corey Nachreiner  43:09  

what do you have for lunch? Show me exactly.

 

Marc Laliberte  43:13  

Thanks again for listening, and you will hear from us next week.