Join Paul Ducklin and SolCyber CTO David Emerson as they talk about the human element in cybersecurity in our new podcast TALES FROM THE SOC.
In this episode, our insightful duo ask just how human-friendly the cybersecurity industry really is.
Does the industry need as much esoteric jargon and as many complex components as it has?
Or is the burden of “more tools, more tools” weighing us down?
Find Tales from the SOC on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you run your own podcatcher app.
Or download this episode as an MP3 file and listen offline in any audio or video player.
[FX: PHONE DIALS]
[FX: PHONE RINGS, PICKS UP]
ETHEREAL VOICE. Hello, caller.
Get ready for “Tales from the SOC”.
[FX: DRAMATIC CHORD]
DUCK. Hello everybody!
Welcome back to another episode of [SERIOUS VOICE] “Tales from the SOC”.
I am Paul Ducklin, and I am joined as usual by David Emerson, who is CTO and Head of Operations at SolCyber.
David, good afternoon for you, evening for me.
DAVID. Thank you, good to be here again.
DUCK. David, this week I know that the SOC machinations that you would like to talk about can best be summarized as “Cybersecurity coming down from the ivory tower.”
Why don’t I hand over to you so you can tell us why you think this is really important?
DAVID. It’s really important for a number of reasons.
Maybe I more frequently see it as “Not ascending the ivory tower in the first place.”
The fact of the matter is that a lot of cybersecurity is far more practical and less compartmentalized than it is presented by the industry that is cybersecurity, and than it is imagined by those who are not in that industry in an explicit sense.
Many very pragmatic practices around technology and the management of systems are wildly underestimated, and this results essentially in leaning on esoteric practices of an industry that frankly isn’t capable of offering a comprehensive solution to defense against an asymmetric threat.
DUCK. To summarize, you might say that cybersecurity has become increasingly obsessed with ‘shiny new objects’, and this takes your mind off getting the basics right.
It perhaps even distracts attention and budget from doing things that you could have done five years ago, should have done four years ago, and have been remiss for not doing three years ago.
DAVID. Absolutely.
If you take the recent CrowdStrike incident, for example…
DUCK. [LOUD LAUGHTER] We got a couple of minutes in before you mentioned it!
DAVID. Yes, it’s a big one… [LAUGHING]
But it’s an example.
That’s not a cybersecurity incident in a strict sense, but it’s an incident that springs from the practice of cybersecurity.
That’s why you would have CrowdStrike on a machine.
Was it a bevy of reverse engineers that fixed that?
Was it some kind of a special programming course that you gave your developers to ensure that they could manage memory?
No, not at the affected organizations.
At CrowdStrike, perhaps it would have been helpful to have better memory management, but at the end of the day, what was really necessary was someone in front of a computer following remediation instructions.
And in order to do that, in some organizations, that might take the form of IT that they didn’t have, or presence in an office that they didn’t have.
Or some kind of mitigating control because they recognized that they are spread out all over the place, and if all their machines go down in ways that their users can’t fix themselves, they’ve got a real problem.
That’s super, super practical.
That is not a cybersecurity professional’s typical remit, but it’s something that ultimately mitigates a massive vulnerability in an organization.
DUCK. Yes, as I’ve heard you say before, it’s as though cybersecurity is better approached as a culture.
I think we seem to have lost a lot of the flexibility that perhaps we had 10 or 20 years ago by assuming that these automated tools that are gathering petabytes of telemetry and making instantaneous decisions and pushing them back in the cloud…
…that they should solve everything.
And then when they don’t, as you say, well, somebody has to get the recovery keys out of the safe and has to call someone up who doesn’t know too much about BitLocker…
DAVID. Yes.
It goes both ways.
I don’t see it as strictly a unidirectional culture exchange whereby cybersecurity is infused in the broader practice of technical systems management or even of just running a business.
Honestly, it’s not as compartmentalized as technical systems management in some ways, but I think that what has happened, what has changed, as you alluded to, is that there may have been a time that the practice of systems administration was more flexible or more able to adapt, to ‘flex’ into the practice of cybersecurity, even though it may not have been called that.
I think that, to steal a Joycean term, we’ve become these creatures driven and derided by vanity, in a way that is the marketing around the notion of cybersecurity.
The notion of it being a ‘compartment’ that can be sold; the notion that when your company is threatened by ransomware, your response should be to buy more things, your response should be to somehow engineer a solution that will preclude you falling victim to this ransomware.
Well, I don’t personally believe you’re going to do that.
I don’t necessarily think that those systems don’t work.
Occasionally they might, but at the end of the day, there could be a novel ransomware that you get hit with, and you’re going to need something more robust as a response to that threat than hoping that you’ll always catch it.
And certainly more robust than once you get hit, relitigating why you didn’t catch it.
Get your backups out.
Did you test them?
Do you have them in the first place?
Are you in a situation where you can respond to having been successfully attacked, which does happen, rather than prevent yourself from ever being attacked, which is unrealistic, I believe?
DUCK. I think we saw that, to go back to the CrowdStrike incident, with people going, “This is outrageous! I have to get all my BitLocker recovery keys out, and I can’t because they’re on my Active Directory server, and that crashed as well.”
DAVID. [LAUGHING] Yes… not CrowdStrike’s fault.
DUCK. [THEATRICAL ASTONISHMENT] You locked all your secret keys in a safe that you can only open under conditions when you don’t need those keys?
How many other ways could this possibly go wrong for you?
What if someone had stolen that server?
What if it fell off a cliff?
What if it set on fire?
What if there was a flood in the data center where you happened to store it?
DAVID. This is something, I think (it’s not all organizations, but many organizations), where someone at that organization probably knew that was a vulnerability.
But they may not have thought that their remit as a professional included calling out the possibility that the cybersecurity software could cause a fault.
And so this is why you test backups, because you don’t find yourself negotiating with ransomware gangs then.
Not because they have some esoteric knowledge of how a ransomware gang operates, or anything like that.
I just see it as really easy to get lost in the marketing, to get lost in the vanity of this sort of abstract achievement where we’ve got this rigid, fragile system that isn’t ultimately going to protect us like practical activities would, such as backups.
DUCK. So you’re not saying we shouldn’t have some specialized cybersecurity tools.
You’re not saying those are bad.
What you’re saying is that the idea that this seems to be our collective response, every time there’s a new type of threat or an old threat that just succeeds against our expectations… that doesn’t really help, does it?
It’s just like, “Hey, more tools, more tools, more tools!”
And as Amos the Armadillo likes to say on the SolCyber website, tools alone don’t cut it.
DAVID. And “more tools, more tools” has a cost.
That isn’t to say that these tools don’t do something.
Many of them are even good at what they do; they’re effective.
The point, however, is for an enterprise that needed, let’s say, locks on their door that they recognize is wide open on the first floor, they buy themselves into an anti-aircraft battery on the roof, and they have bars on the 16th floor window.
But the first floor is still unlocked.
And that’s really what ends up happening in this hidden cost of distraction, this hidden cost of diverted effort.
There are many, many non-academic, so to speak, pursuits that any given organization can undertake that will be aligned with activities that they probably don’t think of as cybersecurity but which will protect them anyway.
Defense in-depth is always going to be more effective than the most effective point solution.
I just think that compartmentalizing too much the notion that cybersecurity is a practice, and that there are tools for cybersecurity, especially ones that are of very limited general-purpose applicability, is hazardous to the practical operation of an enterprise technical system.
DUCK. So, worrying about security on the 16th floor, when there’s no security at street level…
…it’s sort of putting the cart before the horse, isn’t it?
DAVID. It is.
DUCK. And to get that to work is complicated, is time consuming, and it gives you therefore less time and budget to do the things that you should have done first, in fact that you probably should have done a year ago, two years ago, three years ago.
But you’ve now got this convenient reason to push them down the list again because, like we said earlier, ‘new shiny object’.
DAVID. Yes, and listeners might think, “Oh, well, we’ve mentioned backups a lot.”
This is not a limited list of activities that I would consider non-ivory tower.
There are many, many practical activities that fall in this bucket.
Patching!
How many organizations actually patch their stuff?
Well, I can tell you that it’s not a lot.
I’ve seen plenty of organizations with ever-growing numbers of critical vulnerabilities…
…though I’m not a massive believer in any hard-line “every critical vuln disappears within 30 days” – I think that’s unrealistic; I know that it doesn’t actually happen.
But what does and should happen at an organization that has its act together is the critical vulnerabilties are dwindling at a rate that is at least treading water, or perhaps doing better than the rate at which they’re coming out.
DUCK. I think, David, there’s also the problem that quite a lot of people, if you ask them, go, “Yes, we’ve got patching sorted out, because everything can update automatically these days.”
That’s great if the automatic update process actually worked.
But if you don’t go back and check, you have an ultra-false sense of security.
You could have updates that somehow got skipped because of a bug; you could have updates that happened, then they got rolled back because they was some emergency that somebody forgot to record.
Or you’ve got the proverbial forgotten server in the cupboard under the stairs that hasn’t been patched for longer than you can imagine.
DAVID. Appropriate staffing is potentially between you and recovering from an event that requires physical access to your machines.
Those sorts of things are not cybersecurity concerns in a traditional sense; they’re probably better seen as governance concerns.
DUCK. And yet they have a powerful impact on your cyber*IN*security if you don’t do them well.
DAVID. Absolutely.
DUCK. I suppose, as well as all the other things, that lots of patches may fix bugs that could be security exploits, but are more likely perhaps just to cause downtime or crashes that don’t let cybercriminals in.
That’s still a bad look for your business, and it’s still bad for the bottom line of your business, and it could nevertheless get you in trouble with cybersecurity-related regulations.
[ATTORNEY-TYPE VOICE] Well, who had access to your systems?
Did that data get stolen?
Can you show that your data was encrypted before that laptop went missing?
And so forth.
DAVID. Yes.
One of the successes of the cybersecurity profession is getting people to realize that there may be reasons to install updates that are not feature-based.
That’s not an impressive feat culturally, but it is necessary.
It’s necessary that your system administrators aren’t merely considering whether a dot release of a patch that they’re thinking about applying to a firewall will give them any additional functionality, or whether or not they should ever upgrade their accounting software because they don’t need the new functionality, so they don’t need that patch.
That’s not ivory tower.
That’s quite pragmatic.
That is the cultural shift that I’m referring to.
When your sysadmins think beyond, “Do I need additional features?” and consider whether or not they need additional protection.
DUCK. And don’t be afraid of change just because you’ve got terribly used to something.
DAVID. Yes.
DUCK. Because that’s something that the crooks dine out on, isn’t it?
If they can see that you’ve lapsed into a series of behaviors that you would like to change, but you can’t because you’re stuck in the past, that’s where they’re going to focus their attention to try and break into your network and do the next bad thing?
DAVID. Of course.
And this is a systemic ill.
[THINKING] Well, the only anecdote that comes to mind right now…
We had a customer once who is a manufacturer.
They had a system that was utterly obsolescent.
I mean, an operating system that simply hadn’t received patches in seven years at least.
They did experience an actual exploit in that system, and when we notified them that this system would continue to experience lapses in security as a result of its unpatchability…
…the response we got was, “Well, you know, that would cost us $15,000 to rewrite.
We hired a developer in Switzerland 20 years ago to write the application that drives this machine on an obsolescent operating system, and it doesn’t run on the new operating system.
And so that would be $15,000 to rewrite.”
That is just a failure of contextualization.
You know, that probably costs them more than that in the one outage that I experienced with them.
I really think that it’s a lack of cybersecurity culture that would drive anybody to make penny-wise and pound-foolish decisions like, “Oh, we can’t possibly rewrite that software for the modern era,” which moved on seven years ago at the very least.
DUCK. Like you say, you need a culture that *can* move when it needs to.
DAVID. Yes!
DUCK. And is also willing to think about moving when it *wants* to.
Instead of just going, “No, we’re going to keep using MD5 and SHA-1 for the next 500 years.
Because that code was hard to write, so clearly it’s hard to understand, and that means it’s impossible to change.”
Isn’t that the warning sign that you need to grasp the nettle right now, and do something about that?
DAVID. It’s a warning sign.
It goes both ways, though.
And you can overcook it in the direction of over-influence from the cybersecurity profession just as easily as you can overcook it in the direction of under-influence.
The under-infusion of cybersecurity culture in business decisions happens more frequently… and this is a bias probably, but I more frequently feel
that businesses under-emphasize the risk of not accounting for the cybersecurity implications of their decisions, or lack of decisions.
So the successfully integrated culture, from an organizational perspective, will exhibit behaviors such as accounting not merely for funding, and not merely for budget concerns, and not merely functional concerns, but also cybersecurity and governance concerns when the organization makes business decisions.
It’s incumbent on the successful cybersecurity organization to account for the opposite.
Not everything is cybersecurity.
The business probably does something that isn’t cybersecurity, and you are facilitating that ‘doing of something that isn’t cybersecurity’ by keeping them safe.
To that end, it’s really incumbent on both sides to recognize that they need to meet somewhere in the middle, and that the business needs to start making decisions with cybersecurity in mind… or at the very least governance if you want to bring it up a little bit to something that might be more commonly understood.
And the cybersecurity organization needs to understand that their solutions cannot be so esoteric and expensive and impractical that they are just looked at as an ivory tower, or otherwise distracting.
DUCK. Yes, there are surprisingly many cybersecurity companies, but there are orders of magnitude more that aren’t cybersecurity companies. [LAUGHS]
DAVID. Right.
DUCK. David, if I can sound a bit salesy… I don’t really mean to, but I just want to mention this anyway.
From SolCyber’s point of view, I think what differentiates SolCyber from a traditional service provider is that advising, and bringing along, and buffing up that kind of culture is just as important for SolCyber to its customers as, “Yes, we know how to run the console; yes, we know how to read reports; yes, we know how to feed this all into a log centralization system.”
There’s a lot more to it than that, isn’t there?
There is building an environment where users feel that when IT say, “Look, I’m sorry, you can’t do that,” they have a reason to accept it.
And likewise, if IT go to users and say, “Look, we’re going to need your help with doing something,” then users will go, “Sure, I’ll participate willingly because it’s for the greater good of all.”
DAVID. Yes, we do advisory work all the time with our customers.
It’s not even something that we necessarily sell or promote, but it is really common, especially among the customers who want to do better and realize that there is a cultural element to this.
It’s really common for them to ask us questions like, “How should we implement, I don’t know, an MDM tool in a way that is not only humane, but also productive and secure – do you have tips?”
That’s not uncommon.
We may not always know the answer for their organization, but we see this a lot; it’s something that we do professionally.
And we’re more than happy to help them along with tips on what might be a way to to infuse a cybersecurity culture in their organization without just annoying everybody along the way.
DUCK. David, I think that’s a great place for us to wrap up.
And if I may read out something that you wrote to me earlier in the week…
For many companies, they would be well served to spend time and money coming up with realistic responses to cybersecurity problems than trying to come up with what you call the “rigid fragility of a deterministic defense.”
In other words, “We’ve got this checklist; when something bad happens, we look it up.
Oh, look, it’s threat 964201, and this is what we need to do.”
Because sometimes either you can’t find the item on the list because you don’t know its magic number, or it’s something new and isn’t on the list yet at all.
DAVID. All the time!
[LAUGHING] Always something new!
DUCK. So if people want to get in touch with you and with SolCyber, and they want to find out about things like pricing, and what’s available, and how they can sign up for a demo… where do they go?
DAVID. Amos@solcyber.com is one of our general mailboxes.
And you can check out our website, SolCyber.com.
DUCK. Excellent.
And don’t forget, if you’re looking for community-centered articles where you can learn more about cybersecurity, you can also head to SolCyber.com/blog.
Now, obviously, on the SolCyber blog, there’s stuff about SolCyber, but there is also a wide range of articles about everything from HTTPS to VPNs.
Thanks for listening, everybody, and until next time, stay secure!
Catch up now, or subscribe to find out about new episodes as soon as they come out.
Find us on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you run your own podcatcher app.