Join Paul Ducklin and SolCyber CTO David Emerson as they talk about the human element in cybersecurity in our new podcast TALES FROM THE SOC.
In this episode, our insightful duo look beyond cybersecurity automation and algorithms.
What if we start out with a focus on cybersecurity culture, not on rules and rote?
Find Tales from the SOC on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you run your own podcatcher app.
Or download this episode as an MP3 file and listen offline in any audio or video player.
[FX: PHONE DIALS]
[FX: PHONE RINGS, PICKS UP]
ETHEREAL VOICE. Hello, caller.
Get ready for “Tales from the SOC”.
[FX: DRAMATIC CHORD]
DUCK. Hello everybody.
Welcome back to Tales from the SOC.
I am Paul Ducklin.
As always, I am joined by David Emerson, CTO and Head of Operations at SolCyber.
Hello, David!
DAVID. Hey there, how’s it going?
DUCK. Very well, and I hope you think it’s going very well when I tell you what I’d like you to talk about in this episode, David.
It’s based on something you said last time.
You were imagining somebody implementing an MDM, a mobile device management solution.
And you talked about not doing it just automatically or algorithmically, but doing it in what you called a humane way, because you were taking people’s feelings and expectations into account.
So, if it doesn’t sound too pretentious, I’d like you to imagine that we are going under the headline, [DRAMATIC TONE] “Do it for humanity.”
DAVID. Yes, I think that sounds like a fine topic.
I’m down!
DUCK. Obviously, I’m not thinking, “Well, we’ve got to find a way to get all the plastic out of all of the oceans by tomorrow,” as nice as that would be.
[LAUGHTER]
We’re focusing on the cybersecurity angle.
So, how do you apply that, David, to cybersecurity, to build something that’s humane, that’s human-centric, that focuses on cybersecurity culture, not on rote and just implementing policies, as important as they may be?
DAVID. I think that the main distinction to draw in your mind administratively (this is as a strategic decision) is to separate rules from culture.
There are the things that are rules: “We are going to have an MDM.”
And then there’s the cultural understanding of why an MDM has been deployed; how it will be used; how to interact with it; what your latitude is when something doesn’t work properly in the MDM for you as the user.
You see this with web filters.
You can make lists by rote, you can make giant lists of banned sites, or signatures, or categories of site…
…but very rarely, is anyone making the list really stepping back and thinking, “Do we need to block this thing? Do our users have any sense of why we’re blocking the things that we’re blocking?”
So that they don’t just decide, “OK, well I can’t get to these six gambling sites, but I’m going to go and gamble at the seventh gambling site that hasn’t been blocked yet, because the people that made the rules just haven’t thought of it.”
When someone is antagonistically interacting with the rules that you’ve set, it’s likely that they don’t understand the reasons for the rules.
DUCK. Yes, it’s almost as though we’re rewinding to IT 1990s style, isn’t it?
Anyone’s allowed to show up at any time and ask for anything they like, and the job of the IT department is to tell them that they are not going to get it. [LAUGHS]
Regardless of the exigencies of the situation, and the fact that there may be perfectly good reasons for you to do what you want to do without putting the company at risk.
DAVID. By failing to build that culture, the other thing that you preclude is building the sort of social norms that can create what I refer to as social defense.
It’s something that we mention in our live trainings when we do training for security awareness at SolCyber.
Social defense is one of the most legitimate, most powerful forms of defense against things like phishing, against things like the disclosure of confidential information, or potentially granting privileges that are inappropriate.
When someone sends you an invoice and asks you to pay it, and you’ve never heard of that vendor, you could be a call away from asking around the office whether anyone else has either.
And if no one has, it’s a reason to be suspicious.
That’s social defense.
When all you have is a bunch of rules by rote… maybe someone forgot to train you in the ‘fake invoice scheme’, and now you’re not defending against it simply because no one ever said you should?
Now that’s ridiculous.
The culture really trumps rules almost any day.
The reality is you’re expecting skill and culture and social defense from your users, and you need to start imbuing that in them.
DUCK. Absolutely.
“Hey, the vendor’s got this massive list of links you shouldn’t click, and sites you shouldn’t visit, and email IP numbers that shouldn’t be sending you mail, and so forth.”
If that worked, then phishing would be a minority problem that occasionally caused a little bit of problem for some people.
But actually, when you dig into the average vendor’s cybersecurity report these days, it seems that fishing attacks still rate as the leading cause of ransomware.
Why is that?
DAVID. Because we haven’t imbued the notion, the cultural notion, in our users.
I mean, it’s one of many reasons… but we haven’t imbued the cultural notion that their subjective analysis of something may actually be relevant, and that they need not perform things on a purely ‘defined rules’ basis.
We are all experts in our jobs.
We are all experts in the information that we handle on a regular basis.
We may not all be experts in cybersecurity, but that’s no different than someone who is the manager of a data center not being an expert in meteorology, but knowing a lot about what happens to a data center when a hurricane hits it.
That is a valid opinion of an expert.
It may not be perfect, and may not be entirely rules-based, and may not be meteorologically correct.
But that person knows something about data centers, and knows something about the intersection of data centers and hurricanes.
Think about the things that you’re expert in.
Think about the data that you handle on a regular basis.
Imbue in your users the culture that allows them to say, “I know when something smells weird. Nobody ever asks for this information in this way. We’ve never wired money to Hong Kong. We’ve never received an invoice that was so vague as this.”
Just because it is a valid format for an invoice, and it has the right currency, and it’s a number that you can technically approve to be wired…
…that doesn’t mean that any of those things add up to, “This is a valid invoice that I should wire money for.”
DUCK. I don’t think rules and procedures just to protect your network can really do that, can they?
Because they don’t imbue you with that Do it for humanity spirit.
They just imbue you with Stick to the rules and everything will be OK.
DAVID. And they imbue you with presumption.
There’s a presumptuousness when forming a litany of comprehensive (or presumed comprehensive) rules that bleeds into an organization.
And it forms a bureaucracy which is, I would say, disingenuous or incomplete.
If either of us believed that we could comprehensively classify the data handled by a legal department at even a modest-sized company, I think we’d be full of cr*p.
You know, the reality is we probably couldn’t imagine the things that they have in their filing systems.
We probably couldn’t imagine the things that pass by their desks every day.
We could maybe imagine categories.
We could even interview them and get them to give us some of the major categories of things that cross their desk on a regular basis.
But they’d be the things that came to mind, they wouldn’t be comprehensive.
And so the only rational approach is to imbue that culture in the lawyers, in the people who have things crossing their desk, and give them archetypes.
Give them things to look for so they can spot a novel kind of contract that contains sensitive information of a type that was never considered by the IT department… because why would it be?
So, imbue the legal department with the expectation and the authority to say, “This is how we’re going to process this information because it is the safe way.”
And I really just think that’s the only way to do it.
It’s not practical to expect any central authority to have a comprehensive view of so many other jobs.
It makes no sense at all.
DUCK. I think that’s borne out, isn’t it, by the fact that, these days, cybersecurity vendors (well, some of them, anyway) pride themselves on the sheer scale of the protective lists they have.
It’s ironic that for 20, almost 30 years, the new kids on the cybersecurity block have said, “Oh, we’ve got rid of signatures!”
And yet a lot of cybersecurity defense is based on lists, and hash lookups for specific files, and individual URLs.
“This company has 3 billion URLs in its blocklist, but this other company has 3.1 billion URLs so it must be better.”
You’re then doomed to the problem of false positives, because if more is better, “Oh, let’s block more stuff,” then why not block absolutely everything?
Then you’ll be perfectly safe.
DAVID. Ah, you’ve described DLP!
[LAUGHTER]
I don’t think DLP fundamentally works very well, and I think that there are many reasons… most of them are related to exactly what we’re discussing.
DUCK. Yes, either you end up blocking everything, and then you need exceptions, and then the things that go through as exceptions are treated as ‘met the rules’ and therefore must be OK.
Or you try and do cybersecurity purely on the basis of volume.
So, yes, I didn’t want to mention DLP for fear that I might set you off.
[LAUGHTER]
If you have the right culture, then people who aren’t playing by the rules on purpose, in my opinion, should stand out more.
And you should have a better chance of actually catching them out before something bad happens, rather than if you just rely on some regular expression to detect that something bad is coming in or out, or if some hash is on a blocklist or not.
DAVID. You know, the defense of the actual attack itself may occur even in areas of your business that you aren’t logging at all.
And so you have the fanciest DLP on Earth, but someone’s carrying around a USB stick.
Or you have the fanciest DLP on Earth, but someone’s using paper.
DUCK. Or the camera on their mobile phone.
DAVID. Sure, we can come up with examples all day long, and we still would not have come up with an exhaustive list.
And that’s the fundamental problem.
You have to bring it up a level, and teach your employees, and teach your organizational members, to do the same thing, right?
So that they are not just reading this ‘infinity of lists’ in your policies, but are capable of forming their own.
DUCK. [LAUGHS] Yes!
And I think the flip side about ‘infinity of lists’, if you’re talking about blocklists, is because of their propensity for inaccuracy, or the fact that they’re trying to please all of the people all of the time and will therefore displease you some of the time…
…you end up with the other side of the coin, don’t you, the allowlists?
“Yes, that app’s digitally signed by so-and-so. Well, we don’t know when they release new code, so we’re just going to let that through.”
And suddenly you have these authorized exceptions that nobody feels that they can stand up to, or say, “You know what, although this passed all the tests, it didn’t pass my smell test.”
And I don’t see why that should be disparaged by people who go, “Oh, well, user awareness is no use.”
In fact, a little bit of flexibility goes an awful long way!
DAVID. It totally does.
And anyone that says user awareness is of no use just isn’t considering that that’s all you have.
Because the simplistic application of rules by rote has so many holes in it, and is so blind to the many-fold circumstances that will be encountered by your data and by critical information that you’re trying to protect.
DUCK. There are some things that users will need to know how to do by rote, because there’s a procedure that has to be followed, and it has to be followed in a certain order or the system won’t work.
But for the rest, there’s that need for people not only to understand what they’re supposed to do, but also to feel a sense of ownership in why they’re being asked to do it.
Otherwise they’re just being treated like automata, aren’t they?
DAVID. Yes, absolutely.
DUCK. So, David, maybe we can finish up by looking at another aspect of this Do it for humanity issue.
And that’s maybe an internal cultural thing, where I think some IT and security teams may struggle with what you might call internal social contracts inside a company.
And that’s something I’ve particularly seen with the attitude that some people take towards web filtering.
It’s very important that you don’t just turn that on without explaining to your users what that means for their expectations on end-to-end security, right?
Otherwise, it’s putting you in a rather invidious position of power over them, isn’t it?
DAVID. It can be.
I think that that’s something that you just have to navigate on an organization-by-organization basis.
There are organizations where that would not be unusual or even draconian, because of the nature of their business… it’s either sensitive, or confidential.
DUCK. Oh, don’t get me wrong: I’m not saying you shouldn’t do it.
But if you are doing it, it’s important that people realize what you’re able to see and what the risks are to them, because it may affect the way that they do things…
…like personal banking on their work computer, which the company may actually say it’s OK to do.
DAVID. Yes, it’s also something that you have to back up a step even then and think, “Why am I doing this?”
Maybe a more accessible, non-technical example is the writing of job descriptions.
We’ve all read job descriptions… if you’ve been a job seeker, every job description ever has just a billion requirements that actually aren’t requirements, qualifications that actually are irrelevant.
[LAUGHTER]
All of it is an indiscretion, an inability, on the part of the person writing it to take a step back and say, “Why do I need this?”
Why do I need someone to have a graduate degree in computer science or to have 10 years of experience?
Is any of that relevant?
Does any of that align with the actual requirements of the job, the duties of the job?
Likely the answer is no.
It’s just sort of expectation, right, maybe just something propagated through societal norms or whatever?
It’s the same thing when you’re setting up a system and you’re doing things like deciding to middle SSL certs, or deciding to use a little bit more inspection than you might actually require for your environment.
Someone needs to take a step back and say, “OK, what are our requirements?”
What are we required by regulatory bodies to do?
What is a good idea in our industry?
And then be transparent with your users.
DUCK. Yes.
DAVID. You know, “Hey, we’re a bank.”
We’re required to log every single transaction; we’re required to log every packet on our network.
So if that’s a problem for you and your personal life, don’t be sending packets across the network that you don’t want inspected.
DUCK. Absolutely.
DAVID. And likewise, if you’re a web design shop and you don’t have those requirements, take a step back and think about why you would even want to put that system in place.
Because you’re still going to be obligated, I believe, to tell your users that it’s in place.
But at the end of the day, is it really improving your business?
Or is it helping your culture to be a surveillance state?
There has to be a reason that you wanted that information; there has to be a reason that you’re going to analyze it.
Otherwise, it’s a waste of your time, and possibly an intrusion of your users’ privacy and their personal sense of freedom.
DUCK. Yes, if you want to avoid the ‘social contract’ side of it, you need to make sure that you aren’t the kind of person who’s keeping logs that you never ever look at, and don’t know how to look at even if you wanted to.
Because you simply might as well not bother.
You’ll save yourself [A] a lot of time, [B] lot of money and disk space, and [C] you will probably make things easier on a regulatory front for yourself because you won’t be collecting stuff that you’re not quite sure what’s in it.
And if you haven’t considered that, you could actually be making your cybersecurity compliance worse.
You might be losing on the roundabout what you do not get back on the swings.
DAVID. Yes, absolutely.
Basically…
Don’t do the things you don’t need to do.
Offload culture into your user base.
And acknowledge that all the problems you have, and that all the solutions, exist in humans.
I would say that’s really my best advice for keeping the humanity in cybersecurity.
DUCK. [FORMAL TONE] “Do it for humanity?”
DAVID. Do it for humanity!
[LAUGHTER]
DUCK. David, thank you so much for your time, and for your thoughts and insights.
And thanks everybody for tuning in.
Don’t forget, if you would like to get in contact with SolCyber, you can email amos@solcyber.com.
Amos is the SolCyber mascot, the nine-banded armadillo of Texas.
You can simply visit solcyber.com.
And if you’re looking for technical articles that aren’t sales schpiel, then please head to solcyber.com/blog.
Thanks for listening, everybody, and until next time, stay secure!
Catch up now, or subscribe to find out about new episodes as soon as they come out. Find us on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you run your own podcatcher app.