

Waking up from Breach Fatigue
As cybersecurity battle cries go, “Assume breach” isn’t very convincing. Duck and David explain how to adopt a more positive, proactive mindset.
If the media player above doesn’t work in your browser,
try clicking here to listen in a new browser tab.
Find TALES FROM THE SOC on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you use your own audio app. Or download this episode as an MP3 file and listen offline in any audio or video player.
[FX: PHONE DIALS]
[FX: PHONE RINGS, PICKS UP]
ETHEREAL VOICE. Hello, caller.
Get ready for TALES FROM THE SOC.
[FX: DRAMATIC CHORD]
DUCK. Welcome back, everybody, to TALES FROM THE SOC.
I am Paul Ducklin, joined as usual by David Emerson, who is CTO and Head of Operations at SolCyber.
Hello, David.
DAVID. Hey there.
DUCK. David, I’ve prepared a subject that I think is going to entitle you to have what I would like to think of as a rant, but nevertheless with politeness and thoughtfulness.
That topic is: Waking up from Breach Fatigue.
The undercurrent of that is that lately, if you look at Collins Aerospace; or the airports in Europe that got taken out for a week or so; Marks and Spencer in the UK; Jaguar Land Rover, now being billed as the costliest ever cyber-incident in the United Kingdom, with production lines frozen for weeks and weeks․․․
Why is it that companies seem to be, I wouldn’t say happy, but at least prepared to wait until a breach happens, and then go crazy trying to fix it?
Why aren’t we more proactive?
Is it really that hard?
DAVID. It’s all difficult.
It’s hard to pay a ransom; it’s hard to build a cyber security program.
DUCK. Yes. [LAUGHS]
DAVID. There’s no easy answer.
DUCK. True.
DAVID. It is fundamentally because humans are very bad at risk analysis, and don’t really have a sense of the real economics of delay.
This deferral of risk, this notion that, “Oh, we’ll wait until we’ve been breached and we’ll deal with the breach․․․”
It’s not neutral, right?
It’s not a one time thing.
It’s a compounding issue.
Your attack surface expands with every new tool you have, every new partner, every API.
If you postpone mitigation of that, if you postpone maturity, it really means that you’re fighting this sort of implicit “infinite tomorrow” with yesterday’s defenses.
DUCK. [LAUGHS] Yes, I like that way of putting it.
And ironically, in many cases, those tools that you keep adding may well be yet more cybersecurity tools to try and paper over the cracks that last year’s cybersecurity tools claimed to fix, but somehow didn’t.
DAVID. It is analogous to saying, “Oh, well, we have IR firms on hand, and we have lawyers, and we have PR agencies, and stuff․․․”
The things that you start using after an incident occurs.
And those aren’t security, they’re damage control.
If you take a step back, I think the missing piece for a lot of people in this economic analysis is that the return on investment for proactive controls, things like multi-factor authentication, hardening your endpoints, backup testing, employee training․․․
․․․this really is so much less expensive than the sunk cost of post-breach chaos, where you have lawyers on high alert and high bill rates; IR firms on high alert; PR agencies working for you trying to develop a marketing strategy for what is going on; and potentially the long tail of damages as well.
And there have been many studies about this.
There are studies that very clearly delineate the average cost of a breach, and the average amount saved by proactive measures.
And it’s not nothing – it’s millions of dollars on average, which could be saved, essentially, in terms of proactive remediation.
But I think people are missing that economic analysis, and they’re fundamentally very bad at assessing risk.
DUCK. Now, when it comes to employee training, as the kind of very loose category that you mentioned, or human awareness, human responsiveness, a culture of trying to do the right thing․․․ that gets a terribly bad name these days, doesn’t it?
The idea being that, well, PEBKAC, or “Problem exists between keyboard and chair.”
I’ve always thought that that is [A] a very demeaning sort of thing to say, and [B] very defeatist, just like the saying “Assume breach” that CISOs seem to preach about these days.
I get the idea.
What they’re saying is you should be prepared if the worst happens.
But the way we hear that term “Assume breach” these days is almost as though companies are inviting us to accept that they *are* going to get breached, and there’s nothing they can do about it, so we kind of have to suck it up.
DAVID. One of the many sort of psychological constructs behind delay, so outside of the economic analysis, there is this sort of tactical mentality.
It’s almost a short-termism, right?
That security products, the tools that you buy to satisfy an initiative to become cyber-mature, and the people that you hire, that they don’t show an immediate ROI until an incident occurs.
And training is in that bucket as well, so to your point, people don’t really understand why they’re doing the training, and I think that’s because they don’t show an immediate return.
And even when the incident occurs, it can be quite hard to link the cost of an incident back to the things you could have been doing.
And in fact, it’s probably not your top priority.
Once you’ve been breached, you’re probably just trying to clean the breach up.
DUCK. Absolutely.
DAVID. So I think that that sort of short-termism, that myopic perspective on return, is part of why you see this kind of, “Oh, we’re just checking a box, and, you know, we have this super uninspiring training, and what even does it do?”
And there are other psychologies as well.
Coming to mind, I would say a lot of companies think that they’re too small to be a target; they have this optimism that perhaps they will not become one.
A lot of companies equate compliance or governance with security.
Not to say that SOC 2 or ISO 27001 are bad ideas, but they’re not the same as security.
And then finally, the most common, really, is the cognitive dissonance, essentially, that leadership believes they can defer security, like they defer marketing spend.
To not do security is merely taking a bet on the side of not being breached, and they won’t have to worry about it.
But the reality is that attackers and their surface compound a lot faster than the money that we’re saving by pretending as though they don’t exist.
It just gets more and more likely by the day that you will be breached, taking them out like that.
DUCK. And this sort of pushback against the idea of employee training, for want of a better term, seems to me a little bit like saying that your employees are never really going to be smart enough, so automatic tools and techniques will just have to be enough.
And exactly the opposite is sometimes true, isn’t it?
I think in all the breaches that I mentioned earlier, the Collins Aerospace one; the Marks and Spencer one; the Jaguar Land Rover one․․․
We’re talking about people who, very loosely speaking, phoned up and said, “Hello, I would like to get access to the system because I’ve been locked out – will you help me?”
DUCK. There’s a lot to unpack there.
Your point about the problem being between chair and keyboard?
I’ll admit [EMBARRASSED LAUGH] to having thoughts about users now and then myself․․․
But the reality is that complaining about humans “being weak”?
It’s like complaining about you don’t like the taste of water.
It is life-sustaining!
Your business relies on humans in order to do its work, in order to create its value; your body relies on hydration in order to do its thing.
So water might not be your thing, but you’re going to have to drink it.
And I think that that’s essentially an analogous complaint with technology and humans, and their use of technology.
You’re going to have to train your people, and you’re going to have to create systems that they can work in that don’t expose you to unnecessary vulnerability, having people which you are inevitably going to have.
I don’t really see a way around that.
In fact, I think in some ways, kicking the can down the road can endanger your workforce in a less literal sense․․․ in a talent sense.
Good engineers don’t want to work for companies that get breached repeatedly.
It signals dysfunction.
Investors don’t want to invest in companies that get breached repeatedly.
It signals dysfunction; there will be a valuation discount.
Cyber insurers will increasingly deny payouts or deny insurance policies for un-remediated vulnerabilities.
DUCK. And I guess, when it comes to something like employee training, you often hear people saying, “Well, the problem is there are just so many bad students.”
The other way to look at that is maybe the teachers aren’t getting the message across properly.
DAVID. Or they’re not creating an environment that is naturally resilient.
I don’t necessarily believe that anyone could teach a group of people how to fend off sophisticated or even modestly sophisticated cyber security threats.
I really don’t believe that teaching in a rote sense will protect them.
I do believe that some amount of ground knowledge, like foundational knowledge, combined with an environment that is fundamentally resilient․․․
I *do* believe that that can be effective.
DUCK. I agree.
I think you gave an example, in a previous podcast, of an employee who’s been blocked from going to six gambling sites.
When they find the seventh gambling site isn’t blocked․․․
Depending on the culture of the organization, they’re either going to go, “Whoops, that’s obviously a mistake, I’d better call it in.”
Or they’ll go, “Right! I found the one that’s allowed,” and go ahead and use it anyway.
And that’s a very dangerous hole to fall into, isn’t it, if you’re trying to do everything by rote, or by spreadsheet, or by checkbox.
DAVID. In a more relatable example at a strategic level, think about the people that do wire transfers for your company.
The people that are just trying to do their job in the accounting or finance department, and they have wire transfer authority.
You know what *not* supporting them through a potential breach looks like?
Well, it looks like giving them that wire transfer authority and giving them some security awareness training, and leaving it at that.
Because essentially, at that point, they’re required to make a call on whether or not they should be wiring money to someone, even when it seems like a good idea.
DUCK. Yes.
DAVID. At some point, they’re going to fall for a phishing scam, and they’re going to think, “This is a good idea.”
They’re going to try to get their job done, and they’re going to wire $363,000 to a bad actor.
This is just a hypothetical example.
But that is something that you can provide protection in excess of the traditional training and the hope that they do their job 100% of the time accurately.
And that protection could be social in nature; it could be procedural in nature.
Your systems might not fundamentally allow them to wire money to a novel account, or maybe not allow them to bind a novel account to a new vendor without some kind of verification, or two-man key, or something like that.
There are many ways to ultimately effect that change, but really, *that’s* what it looks like to properly support your people.
Support does not mean, essentially, “Here’s some sort of high level training, and also we hope that you’re a robot.”
Because people are not going to be robots; they’re not going to do the right thing 100% of the time.
They’re going to forget; they’re going to fall for something, even if they took the training and meant to apply it.
DUCK. So what sort of things can companies do about this, to avoid just pouring more and more money into tools, and more and more money up front into doing things that feel like they’re ticking off one box in a grid?
How about actually infesting wisely in a mixture of tools, technology, culture and humans, to make it less likely that they will be breached, and maybe to get them out of that “Assume breach” habit?
Not that I’m saying they should assume that they will always get the better of the cybercrooks.
But maybe that they’ll present a less defeatist front to the world, and to their customers, and to their own staff.
DAVID. The first thing that a company can do, and by a company, I really in this case mean a person in a strategic position to advocate for security․․․
So it might be a CIO, or CISO – that depends on the company.
The first thing to do is to define the strategic payoff for your company.
Typically, companies that, let’s say, treat security as a strategic asset, their cyber security maturity as a strategic asset․․․
You might find that in some industries they win better contracts.
Maybe they’re a government contractor and they can win contracts because they’re CMMC.
Maybe they win contracts because their customer trust audits pass all the time, or they have SOC 2, or whatever that looks like.
Think about the value of your brand.
Is your brand the kind of brand that will turn a breach into a credibility moment?
Or is your brand the kind of brand that a breach is catastrophic for, no matter what?
Think about your investors.
Are you leaving VC or private equity money on the table, or even independent investing – public market money – on the table, as a result of your management discipline being a little bit lax in the cyber realm?
If you’re actually trying to fix this, define what the strategic payoff will be, and then start your slightly more tactical work of quantifying, prioritizing and implementing.
DUCK. Do you think that we, as in the cybersecurity community, need to bring what you might call a little bit more mellowness or “co-opetition” to the table?
Because I regularly find new startups coming onto the cybersecurity scene just saying. “We’ve got this brand new thing, which will deal with one part of the problem so well that you can take your eye off the ball with the rest.”
DAVID. We’re part of the problem.
Education, and the propagation of the notion that this is a quotidian thing to be breached.
It is an ordinary, everyday thing, and it should be unsurprising.
I think that that notion absolutely needs to be more commonly discussed, and I think that that notion is anathema to a lot of marketing departments because I don’t think it’s very sexy.
But, at the end of the day, these breaches, they are not surprises to anyone.
They’re essentially․․․ I guess you could think of them as “invoices for neglected risks.”
DUCK. [LOUD LAUGHTER] I shouldn’t really laugh, should I?
That’s a great way of putting it, and it sounds funny, but that’s not a great position to be in, is it?
DAVID. Well, it isn’t.
But we’ve said it before, right?
“The pain of discipline, or the pain of regret.”
It’s seen as a cost center by a lot of companies that haven’t had it explained to them that *this is actually at the cost of being in business* in a lot of cases.
That attitude is not promoting the idea that we buy more tools; that attitude is not promoting the idea that I need a hundred-person team, or anything like that.
That attitude is promoting the idea that this is background radiation, and we need to deal with it because it isn’t going away.
Nobody should be surprised by the efforts necessary to sustain life by drinking water!
It’s just the way it goes.
I really think that the industry does its customers a disservice, and I think that the industry does itself a disservice, in focusing on the flashiness of an exploit.
The reality is that most of these breaches are the least interesting possible thing․․․ they’re just *effective*.
DUCK. So if you were a person who was responsible for guiding cybersecurity in a business (let’s say a small or a medium business, not a massive one), where would you start?
What would be the things that you would do first to try and adjust your culture, and manage your cybersecurity risk better?
DAVID. I mentioned earlier that you have to define the strategic payoff for yourself, for your company, for your industry, for the customers that you have.
Define what it is that “having more cyber-maturity” is going to do for you.
Whether that’s, “We won’t be out of production for a while.”
Or, “We’ve got better uptime,” or whether that’s, “We’re going to win better contracts.”
Or, “Our brand is going to be better protected and more durable as a result of not having constant crises,” or, “We’re going to attract investor funds.”
So define that strategic payoff.
And then I think another thing that you can do․․․ and that’s why we have a product at SolCyber.
Hire somebody to do the day-to-day, because there is likely very low value at your company in doing the technical grind that is cybersecurity.
That is likely of low value to your company, unless you’re in the cybersecurity industry, or in a very niche, high-security industry.
Hiring out the day-to-day will allow you to put resources that aren’t distracted into the gaps between the things that are done by the MSSP, by your managed security team – the day-to-day stuff – and that strategic payoff being realized.
Those are typically specialized things.
So, let’s say you’re in trading, finance, or something like that.
And your day-to-day is handled by a managed security provider, but you have some particularly critical trading systems that have certain compliance checks on them that have to be done.
Or certain kinds of hands-on activities for remediation, or security, or monitoring.
That, probably, is the domain of your team, but at least now because you’ve done this strategic payoff analysis, and because you’ve hired somebody to handle the day-to-day․․․
You know that your engineers need to look at these trading systems; they need to do specialized activity on those trading systems.
So I really think that’s how to approach it.
Describe to yourself what it is you’re doing.
Hire someone to do the day-to-day.
And then fill gaps that are high value to your business, using your own employees and your own effort.
And that’s how you’re going to save money doing all of this, as well.
DUCK. So, to finish up, David, would you agree with the notion that if you’re inclined to think that you need to go for some kind of certification in order to show that you’re cyber-secure․․․
․․․that perhaps you need to turn that on its head and say, “Let’s make ourselves cyber-secure so that the certification comes naturally, and that we don’t have to go and retrofit a whole load of things which don’t match well with our company or our culture.”
DAVID. I’d like to say I think that’s possible in some cases.
It really depends on the business.
But there are legitimate legacy reasons that you might have trouble taking that sort of purist approach.
DUCK. Right.
DAVID. It’s a fine aspiration, and it’s a fine way to couch things in your mind, you know, as that aspiration.
But I think, in reality, it’s going to be craggy and complicated.
Your freedoms of movement may not be what you think they are.
That’s OK.
I think, start there, and carefully and honestly defend to yourself where you need to make exceptions.
Doing that alone, you’re doing better than most companies will ever do with this cyber-maturity business.
Most companies really are explicitly deciding to receive that “invoice for a breach” – a neglected risk, essentially – later, after a breach occurs.
And that’s an expensive way to do things.
DUCK. It is.
So David, I don’t want this podcast to finish up feeling all pessimistic.
“Oh, it’s so terribly hard that we’re never going to get there, so we might as well not bother.”
What one thing can you think of that makes you feel upbeat about fighting back, if you like, against cybercriminals and state-sponsored attackers?
DAVID. Well, I think it’s that when you look at the breaches that have occurred, really all time but especially recently․․․
Those breaches are of a tremendously unsophisticated nature, while remaining high-impact.
DUCK. Yes.
DAVID. That, to me, is a really good indicator that this is something we *will* one day be able to explain to businesses as an investment in their future.
Because it won’t be wizardry.
They won’t be lost in the conversation, or unclear about what it is that will cause them to lose lots of productivity and therefore lots of money.
It’s going to be really straightforward.
So that actually helps our cause, which is to say that the higher impact these things are while remaining fairly unsophisticated is an indicator of,”We *can* do something about this.”
And you will be able to understand what it is we’re doing, as a business leader.
I think that gives me hope.
Unfortunately, I haven’t really seen that pan out yet, but I do think that people will slowly realize that this is part of the landscape, and that it’s not a wonky thing to achieve cyber-maturity.
DUCK. Maybe the cybersecurity world would be a better place if we truly put humans front-and-center of our efforts, and let the technology and the tools serve us behind the scenes?
Instead of letting the tools force change on our business that doesn’t necessarily give us the results that we want.
DAVID. Yes, sounds fair enough!
DUCK. Excellent.
Thank you very much, David, for your thoughtfulness, and for your careful consideration of these issues.
I’m delighted to say that this didn’t turn into a rant, and you were very conciliatory about the fact that some of these issues are hard to grapple with, and you can’t necessarily solve things in the purest possible way.
So with that, I’d just like to say thanks to everybody who tuned in and listened.
We really appreciate your support, so please Like and Share us on social media, and don’t forget to tell your friends, your family, your colleagues, and especially your boss about us.
While you’re about it, do pay a visit to solcyber.com/blog, where you’ll find a wide range of excellent articles on cybersecurity topics – no sales schpiel, just community-centered advice that really helps.
And we’d also love you to pay a visit to solcyber.com/pricing, where you can find out just how easy it is to increase the value of cybersecurity in your business, while at the same time reducing the cost.
Once again, thanks for listening, and․․․
Until next time, stay secure!
DAVID. Bye, all!
Catch up now, or subscribe to find out about new episodes as soon as they come out. Find us on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you use your own audio app.
Learn more about our mobile security solution that goes beyond traditional MDM (mobile device management) software, and offers active on-device protection that’s more like the EDR (endpoint detection and response) tools you are used to on laptops, desktops and servers:

By subscribing you agree to our Privacy Policy and provide consent to receive updates from our company.






