Join Paul Ducklin and SolCyber CTO David Emerson as they talk about mobile security in TALES FROM THE SOC.
In this episode: When it comes to cybersecurity, just how safe is your mobile device? Is it impervious to malware? Can it protect you from yourself?
If the media player above doesn’t work in your browser,
try clicking here to listen in a new browser tab.
Find TALES FROM THE SOC on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you use your own audio app.
Or download this episode as an MP3 file and listen offline in any audio or video player.
[FX: PHONE DIALS
[FX: PHONE RINGS, PICKS UP]
ETHEREAL VOICE. Hello, caller.
Get ready for TALES FROM THE SOC.
[FX: DRAMATIC CHORD]
DUCK. Hello, everybody, welcome back to Tales From The SOC.
I am Paul Ducklin.
I’m joined by David Emerson, CTO and Head of Operations at SolCyber.
Hello, David.
DAVID. Hey there.
DUCK. David, as we’ve done in the past, I’d like to use some recent articles from solcyber.com/blog as the basis, or the background, to this episode.
One is entitled Patches from Apple fixed privacy, code execution and lock screen bugs.
No zero-days, but a whole range of bugs, even though this is your secure mobile phone.
And another article entitled iPhones aren’t breach-proof – debunking the myth of iOS security.
We’re not saying Apple phones are insecure.
We’re just confronting head-on that myth that’s existed for decades.
First it was “Linux is unhackable,” then it was “macOS is unhackable,” now it’s “mobile phones are unhackable.”
And, of course, the truth is rather different to that, isn’t it?
DAVID. It is, for sure.
I think mobile phones appear unhackable to the extent that their functionality is somewhat more compartmentalized, somewhat more limited than a general-purpose compute device.
But, at the end of the day, they’re a general-purpose compute device with that mask of constrained functionality on top of it for the purposes of convenience.
They’re no less capable of divulging secrets.
And in some ways, because we carry them in our pockets, they actually collect far more about us which might be of practical use to an adversary.
DUCK. One thing that probably makes people assume that mobile phones are fundamentally secure, whereas perhaps their laptop isn’t, is the fact, as you say, that apps are obviously compartmentalized.
Because you can’t, without either the app specially supporting it or going in and turning on some very special settings, let one app read another app’s files.
But it doesn’t magically protect your data from any one of those apps getting into the wrong hands, does it?
DAVID. It does not, no.
At the end of the day, there’s nothing new under the sun, and your operational security is going to rest largely on far, far more practical concerns like human organization and behavior.
The fact that operating systems for the mobile devices were developed much later than operating systems for general-purpose computing – you know, for your desktop, laptop, workstation, and so on – has given them an opportunity to do some things in the security realm that we probably wished we did in Unix in 1970.
But the reality is that the data is still there.
And the reality is that it’s still being entered at a human/computer interface.
It’s still *you* who is the custodian of that data, not necessarily the developer of that app and their ability to provide its safe transit, for example, or safe storage.
And a lot of users just conveniently forget that, because it’s much more difficult to accept self-responsibility for one’s operational security.
DUCK. [LAUGHS] That was very diplomatically put, David.
DAVID. [LAUGHING] The adversary of the average laptop, desktop, or workstation is probably addressing a more general or abstract target, which is why you see things like ransomware.
On the mobile device, because of the nature of the information it collects, it attracts an adversary that is after your location; it attracts an adversary that is after your iMessages; it attracts an adversary that is after your communications, which you presumed were encrypted in an app like Signal.
Those not only tend to be more sophisticated adversaries, but they also tend to be less generic attacks.
And because of that, really a much more sophisticated attack as well.
DUCK. At the same time, that can also lead to people going, “Oh, well, the crooks will never be interested in little old me, therefore I don’t have to care.”
And that, sadly, is not true either, is it?
You might be targeted just because you happen to be the person through whom important information passes, like who’s arranging meetings with whom and when; who’s asking legal questions; who’s signed what deals at what time.
It’s not guaranteed that if you’re not the CEO, you can just stand down from blue alert.
DAVID. You could easily be a conduit or proxy for the CEO.
And you should consider that possibility, whatever form or relationship that takes for you.
But the message is the same, and we’ve said it before, that culture is really important in cybersecurity.
The thing that you will do consistently; the thing that you will do well – those are the things that will actually protect you.
It’s no different here.
The operational security is provided by your ability to consistently consider the things that you’re protecting; consistently consider the vulnerabilities and surface area you have.
And I’m not saying that in a technical manner necessarily.
The expression of you considering what you have at risk, of you evaluating your risk, is not a technical process.
We have a list of 23 things that we recommend mobile users do, as part of our mobile defense product.
We actually do not expect that they do all 23 of those things; it would probably not be appropriate for most of our customers to do all 23 of those things.
If some of them were carrying classified information on their phones, now we’re talking – do all 23 of those things.
But most of our customers are not, so it really is going to depend.
DUCK. I’m looking at that list right now, David.
It’s essentially a list that you should at least confront, and decide which bits of it should apply where in your organization.
And I really like that approach.
Because otherwise, as we’ve said in previous podcasts, if you just try and have a one-size-fits-all policy, you’re just going to press the “shadow IT” button, aren’t you?
So it is, as we’ve said before on the podcast, like most things in security, a journey, not a destination.
DAVID. And it has to be informed by your actual risk.
Risk analysis is a practical exercise, not a technical exercise.
It’s the same thing with disabling Bluetooth, or disabling JavaScript, or reviewing app permissions and setting them appropriately.
“Appropriately” depends on your risk profile.
And, really, only you can evaluate that directly.
Now… with the assistance of others; I’m not saying that there’s no one that can help talk you through that.
You don’t have to be a risk expert, but you do have to actually think through it for your particular scenario.
DUCK. And that’s part of the SolCyber service and solution, isn’t it?
“Here are 23 things that you can do.”
“Some of them you almost certainly should do, but there are some that just aren’t going to work for you.”
Let’s work out what the threat and response matrix is across the organization, in a way that people will *want* to comply, and they *can* comply.
And best of all, it means that then, if they realize that their phone isn’t compliant, they won’t feel that they should keep quiet about it because it would be a terrible hardship if they owned up.
You want that last pair of eyes, don’t you, instead of just trying to automate everything?
DAVID. Yes!
As the practitioner, and as the person in charge of the risk, or the stakeholder, you do want the full list of possibilities.
Be ready to dismiss two-thirds of it because it’s not relevant to your particular risk profile, and then take seriously the one-third of it, in some comprehensive way, that will actually be practical for you to maintain.
DUCK. Absolutely.
Because by definition, preparation is only something you can do in advance.
You know, we see this often with ransomware attacks: “If we get hit, it’s going to be so traumatic, we may as well just wear it and we’ll just make it up as we go along.”
It’s very, very hard to do that.
And it’s effectively impossible to do it well.
But if you could have prepared, what a terrible thing if you didn’t?
Because you can’t do it while all your computers are showing flaming skulls on their wallpaper all around you.
DAVID. [LAUGHS] Certainly not then… you can’t do it then!
And then you kind of get into the stuff we’ve discussed in previous podcasts about intentional design.
Maybe if you’re developing that known risk by doing the thing that you’re doing, you ought to reconsider doing that thing?
DUCK. Yes.
DAVID. You know, it’s possible that turning off Bluetooth is too difficult for you.
But if it’s a thing you really think you ought to do still, why do you feel like you ought to do it?
What activity are you participating in that turning off your Bluetooth now is the way you’re going about it?
And maybe you need to rethink doing that activity.
You know, if you’re a company and you think, “Oh, boy, there’s no way that we could apply HIPAA to our processes!”
Well, what are you doing collecting health records?
Let’s ask first why we actually have this sort of feeling that we should be applying HIPAA to our methods or to our business processes.
So, you have to do that reconciliation.
And that is not a technical activity.
It’s oftentimes, in the context of business, a business strategy activity.
DUCK. A lot of IT is stick-wielding, isn’t it, still?
It’s like we’ve almost gone full circle and we’re back to the 1990s where, “Oh, it’s all so hard.”
“The attack surface area is so huge that we’re going to check all the boxes in the matrix and apply it, and that’s it.”
“And if it makes your work harder to do, you’re just going to have to wear it.”
DAVID. Yes.
I think that, you know, whenever I use the carrot-and-stick metaphor, I always think also of the relative spacing of the carrot and the stick.
That’s important too.
You don’t want the carrot to be so elusive, and you don’t want the stick to be whacking the victim because they made the very tiniest mistake.
They have to be appropriately spaced so as to be taken seriously, relatively speaking.
I don’t know if I expect users to have technical investigative ability or interest in a broad sense, but I absolutely do expect them to know generally what they ought to be doing, and what they ought not to be doing.
You see that sort of thing, especially in very sensitive environments where there is a community understanding that certain kinds of communications simply must occur over certain kinds of channels.
Classified information, for example, or maybe controlled unclassified information.
Those communities tend to have understanding, even if it’s not a technical understanding.
They have an understanding that they ought not be emailing something.
And so that’s sufficient awareness.
I think that’s healthy.
And to the extent that the equivalent of that does not exist in your environment, it is incumbent on you to build that as the administrator of your environment, or to build it as the stakeholder of your environment, or the stakeholder in the risk which is at hand.
DUCK. You said this very presciently a month ago in the last podcast, didn’t you?
“The fact that you’re using an app that has the name Signal, which is known to be secure end-to-end and can’t recover what you’ve talked about…
…doesn’t protect you from yourself.”
DAVID. [LAUGHS] We couldn’t have asked for a better example, in the wild [SIGNALGATE], of the stuff that we talk about all the time in this podcast; in our knowledgebase documents; in our suggestions to customers.
Clearly, the stakes are not, for most of our customers as high as they were in that Signal thread.
At the same time, you know, no matter what you think of the content of that Signal thread, whether it represents a massive breach or not, the facts are that it was a non-technical problem.
It was a human operational problem, and no app of any security level is going to stop you from communicating with someone that you willingly added to that chat.
That really just isn’t how the world works, and it’s a human problem – it is not a technical problem.
That’s the culture you need to build.
It’s a culture that realizes, “Hey, we’re about to talk about something that is potentially sensitive; we probably ought to use an appropriate channel.”
DUCK. When you have a breach of that sort, even if what you’re talking about was not of significant risk, it paints a very poor picture for your customers in the future, doesn’t it?
“What if I need to talk with that company about something much, much more serious?”
“What if they make the same kind of blunder?”
It’s not an easy problem to solve, but it is one that requires human commitment, human involvement, and, like you said, well-positioned and well-spaced carrots and sticks.
You can’t do it with rules; you can’t just do it with regulations.
You actually need everybody to buy into it, and to want to stay bought into it, don’t you?
DAVID. Yes.
And to want to focus on, to maintain the focus on, those human issues, those operational security issues.
DUCK. Having said that there is a human-centric solution to all of this, it’s still worth mentioning the SolCyber mobile protection product, isn’t it, David?
Traditionally, people have figured, “Well, we’re just going to enforce a set of rules that we think is right.”
“And because we can do that with MDM, mobile device management, that’s good enough.”
But there is still a place for more of what you might call EDR, or endpoint detection and response.
There is a place for that on mobile devices, despite the perhaps better security that mobile devices have compared to laptops.
If it’s a site that you’re supposed to be allowed to go to, or if this app is allowed to use your camera, then you can scan a QR code with it and you could end up in harm’s way.
So there are some technological things you can do, as long as you don’t decide, “Hey, well, I’ll just buy a mobile phone EDR-style product, and then I don’t have to engage my people anymore because it’ll just be all automatic.”
It’s a little bit of both, isn’t it?
DAVID. We developed the product because it *is* a little bit of both.
And the product itself operates in an ecosystem, so it’s defense in depth, like all other things.
The Google Play Store, for example, is really good at catching malware that has been out in the wild for a few days.
It’s not really good at catching malware that came out 10 minutes ago.
DUCK. [LAUGHING] It’s very good at detecting malware that it doesn’t miss.
DAVID. [LAUGHS] Yes.
What I’m really pointing out is that there are novel threats at all times.
Go to MalwareBazaar, which is a website, and you’ll see APKs, for example, to install on an Android phone, of all kinds of different payloads.
If you pick ones that are simply too new, Google Play Store is not going to capture those.
If you have a device that has developer mode on and you’re sideloading this with a USB cord or something, I mean, there are all kinds of ways to circumvent Google Play noting that you’ve installed malware on your device.
And you’re kind of in a similar situation on Apple iOS.
It’s a little bit more locked down, but nonetheless, there are ways to effectively sideload malware if you’re hardened enough, or dedicated enough to the cause.
DUCK. Absolutely.
And, David, there’s also the problem that even for what you might call run-of-the-mill cybercriminals (in other words, not state-sponsored actors), there’s an obvious interest in continually submitting apps to the App Store until you find the magic trick that stops the automatic tools (because that’s what they are – you don’t get a human looking at every new app; there are too many of them), until you find the way in.
And then you can rush in a whole load of fraudulent apps, knowing that, if they get removed in three days, you’ll still have got victims to sign up and pay a $49 charge that they’re not now going to get refunded, or you could still have stolen information off the device that you can sell on for other crimes later.
Some bunch of people are always going to be the first.
So the idea that, “Oh, well, three days later, Google Play or the Apple Store blocked it.”
To them, that’s very cold comfort, isn’t it?
DAVID. It should be.
Because also you’re relying on it being comprehensively blocking at that three-day time horizon, which it’s not.
That’s just kind of an example, right?
DUCK. Yes, because first somebody has to notice that the automated systems have failed.
DAVID. Yes.
And the MalwareBazaar genre of exploits is meant as a stand-in for most of the less sophisticated ones.
Not nation-state attacks; when you get into the nation-state attacks, then you have an entirely different problem.
There aren’t a lot of people uploading Pegasus to MalwareBazaar.
DUCK. And every build of Pegasus is different anyway, so good luck with that. [LAUGHS]
DAVID. Yes.
And so you’re definitely not going to stop it with that, “Oh, it’s been out three days, and everyone knows the hash, and it’s got a signature that we can predict.”
You’re not going to stop it that way.
And so that’s the other reason to have this kind of defense-in-depth.
The notion that a nation-state might be after you, or that there is a category of attack outside of this three-day time horizon where it just isn’t going to get caught, is really where something like mobile EDR would be helpful to your organization.
It’s something where that defense in-depth comes into play.
Because, especially with those considerations…
Let’s say you have a hundred thousand employees that have mobile devices, it wouldn’t be crazy to think that one or two of them are running some kind of sophisticated exploit.
Maybe not even having been directly targeted, but indirectly targeted.
DUCK. Absolutely.
And it doesn’t have to be especially sophisticated.
It just has to be one that hasn’t been blocked by the vendor stores yet.
DAVID. Right.
Exactly.
DUCK. It could be unsophisticated, but just sneakily and sufficiently different because the crooks are trying over and over again until they hit the jackpot.
It’s worth their while to do that, isn’t it, even if they’re not the developers of Pegasus?
They can try over and over and over, and when they get lucky, that luck could hold for quite some time if nobody notices.
DAVID. Yes.
It could be resident on a device that gives them access to things they never even were aiming for, but now they’ve got those things and they’re going to monetize them.
DUCK. And it also absolutely signals to the crooks that this thing really is through the defenses; nobody’s spotted it yet.
So that’s the one that they should put the fraudulent advertising money into, to try and spread it as widely as possible.
DAVID. Yes.
DUCK. So David, I’m conscious of time.
Why don’t you tell our listeners where they can go to find out more about SolCyber Mobile Protection?
DAVID. At solcyber.com/blog, but also just solcyber.com, broadly.
You will see our mobile protection services highlighted on the front page at the moment.
DUCK. Or just go to solcyber.com/mobile-protection if you want to type in the full URL!
DAVID. That’s too much typing for me.
DUCK. [LAUGHS] David, thank you so much for your time.
It’s great to hear someone who’s got the passion both for technological solutions that can really help, and human-centric solutions that can build a better cybersecurity culture for everybody, raising the bar all along its length.
So, thanks so much for your time.
Thanks to everyone who tuned in and listened.
Don’t forget that if you would like to read more articles on mobile phone security, there are plenty available on solcyber.com/blog.
Thanks for listening, and until next time, stay secure.
Catch up now, or subscribe to find out about new episodes as soon as they come out. Find us on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you use your own audio app.
Learn more about our mobile security solution that goes beyond traditional MDM (mobile device management) software, and offers active on-device protection that’s more like the EDR (endpoint detection and response) tools you are used to on laptops, desktops and servers:
By subscribing you agree to our Privacy Policy and provide consent to receive updates from our company.