Join Paul Ducklin and SolCyber CTO David Emerson as they talk about the human element in cybersecurity in our podcast TALES FROM THE SOC.
In this episode: When governments insist that we need to regulate encryption algorithms, is that a practical step, or just intellectual laziness?
Co-hosts Duck and David tell it like it is!
If the media player above doesn’t work in your browser,
try clicking here to listen in a new browser tab.
Find TALES FROM THE SOC on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you use your own audio app.
Or download this episode as an MP3 file and listen offline in any audio or video player.
[FX: PHONE DIALS
[FX: PHONE RINGS, PICKS UP]
ETHEREAL VOICE. Hello, caller.
Get ready for TALES FROM THE SOC.
[FX: DRAMATIC CHORD]
DUCK. Hello, everybody.
Welcome back to TALES FROM THE SOC.
I am Paul Ducklin, and joined by David Emerson, CTO and Head of Operations at SolSaber.
DAVID. Yes… SolSaber!
I like it!
DUCK. SolSaber?! [LAUGHS]
DAVID. En garde!
Could be light sabers as well – that’s even cooler. [LAUGHS]
DUCK. [MOCK EMBARRASSMENT] I’ll start again.
[APOLOGETIC COUGH]
I am Paul Ducklin, joined by David Emerson, CTO and Head of Operations at SolCYBER.
David, I’d like to talk about something that’s been all over the news.
We’ve actually covered it on solcyber.com/blog.
And it became an even bigger issue when, in the UK, Apple recently said, “You know what? If you want us to put a backdoor in our iCloud+ encryption, we’re not going to do that – we’re actually going to make it unavailable for people in the UK.”
So the topic I’d like to dig into is end-to-end encryption.
What is it?
Why do we need it?
How does it work?
And why does it seem to present such a dilemma for governments, and is their way of fixing it going to work?
So let’s start with the idea of end-to-end encryption.
We’re not talking about BitLocker, which just protects your hard disk while it’s turned off.
We’re talking about things like HTTPS that protect your web browsing.
What’s that for?
DAVID. Well, it’s for a number of things.
I think, more to the point, it became possible.
It is computationally inexpensive today, due to a number of routines that are built into hardware, due to more efficient software.
At this point, in transit, it’s very inexpensive to encrypt and decrypt strings, or data, or whatnot.
And it is something that therefore becomes effective because of its ubiquity.
DUCK. If we talk about HTTPS…
DAVID. Yes.
DUCK. That’s just HTTP, hypertext transfer protocol, with a secure layer on top.
That means that whatever you generate in your browser (obviously you have to type in, say, the actual username you want to log in as) will always, with HTTPS, be encrypted before it leaves your browser.
And it won’t be molestable, extractable, decryptable, viewable along the way…
…until it gets right to the other end.
And like you said, that used to be considered quite hard.
DAVID. Yes, that used to be expensive.
At the very least, it used to be something that you would not expect every transaction to want to be burdened by.
DUCK. Right.
DAVID. In my lifetime, certainly, it’s never been intractable, but it is certainly something that has become just transparent to the end user – ubiquitous to the point of assumed.
And that’s a good thing; that’s a very good thing.
I kind of began in the middle here, I think mostly because I don’t really consider end-to-end encryption some kind of singular good.
I think it’s almost a feature that has become so ubiquitous, and it’s good because it’s ubiquitous.
But it isn’t in itself the only solution to any given security problem.
It’s not like it’s some kind of singularly important solution.
It’s just really nice to have as a background radiation.
DUCK. And the advantage of having HTTPS all the time, instead of just occasionally, is it means that you can’t realize later, “Oh, no! Of all the packets I sent, that one should have been encrypted and I forgot.”
DAVID. Absolutely.
End-to-end encryption… my point is certainly not that it’s useless, or to downplay it.
But it’s important for cultural reasons, if nothing else.
We should be encouraging people to take practical, inexpensive steps to protect their information.
End-to-end encryption is practical and inexpensive in 2025.
The Signal app doesn’t cost you anything; you should use it.
Everyone should be using HTTPS websites.
The unencrypted website?
It may not necessarily be malicious, but it is absolutely a sign of egregious neglect in 2025 to have an unencrypted website.
HTTPS is also technically important.
It’s a good, honest man’s lock, which is going to prevent a lot of really basic sniffing.
But on its own, it’s not a singular good.
End-to-end encryption is great, but there’s still a physical input layer.
Back in the 1970s, I think it was, Russia was spying on the United States using our Selectric typewriters, and they did so by installing a magnetic pickup in them, essentially a bar that looked like nothing in particular.
These typewriters were subjected to all manner of discovery and analysis because regularly the Russians would have direct access to orders that were being typed, and memos that were being written, and whatnot, on these Selectric typewriters.
It took years before someone finally realized that there was one extra coil in that typewriter, a passive-looking coil that was basically radiating signals related to the rotation of the ball on a Selectric typewriter.
The point is, end-to-end, there’s still an input layer.
There’s fundamentally this problem of crypto-fantasy versus reality, and that’s kind of the “$5 wrench” problem.
If someone wants something from you that they know you have in your mind; really wants it… they’re not going to be decrypting your Signal conversations.
You might imagine that you’ve encrypted yourself into oblivion from their perspective, but what they’re going to do is go to Harbor Freight [A US RETAIL OUTLET FOR TOOLS] and buy the heaviest pipe wrench, and hit you until you tell them what they want to know.
DUCK. [LAUGHS] It’s almost like you’ve been reading XKCD.
I forget the number [XKCD 538], but that is a famous XKCD, isn’t it?
DAVID. Right, yes… but it’s not that end-to-end is useless. [LAUGHING]
DUCK. At one end or the other, just before the data goes into a secure process, or just after it comes out of it, before the other end notices, then you have every chance, don’t you?
Which is, I guess, why crooks really love malware that sits on one end of an encrypted link, like on a spyware-hacked phone.
Because then end-to-end encryption is kind of irrelevant, isn’t it?
Because they are one of the ends – they see the data before it goes in, or after it comes out.
DAVID. Yes.
And some of the solutions to those problems can be simpler still than end-to-end, right?
Like the practice of “not knowing things” is perhaps the simplest of all, and mitigates the $5 wrench problem.
It’s fundamentally protective.
But a lot of those solutions are less practical.
Endpoint hardening, for example.
You brought up mobile OSes…
…yes, that could solve some of your input layer issues, but good luck getting everyone at your company to run GrapheneOS.
DUCK. It’s very dark.
DAVID. [LAUGHS]
DUCK. I found it so dark that it made me feel dark, and I’m quite keen on security, so I get your point…
…if you make it too difficult, then people will just find a shadow-IT way of doing something that does a complete end-run around not only the encryption, but also any endpoint security they’ve got in place.
DAVID. Yes.
Also, if you’re using Signal, don’t assume that you’re protected because of the word Signal.
You’re still typing something into a phone before it gets encrypted; you’re still having that thought in your head before it gets typed into the phone.
So the “estate of vulnerability” is sometimes not really fully inventoried.
DUCK. So, go for an environment in which you have built what you might call (and it sounds corny to say it) a cybersecurity culture where humans feel inclined to look after their own data, and after other people’s, on the grounds that they’d expect the same thing in return.
DAVID. Or solve those problems another way.
If you’re working in a culture that is pragmatic, you might ask yourself the question, “Does anyone need to know this data?”
“Why are we even sending this data? Why does this data exist or why is this data contemplated in this system?”
I’m not saying nobody should communicate ever, because obviously the world is built on communication, and the exchange of ideas, and whatnot.
But, at the end of the day, parsimony can go a really long way toward not having to kill yourself with security measures.
And that was our “Data Lake” discussion, right? [PODCAST S1 EP008 – THE ILLNESS OF EXCESS]
“Think about how many fields you can collect…”
Well, great – if you can think of data, you can probably collect it in 2025.
But maybe you just shouldn’t.
Maybe that discretion will save you the knock-on effects of having to secure all that stuff that you collected for perhaps unclear reasons.
DUCK. Yes, because you touched on that in the Data Lakes episode, didn’t you?
Talking about how a company may have collected stuff that they figured, “Hey, we could use this for sales or marketing.”
“Maybe we could even use it for scientific research in the future, so we’ll just put in the cupboard somewhere and we’ll be reasonably careful with it.”
And then along come upgraded regulations, like HIPAA, or GDPR, or something like that.
Regulations that existed in one form, and then maybe they get a little bit strengthened.
And suddenly, you’ve just acquired this massive cost of having collected that data in the first place.
DAVID. Yes!
You touched on the political aspect earlier as well.
And you actually see that in end-to-end encryption itself, even in the marketing of end-to-end encryption.
In security generally, you won’t find a lot of practitioners taking seriously end-to-end encryption companies that are for-profit that are based in Australia, or the US, or the UK, or Germany.
And the reason is really simple.
All of those countries have laws to which for-profit companies would be subject, or have some downside to not complying with in that jurisdiction, which would counteract the advantage of end-to-end encryption.
Politically, I really like that the Signal Foundation, and the OpenBSD Project, and other serious security companies and organizations are very open about how they structure their corporate entities.
They’re very open about being in Canada, or being in Switzerland, or being non-profit, or whatever.
Just explicitly to avoid these kinds of influences and jurisdictions.
Because it does call out the absurdity of trying to stop privacy, which is really what the US, and the UK, and Germany, and Australia are trying to do, while still offering the fundamental service of privacy to those who need it or think they need it, or want it, or just want to support.
You know, I donate to Signal.
I’m not a dissident; I’m not a journalist.
But I think it’s really important that they exist.
I love that they’re non-profit.
They are coincidentally, or notionally, based in the United States, but they don’t have the keys, and they make no money off of the product.
So to the extent that the US decides to enforce any kind of middling on them [MITM – MANIPULATOR IN THE MIDDLE SNOOPING], they can just tell them to take a hike.
They have no real skin in the game, which is great.
I think that’s exactly how things should be structured.
DUCK. Yes, because the standard counter argument is, “Well, if you’ve got nothing to hide, you shouldn’t have anything to worry about if the government, or a private company, or anybody, is allowed to insist that you decrypt your data.”
And the problem with that is that it’s not true, is it?
We all have things that we ought to be allowed to hide, simply because they’re ours and they’re nobody else’s business.
And many or most of us have things that we have already entered into a contract with someone else to say that we will never disclose.
For example, like the PIN on your bank card.
So this idea that you’ve got nothing to hide… it’s not really a question of hiding, is it?
It’s more a question that there are some things that will be highly disadvantageous – not just to you, but possibly to everybody else, including other law-abiding people – if they get out.
Why should you not be allowed to protect that stuff?
DAVID. Absolutely.
The idea that “you’ve got nothing to hide,” I see almost in a different light, in a practical sense.
I would consider myself, you know, someone that doesn’t have much to hide.
There’s nothing in my Signal that I would have a problem with revealing in a court of law.
But there’s a lot in my Signal that I don’t want in public; there’s a lot in my Signal that I would not want to broadly reveal without a reason.
And so I feel like “nothing to hide” is almost a reference aspiration.
Act as though you have nothing to hide, and you will be free in the sense of the concern that you would have something to hide.
It’s good security practice.
But some people do, as part of their jobs, have things to hide.
Some people do, as part of agreements that they’ve made, have things to hide.
So, to your point, it isn’t realistic that people have nothing to hide – it’s just a reference, you know, of how to be less vulnerable.
DUCK. Well, it seems to be a kind of irony, doesn’t it, looking specifically at a country like the UK, which just happens to be where I live…
On the one hand, we have a public sector body like the Information Commissioner’s Office urging people to take data security seriously; urging them to use encryption reliably, strongly, and properly to protect the privacy of other people, as required by law.
And then, on the other hand, we have other public sector bodies saying, “Oh, no, no! We should have encryption, but it shouldn’t be too strong.”
It’s kind of like there’s this fundamental tension inside the public sector itself.
One side is saying, “You’d better start taking your customers’ privacy seriously, because we’ve had far too many breaches.”
And the other is saying, “Yes, but you’d better not do it very well.”
It’s an unresolvable problem, that, isn’t it?
They have this conflict of interest with one another.
DAVID. It’s a misunderstanding of freedom, I think.
It’s a misunderstanding of technology, to some extent.
The same argument is used for cryptocurrencies.
The notion of regulating cryptocurrencies… it’s almost hilarious.
They’re not going to; it’s not possible mathematically.
It isn’t going to be done by any government of any size in any real sense, right?
No one is going to stop me if I want to move my cryptocurrency from one figment of math to another figment of math.
There will be no regulation of that.
So, I find it laughable in the same way that I find the whole debate about, “Oh, well, encryption should be strong, but not so strong.”
It’s ridiculous.
You’re not going to stop people from using an algorithm; you’re not going to stop people from using a method.
DUCK. Exactly.
DAVID. If you truly believe that that is what your investigation hinges on… I’m not saying it never has, but as a system, you need to improve your investigative measures.
Or you need to improve the protections that are in place to prevent the crimes that are occurring that you find yourself unable to investigate.
DUCK. Exactly.
DAVID. It’s just silly.
It’s trying to govern the ungovernable.
DUCK. Yes, and this idea of, “Well, let’s have a weakness; let’s have a backdoor in there, but where the government will keep it to themselves, or where anybody will keep it to themselves.”
Well, it didn’t work last time.
DAVID. [LAUGHS]
DUCK. It didn’t work the time before last.
DAVID. It never works.
DUCK. Because it only takes one blunder, doesn’t it?
That’s the whole problem.
DAVID. Yes.
To the degree that it’s genuine, and not just an attempt to swell one’s laws with impractical enforcement measures, I think it’s just intellectually lazy.
I think that’s what’s really happening is someone doesn’t understand the technology, or the sources of this information.
In fact, they probably ought to focus their efforts somewhere else.
It may be possible to stop the crime that they perceive occurring, but it’s not going to be possible to do it by making proclamations about the strength of one’s encryption.
I think that, unfortunately, there are plenty of people that are real investigators that are willing to pick apart the blockchain, but at the same time the political tagline is, “Regulate crypto.”
And it makes very, very little sense.
DUCK. Yes, because a backdoor for one is a backdoor for all, isn’t it?
DAVID. Absolutely.
DUCK. As we know, secrets that apply to global systems are impossible to keep.
This idea of regulating weakness into crypto didn’t work last time, or the time before, or the time before.
DAVID. It’s definitely not going to work. [LAUGHS]
DUCK. In fact, you could argue that, in the 1990s, it basically ruined the cryptographic software industry in the US.
You couldn’t write cryptographic software for export, whereas people in the US could legally buy strong encryption technology from companies outside the US and use it inside the US.
It seemed like a bit of an own goal, and finally, in 2000, it was, “Yes, well, let’s not do that.”
But I guess this is one of those things that comes full circle every few years, isn’t it?
DAVID. It is.
It’s not limited to technology.
If you live in a country where it’s technically illegal to encrypt your own data, you should move.
DUCK. [LAUGHING] That may be easier said than done.
DAVID. [LAUGHS] There are many, many countries where the services are backdoored.
The US is one of them – I’m certain that many of the services I use offer a backdoor to the government for the purposes of investigation.
The UK is another; Germany is another; Australia.
There are countries where your data is fundamentally not safe, if it’s hosted by a company in that jurisdiction.
That can be OK, because those countries still feature fundamentally some form of freedom.
To encrypt, you just might have to effect it yourself.
So it may mean encrypting the data prior to uploading it, encapsulating your data inside the presumed backdoored encapsulation.
That’s one option.
Another option is not to use quite so much cloud.
Why do you need to have that data uploaded to a service where it is available for investigation?
Why do you need to have that data perhaps transiting a service of unknown configuration?
It’s like the public Wi-Fi problem.
I’m not saying never go into Starbucks and ever use their Wi-Fi; I’m just saying, maybe don’t go into Starbucks and use their Wi-Fi to transit classified information.
Discretion can go a long way, just as it can for database design.
And so the things you do to protect yourself, they might not be particularly technical.
They might consist of, “I’m not going to transmit this data over this service.”
That’s very, very simple.
That would be my advice – just general awareness.
Don’t get too deep into the techno-babble because it isn’t necessarily a technological problem.
It’s more of a logical issue.
DUCK. You’re right.
Because I suspect that what’s enticing about cloud services like iCloud, for many people, is not whether or not it provides encryption.
It’s a way of having an offsite backup – that’s good.
It’s also a way of, when you need to or want to, just sharing your data further, and wider, and more quickly than you otherwise would be able to.
As you say, if they didn’t bother with the cloud in the first place, they’d have a little less convenience, but a lot more security.
And in cybersecurity, a little inconvenience can go an awful long way, can’t it?
DAVID. Oh, certainly.
It’s all about “a little inconvenience.”
The thing that you will do regularly and consistently – that’s the thing that will keep you secure.
And that is very, very little in this world.
What almost all vulnerability hinges on is that inconsistency; that one time that you didn’t do something correctly.
And if you really do have something to secure, thankfully, there are plenty of movements for you, right?
There’s Signal for messaging, used by all manner of political dissidents.
There’s the self-hosted cloud movement.
A lot of the fediverse is making popular the notion of self-hosting services, rather than using public shared social media.
Well, that goes for file shares too.
If you’re doing something really potentially objectionable in your country, maybe you should self-host it?
It doesn’t solve the $5 wrench attack, and if investigators really believe that you’ve committed a crime, you might still end up divulging those keys.
But if nothing else, for the civilian non-criminal, it’s a vote in favor of, “I actually take privacy seriously, and other people do too, people who are not criminals who aren’t trying to get away with anything.”
DUCK. Well said!
I think that’s a great place to end, David, because we’ve got quite excitable and carried away today.
So, thank you so much for your time.
This is quite an emotive, or an emotional, topic, as well as a political one, not just a technical matter.
If you enjoyed what you heard here, then don’t forget to head to solcyber.com/blog, where we have a long form article about end-to-end encryption ready for you to read.
Thanks for listening, and until next time, stay secure.
Catch up now, or subscribe to find out about new episodes as soon as they come out. Find us on Apple Podcasts, Audible, Spotify, Podbean, or via our RSS feed if you use your own audio app.
By subscribing you agree to our Privacy Policy and provide consent to receive updates from our company.