Cybercrime first turned into a global problem that affected all of us, directly and indirectly, about 40 years ago.
That’s when computers started to move out of server rooms, where they were guarded by a high priesthood of operators in white lab coats, and showed up on our work desks and in our homes.
The closely-guarded computers in secretive server rooms never went away, of course; in fact in today’s cloud-centric world, server rooms have become so large that we call them data centers.
But those personal computers have spread not only to our desks, but also to our backpacks and briefcases, in the form of laptops; and to our pockets and our purses, in the form of mobile “phones,” as we still rather quaintly call them.
Unsurprisingly, then, cybercrime has followed us, in new-fangled ways, into all these new-fangled places.
Cybercriminals and their country-level counterparts, known in the jargon as state-sponsored actors, are actively attacking our digital infrastructure at all times.
They’re after the cloud servers we rely upon; they’re after the laptops we use; and some of them are zealously focused on our mobile phones, even though those devices are often locked down by the vendor and the network provider for commercial reasons.
The good news, loosely speaking, is that the locked-down nature of mobile phones, combined with their stricter access controls, such as keeping apps away from each other’s data, does make them harder to hack for the average cybercriminal or state-sponsored attacker.
But difficult does not mean impossible.
The bad news, then, is that well-funded attackers with time and money on their hands to focus on mobile malware and spyware, and who are in the habit of keeping any bugs they find secret instead of disclosing them responsibly to the vendor so they can be fixed, frequently end up with mobile phone exploits they can use to implant malware onto chosen victims’ mobile devices.
From then on, those attackers can keep their victims’ digital lives under constant surveillance, scooping up their emails and messages, tracking their location, listening in to their calls, taking surreptitious screenshots, and even activating the camera or microphone remotely to “bug” meetings invisibly.
As David Emerson put it in a recent Tales From The SOC podcast:
I think mobile phones appear unhackable [because they are] somewhat more compartmentalized, somewhat more limited, than a general-purpose compute device. But, at the end of the day, they’re a [computer] with that mask of constrained functionality on top of it for the purposes of convenience.
They’re no less capable of divulging secrets. And in some ways, because we carry them in our pockets, they actually collect far more about us which might be of practical use to an adversary.
Mobile phone adversaries aren’t always cybercriminals, and they aren’t always state-sponsored actors either, at least not directly.
Over the years, numerous commercial companies have appeared that have openly, and often legally (or so they have claimed), offered mobile spyware for sale.
Brands such as Mobistealth, mSpy, and StealthGenie, for example, have been pitched as tools for keeping track of underage children, typically on the assumption that parents would own and pay for the device, and tell their children that they had locked it down in various ways.
But these apps are also widely known as stalkerware, because they are commonly abused for snooping on spouses suspected of cheating, or for exterting control over partners in abusive relationships.
These vendors generally hide behind disclaimers that the purchaser is responsible for complying with privacy rules in their jurisdiction, and that the software isn’t intended for installation on someone else’s device without their knowledge.
But this sort of application deliberately drops out of sight once activated, and often requires only brief access to a victim’s phone to install without their consent.
No commercial spyware company, however, has created quite as much news as NSO Group, makers of the notorious spyware toolkit known as Pegasus.
The eye-watering fees alone clearly pitch it at governments and state-sponsored actors, but although would-be purchasers have to be approved by NSO Group itself, its customers have apparently ranged from democracies to theocracies, and from privacy-conscious countries to authoritarian states.
Unlike stalkerware vendors, where malicious purchasers themselves are left to sneak the software onto their victims’ devices, NSO Group focuses on so-called zero-click exploits, which is where an attack can be launched on a device without needing to wait for or to lure the user into doing anything.
Phishing attacks, for example, often rely on the victim opening up an email or a text message (at least one click or tap), clicking through to visit a website (a second click), and then entering data or otherwise interacting with the rogue page (a third click).
Even basic phone-based scams rely on the victim answering a call in the first place, or listening back to a voicemail later, requiring some interaction and leaving at least some trace of how the scam was initiated.
But zero-click attacks are literally what the name suggest: they can be triggered whether the victim clicks on anything or not.
Back in 2019 NSO Group found (or paid someone else to find and keep quiet about) a security bug in the WhatsApp messaging software that could be triggered just by placing a call to someone else’s device.
While the app was accepting the incoming connection, before the user answered the call or even realized that an incoming call was on the way, the exploit would activate, allowing the NSO Group’s attack code to implant spyware invisibly in the background, and take remote control over the device.
Fortunately, Meta, owners of the WhatsApp brand, managed to spot and inhibit this attack, and to send a warning to affected users (about 1400 in number, Meta said at the time).
In fact, Meta was sufficiently irate at the attack that it decided to take legal action against NSO Group, and that case ground slowly through the courts, even as NSO Group continued to make and sell new variants of its Pegasus spyware that were used to spy on victims around the globe.
NSO Group was ordered by the court in 2024 to hand over its WhatsApp attack code to Meta, as an understandable part of the legal discovery process, but failed to do so, justifying the comments of cynical IT publication The Register that the company was “fighting tooth and nail not to be held accountable for providing surveillance tools to government clients.”
The court was not amused, and issued a summary judgement favoring Meta in December 2024, agreeing with Meta’s argument that the “defendants have failed to produce Pegasus source code in a manner that can be used in [the legal case], failed to produce internal communications (i.e. email), and wrongfully imposed temporal limitations on their production/testimony.”
Earlier this month, the case went for deliberation by a jury, which found against NSO Group, and awarded WhatsApp compensatory damages in the curiously precise amount of $447,719.
That amount is supposed to settle the costs and losses incurred by Meta as a direct result of the infringment.
But the jury also decided in favor of hitting NSO Group with so-called punitive damages, which in US court cases frequently exceed the compensatory amount.
And exceed them they did, with the answer to the question, “What is the amount of punitive damages that NSO should pay” declared as a whopping $167,254,000.
The original amount, in black ink, was evidently not written distinctly enough for the court, as the “1” in the amount was first written over, before being scrawled out and written freshly above in blue:
Meta’s own commentary following its victory admits that the company faces “a long road ahead to collect awarded damages from NSO,” as the NSO’s unwillingness to comply with the court so far invites us all to infer.
But Meta insists that it plans to go after the money anyway, with the intent, it suggests, of making a “donation to digital rights organizations that are working to defend people against such attacks around the world.” (The percentage of any amount collected that Meta plans to give away is left unsaid.)
Of course, this case goes well beyond WhatApp and Meta, and beyond those 1400 hapless vicitms targeted all those years ago.
Whatever you think of Meta’s own record on data collection and privacy, the company’s description of what emerged as a result of this long-running case is well worth reading:
[T]his trial put spyware executives on the stand and exposed exactly how their surveillance-for-hire system – shrouded in so much secrecy – operates. Put simply, NSO’s Pegasus works to covertly compromise people’s phones with spyware capable of hoovering up information from any app installed on the device. Think anything from financial and location information to emails and text messages, or as NSO conceded: “every kind of user data on the phone.” It can even remotely activate the phone’s [microphone] and camera – all without people’s knowledge, let alone authorization.
This trial also revealed that WhatsApp was far from NSO’s only target. While we stopped the attack vector that exploited our calling system in 2019, Pegasus has had many other spyware installation methods to exploit other companies’ technologies to manipulate people’s devices into downloading malicious code and compromising their phones. NSO was forced to admit that it spends tens of millions of dollars annually to develop malware installation methods including through instant messaging, browsers, and operating systems, and that its spyware is capable of compromising iOS or Android devices to this day.
Meta’s observation that attacks of this sort continue to this day (not least because NSO Group is one of several vendors selling spyware-as-a-service) are borne out by the news that as recently as February 2025, the company was made aware of another successful zero-click compromise via the WhataApp software.
We provided some detailed advice at the time, which we refer you back to now, including:
Learn more about our mobile security solution that goes beyond traditional MDM (mobile device management) software, and offers active on-device protection that’s more like the EDR (endpoint detection and response) tools you are used to on laptops, desktops and servers:
Paul Ducklin is a respected expert with more than 30 years of experience as a programmer, reverser, researcher and educator in the cybersecurity industry. Duck, as he is known, is also a globally respected writer, presenter and podcaster with an unmatched knack for explaining even the most complex technical issues in plain English. Read, learn, enjoy!
By subscribing you agree to our Privacy Policy and provide consent to receive updates from our company.