If you’re one of the 1.3 billion people using Facebook Messenger, then you need to switch to an alternative. Facebook has suddenly confirmed significant delays with much needed security enhancements to the platform, enhancements that its own executives say are “essential.” Here’s what you need to know.
“The lessons of the past five years make it absolutely clear that technology companies and governments must prioritize private and secure communication.” So said senior Facebook exec Will Cathcart in a Wired opinion piece this week.
Cathcart currently heads WhatsApp, and his article focuses on the need for end-to-end encryption to be protected. He’s absolutely right. Such encryption is “essential,” there is “serious pressure to take it away,” and it “should not be taken for granted.”
I have warned users before to quit Facebook Messenger for alternatives. Beyond its lack of encryption, the platform is also open to content monitoring by Facebook itself, and I have also reported on other serious issues with its handling of your private data.
Now, this week, we have seen three separate events, all of which should give you every reason you need to make that change, to quit Messenger. First Cathcart’s rallying cry for users to use platforms with end-to-end encryption in place. Second, Facebook admitting that such security will not come to Messenger until some time in 2022, at the earliest. And, finally, another story on Facebook’s data mishandling.
Prior to his WhatsApp role, Cathcart oversaw the development of Facebook’s apps. And so it’s somewhat ironic that the publication of his article coincided with Facebook’s latest data disaster—the online release of 533 million user records. Leaking user phone numbers isn’t really the best ad for “privacy and security.”
The data in this latest breach escaped from Facebook some years ago—it has been documented before, as have other such hyper-scale Facebook breaches. The issue wasn’t so much the data exposure this time, but rather the response. Facebook has been heavily criticized for playing down the seriousness of this data exposure and for not informing all of those impacted users.
Back in 2019, I reported on a vulnerability that allowed user phone numbers to be pulled from Facebook databases at scale. At the time, the company admitted the flaw but nothing more, telling me it was a complex and unlikely exploit. Further irony, then, in Facebook using my 2019 story “as evidence that it publicly acknowledged the 2019 Facebook contact importer breach,” as reported by Wired.
There are echoes of that 2019 “unlikely problem” response from Facebook today. I have just published details of a shocking WhatsApp flaw that would enable an attacker to remotely disable a user’s WhatsApp account, deregistering their phone and then preventing them from getting back in. WhatsApp has not yet confirmed it will issue a fix—right now the vulnerability remains live and users should beware.
Meanwhile, on the privacy front, there’s a battle taking place between the world’s largest tech giants—they’re fighting over your privacy, or lack thereof. Apple has Facebook in its sights. Its welcome new privacy labels are a scary, striking reminder as to just how much data we surrender to use the free apps that run our lives.
Which takes us back to end-to-end encryption. WhatsApp found itself caught up in the Apple privacy label debacle, when it transpired that it was way out of step with its peers on data collection—an issue compounded by a mandatory change of terms to enable Facebook to generate more revenue from WhatsApp.
WhatsApp’s defense against all this has been end-to-end encryption. “Five years ago,” Cathcart wrote this week, “we completed our roll out of end-to-end encryption… This was a technical achievement decades in the making… In the past five years, WhatsApp has securely delivered over 100 trillion messages to over 2 billion users.”
I agree with Cathcart. End-to-end encryption is absolutely critical. Its use should be expanded. “End-to-end encryption is now the way most messages are sent globally,” he pointed out. “Technical as encryption can be,” he asked, “it is really about something at the very core of how we live our lives today: Should people be able to have a private conversation when they are not together in person?”
“I believe the answer must be yes,” Cathcart responded to his own question. “The lessons of the past five years make it absolutely clear that technology companies and governments must prioritize private and secure communication.”
Absolutely right. But WhatsApp isn’t the only hyper-scale messaging platform under Facebook’s roof. Its stablemate Facebook Messenger caters to more than 1.3 billion users—only WhatsApp is larger. But those 1.3 billion users don’t get the benefit of default end-to-end encryption, they don’t have “private and secure communication”. When it comes to Messenger, the answer to Cathcart’s question “should people be able to have a private conversation when they are not together in person?” is currently no.
For more than two years, Facebook has talked about expanding WhatsApp’s end-to-end encryption to include Messenger and even Instagram. Back in 2019, the company’s Jay Sullivan told a senate committee that “people should be able to communicate securely and privately with friends and loved ones without anyone—including Facebook—listening to or monitoring their conversations.”
Easier said than done. Encrypting Messenger has taken significantly longer and it has been much more complex than envisaged. “While we will continue to make progress on our move to end-to-end [Messenger] encryption,” a Facebook spokesperson told me this week, “it’s a big technical project and all of our messaging services won’t be fully end-to-end encrypted until sometime in 2022 at the earliest.”
“End-to-end encryption is more than a fundamental right,” says ESET’s Jake Moore. “It is a vital necessity for all communication tools, and any platform not yet secured with this layer of protection must be treated with caution. It is well documented that non encrypted forms of communication can be surveilled by law enforcement, app owners and even some third parties, so it is important to treat such apps with care and not to be used for private communication or to transfer sensitive data.”
It’s hard to reconcile Cathcart’s words with Facebook’s continued failure to fully encrypt Messenger. One can’t help but think that it could be done more quickly if it was more of a priority. And then there is always the question as to whether some of this complexity stems from the data harvesting and mining changes that will be needed to work around the limitations brought about by such encryption.
“End-to-end encryption locks tech companies out of particularly sensitive information,” Cathcart wrote. “Will we be able to have a private conversation, or will someone always be listening in? The choice we make will have lasting consequences for future generations.” It’s hard to argue with that—it’s also hard to tally that statement with Facebook’s business model and those stark privacy labels.
As regards the current controversy circling around those 533 million user records, Facebook says that “malicious actors obtained this data not through hacking our systems but by scraping it from our platform prior to September 2019.” Maybe so. But it’s a welcome reminder that Facebook is a data machine, and the safest way to secure your private information is not to give it up in the first place.
As infosec expert Mike Thompson warns, “Frankly, if it has a Facebook fingerprint on it, it’s about monetizing you and your behavior within its ecosystem.”
In reality, Facebook’s response was directly from the Cambridge Analytica playbook. It’s not our fault. We were exploited by bad actors. We did nothing wrong but we’re putting it right. In the immediate aftermath of Cambridge Analytica, Facebook was almost on the ropes. But it’s now bigger and more profitable. Its billions of users didn’t care quite as much about their privacy as many of us thought/hoped they would.
“Imagine if your government, or a foreign one, could see every transaction you made,” Cathcart asked, “or if your boss could see every text message you wrote or photo you sent.” Yes, just imagine if an organization could harvest that much of your data and monitor your content and transactions. “That’s the greatest risk of all,” Facebook’s Cathcart said. “No matter how well-meaning the motivation, surrendering our privacy would paralyze us.”
You now have the tools and the information to make informed choices about the apps and services you use, and how you use them. With all that in hand, take a look at those privacy labels and the information on end-to-end encryption and make the right choice. Unless we all “vote with our feet,” choosing services that respect our privacy, then how can we expect those service providers, those data harvesters to change?