Facebook Messenger’s ‘Dangerous’ New Update—Why You Should Be Concerned
Written by ABC AUDIO on April 10, 2022
If you’re one of Facebook Messenger’s 1.3 billion users, then Meta’s determination to radically change your app gained serious momentum this week. Despite multiple warnings that this update is a dangerous step in the wrong direction, Meta is not listening.
We are talking end-to-end encryption, of course, and Meta/Facebook’s ongoing program to bring the same level of security that protects WhatsApp to Messenger and even Instagram. These plans, first announced in 2019, have been seemingly plagued with technical challenges. The global rollout is not expected until late 2023.
Beyond the technical workload, Meta’s uphill struggle with this plan has been regulatory. Lawmakers and security agencies already bemoan their inability to penetrate end-to-end encryption, describing it as “a gift to terrorists and pedophiles,” and they don’t want to see the problem getting any worse.
Meta’s latest foray has been to commission a two-year report from Business for Social Responsibility (BSR) into the human rights impact of such encryption, which found that “expanding end-to-end encryption enables the realization of a diverse range of human rights and recommended a range of integrity and safety measures to address unintended adverse human rights.”
This report is right, and yet Meta is wrong. Yes, we should all use end-to-end encrypted messaging apps, but no, not all messaging apps need to be end-to-end encrypted. Meta’s WhatsApp is the largest proponent of such security, and has 2 billion users to Messenger’s 1.3 billion. WhatsApp’s user penetration of developing and autocratic markets, where the need for such security is more acute than in the West, is fairly universal. Meta already meets the needs advocated for by this report.
Furthermore, the report’s finding that expanding encryption enhances “privacy, freedom of expression, protection against cybercrime threats, physical safety, freedom of belief and religious practices and freedom from state-sponsored surveillance and espionage,” have been served by the growth in apps such as Signal, and Telegram, albeit—ironically—that one isn’t end-to-end encrypted.
So, what’s the difference? Put simply, while full end-to-end encryption makes absolute sense on dedicated messaging apps, like WhatsApp and Signal, it is not the same when linked to social media platforms. I cannot browse WhatsApp, looking at profiles and photos, selecting people to click to contact. I cannot mask my identity behind a fake profile in WhatsApp. And, most critically, WhatsApp is not a sticky platform for kids, in the way that Facebook and Instagram (as well as TikTok) are.
As children’s charity NSPCC has warned, this proposed update to Messenger risks “failing to protect children from avoidable harm,” that “10% of child sexual offences on Facebook-owned platforms take place on WhatsApp, but they account for less than 2% of child abuse the company reports to police because they can’t see the content of messages,” which makes this new update very high-risk. We saw this last year, where Facebook evidence was critical to the capture of “one of the web’s most dangerous pedophiles,” which, investigators say, would not have been possible with end-to-end encryption.
Facebook’s response is to bulk up metadata AI analysis. “We’re building strong safety measures that are designed to prevent harm from happening in the first place,” the company has told me, “and give people controls to respond if it does. Working together also gives us more information to identify abusive accounts and allows us to introduce safety features … like restricting interactions between adults and minors.”
Unsurprisingly, the new report echoes this, recommending in Meta’s words that “we continue to invest in effective harm prevention strategies such as metadata and behavioral analysis, user education and robust user reporting, among other tools.”
That said, the report does criticize client-side scanning, which it says “would undermine the integrity of E2EE and disproportionately restrict people’s privacy and a range of other human rights.” Bad news for Apple, which has embarked down the client-side scanning route. Ironically, in this regard Meta’s criticism of Apple on the privacy front is well justified. Client-side monitoring is a bad idea. Period.
In response to child safety concerns, Meta now says “the impacts of E2EE go far beyond such a simplistic ‘privacy versus security’ or ‘privacy versus safety’ framing.” Again, this is right as regards dedicated messaging, but wrong where social media platforms are concerned. Meta assures that “our machine learning technology will look across non-encrypted parts of our platforms—like account information and photos uploaded to public spaces—to help detect suspicious activity and abuse.” But this will not suffice—it has marginal efficacy where the content itself is “dark.”
As one WhatsApp insider told me, “half my day is explaining to people that WhatsApp isn’t a social network,” and it’s this difference that they’re referring to.
There’s some acknowledgement of this in Meta’s indefinite pause on “Instagram Kids,” and in response to this report it says “we have not yet determined how and whether to implement E2EE in Messenger Kids, we are committed to maintaining the same strong parental controls in Messenger Kids as we expand E2EE, including the ability for parents to control who their children can message, and when.”
And this is the crux. End-to-end encrypting Facebook Messenger is unnecessary. It’s not a dedicated messaging app, but a comms tool linked to a social media platform. The trade-off between safety and privacy is fine for WhatsApp, but that’s not the case on Facebook itself. Furthermore, Facebook has in the past admitted to monitoring Messenger traffic, I don’t see much controversy in a client-server/server-client encryption architecture, exactly as Telegram operates, with some form of automated CSAM or child safety scanning in the middle.
As the BST report says, we should promote access to end-to-end encryption for everyone, and its use has made users safer and more secure. But let’s not see this as an absolute—it isn’t. “If content is being shared and we don’t have access to that content,” warned one Meta exec last year, “if it’s content we cannot see then it’s content we cannot report.”
For Meta, there’s another inherent risk in their plans to fully encrypt Messenger and Instagram. There’s currently an uneasy standoff between tech and regulators on encryption. Bar the EU overstepping or something similar, it’s unlikely that WhatsApp or Signal encryption will be broken down. But if Meta insists on expanding that encryption, then it seems likely that regulators and lawmakers will demand a compromise. And that compromise will likely hit existing encryption as well as any expansion. That might serve Meta’s desire to build an integrated (undivestable) messaging giant, but it won’t serve WhatsApp’s 2 billion users.
— to www.forbes.com
The post Facebook Messenger’s ‘Dangerous’ New Update—Why You Should Be Concerned appeared first on Correct Success.