Meta explains its approach to user safety following delayed rollout of end-to-end encryption

Meta (formerly Facebook) recently announced its plans to delay the rollout of end-to-end encryption across its messaging service until sometime in 2023, citing concerns from child safety advocates who warned the change would shield abusers from detection. Today, the company offered more details about how it plans to approach the need for harm prevention alongside the eventual rollout of end-to-end encryption.

While technology would be able to scan unencrypted private messages in order to detect malicious patterns of behavior, that’s not the case in an end-to-end encrypted environment, Meta explained. Instead, the company plans to use artificial intelligence and machine learning to look at the non-encrypted parts of its platform, like user profiles and photos, for other signals that could indicate malicious activity. For example, if an adult set up new profiles and kept trying to reach minors they didn’t know, or if they began to message a large number of strangers, the company could intervene to take action, a Meta blog post said. 

Meta also recently rolled out a series of increased protections for accounts belonging to minors, including defaulting them to private or “friends only” accounts across Facebook and Instagram. It also this year introduced features that would restrict adult Instagram users from being able to contact teens who didn’t already follow them. And on Messenger, it’s now popping up safety notices that provide tips on spotting suspicious activity and how to take actions like blocking, reporting, ignoring or restricting other users, developed using machine learning. Over the past month, the tips have been seen by more than 100 million users. Meta says this sort of feature would still be able to work in an end-to-end encrypted environment, too.

The company additionally cited the variety of user-facing features that allow them to better control who can reach them through their inbox. Creators who are in search of increased reach may want fewer controls, but will want to filter for abuse and spam, while people with private accounts may want to fully restrict contact only to people they know. Other messaging features blur images and videos and block links from messaging requests (conversations started by people you don’t know).

Meta also noted that, earlier this year, it made a change to its reporting features that makes it easier to report content for violating child exploitation policies by adding a choice that says “involves a child” during the reporting flow, and made reporting more accessible. The company says it’s seen a nearly 50% year-over-year increase in reporting as a result. When messages are reported, that portion of the conversation is decrypted so the company can take action — like reporting attempted child exploitation to law enforcement or NCMEC, for example. Meta says it’s also now warning users that resharing child exploitation images, even in outrage, is harmful and launched a “report it, don’t share it” campaign to educate users.

The company had already spelled out much of these procedures and plans in the context of responding to regulator inquiries and backlash from child safety advocates who are concerned that child exploitation will increase in an end-to-end encrypted environment. But Meta has maintained that the combination of scanning users’ non-encrypted data and existing reporting features will allow it to take action on abuse, even when messages are encrypted. In fact, this is how it manages to detect abuse on WhatsApp, which already uses end-to-end encryption. It recently said, too, that a review of historical cases showed it would have still be able to provide information to law enforcement, even if the chats in question had been end-to-end encrypted.

The issues around encrypted messaging go beyond whether or not Meta should be allowed by regulators to roll it out. While Meta’s post today seems to be arguing that E2EE won’t prevent the company from keeping users safe, a former Meta employee recently accused the company of planning to move to an E2EE system on an “absurdly accelerated timeline,”  which led to resignations from child safety team members who understand child safety protections would have become demonstrably worse, due the lack of a roadmap and plan for protections in an E2EE environment. The specific issues the employee, David Thiel, raised in his Twitter thread were not addressed by Meta’s post, which oversimplifies what’s needed to truly create a safe environment for users while also allowing for encrypted communications.

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to our Newsletter