Since the 1990s, Governments around the world have often used child welfare as an excuse for excessive access to all kinds of Internet policy: backdoor encryption, centralized censorship mechanisms, and measures against anonymity. So when Meta, under pressure from the government as well as NGOs, announced its decision last week to delay the rollout of end-to-end encryption for messaging systems such as Instagram DM and Messenger. The reason given for the protection of the children was – the proponents of privacy were perplexed and suspicious. But as someone who has previously worked on safety and security on Facebook, I do not see the delay as an arbitrary political decision. Concerns about the safety of young consumers are real, and the issues are wide-ranging, especially when it comes to social systems as complex as in meta.
Disappointingly, the company’s delay is potentially justified. Some form of end-to-end encryption should be available to all to protect the right to private communication and prevent government interference. But end-to-end encryption is not just a problem or technology – it is a broad set of policy decisions and uses cases that have far-reaching consequences. Thus, creating the right environment for its use is a complex task. The need for end-to-end encryption, as well as the conditions required to implement it securely, vary for each platform, and to introduce apps like Facebook and Instagram without compromising functionality or introducing security risks. Serious changes are still needed before. Meta’s biggest mistake is not the latest delay but the timeline, and perhaps the result it promised.
When then-Facebook first Announced its timeline in 2019 for implementing interoperable and end-to-end encryption in all its features, which was immediately apparent to be unenforceable. The proposed timeline was so fast that it would be almost impossible to develop the technology itself, the security mechanism was barely getting into the picture. Systems such as WhatsApp already had end-to-end encryption and content-less mechanisms to detect certain types of vulnerabilities, and it was assumed that this would easily translate into other features of Facebook.
However, apps and sites like Facebook and Instagram are very different in architecture and dynamics than WhatsApp. Both implement direct messaging with systems that actively try to connect you with people, derived from users’ phonebook reading collections, algorithmically with locations, interests and friends. Determine similar accounts based on common online activity. In the case of Facebook, large public or private groups also facilitate the growth of one’s social graph, as well as grouping all accounts through global searches and institutions such as schools. While apps like WhatsApp and Signal act as private direct messaging between known contacts, the evolving design of Facebook and Instagram leads to situations where abusers can easily find new victims. , Identities and relationships come to the fore by chance, and a large number of strangers mingle with each other. .
These fundamental differences mean that before Meta can securely switch all of its platforms to end-to-end encryption, its apps must undergo some unusual changes. First, the company must improve its approach to minimizing negligent losses from existing content. This includes using social graphs to locate users who are rapidly expanding their networks or targeting people of specific demographics (for example, a certain announced or estimated age). People), and looking for other potentially annoying patterns in the metadata. These mechanisms can work together with user reporting options and active messaging, such as providing security messages to users informing them of their options for reporting abuse. With effective reporting flow, they can reach platform operators. While these types of features are beneficial with or without end-to-end encryption, they become significantly more important when the ability to inspect content is removed.