Facebook’s WhatsApp privacy pledges undermined by ‘improper content’ reviewing system

Although WhatsApp boasts end-to-end encryption - meaning that all messages are unreadable until they reach their intended recipient - the company’s claim that nobody, not even the company, can read the messages, is false.


An investigation by the US non-profit newsroom ProPublica found that Facebook undermines the privacy protections of its two billion WhatsApp users by using its own specialist software to review reams of user content for illicit material, according to its report.

The report, published on 7 September, says that although WhatsApp boasts end-to-end encryption – meaning that all messages are unreadable until they reach their intended recipient – the company’s claim that nobody, not even the company, can read the messages, is false.

It turns out that WhatsApp has offices in the US, Dublin and Singapore, where more than a thousand contract workers examine millions of pieces of user’s content, employing special Facebook software to examine private messages, images, and videos that have been reported by WhatsApp users as improper. These are then screened by the company’s artificial intelligence.

When ProPublica contacted WhatsApp’s director of communications, Carl Woog, he acknowledged the existence of the teams of contractors that review WhatsApp messages in order to remove “the worst” abuses but said that he did not consider this to be content moderation unlike WhatsApp’s corporate siblings Instagram and Facebook where the company has stated publicly that it employs around 15,000 moderators to examine the content across these platforms, none of which is encrypted.

Further investigations by the newsroom found that employing content reviewers is just one of the many ways Facebook Inc. has compromised the privacy of WhatsApp users. Working with data, documents, and dozens of interviews with both current and former employees, ProPublica revealed how since buying WhatsApp in 2014, Facebook has been quietly undermining its own guarantees about users’ privacy in a number of ways.

At the same time, it continues to downplay how much data is collected from WhatsApp by saying that it only shares users’ metadata (unencrypted records such as timestamps and phone numbers, for example) but a lot can be revealed about a user’s activity from metadata alone. ProPublica also learned that WhatsApp user data helped prosecutors build a case against a US Treasury Department employee who had leaked confidential documents to BuzzFeed News that exposed how dirty money flowed through the US.

Trying to navigate the tension between users who have been guaranteed privacy and collaboration with law enforcement as well as the company’s need to make money often led to moves that have angered both users and regulators. In 2016, Facebook Inc. decided to share WhatsApp user data with Facebook, something it had told EU regulators was technologically impossible to do. Another plan to sell advertising on WhatsApp was abandoned in late 2019 and a further botched initiative wanted to introduce a new privacy policy for user interactions with businesses on WhatsApp.

A 49-slide internal company marketing presentation obtained by ProPublica emphasized, on the one hand, the “fierce” promotion of WhatsApp’s “privacy” narrative and states that “privacy will remain important” but then also describes, perhaps more tellingly, the need to “open the aperture of the brand to encompass our future business objectives”.

Ever since Facebook announced that it planned to buy WhatsApp, many wondered what would happen to a service that was known for its commitment to privacy within a corporation renowned for the opposite.

At first, Mark Zuckerberg vowed that he would keep WhatsApp “exactly” the same but in 2016, the global messaging service revealed it would be sharing user data with Facebook, clearing the path for future revenue-generating plans and which caught the attention of regulators.

In May 2017, European Union antitrust regulators fined the company 110 million euros for falsely claiming that it would be impossible to link the user information between WhatsApp and the Facebook family of apps. In 2019, the US Federal Trade Commission fined Facebook $5 billion for violating a previous agreement to protect user privacy.

The Trade Commission announced that it was ordering Facebook to take steps to protect user privacy going forward, including for WhatsApp users. Facebook agreed to the fine and the order but by that point, the company had already started hiring hundreds of content reviewers for WhatsApp.

You can read the full investigation that includes how WhatsApp content is moderated and the type of information shared with law enforcement agencies, even outside the US.


Sign up to our newsletter

Stay in the know

Get special updates directly in your inbox
Don't worry we do not spam
Notify of

Inline Feedbacks
View all comments

Related Stories

Chamber of Architects calls for reform of planning regulations
The Chamber of Architects has urged the government to
Mayor critical of Minister Miriam Dalli increases votes in Qormi
Qormi mayor Josef Masini Vento, who publicly and vociferously

Our Awards and Media Partners

Award logo Award logo Award logo