Gizmodo highlights the findings of a new ProPublic report on WhatsApp’s content moderation system. What they found was that there are at least 1,000 WhatsApp content moderators employed by Facebook’s moderator contract firm Accenture to review user-reported content that’s been flagged by its machine learning system. "They monitor for, among other things, spam, disinformation, hate speech, potential terrorist threats, child sexual abuse material (CSAM), blackmail, and “sexually oriented businesses,’” reports Gizmodo. “Based on the content, moderators can ban the account, put the user ‘on watch,’ or leave it alone.” From the report:
Most can agree that violent imagery and CSAM should be monitored and reported; Facebook and Pornhub regularly generate media scandals for not moderating enough. But WhatsApp moderators told ProPublica that the app’s artificial intelligence program sends moderators an inordinate number of harmless posts, like children in bathtubs. Once the flagged content reaches them, ProPublica reports that moderators can see the last five messages in a thread.
WhatsApp discloses, in its terms of service, that when an account is reported, it “receives the most recent messages” from the reported group or user as well as “information on your recent interactions with the reported user.” This does not specify that such information, viewable by moderators, could include phone numbers, profile photos, linked Facebook and Instagram accounts, their IP address, and mobile phone ID. And, the report notes, WhatsApp does not disclose the fact that it amasses all users’ metadata no matter their privacy settings.
WhatsApp didn’t offer much clarity on what mechanism it uses to receive decrypted messages, only that the person tapping the “report” button is automatically generating a new message between themselves and WhatsApp. That seems to indicate that WhatsApp is deploying a sort of copy-paste function, but the details are still unclear. Facebook told Gizmodo that WhatsApp can read messages because they’re considered a version of direct messaging between the company and the reporter. They added that users who report content make the conscious choice to share information with Facebook; by their logic, Facebook’s collection of that material doesn’t conflict with end-to-end encryption. So, yes, WhatsApp can see your messages without your consent.