The International Dimensions of the Facebook Papers
The Facebook Papers, the product of a consortium of networked investigative journalists working with at least two Facebook whistleblowers, are filled with explosive revelations about the company’s inability to control misinformation, hate speech, and extremism on its platform. While we wait for more of the leaked original documents to make their way past journalists and into the general public, as promised by some in the consortium, the available reporting already reveals three key trends on the international side of Facebook’s practices. First, local staff are overwhelmed and under resourced; second Facebook consistently prioritized growth over integrity in a way that disproportionately impacted emerging markets; and third, company culture favored half-measures over more aggressive steps.
1. Local fact-checking organizations expressed frustration about the lack of technical resources and staffing to sort through the maze of hate, disinformation, and misinformation appearing on their screens. Certain technical tools, like classifiers to automatically detect concerning information in local languages, weren’t deployed to “at-risk countries” like Ethiopia. One staff member with a Facebook-affiliated fact-checker had to leave the country due to local intimidation. Researcher Berhan Taye says Facebook has failed to take down much of the problematic content she and a volunteer team have flagged. The Atlantic reports that a meager 13% of Facebook’s staff hours for moderating misinformation were devoted to areas outside the United States, which constitute 90% of its user base. Facebook researchers knew that extremism, sex trafficking, and hate speech were rampant in Arabic language spaces on the platform, but Facebook management was slow to scale up its content moderation teams needed to address this problem. Differences between dialects, hiring problems, and inconsistent enforcement plagued the platform’s response.
2. Emerging markets bore the brunt of Facebook’s decision to prioritize growing its user base and creating “meaningful social interactions” through its algorithm. In countries like Burma and Ethiopia, where internet usage was low when Facebook arrived, Facebook quickly became the primary lens through which many people viewed current affairs. Content moderation, translation, and hiring lagged. In some cases, Facebook offered foreign language versions of its platform, but didn’t offer translated versions of its content moderation tools or hate speech reporting forms. As a result, Facebook enabled systemic violence against Rohingya Muslims in Burma, and inflamed civil conflict in the Tigray region of Ethiopia. In Poland, a marginal extremist party took advantage of the boost Facebook’s algorithm gives “emotional messages” to gain the most followers of any Polish party on the platform.
3. Political and public relations considerations induced Facebook to favor half-measures for addressing known problems. Hate speech by a prominent Indian politician stayed up because the platform anticipated that banning him would trigger a “backlash” in this highly coveted market; Facebook did ultimately ban the politician, but not before his (and other) inflammatory rhetoric repeatedly filled Indian news feeds. Similarly, Facebook was slow in reacting to concerns about a 2017 algorithm tweak that gave emoji responses to posts five times the weight of a “like” when assessing whether to show a post to users. There were immediate concerns that since “angry” emojis were included, this change could amplify provocative or unverified content. Yet the platform downgraded the angry emoji’s weight (now zero) only in a series of half-steps over the next three years. Facebook decision makers rejected a number of internal proposals along the way to quicken the de-weighting of emotive content.
In 2014, Facebook dropped its early motto of “Move Fast and Break Things” in favor of broader principles. The Facebook Papers show that since 2014, that motto has still applied as Facebook has rushed recklessly into new information markets around the world.
— Kevin Sheives, Associate Director, International Forum for Democratic Studies