Four revelations from the Facebook Papers

Aurich Lawson | Getty Images

Facebook is battling its gravest crisis since the Cambridge Analytica scandal after a whistleblower accusing the company of placing “profit over safety” shed light on its inner workings through thousands of pages of leaked memos.

The documents were disclosed to US regulators and provided to Congress in redacted form by Frances Haugen’s legal counsel. A consortium of news organisations, including the Financial Times, has obtained the redacted versions received by Congress.

Earlier this month, Haugen testified in Congress that the social media company does not do enough to ensure the safety of its 2.9 billion users, plays down the harm it can cause to society and has repeatedly misled investors and the public. The Wall Street Journal also ran a series of articles called the Facebook Files.

Here are four surprising revelations the documents contain:

Facebook has a huge language problem

Facebook is often accused of failing to moderate hate-speech on its English-language sites but the problem is much worse in countries that speak other languages, even after it promised to invest more after being blamed for its role in facilitating genocide in Myanmar in 2017.

One 2021 document warned on its very low number of content moderators in Arabic dialects spoken in Saudi Arabia, Yemen and Libya. Another study of Afghanistan, where Facebook has 5 million users, found even the pages that explained how to report hate speech were incorrectly translated.

Enlarge Facebook allocated only 13 percent of its budget for developing misinformation detection algorithms to the world outside the US

Washington Post | Getty Images

The failings occurred even though Facebook’s own research marked some of the countries as “high risk” because of their fragile political landscape and frequency of hate speech.

According to one document, the company allocated 87 percent of its budget for developing its misinformation detection algorithms to the US in 2020, versus 13 percent to the rest of the world.

Haugen said Facebook should be transparent on the resources it has by country and language.

Facebook often does not understand how its algorithms work

Several documents show Facebook stumped by its own algorithms.

One September 2019 memo found that men were being served up 64 percent more political posts than women in “nearly every country,” with the issue being particularly large in African and Asian countries.

While men were more likely to follow accounts producing political content, the memo said Facebook’s feed ranking algorithms had also played a significant role.

Enlarge Facebook found that men were being served up 64 percent more political posts than women in “nearly every country.”

Ben Stansall | Getty Images

A memo from June 2020 found it was “virtually guaranteed” that Facebook’s “major systems do show systemic biases based on the race of the affected user.”

The author suggested that perhaps the news feed ranking is more influenced by people who share frequently than those who share and engage less often, which may correlate with race. That results in content from certain races being prioritised over others.

Read More

Related posts

Not Using a Repricer? Here’s What You Need to Know to Get Started

What are BTC Halvings, And How Do They Drive the Market?

Essential Software When Working with Remote Employees