Facebook has been autogenerating pages for white supremacists

Facebook has been autogenerating pages for white supremacists

by Tech News
0 comments 55 views
A+A-
Reset

What ban? —

Facebook’s efforts to combat extremism remain at odds with engagement goals.

Tim De Chant

Facebook CEO Mark Zuckerberg testifying before Congress in April 2018. It wasn't his only appearance in DC this decade.

Enlarge / Facebook CEO Mark Zuckerberg testifying before Congress in April 2018. It wasn’t his only appearance in DC this decade.

Facebook CEO Mark Zuckerberg is testifying before Congress today, and he may have a few more uncomfortable questions to answer. Among them, why is Facebook autogenerating pages for white supremacist groups?

Researchers at the Tech Transparency Project found that Facebook created dozens of pages for groups like the “Universal Aryan Brotherhood Movement” when a user did something as simple as listing it as their employer. Some of the autogenerated pages garnered thousands of likes by the time they were discovered by researchers. TTP also discovered four Facebook groups that had been created by users. The researchers shared their findings with Facebook, which removed most of the pages. Yet, two of the autogenerated pages and all four Facebook groups remained active when the group published its findings.

Facebook reportedly banned “white nationalist” content following the 2019 mass shooting at a New Zealand mosque, expanding on an earlier ban of white supremacist content. 

It wasn’t hard for the researchers to find offending pages and groups. They simply searched Facebook for the names of neo-Nazi and white supremacist groups identified by the Anti-Defamation League and the Southern Poverty Law Center. More than half of the groups in their query of 221 names returned results. A total of 113 white supremacist organizations and groups had a presence on Facebook, sometimes more than one. One user-generated page that has been active for over a decade had 42,000 likes. Ten other pages and one group had more than 1,000 likes each.

Much of Facebook’s moderation system relies on artificial intelligence to flag potential violations for human moderators, a system that appears to be easily thwarted. Simple misspellings of words—whether by adding vowels or using $ in place of S, for example—have been enough to foil algorithmic moderation. 

Facebook’s own user-interfacing algorithms have also been coming up short. TTP found that on a page for an organization called the “Nazi Low Riders,” Facebook recommended that users also like a page for the “Aryanbrotherhood.” 

The company’s tactic for combatting rising extremism on the site also appears to be failing. Searches for known hate groups are supposed to direct users to the page for Life After Hate, a nonprofit group that seeks to deradicalize right-wing extremists. But that only worked in 14 of the 221 searches the researchers performed.

Militias, too

Facebook has had similar problems with far-right militias, according to a related investigation by the TTP and Buzzfeed. Facebook had banned several militant groups last August, but researchers turned up still-active autogenerated pages for some of the militias.

Earlier this year, Facebook came under fire in the wake of the January 6 insurrection at the US Capitol for its role in the violence. Reports revealed that not only were people using Facebook to organize in advance of the rally and related attack, many were radicalized by Facebook and its platforms, including Instagram. 

Mentions of groups involved in the insurrection, including the Proud Boys, have been banned since 2018, yet in recent weeks TTP researchers found militia groups spreading propaganda. One included a three-minute “highlight reel” of the Capitol riot along with Proud Boys attacking Black Lives Matter protesters.

Facebook’s groups problem hasn’t gone unnoticed within the company. Facebook’s own researchers warned top executives as early as August 2020 that 70 percent of the 100 most active US “civic” groups on the platform were “considered non-recommendable for issues such as hate, misinfo, bullying, and harassment.” One particularly large group with 58,000 members spread “enthusiastic calls for violence every day.”

Facebook’s stated desire to combat polarization has long been at odds with its quest to maximize engagement. In 2017, an internal task force found that reducing polarization on the site would also reduce engagement. The task force was soon disbanded, and most of its suggested fixes were shelved.

Listing image by Bloomberg | Getty Images

Read More

You may also like

Leave a Comment