Press "Enter" to skip to content

Facebook ‘auto-generated’ extremist video

Facebook has been accused of “auto-generating” extremist content, including a celebratory jihadist video and a business page for al-Qaeda.

The material was uncovered by an anonymous whistleblower who filed an official complaint to US regulators.

Similar content for self-identified Nazis and white supremacist groups was also found online.

Facebook said it had got better at deleting extreme content but its systems were not perfect.

Content creators

The whistleblower’s study lasted five months and monitored pages of 3,000 people who liked or connected to organisations listed as terrorist groups by the US government.

The study found that groups such as the Islamic State group and al-Qaeda were “openly” active on the social network.

In addition, it found that Facebook’s own tools were automatically creating fresh content for the proscribed groups by producing “celebration” and “memories” videos when pages racked up enough views or “likes”, or had been active for a certain number of months.

The local business page for al-Qaeda generated by Facebook’s tools had 7,410 “likes” and gave the group “valuable data” it could use when recruiting people or seeking out supporters, the complaint said.

On the local business page, Facebook’s algorithms populated the page with job descriptions that users put in their profiles. It also copied images, branding and flags used by the group.

Similar content was automatically produced for white supremacist and Nazi groups active on Facebook.

The complaint has been filed with the US Securities and Exchange Commission, alleging Facebook has misled shareholders by claiming to remove extremist content while letting it persist on the site.

John Kostyack, director of the National Whistleblower Centre, which released the study on behalf of the whistleblower, said he was “grateful” that the “disturbing information” had been released.

“We hope that SEC takes prompt action to impose meaningful sanctions on Facebook,” he said in a statement.

In a statement, Facebook said: “After making heavy investments, we are detecting and removing terrorism content at a far higher success rate than even two years ago.

“We don’t claim to find everything and we remain vigilant in our efforts against terrorist groups around the world.”

The study is the latest in a series of mis-steps for Facebook, which has faced repeated criticism over the way it handles hate speech and extremist content.

This week, Facebook co-founder Chris Hughes said it was time to break up Facebook, in an editorial published in the New York Times.

“The government must hold Mark [Zuckerberg] accountable,” he wrote.

Source: BBC