Press "Enter" to skip to content

Facebook staffer sends ‘blood on my hands’ memo

By Jane Wakefield
Technology reporter

.css-evoj7m-Image{display:block;width:100%;height:auto;}

image copyrightGetty Images

Fake accounts have been undermining elections around the world, an ex-Facebook employee has claimed.

In a 6,600-word internal memo to fellow workers, data scientist Sophie Zhang said she made decisions “that affected national presidents” without oversight.

“I have blood on my hands,” she wrote in the memo, parts of which were published by Buzzfeed.

In response, Facebook said it was working hard to stop bad actors and inauthentic behaviour.

In her memo, parts of which were published by Buzzfeed without her permission, Ms Zhang said: “In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions.

“I have personally made decisions that affected national presidents without oversight, and taken action to enforce against so many prominent politicians globally that I’ve lost count,” she added.

It is reported by Buzzfeed, which said it had shared only those parts of her memo that were in the public interest, that Ms Zhang turned down a $64,000 (£49,000) severance package which she was offered on condition that she did not share her memo internally.

In response, Facebook said: “We’ve built specialised teams, working with leading experts, to stop bad actors from abusing our systems, resulting in the removal of more than 100 networks for co-ordinated inauthentic behaviour.

“It’s highly involved work that these teams do as their full-time remit. Working against co-ordinated inauthentic behaviour is our priority, but we’re also addressing the problems of spam and fake engagement.

“We investigate each issue carefully, including those that Ms Zhang raises, before we take action or go out and make claims publicly as a company.”

Examples of work she refers to in her memo include:

  • Facebook took nine months to act on information about bots being used to boost president Juan Orlando Hernandez of Honduras
  • In Azerbaijan, the ruling political party used thousands of bots to harass the opposition
  • 10.5 million fake reactions and fans were removed from high-profile politicians in Brazil and the US in the 2018 elections
  • A Nato researcher told Facebook he had seen Russian activity on a high-profile US political figure, which Ms Zhang removed
  • Bot accounts were discovered in Bolivia and Ecuador but the issue was not prioritised due to workload
  • She found and removed 672,000 fake accounts acting against health ministries around the world during the pandemic
  • In India she worked to remove a politically sophisticated network of more than 1,000 actors working to influence local elections in Delhi
Mark Zuckerberg

image copyrightGetty Images

“Facebook projects an image of strength and competence to the outside world… but the reality is that many of our actions are slapdash and haphazard accidents.”

She said the fact she had to make countless decisions about many different countries took a toll on her health and left her feeling responsible when civil unrest took place in locations that she had not prioritised for action.

Her revelations come just a week after ex-Facebook engineer Ashok Chandwaney accused the firm of profiting from hate.

Carole Cadwalladr, a UK journalist who exposed the Cambridge Analytica scandal,

tweeted: “The speed and scale of the damage Facebook is doing to democracies around the world is truly terrifying.”

Presentational grey line

By Marianna Spring, Specialist disinformation reporter

This explosive memo confirms concerns that have long been raised about Facebook’s ability to tackle foreign interference and disinformation campaigns.

But while most eyes have been on Russian interference in US politics after the 2016 election, this former employee’s testimony turns attention to democratic events beyond the West.

Facebook’s failure to tackle disinformation in other languages has come under scrutiny during the pandemic – and this new information alleges it has struggled to tackle interference campaigns in non-English speaking nations before.

The memo also raises big concerns about the huge responsibility bestowed on junior Facebook moderators – whose decisions could affect democratic events, political outcomes and people’s lives globally.

This will no doubt increase concerns about Facebook’s work to tackle interference and disinformation campaigns as the next US election approaches. But it should remind us that Facebook plays a role in democratic events outside the US.

Presentational grey line

Related Topics

  • Facebook

  • Fake News
  • Social media

More on this story

Source: BBC