Facebook said it has spent more than $13 billion on safety and security efforts since the 2016 US election, and now has 40 000 employees working on those issues.
The 40 000 safety and security workers include outside contractors who focus on content moderation, a spokesman said. Facebook said it had over 35 000 safety and security employees in October 2019.
The new statistics — meant to demonstrate how seriously the company takes safety and security issues — were published Tuesday in a blog post after a series of stories last week in the Wall Street Journal used leaked documents to show that despite hefty investments, Facebook struggles to combat a myriad of serious issues, including Covid-19 misinformation and illegal human trafficking.
The documents showed that Facebook’s internal researchers often identified serious problems with inappropriate content or user behaviour on the company’s services, but Facebook routinely failed to fix them. The stories spurred calls by US lawmakers for an investigation and possibly hearings on the issues.
The blog post addressed some of these criticisms without citing the newspaper’s stories specifically. The company said that while it has historically been responsive to issues on the platform, it’s trying to be more proactive by having safety and security employees embedded in product teams during the development process.
“In the past, we didn’t address safety and security challenges early enough in the product development process,” Facebook said in its blog. “But we have fundamentally changed that approach.”
Facebook also shared new statistics around its global political ad library, an archive where people can search for political ads that run on Facebook or the Instagram photo-sharing app. Facebook said 3 million people use the ad library each month, and the company rejected 3.5 million political or social ad submissions over the first six months of 2021 for failing to provide proper information.
Instagram, which was the focus of a story last week that revealed internal research showed the company knows its product can be emotionally damaging to young women, said this week that it’s considering “nudges,” which will prompt users to look at healthier content on the service, or take a break from scrolling.