(AP) — The Facebook Papers project represents a unique collaboration among 17 American news organizations, including The Associated Press. Journalists from a variety of newsrooms worked together to gain access to thousands of pages of internal company documents obtained by Frances Haugen, the former Facebook product manager-turned-whistleblower.
The papers are redacted versions of disclosures that Haugen has made over several months to the Securities and Exchange Commission, alleging Facebook was prioritizing profits over safety and hiding its research from investors and the public.
These complaints cover topics including how its platforms might harm children and its alleged role in inciting political violence.
People or profit? Papers show conflict within
Facebook the company is losing control of Facebook the product — and of the carefully crafted image it’s spent over a decade selling despite problems like misinformation, human trafficking, and pervasive extremist groups on its platform.
From complaints Haugen has filed with the SEC, along with redacted internal documents obtained by The Associated Press, the picture of the mighty Facebook that emerges is of a troubled, internally conflicted company, where data on the harms it causes is abundant but solutions are halting at best.
The documents show a company in crisis, where good intentions long ago stopped being enough. Facebook says it does not prioritize engagement over safety.
Amid the Capitol riot, Facebook faced its own insurrection
The documents provide a rare glimpse into how the company, after years under the microscope for the policing of its platform, appears to have simply stumbled into the Jan. 6 riot.
For weeks, riot participants had vowed — on Facebook itself — to stop Congress from certifying Joe Biden’s election victory with little response from the company.
Yet when the insurrection finally broke out, Facebook seemed as surprised as anyone else, leading employees to vent their frustration over what some saw as the company’s halting and inconsistent response to rising U.S. extremism.
Facebook’s language gaps weaken screening of hate, terrorism
Across the Middle East, journalists, activists and others have long accused Facebook of censoring their speech. In India and Myanmar, political groups use the social network to incite violence.
All of it frequently slips through Facebook’s efforts to police its social media platforms because of a shortage of moderators who speak local languages and understand cultural contexts.
Now, internal company documents show the problems plaguing the company’s content moderation are systemic, and that Facebook has understood them for years while doing little about it.
The company says it has invested in language and topical expertise in recent years but concedes that Arabic content moderation remains a particular concern.
Facebook dithered in curbing divisive user content in India
The leaked documents show that Facebook in India dithered in curbing hate speech and anti-Muslim content on its platform and lacked enough local language moderators to prevent misinformation from appearing.
That publication of false news at times led to real-world violence. The files show these problems have been plaguing the company for years, particularly in cases where members of Prime Minister Narendra Modi’s ruling party created multiple Facebook accounts to amplify anti-Muslim content.
They highlight Facebook’s constant struggles in quashing abusive content on its platforms in the world’s biggest democracy and the company’s largest growth market. Communal and religious tensions in India have a history of boiling over on social media and stoking violence.
Apple once threatened Facebook ban
Apple threatened to pull Facebook and Instagram from its app store two years ago over concerns about the platform being used as a tool to trade and sell maids in the Mideast.
Facebook acknowledged in the internal documents that it was “under-enforcing on confirmed abusive activity” that saw Filipina maids complaining on the site of being abused.
Apple relented and Facebook and Instagram remained in the app store. Yet these ads continue to appear.
Facebook says it took the problem seriously, despite the continued spread of ads exploiting foreign workers in the Mideast.
Apple did not respond to requests for comment.
Facebook froze as anti-vaccine comments swarmed users
As false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook found they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds.
Yet despite evidence that it worked, Facebook took a full month to implement the changes at a pivotal time in the global vaccine rollout.
Facebook’s own documents also reveal how comments on posts are a hotbed for anti-vaccine messages. But when another researcher suggested disabling comments on vaccine posts, that idea was ignored.
Facebook says it has made “considerable progress” with downgrading vaccine misinformation in users’ feeds.