Becoming a whistleblower was not Frances Haugen's plan. She has said more than once that she does not enjoy being the centre of attention. What she saw inside Facebook changed her mind. A former product manager on the company's Civic Integrity team, Haugen left the social network in May 2021 carrying copies of tens of thousands of internal documents, and a few months later she handed them to the United States Securities and Exchange Commission and to a reporter at The Wall Street Journal.
The disclosure became known as The Facebook Files and turned into one of the most consequential corporate leaks of the social-media era. It bruised the reputation of the company that now calls itself Meta, pushed European lawmakers to finalise the Digital Services Act, gave Westminster fresh ammunition for the Online Safety Act, fed an FTC antitrust trial against Mark Zuckerberg, and inspired a Hollywood biopic from the writer of The Social Network. Haugen herself has spent the years since talking to legislators on three continents, building a small nonprofit and writing a memoir.

Frances Haugen at a Heinrich-Böll-Stiftung event in Berlin, November 2021. Photo by Stephan Röhl / ©Stephan Röhl (CC BY-SA 2.0)
From Iowa debate kid to Facebook integrity team
Haugen grew up in Iowa City, the daughter of two academics, and went through high-school competitive debate before studying electrical and computer engineering at Olin College and earning an MBA at Harvard Business School. Her professional career runs through almost the full alphabet of Silicon Valley platforms: she spent more than a decade at Google working on Ads, Book Search and Google+, then short stints at Yelp and Pinterest, before joining Facebook in 2019.
Two pieces of personal context matter for what came later. Haugen has lived for years with a chronic autoimmune illness that at its worst left her in a wheelchair, and the death of a close friend in her twenties turned out, in her own telling, into the lens through which she would later read Facebook's internal research on teenage harm. She has written that those experiences taught her how a person can be slowly hollowed out without anyone outside noticing, which turned out to be a useful frame for thinking about an algorithmic feed.
Inside Facebook she was placed on the Civic Integrity team, the small group that worked on election-related risks, foreign interference and platform manipulation. The team was disbanded in December 2020, weeks after the United States presidential election. That decision, Haugen has said in interviews and under oath, was the moment her doubts about the company tipped into action.
The 22,000-document scrape
Once she decided to leave, Haugen began copying internal material from Facebook's employee-only Workplace forum, in particular research and decision memos tagged to integrity work. By her own account in her memoir The Power of One, she pulled roughly 22,000 documents over several weeks, working from screen captures because the underlying systems would have logged a bulk export.
She left the company in May 2021 and contacted Whistleblower Aid, a nonprofit law firm that had previously represented intelligence-community sources. Whistleblower Aid lawyers walked her through how to make a protected disclosure to the SEC and how to share material with reporters without obvious legal exposure. Her main journalistic partner became Jeff Horwitz of The Wall Street Journal, whose Facebook Files series began running in September 2021.
On 3 October 2021, Haugen unmasked herself on the CBS programme 60 Minutes in an interview with Scott Pelley. She told him that the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world
, and that the company was paying for its profits with our safety
. The combination of that broadcast and the SEC complaints filed by Whistleblower Aid (eight in October 2021, with two more in February 2022) set up the Senate hearing two days later. The pseudonymous Sean
who had been quoted in the WSJ pieces was no longer pseudonymous.
"Profits over people": the Senate hearing
On 5 October 2021, Haugen sat in front of the Senate Subcommittee on Consumer Protection, chaired by Senator Richard Blumenthal. She did not soften her opening statement.
"I am here today because I believe that Facebook's products harm children, stoke division, and weaken our democracy. The company's leadership knows how to make Facebook and Instagram safer but won't make the necessary changes because they have put their astronomical profits before people."
Frances Haugen, U.S. Senate Subcommittee on Consumer Protection, 5 October 2021
Over more than three hours she walked the senators through internal research that, in her telling, contradicted years of public reassurance. Facebook, she said, knew that engagement-ranking on Instagram could carry a teenage girl from healthy-recipe content to anorexia-promoting material in a short hop. The company had a VIP whitelist called XCheck that exempted around 5.8 million high-profile users from normal moderation. It had quietly shelved an Instagram Kids product after the Files broke, but had not abandoned the strategic logic that drove it. Facebook has not earned our blind faith
, she told the panel, and until we have transparency, we will not have a system compatible with democracy.
The market took the broadcast and the testimony together as a single event. In the 24 hours after the 60 Minutes interview, Facebook's market capitalisation fell by about $6 billion, on top of a longer-running slide that the Senate hearing accelerated. A separate seven-hour outage of Facebook, Instagram and WhatsApp on the same day did not help the company's case in either Washington or the press.
What the Files actually said
Read together rather than story-by-story, the Facebook Files are an inventory of internal warning signs that the company chose not to act on. The Wall Street Journal series and the broader Facebook Papers consortium reporting that followed in late October 2021 set out, among other things:
- Internal slides acknowledging that Instagram
make[s] body issues worse for one in three teenage girls
; - The XCheck whitelist that effectively created a parallel set of rules for celebrities, politicians and athletes;
- Documented spread of anti-vaccine misinformation during the COVID-19 pandemic, even after the company said publicly that it was containing it;
- Algorithm changes that, by the company's own measurement, amplified angry and divisive content;
- A pattern of underinvestment in non-English-speaking markets that contributed to ethnic-violence flashpoints in India, Myanmar and Ethiopia;
- Cooperation with the Vietnamese government on political censorship in exchange for continued market access.
In her memoir Haugen writes that she had wanted the public to be able to read the underlying material themselves, on the grounds that the larger problem is that Facebook is allowed to operate in the dark
. The disclosure was, in that sense, an attempt to put a corporate research archive on the legal record that regulators and journalists could later mine.
The Files turn into law
The leak landed in Brussels at exactly the moment European institutions were finalising the Digital Services Act. Members of the European Parliament invited Haugen to testify on 8 November 2021; she described the DSA as a possible gold standard
for platform regulation if it kept its strongest transparency provisions. The text was politically agreed in April 2022, signed on 19 October 2022 and entered into force on 16 November 2022. The DSA's rules on risk assessments, dark-pattern bans and researcher data access track closely against the kinds of internal practices the Files exposed.
The United Kingdom went the same direction at its own pace. Haugen testified before a UK Parliamentary committee on 25 October 2021; the Online Safety Act received Royal Assent in October 2023 and is now being phased in by Ofcom. In the United States the Ohio Attorney General sued Meta in November 2021 on behalf of the Ohio Public Employees Retirement System and other shareholders, citing the Facebook Papers and seeking more than $100 billion in losses; a federal court in California named Ohio lead plaintiff in the consolidated class action in mid-2022, and the case has wound through the federal system since.
Enforcement, where it has happened, has been mostly European. The Commission opened formal proceedings against Facebook and Instagram under the DSA on protection of minors in May 2024, and in 2025 issued preliminary findings that Meta and TikTok were breaching their researcher-access obligations after Meta shut down the CrowdTangle research tool. In April 2025 the Commission separately fined Meta €200 million under the Digital Markets Act for the consent or pay
model it offered to European users. None of this would have been politically straightforward without the Files behind it.

Frances Haugen at a Stanford University conference, March 2022. Photo by Kimberly White / ©Kimberly White via Ekō (CC BY 2.0)
Beyond the Screen, the memoir, the Sorkin film
In September 2022 Haugen launched Beyond the Screen, a small nonprofit that bills itself as a coalition of technologists, designers and thinkers fighting against online harms
. Its first project, the Duty of Care initiative, is an open-source database of platform failures that litigators and researchers can pull from when they argue that a specific harm was foreseeable. The organisation works alongside groups such as Project Liberty and Common Sense Media, and has tried to position itself as a translator between civil society and the kind of internal product-management vocabulary that made the Facebook Files legible in the first place. Haugen's logic, repeated in many talks since, is that platforms cannot be reformed by pressure on individual incidents; they need to be reformed by changing the legal duties they owe their users, in the same spirit as the work Harry Markopolos spent years dragging into the open at the SEC.
In June 2023 Little, Brown published Haugen's memoir The Power of One. The book is part professional autobiography (what it is like to grow up in an academic household in Iowa, debate competitively, fall ill, work for several large platforms) and part procedural account of how she gathered evidence and found a legal team. Reviewers in the Irish Times, Washington Post and Kirkus read it as more measured than the talk-show version of the story; the strongest passages are the ones in which she argues that the public could have been spared the guesswork if Facebook had simply published its own research.
The cinematic afterlife is now scheduled. Aaron Sorkin, who wrote The Social Network, is directing a follow-up film called The Social Reckoning for Sony Pictures, with Mikey Madison as Haugen, Jeremy Strong as Mark Zuckerberg, and Jeremy Allen White as Jeff Horwitz. The film is dated for 9 October 2026. Sorkin has called it a David and Goliath story
, which is also more or less how Haugen has talked about it from the start, alongside other large-platform disclosures such as Edward Snowden's NSA archive, the Cambridge Analytica revelations and the Twitter Files.
What's still open
The single most-watched legal aftershock was the FTC's monopolisation case against Meta, which went to trial in spring 2025. Haugen took a public position early, telling Democracy Now! in April that we've let one company, one man, influence the information environment for the world
and that Instagram needed to be spun out. On 18 November 2025, federal district judge James Boasberg sided with Meta on the merits, ruling that the FTC had not proved Meta still held monopoly power in a market that now includes TikTok and YouTube. Meta keeps Instagram and WhatsApp; the FTC has signalled it will appeal.
The other long-running thread is teenage safety. After years of resistance Meta launched Instagram Teen Accounts in September 2024, defaulting under-18 users into private accounts with stricter messaging and content rules, and in October 2025 layered a PG-13-style content rating on top. Public-health researchers and ex-employees, Haugen included, have been blunt about the shape of that concession: it is the policy Facebook's own internal research recommended years before the Files dropped, deferred until the regulatory cost of not doing it became higher than the engagement cost of doing it.
None of which has settled the underlying question Haugen put to the Senate. Platform companies still write their own rules, run their own research, and decide for themselves which parts of it the public is allowed to see. What changed after the Facebook Files is that the regulators now have a much clearer picture of what they are looking at, and a former product manager from Iowa City is one of the reasons.