A Conversation with Facebook Whistleblower Frances Haugen

October 18, 2021

We are now in social media's Big Tobacco moment. And that’s largely thanks to the courage of one woman: Frances Haugen.

Frances is a specialist in algorithmic product management. She worked at Google, Pinterest, and Yelp before joining Facebook — first as a Product Manager on Civic Misinformation, and then on the Counter-Espionage team. But what she saw at Facebook was that the company consistently and knowingly prioritized profits over public safety. So Frances made the courageous decision to blow the whistle — which resulted in the biggest disclosure in the history of Facebook, and in the history of social media.

In this special interview, co-hosts Tristan and Aza go behind the headlines with Frances herself. We go deeper into the problems she exposed, discuss potential solutions, and explore her motivations — along with why she fundamentally believes change is possible. We also announce an exciting campaign being launched by the Center for Humane Technology — to use this window of opportunity to make Facebook safer.

Guests

Frances Haugen is a specialist in algorithmic product management, having worked on ranking algorithms at Google, Pinterest, Yelp and Facebook. She was recruited to Facebook to be the lead Product Manager on the Civic Misinformation team, which dealt with issues related to democracy and misinformation, and later also worked on counter-espionage.

During her time at Facebook, Frances became increasingly alarmed by the choices the company makes prioritizing their own profits over public safety  and putting people's lives at risk. As a last resort and at great personal risk, Frances made the courageous decision to blow the whistle on Facebook.

Frances fundamentally believes that the problems we are facing today with social media are solvable. We can have social media that brings out the best in humanity.

SUPPORT FRANCES HAUGEN

It took extraordinary courage and personal risk for Frances to blow the whistle against a $1 trillion market cap company. Please support her whistleblower protection by donating to Whistleblower Aid: https://www.gofundme.com/f/facebook-whistleblower

Episode Highlights

Major Takeaways

  • Economies of scale are central to the safety issues at Facebook. It costs the same amount to build safety systems for populations of language speakers big and small — so safety for smaller populations is simply less economic. This creates a perfect storm, in which smaller and more vulnerable populations get less robust safety systems. Frances believes this is in direct tension with what should be Facebook's higher obligation of safety for small populations where Facebook has subsidized the internet with Free Basics — because the company has made it more difficult for alternatives in their language to emerge. 
  • Here are 4 core problems Frances exposed, that we explored together: 
  • ~How 99th percentile 99th percentile users of the platform, who are often more prone to mis- and disinformation, disproportionately affect the algorithm for everyone else
  • ~How mere invitations to groups prompts Facebook to inject those groups’ content into our feeds, and treats interactions with that content as a ‘ghost acceptance’ of the invitation
  • ~How the platform forces politicians to adopt more extreme positions
  • ~The implications of engagement-based ranking, and why Frances supports a chronological news feed (or at least, a feed organized in a way users understand)
  • Frances says Facebook deflects blame by painting false trade-offs — for example between free speech and safety. According to the company, more robust safety would curtail freedom of speech, and therefore, creating problems that are very challenging to solve. But as Frances points out, Facebook's own researchers have identified solutions that transcend those trade-offs — but that the company does not adopt for economic reasons.
  • Among the many potential solutions, Frances' main push is for transparency, so that people outside the company — from government regulators to civil society groups — can see the problems with the platform and propose ways to resolve them.
  • Frances notes that because "every dollar matters to Wall Street....Facebook has been left optimizing for the shareholder interests instead of the public's interests." In this spirit, she thinks of regulation as a collaborative effort between government and Facebook — in order to 'help' Facebook balance shareholder interest with the public's interests. 
  • Frances is calling for Facebook to declare moral bankruptcy. This statement has been misinterpreted as Frances claiming Facebook to be morally bankrupt. But Frances is referring to bankruptcy in a financial sense — that the company must own up to its failures, ask for help with its debts, and have its moral credit score lowered.
  • Frances fundamentally believes it's still possible to reform Facebook — which is why she blew the whistle, and why she still wants to work there!
  • Donella Meadow's Leverage Points framework identifies 12 leverage points for making change in a complex system. It puts 12 leverage points on a scale of increasing leverage: leverage points lower down on the scale are often easier to push, but have less impact on the system, while leverage points higher up on the scale are often much harder to push, but have more impact. Bringing it back to Facebook, we need to push on the lower leverage points of platform tweaks and design changes, while advocating for longer term systemic reform through external regulation and new business models.
  • In this spirit, the Center for Humane Technology and partners have launched #oneclicksafer — a campaign to make Facebook safer by changing the reshare button. It’s just one intervention, but one that’s high-impact and can build positive momentum. Join us!

Take Action

Share These Ideas