Back to all Insights

The Facebook Files: Tech's Big Tobacco Moment

This post is part of The Catalyst newsletter series. Subscribe here for future resources.

Last week, the Wall Street Journal launched a bombshell investigative journalism series called The Facebook Files. So far there have been 5 articles and 4 podcast episodes on a wide-range of topics from teen mental health, to algorithm-driven outrage, to human-trafficking, to vaccine misinformation.

What’s unique about the investigation is its evidence: Internal Facebook documents “including research reports, online employee discussions, and drafts of presentations to senior management” prove Facebook knew of alarming, far-reaching harms and failed to take action—often against direct recommendations of its own researchers.

 

SOME OF THE (MANY) REVELATIONS:

  • 32% of teen girls said that when they felt bad about their bodies, Instagram (which is owned by Facebook) made them feel worse
  • 13% of British and 6% of American teens who reported suicidal thoughts traced the desire to kill themselves to Instagram
  • 5.8 million VIPs are protected or excluded from having their policy-violating content removed through a system called "XCheck"
  • Only 13% of moderator time taken to label or remove false or misleading information is spent on content from outside the US, yet 90% of users live outside the US and Canada
  • One political party's team shifted their content from 50% negative to 80% negative because 2018 algorithm changes rewarded outrage

SYSTEMIC HARMS

While the multitude harms evidenced by the reports are shocking, they're also predictable: We're seeing the same effects of extractive capitalism as described in The Social Dilemma. When faced with trade offs, platforms like Facebook, Instagram, YouTube, Google, and TikTok are incentivized to prioritize profits at the expense of user well-being.

What the WSJ reporting illuminates is the length organizations like Facebook will go to protect their businesses — unless outside forces like widespread regulation, public pressure, or investor demands change the equation.

And still, even when platforms do work in the users' best interests we see runaway mechanisms generating problems faster than they can be solved by humans or by artificial intelligence.


A CLARION CALL

This is the Big Tobacco moment for today’s dominant social media platforms. It's clear that platforms like Facebook cannot regulate themselves without devastating global consequences.

Importantly, while the Facebook Files are limited to Facebook and Instagram, there is ample evidence that similar harms and causal mechanisms are at play in the other major social media companies—it’s just that their internal documents haven’t been reported on publicly.

WHAT COMES NEXT

What happened to Big Tobacco must happen to these platforms. Externalized harms have to be paid for and future harms need to be mitigated. Tweaks to products won’t solve the problem; we need transformative changes in:

  • How the industry is regulated
  • How capital is distributed
  • How technology is built

HOW YOU CAN HELP

  1. Get informed. Listen to WSJ's Facebook Files podcast episodes (free) or read the articles series (paywall). More info below.
  2. Dig deeper. Watch or listen to this candid conversation with Tristan Harris, Daniel Schmachtenberger, and Frank Luntz on The Facebook Files, its business model, our regulatory structure, and human nature itself.
  3. Connect with others. Where you can, try to help people see that these harms share a root cause: giant user-generated content systems that profit from monetizing attention, social comparison, peer pressure, tribalism, and disinformation.

Published on
September 21, 2021

Posted in: