A Facebook Whistleblower: Sophie Zhang

July 9, 2021

In September of 2020, on her last day at Facebook, data scientist Sophie Zhang posted a 7,900-word memo to the company's internal site. In it, she described the anguish and guilt she had experienced over the last two and a half years. She'd spent much of that time almost single-handedly trying to rein in fake activity on the platform by nefarious world leaders in small countries. Sometimes she received help and attention from higher-ups; sometimes she got silence and inaction. “I joined Facebook from the start intending to change it from the inside,” she said, but “I was still very naive at the time.” 

We don’t have a lot of information about how things operate inside the major tech platforms, and most former employees aren’t free to speak about their experience. It’s easy to fill that void with inferences about what might be motivating a company — greed, apathy, disorganization or ignorance, for example — but the truth is usually far messier and more nuanced. Sophie turned down a $64,000 severance package to avoid signing a non-disparagement agreement. In this episode of Your Undivided Attention, she explains to Tristan Harris and Aza Raskin how she ended up here, and offers ideas about what could be done at these companies to prevent similar kinds of harm in the future.

Guests

As a data scientist, Sophie spent over 2 years trying to fix Facebook from within, and became a whistleblower after she failed. She currently stays home to pet her cats.

Episode Highlights

Major Takeaways

  • Facebook and other major tech platforms operate in places all over the world where they have little or no cultural competency, often don’t have employees who speak the language, and don’t have the resources to keep up with harmful activity.
  • Representation matters. Sometimes Sophie was able to get fast enforcement for fake political activity by bringing issues to the attention of employees who were from the country affected or had a personal stake in the situation. 
  • Company culture matters. In Sophie’s case, Facebook’s internal culture of transparency and open dissent allowed her to speak up and access high-level decision-makers. However, its non-hierarchical style and tendency to listen most to the loudest voices may have also contributed to inaction. 
  • Being a small, less-influential nation experiencing harmful activity on a tech platform can be a double-edged sword. The company may deprioritize action and enforcement in your country, but employees may also have more latitude to act. By contrast, powerful countries like India and Turkey are increasingly able to control platforms’ operations by threatening retaliation against on-the-ground employees, or via some other powerful bargaining chip. 
  • It’s far easier to quantify fires you’ve put out than fires you’ve prevented. That reality and how it's communicated can skew public perception about the success of major tech platforms at containing harm. 
  • Sophie recounts several instances where Facebook’s decisions about whether or how to enforce violating behavior was guided not by ethics, but by political considerations. She recommends that tech companies adopt an internal model similar to that of newspapers, where the department tasked with maintaining policy relationships is separated internally from the department responsible for enforcement.
  • Sophie’s advice to tech workers who want to make change from within: think carefully about your goals, priorities, motives, and limits. Everyone has different levels of privilege and ability when it comes to taking risks.

Take Action

Share These Ideas