We are in a period of crisis. The profound challenges caused by your platform have been made apparent through the release of The Facebook Files in The Wall Street Journal and the testimony of Frances Haugen before the US Senate.
It is clear that systemic, root-cause solutions are needed to address the core operating model of your product (and others like it), which combines viral attention with algorithmic micro-targeting. No matter how many fact-checkers you hire, how much you invest in AI, how you tweak metrics like Meaningful Social Interaction (MSI), or how hard your Oversight Board works, Facebook and democracy will be incompatible until the underlying operating model changes.
We realize this is a complex conversation requiring solutions at the governance and regulatory level. But there is one interim change, supported by Facebook’s own internal research, that could materially mitigate many of the problems caused by viral, harmful content on Facebook:
Allow a maximum of 2 levels of sharing per post.
We’re calling this change #OneClickSafer.
This approach is backed by Facebook’s own internal research and their own Civic Integrity team, which whistleblower Frances Haugen was a member of. As The Wall Street Journal’s investigative podcast the Facebook Files revealed, “It sounds almost too simple but literally every single hop of a reshare, it gets worse. So if a thing's been reshared 20 times in a row, it's going to be 10X or more likely to contain nudity, violence, hate speech, misinformation, than a thing that has just been not reshared at all.”
Because of how Facebook currently works, these harmful posts spread widest, drowning out meaningful posts from everyday Facebook users.
You often speak about the time and resources you and your team spend putting out fires. #OneClickSafer weakens the virality flamethrower that’s generating new fires.
Currently, anyone who wants a voice on your platform has to compete in a marketplace for attention—and based on your organization’s internal research, the top currency is negative, outrage-inducing content that elicits strong reactions. So whether someone is talking to friends, taking a political stance, or speaking for an organization, their voice will go further if it garners a strong, immediate response.
Even if you don’t want Facebook to be divisive, this process makes it so.
As the Facebook Files revealed, this reality has affected political parties worldwide. Facebook researchers learned that political parties in countries such as Spain, Poland, and India were being forced to use more negative, divisive, and extreme messaging to compete for attention… leading them to more extreme policy positions. Such shifts ultimately degrade social cohesion, harm democracies, and encourage violence.
Even journalists and the media are forced into similar shifts, leading to more polarized content that accelerates division across the electorate. None of this leads to the kind of genuine connection that is core to Facebook’s mission.
As mentioned earlier, we propose that you remove the “reshare” button once a given post has been shared twice. After that, users can copy and paste content if they want to share further.
Two levels of sharing will still allow people to share content they care about with their friends, and then their friends’ friends. Friction raises the quality of what we see on social media because people aren’t sharing reflexively. This is especially critical for countries that have less content moderation (human or algorithmic) or where there is more social upheaval. Imagine a world where each outrage-inducing post reaches a few hundred people after two reshares instead of millions after more levels of resharing.
#OneClickSafer is supported by your own internal findings. From The Wall Street Journal: “a senior data scientist just went to the executives of newsfeed and gave a presentation that was like, ‘Hey, brilliant idea for changing this product and addressing all of our integrity problems. Kill the reshare button....and you know what? The data backed up that if you wanted to address problems on Facebook...this was the way to do it...[But] It was so clearly going to hurt the business badly in a way that Facebook was just not going to tolerate. So it was dead on arrival.”
We are asking that you wind back the clock on that meeting and make the right decision now. And you uniquely have the power to do this. Today, you have 55%+ of the voting rights of Facebook, Inc. which has over 2.8 billion users. It’s a tremendous, unthinkable amount of power, and we implore you to use it with wisdom.
We acknowledge that the #OneClickSafer change will have consequences. Some entertaining and some informative content may not spread as far and as fast. There will be an impact on Facebook’s bottom line. Yet, these consequences are outweighed by the existential risk that Facebook’s current practices pose to both its license to operate and the democratic societies in which it functions.
Transformative, systemic change is needed. #OneClickSafer is an important first-step solution in a much longer journey. It directly addresses a root-cause mechanism for how Facebook creates harm in the world, and we believe many of your brilliant, well-intentioned employees will support it. Together, let’s put people and democracy above profits.