Back to all Insights

The Breakdown of Shared Understanding (Course Excerpt)

This post is part of The Catalyst newsletter series. Subscribe here for future resources.

In our last newsletter, we unpacked why technology is never neutral. Social media is no exception. Social media doesn’t simply reflect society; it shapes society. 

The world we see through social media is distorted, like looking into a funhouse mirror. These distortions are negative externalities of an advertising-driven, engagement-maximizing business model, which affects people and relationships in myriad ways.


  1. The Extreme Emotion Distortion 🥵 occurs as users have access to virtually unlimited amounts of personalized, emotional content, any user can find overwhelming evidence for their deeply held beliefs. This situation creates contradicting “evidence-based” views, resulting in animosity and fracturing of our collective sensemaking.
  2. The Information Flooding Distortion 🤯 happens as algorithms and bots flood or curate the information users see based on their likelihood to engage with it, resulting in users believing that what is popular (e.g., hashtags, comments, trends) is public consensus, when in fact it can also be manipulated distortion.
  3. The Micro-Targeting Distortion 🔬 happens as advertisers send personalized, emotionally resonant — and sometimes opposing — messages to distinct groups of people, resulting in individualized micro-realities that can generate social conflict.
  4. The Moral Outrage Distortion 😱 occurs when engagement-maximizing algorithms amplify emotionally charged, moralizing content. This results in polarization, mischaracterizations of “the other side,” and the perception of more moral outrage around us than there really is.
  5. The Engaging Content Distortion 🤩 happens when social media platforms incentivize competition to create more viral content. This results in more frequent posting, more hyperbolic language, and more posting of extreme views, including conspiracy theories and out-of-context information.
  6. The Anti-Journalism Distortion 🚫 is created as social media platforms force reputable news organizations to compete in an environment that rewards clickbait headlines and polarizing rhetoric resulting in less thoughtful, less nuanced reporting.
  7. The Disloyalty Distortion 😡 happens when users on public social media feeds try to understand or express compassion for the “other” side and are attacked by their “own” side for doing so.
  8. The Othering Distortion 👹 occurs as algorithms amplify divisive, negative, out-of-context content about particular groups. This incentivizes “othering” content, causing us to dehumanize others and view them as unworthy of our understanding.


These distortions don’t just affect individuals. Over time these distortions warp society’s perception of reality, breaking down our ability to find shared understanding.

Shared understanding is needed for democratic functioning. It enables nuanced discussion, debate, and problem solving across party lines. Yet, today's dominant social media platforms are breaking down these critical capabilities at an alarming pace. This is why social media as it operates today is a threat to open societies worldwide.


We can uphold open society values by enabling an information ecosystem that stewards our capacity for shared understanding rather than optimizing for engagement:

  1. Curtail the causes through platform design changes that incentivize trust and understanding. For example, introducing friction to limit virality prevents ideas that trigger powerful emotions from spreading quickly and dominating public discourse. For a deep dive, we recommend reading Renee DiResta and Tobias Rose’s piece, “​​How to Stop Misinformation Before It Gets Shared.”
  2. Address the crises caused by the breakdown of shared understanding. For technology teams, identify crises among both users and non-users, maintain cross-team collaboration, and plan ahead for challenges. For instance, consider implementing blackouts for features that may cause harm during certain periods (e.g., elections).
  3. Heal the toxic state of our minds from years of being conditioned to see divisiveness as safe and compassion with the “other side” as risky. 
  4. Approach mutual understanding as a skill to be developed. Search for Common Ground and The One America Movement provide powerful insight into how public education can cultivate intellectual humility and establish understanding.
  5. Rehumanize each other by connecting with values that we share and sharing experiences in order to depolarize our communities. For a bit of inspiration, check out this video
  6. Illustrate distortions in order to reveal perception gaps and “alternate” realities. For example, participate in a “reality swap,” where you swap feeds with another person to see how the reality presented to them differs from the reality you see.


This piece has been adapted from Module 5 in our recently launched course, Foundations of Humane Technology. If you are involved in shaping tomorrow’s technology, we welcome you to register for the course.


  • Frances Haugen calls for granting civil society organizations access to platform data in “Civil society must be part of the Digital Services Act.” 87% of Facebook's operational budget goes to protect just 10% of its users, so civil society groups worldwide have filled in the gap. Granting civil society groups access to platform data would help them uphold platform transparency and accountability.

  • Stewarding technology responsibly—and humanely—requires crisis planning. But social media platforms were not prepared for the invasion of Ukraine. As Will Oremus writes, platforms’ rapid, ad hoc responses are setting a dangerous precedent for crises to come.

  • Extractive technology depletes finite resources faster than they can be replenished. In our latest YouTube video, CCO Maria Bridge unpacks the types of extractive technologies, how to prevent them, and what we can do to become more aware of their impact.  

Published on
April 14, 2022

Posted in: