All episodes

Episode 39 | Aug 12, 2021

Do You Want To Become A Vampire? with L.A. Paul

How do we decide whether to undergo a transformative experience when we don’t know how that experience will change us? This is the central question explored by Yale philosopher and cognitive scientist L.A. Paul

Paul uses the prospect of becoming a vampire to illustrate the conundrum: let's say Dracula offers you the chance to become a vampire. You might be confident you'll love it, but you also know you'll become a different person with different preferences. Whose preferences do you prioritize: yours now, or yours after becoming a vampire? Similarly, whose preferences do we prioritize when deciding how to engage with technology and social media: ours now, or ours after becoming users — to the point of potentially becoming attention-seeking vampires? 

Today with L.A. Paul, we're raising the stakes of the social media conversation — from technology that steers our time and attention, to technology that fundamentally transforms who we are and what we want. Tune in as Paul, Tristan Harris, and Aza Raskin explore the complexity of transformative experiences, and how to approach their ethical design.

Major Takeaways

  • Transformative experiences change our preferences. The preferences we have before the experience are different from the preferences we have after. Examples include: becoming a parent, fighting in a war, embarking on a career, becoming bereaved, or becoming a TikTok influencer. 

  • Transformative experiences change the landscape of future possibilities. One transformative experience might lead to a different menu of choices, altering the course of the future and the landscape of possibilities. 

  • Technology is not neutral, but fundamentally transformative. Technology doesn't just persuade or nudge, but fundamentally transforms who we are and what we want -- individually and collectively. 

  • Agentic preferences cannot be assumed. The notion that people's preferences should be honored is based on the assumption that those preferences are consciously chosen, which isn’t necessarily the case. How to honor the preferences of users, when technology alters those preferences? 

  • Transformation raises the ethical stakes of technology design. The ethical stakes of transformation are higher than that of nudging, influencing, and persuasion. What are the ethical implications of designing technology that alters people's preferences, so that they have little or no basis for deciding whether to use the technology in the first place? What constitutes meaningful informed consent within the context of technology?

  • Two potential ways to resolve the ethical conundrum include infinite games and omni-considerate choices. In order to evaluate transformative experiences, we might take a systems view. Specifically, we might choose whatever path allows the game to continue (James Carse's notion of infinite games), or the path that considers more balance sheets (Daniel Schmachtenberger's notion of omni-considerate choices).

Other recommended reading

Transformative Experience

L.A. Paul’s 2014 book on decision-making in circumstances where assessment is difficult or impossible. 

Finite and Infinite Games 

James P. Carse’s book on games with an object of winning, leading to the game’s end, versus games that continue. 

Omni-Considerate Choices

Daniel Schmachtenberger’s talk with Jordan Greenhall and Forrest Landry, where they elaborate the notion of omni-consideration — decisions made to minimize externalities, and enhance alignment between the well-being of the agents and the commons.


Help us design a better future at the intersection of technology and humanity

Make a donation