Episode 31 | Mar 18, 2021
Disinformation Then and Now. Guest: Camille François
Camille François, Chief Information Officer at the cybersecurity company Graphika and an affiliate of the Harvard Berkman Klein Center for Internet & Society, and other disinformation researchers have been fighting two battles over the last decade: one to combat and contain harmful information, and one to convince the world that these manipulations have an offline impact that requires complex, nuanced solutions. Camille believes that our common understanding of the problem has recently reached a new level. In this interview, she catalogues the key changes she observed between studying Russian interference in the 2016 U.S. election and helping convene and operate the Election Integrity Partnership watchdog group before, during and after the 2020 election. “I'm optimistic, because I think that things that have taken quite a long time to land are finally landing, and because I think that we do have a diverse set of expertise at the table,” she says. Camille and Tristan Harris dissect the challenges and talk about the path forward to a healthy information ecosystem.
Major Takeaways
Although foreign interference has contributed to mis- and disinformation campaigns, inaccurate information doesn't come only from bots and trolls. Real people, including "blue check" verified social media accounts, are driving the spread of misinformation by sharing falsehoods that they believe to be true.
Researchers need more access to data from social media platforms. Companies like Facebook keep track of information about topics and online groups that are likely to cause violence but do not share their methods or findings with researchers.
Online conflict can become offline violence. The events on January 6, 2021 show how online events impact our offline lives in meaningful ways.
Social media platforms should make design interventions to contain the user-driven spread of misinformation and disinformation. For example, during the 2020 U.S. election, Twitter changed the retweet button by asking users to add commentary to another tweet before sharing it. By increasing friction in the sharing process, Twitter's design intervention makes it harder to spread misinformation mindlessly.
Other recommended reading
Agents of Chaos
Agents of Chaos is a two-part documentary featuring this week’s guest Camille François that details Russia’s interference in the 2016 U.S. presidential election based on interviews with experts and information from inside sources.
Mistrust
Mistrust, a book by Ethan Zuckerman, explains how lack of trust has become a crisis for representative democracy. Zuckerman says it’s not too late to fix the problem – and that we should look to creative social movements to find new forms of civic engagement that rebuild our confidence in one another.
"False News Targeting Latinos Trails the Election"
This New York Times article by Patricia Mazzei and Nicole Perlroth describes how disinformation campaigns attempted to convince Spanish speakers that President Trump had been “robbed,” part of a strategy to delegitimize the 2020 election.
"Introducing Birdwatch"
”Introducing Birdwatch” is an announcement from Twitter about their new pilot program, Birdwatch, a community-based system to combat misinformation online.
“How the Pentagon is trolling Russian, Chinese hackers with cartoons”
This news report from Shannon Vavra at CyberScoop is about a U.S.-sponsored plan to undermine foreign government operations using online graphics.
Feels Good Man
This PBS documentary tells the story of comic character Pepe the Frog and its creator Matt Furie. The film chronicles Pepe’s ascent from gormless cartoon character to political icon and elaborates on meme culture’s social effects.