Back to all Insights

TikTok's Global Impact

This post is part of The Catalyst newsletter series. Subscribe here for future resources.

TikTok has been downloaded over three billion times, surpassing Facebook, Instagram, Snapchat, and Twitter. But the rise of TikTok isn’t simply another Silicon Valley tech story.

TikTok, Boom. is a new documentary by Shalini Kantayya (the director of Coded Bias) now available for streaming on PBS. The film examines a range of themes — algorithmic, socio-political, economic, and cultural — to explore the impact of the history-making app.


TAKEAWAYS FROM THE FILM

  • POWERFUL DESIGN: TikTok allows users to easily make short-form, highly engaging videos while its algorithm uses machine learning and big data to feed these videos to users personally. Unlike most social media platforms, TikTok’s popular “For You” feed is entirely curated by algorithmic recommendations, meaning users don’t need to follow others to get scarily personalized content. The powerful machine learning that optimizes these engaging videos feeds TikTok’s virality function, which in turn can push unwanted content, including disinformation, propaganda, and even hate content.
  • ALGORITHMIC BIAS: Whistleblowers have revealed undemocratic algorithmic practices in which TikTok has chosen what content to promote and demote. This included demoting content that critiqued the Chinese Communist Party as well as content by people based on their race, age, ability, and more. By choosing which voices are heard and amplified, the algorithm impacts how we think and act.
  • A NEW ECONOMY: TikTok has fueled a powerful creator economy financed by ads and brand partnerships. While this economic structure can provide a livelihood for creators, it has also resulted in serious harm to users, including harassment and abuse, which creators themselves struggle to balance.
  • SECURITY & PRIVACY: TikTok’s parent company, ByteDance, is based in China, which raises significant concerns about how TikTok might be used for surveillance and algorithmic manipulation. (For more on this, listen to our podcast episode, “Addressing the TikTok Threat.”) Concerns over TikTok are increasing, as shown through bipartisan support in the US for greater regulation of the app.

As TikTok continues to shape global culture, political movements, and the economy,  the question becomes how we successfully address the risks and prevent its harms.

Explore the debate for yourself: TikTok, Boom. is now free to watch on PBS.


WHAT WE'RE READING, LISTENING TO, & WATCHING

  • Integrity Institute just released a Misinformation Amplification Analysis and Tracking Dashboard, which quantifies how much platforms amplify misinformation through their platform and algorithm design. This dashboard is part of Integrity Institute’s Elections Integrity Program, which collects and coordinates the expertise of integrity professionals to understand and protect democratic elections globally.
  • In an investigation into platform-promoted election disinformation, Global Witness found that YouTube and Facebook continue to approve most ads that contain election disinformation in Brazil. YouTube approved 100% of ads, while Facebook approved 50%, including ads denying the credibility of the Brazilian election and urging people not to vote.
  • In “Content Moderation is a Dead End.” Ravi Iyer explores how content moderation is both necessary and insufficient. While content moderation is critical for protecting people from explicitly dangerous or illegal content, it fails to address the majority of harmful content, since what is harmful varies dramatically across contexts. He dives into three more promising approaches: Subjective Measurement, Designing for Well-Being, and Algorithmic Value Alignment.
  • In an essay on Tech Policy Press, Nicole Gill and Jesse Lehrich of Accountable Tech sketch out security threats likely to result from Elon Musk’s Twitter acquisition. The threats originate from Musk’s connections, conflicts of interest, and the deal’s financing.

Published on
October 27, 2022

Posted in: