TikTok's Transparency Problem with Marc Faddoul

March 2, 2023

A few months ago on Your Undivided Attention, we released a Spotlight episode on TikTok's national security risks. Since then, we've learned more about the dangers of the China-owned company: We've seen evidence of TikTok spying on US journalists, and proof of hidden state media accounts to influence the US elections. We’ve seen Congress ban TikTok on most government issued devices, and more than half of US states have done the same, along with dozens of US universities who are banning TikTok access from university wifi networks. More people in Western governments and media are saying that they used to believe that TikTok was an overblown threat. But as we've seen more evidence of national security risks play out, there’s even talk of banning TikTok itself in certain countries. But is that the best solution? If we opt for a ban, how do we, as open societies, fight accusations of authoritarianism? 

On this episode of Your Undivided Attention, we're going to do a deep dive into these questions with Marc Faddoul. He's the co-director of Tracking Exposed, a nonprofit investigating the influence of social media algorithms in our lives. His work has shown how TikTok tweaks its algorithm to maximize partisan engagement in specific national elections, and how it bans international news in countries like Russia that are fighting propaganda battles inside their own borders. In other words, we don't all get the same TikTok because there are different geopolitical interests that might guide which TikTok you see. That is a kind of soft power that TikTok operates on a global scale, and it doesn’t get talked about often enough.

We hope this episode leaves you with a lot to think about in terms of what the risks of TikTok are, how it's operating geopolitically, and what we can do about it.


Marc Faddoul is an algorithm audit expert, affiliated scholar at UC Berkeley and co-director of Tracking Exposed, a non-profit which investigates the influential algorithms in our lives.

With Tracking Exposed he is currently scrutinizing TikTok, YouTube and PornHub's recommender systems on issues of data malpractice, election integrity, information warfare and more. His work is cited by representatives of the US Congress and the European Commission thanks to Tracking Exposed’s unique tools which inform global tech regulation debates with current, quantitative research.

As an expert commentator on algorithmic harms, Marc is regularly quoted in the Washington Post, The Guardian, Le Monde and more. He’s also a member of the French Digital Council and the committee advising the French national media regulator ARCOM.

Marc started his career building algorithms but quickly moved to analyzing their impact on society. He carried out research on algorithmic systems in academia (UC Berkeley), startups, and big tech (Facebook AI), which led him to defend digital rights through algorithmic transparency.

Marc holds an engineering degree and MS in Data Science from Télécom Paris, and a MS in Information Systems from the UC Berkeley School of Information.

Episode Highlights

Major Takeaways

  • Key design differences within TikTok make it more dangerous than other forms of social media. A user’s full attention is on a single piece of content, and the videos tend to be very short to encourage fast swiping. TikTok harvests data at a massive scale: After ten minutes of use, YouTube might get one data point from a video that you like or dislike, but TikTok will get 100 because you could swipe 100 videos in ten minutes. Music is also key to TikTok’s draw as humans have a strong emotional attachment to it.
  • TikTok is able to shape public perception, a capability made even more dangerous during times of war. After Russia invaded Ukraine, ByteDance was able to create a separate version of TikTok for Russia overnight which banned negative talk of the war. 
  • TikTok isn’t transparent about how it promotes or demotes content. It can appear as if malicious content isn’t visible on TikTok. Regardless, the platform could still be ‘shadow promoting’ content, and researchers are currently the only people detecting that. In Russia, Tracking Exposed revealed that some of the content which TikTok claimed to have banned was in fact still being recommended by the ‘For You’ algorithm.
  • But transparency doesn’t stop a cancer cell from being a cancer cell. Greater transparency is needed regarding which content gets promoted on TikTok and which content is - or isn’t - available in your country. However, transparency doesn’t matter when companies like TikTok can rescind access to data about how they’re using their algorithms to promote and demote content. Meaningful transparency isn’t possible when the amount of content moving through the system is so vast.
  • We need to solve for better guardrails so that all of social media can operate in a way that strengthens democratic societies. There's hope that regulation will solve the transparency issue by forcing platforms to open their data to researchers so they can better study the algorithms and their effects on society. Recommender systems could adopt a Wikipedia-type model and become public goods that aren’t driven by profit. Tracking Exposed built an interoperable platform where users are empowered to choose and control their algorithm based on their own interests. Their proof-of-concept for such an algorithmic marketplace for YouTube is called youchoose.ai. The drawback to choosing your recommender system is that doing so doesn’t address the breakdown of shared understanding; you’re still seeking your own perspective.
  • Social media is the telecommunications infrastructure of the 21st century, and we need to start seeing it that way. We already tightly control our telecommunications networks. We wouldn't allow Russia or China to install equipment inside of our networking infrastructure because we see it as critical. In the same vein, TikTok isn’t just innocuous entertainment for kids who like to dance; it’s directly contributing to the ongoing metacrisis.

Take Action

Share These Ideas