Can We Govern AI? with Marietje Schaake

April 21, 2023

When it comes to AI, what kind of regulations might we need to address this rapidly developing new class of technologies? What makes regulating AI and runaway tech in general different from regulating airplanes, pharmaceuticals, or food? And how can we ensure that issues like national security don't become a justification for sacrificing civil rights?

Answers to these questions are playing out in real time. If we wait for more AI harms to emerge before proper regulations are put in place, it may be too late. 

Our guest Marietje Schaake was at the forefront of crafting tech regulations for the EU. In spite of AI’s complexity, she argues there is a path forward for the U.S. and other governing bodies to rein in companies that continue to release these products into the world without oversight. 

Correction: Marietje said antitrust laws in the US were a century ahead of those in the EU. Competition law in the EU was enacted as part of the Treaty of Rome in 1957, almost 70 years after the US.


Marietje Schaake is international policy director at Stanford University Cyber Policy Center and international policy fellow at Stanford’s Institute for Human-Centered Artificial Intelligence.

Between 2009 and 2019, Marietje served as a Member of European Parliament for the Dutch liberal democratic party where she focused on trade, foreign affairs, and technology policies. Marietje is an (Advisory) Board Member with a number of nonprofits including Mercator Institute for China Studies (MERICS), European Council on Foreign Relations (ECFR), Observer Research Foundation (ORF) and AccessNow. She writes a monthly column for the Financial Times and serves as (independent) Special Advisor to Executive Vice-President of the European Commission, Margrethe Vestager.

Episode Highlights

Major Takeaways

  1. There are fundamental challenges to regulating AI that make it different from other technologies. Data sets aren’t accessible to lawmakers or journalists and stay in proprietary hands. Updates and hyper-personalized experiences make the product or service fluid, which in turn makes it harder to regulate. Global companies operate from their local jurisdiction but end up reaching consumers in a different context on the other side of the world.
  2. We shouldn’t be discouraged by the complexity of the problem. It’s similarly difficult to regulate chemicals, financial services, and food. There’s enormous complexity, wide variety, and constant innovation in many sectors beyond tech - and yet regulation works, for the most part. 
  3. The US is behind the EU when it comes to tech regulation. There’s a twin set of regulations in the EU that address related but different aspects of tech company business models: the Digital Services Act and the Digital Markets Act. These regulations are still proving their effectiveness, and are meant to confront speech and content issues, trust and disinformation issues, and the imbalance of power between large tech companies and the public.
  4. Enforcers should be empowered with stronger mandates and capabilities. Budgets for enforcers are undersized; there’s relatively little investment in regulation enforcement. Sanctions need teeth in order to be felt. Levying a $3 billion fine on a company that’s worth $200 billion is simply a cost of doing business at that scope.
  5. The next decade will be crucial for AI policymaking. Most of the regulations that will be significant in dealing with AI and other emerging technologies are still in the pipeline. Regulations should be flexible so they can expand to cover new threats and challenges as they surface. Success will depend on the rigor and effectiveness with which enforcement happens.

Take Action

Share These Ideas