Back to all Insights

Policy Brief: State of Global Tech Policy September 2022

This post is part of The Catalyst newsletter series. Subscribe here for new resources 2x a month.

Introduction

Countries have woken up to the power of today’s biggest tech platforms and to the very real harms their citizens experience online. The harms of extractive technology are dividing communities, provoking violence, undermining public health, and weakening public trust in democracy. While the role of traditional media cannot be underestimated, it is the new digital media platforms – whose business models are built on engagement-maximization – that undermine our individual and collective well-being. The features of the digital media market that cause these problems are also what maximizes profit. They will not improve on their own. The dominance of today’s largest social media platforms in mass media is a paradigm shift, and addressing this shift requires a rebalancing of commercial and democratic interests.

Below is a snapshot of the efforts by various countries to achieve a delicate calculus between these competing interests. No matter where particular countries are in their process of regulating tech platforms, at least there is a growing recognition that various policy approaches – a public safety lens, an antitrust lens, a lens of regulating commercial control over our data, and finally, a lens of restoring local journalism – will have to be simultaneously undertaken to adequately address digital threats to democracy. Many of these approaches are required in combination to begin addressing the systemic problems we face.  

United States

Hundreds of legislative proposals have been introduced in Congress over the last several years to regulate tech platforms. But none has achieved enough political consensus through the legislative process. Some of the barriers include 1) the focus on content moderation and how it arguably conflicts with American protections for free speech and the First Amendment, 2) the fact that many of these proposals have started only to address the symptoms of the problem and not their root cause; and 3) perhaps most importantly, partisan obstructionism aided by industry lobbying.

While there are a number of policy statements, proposals, or initiatives at the federal level, the political will to reach across the aisle and get policy passed has been lacking. In this current Congressional session, we've seen notable attempts to regulate tech platforms as monopolies, provide a federal standard for privacy in technology, and increase online protections for kids. However, with the prospect of losing the Democratic majority in one or both houses of Congress this fall, the political consensus needed to adopt legislation may wane or dissolve altogether. There are nevertheless opportunities that exist at the state level, executive/administrative level, as well as within the courts. Some notable mentions:

Congress

  • American Data Privacy and Protection Act (ADPPA): a bipartisan federal comprehensive data privacy bill.
  • American Innovation and Choice Online Act (AIOCA): a bipartisan federal bill that prohibits self-preferencing by large tech companies.
  • Kids Online Safety Act (KOSA): a bipartisan federal bill that would mandate risk assessments, give parents greater control over their children’s online activity, and require that platforms give kids the option to opt-out of algorithmic recommendations and other potentially harmful features.
  • The Children and Teens’ Online Privacy Protection Act (COPPA 2.0): a bipartisan federal bill that would expand existing protections for children’s privacy by banning companies from collecting the data of users 13 to 16 years old without their consent as well as by creating an “eraser” button allowing children to remove their data from digital services.
  • The Journalism Competition and Preservation Act (JCPA): a bipartisan federal bill that would create an antitrust exemption allowing certain news publishers below a certain size to collectively bargain with tech giants over payment for content displayed on their platforms.
  • Platform Accountability and Transparency Act (PATA): a bipartisan federal bill would require social media companies to provide vetted, independent researchers and the public with access to certain platform data.

White House

On September 8, the White House hosted a Listening Session on Tech Platform Accountability and identified six broad principles for reform, including promoting competition in the technology sector, providing “robust” federal protections for privacy, protecting kids’ safety online, increasing transparency about algorithms, and stopping discriminatory algorithmic decision-making. The other core principle the White House identified was reforming “special legal protections for large tech platforms,” referring to Section 230 of the Communications Decency Act. The controversial provision provides a liability shield for tech companies over content posted by third parties on their platforms.

Federal Trade Commission

With the appointment of Chair Lina Khan last year, the FTC demonstrated its intent to enforce federal antitrust laws against Big Tech - and to break with traditional antitrust perspectives of anti-competitive behavior as primarily focused on consumer harm and price inflation as opposed to impact on competition and innovation. With the confirmation of Commissioner Alvaro Bedoya this year, the FTC has signaled its intention to undertake significant rulemaking, particularly at the intersection of privacy and civil rights, and privacy and children’s digital rights. The FTC recently invited public comment on data harms from businesses collecting, analyzing, and monetizing information about people, and expressed its intent to craft rules that crack down on commercial surveillance and lax data security practices. Also, in the privacy realm, the FTC recently used its enforcement power to sue a data broker for selling the geolocation data of hundreds of millions of people, including to and from reproductive health clinics.

States

The states present an interesting terrain of contestation for tech policy. On the one hand, they are the lifeblood of our experiment with democracy - serving as the laboratories of innovation for our country. Where political consensus is easier to build around tech policy issues, states present a lower-stakes opportunity to advocate for creative policy solutions to tech harms. On the other hand, states’ rights are often invoked as a shield for conservative lawmakers to shirk federal protections and impose more conservative policies “at home.” Notable examples include state attempts to impose censorship laws attacking social media platforms for their alleged liberal bias, despite significant evidence to the contrary.

The California Age-Appropriate Design Code Act (CA Kids Code) is a notable success in state tech policymaking. The Kids Code requires online platforms to proactively consider how their product design could harm minors, including through algorithms and targeted ads. The groundbreaking bill could become the basis of other state or federal design codes or could even encourage platforms to proactively change their services for young users across the country, due in part to the difficulty of applying different standards based on location. The California Kids Code was modeled after the UK Age Appropriate Design Code, which became law in the UK one year ago. This transnational experiment in policymaking speaks to the importance of solidarity across borders, recognizing that the operations of tech platforms know no borders.

European Union

The EU is undoubtedly the global leader in tech regulation.  Earlier this year, the EU passed the Digital Services Act (DSA) and the Digital Markets Act (DMA) through a trilogue process with representatives of the European Parliament, European Council, and the European Commission.

The Digital Markets Act aims to ensure a level playing field for all digital companies, regardless of their size. The regulation will lay down clear rules for big platforms - a list of “dos” and “don’ts” - which aim to stop them from imposing unfair conditions on businesses and consumers. Such practices include ranking services and products offered by the gatekeeper itself higher than similar services or products offered by third parties on the gatekeeper's platform or not giving users the possibility of uninstalling any pre-installed software or app.  Interoperability between messaging platforms will improve - users of small or big platforms will be able to exchange messages, send files or make video calls across messaging apps. The rules should boost innovation, growth, and competitiveness and help smaller companies and start-ups compete with very large players.

The Digital Services Act, on the other hand, establishes a standard set of rules on digital intermediaries' obligations and accountability to create a digital space where users’ fundamental rights are protected. The DSA will give people more control over what they see online: users will have better information about why specific content is recommended to them and will be able to choose an option that does not include profiling. Targeted advertising will be banned for minors and the use of sensitive data, such as sexual orientation, religion, or ethnicity, won’t be allowed.  The new rules will also help protect users from harmful and illegal content. They will significantly improve the removal of illegal content, ensuring it is done as quickly as possible. It will also help tackle harmful content, which, like political or health-related disinformation, doesn’t have to be illegal, and introduce better rules for protecting freedom of speech. Another critical piece of the DSA is that it opens up access to the online platforms to researchers and requires independent audits and risk assessments of these platforms.

The DSA and DMA will come into force in 2023.

Also, in the EU, the European Commission has prioritized creating a regulatory framework that would prevent and minimize AI’s negative effects. The European policy model is fairly subjective as a “risk-based” approach that sorts the level of regulatory scrutiny or all-out ban based on the risk.

 

United Kingdom

Over the last five years, the Online Safety Bill (OSB) has been developing and picking up political momentum in the UK. With Boris Johnson’s resignation as prime minister this summer, the OSB was put on hold, and it remains to be seen whether the new prime minister, Liz Truss, will advance the bill as one of her priorities.  

The OSB introduces rules to social media and other user-generated content-based sites that compel them to remove illegal material from their platforms, with a particular emphasis on protecting children from harmful content. In addition, the largest platforms would have to tackle named forms of "legal but harmful" content, which could include issues such as promoting self-harm or eating disorders. Then for the largest platforms, the categories of legal but harmful material that need addressing would be agreed upon through secondary legislation by Parliament, and tech firms would then need to clearly explain in their terms and conditions what is and isn't acceptable on their site and enforce those rules.

The Bill says it would also protect free speech by exempting news content from the regulations, as well as protecting content defined as being of "democratic importance".

Companies that breach these new rules face fines that could run into billions of pounds for the largest services or face being blocked.  Communications regulator Ofcom, which is set to become the sector's new regulator, would oversee the OSB.

The bill has been wildly controversial, seen by some as a form of government censorship of free speech online.  Others have criticized the bill for allowing news publishers off the hook within the existing draft of the regulation, claiming that this will effectively give news publishers the green light to spread disinformation with impunity.  Many amendments have been tabled - whether the new government takes them up and whether or not the core of the bill changes remains to be seen. In particular, Truss’s government may take out the “legal but harmful” category to placate free speech defenders.

Canada

Canada introduced the Online News Act, a bill similar to the Australian News Media Bargaining Code (adopted in February 2021), which would compel large platforms to negotiate with publishers about payment for the use of their content or be forced into arbitration. The Canadian government has been incentivized to push this legislation through because there is a wide recognition that large tech platforms have absorbed journalism’s largest source of revenue (advertising), that this has negatively impacted the state of journalism in Canada, and that a healthy journalism industry is important for democratic societies. The Online News Act compels digital platforms to enter into financial agreements with publishers for news.

In June, a Canadian minister also introduced a federal privacy bill that would replace Canada’s outdated privacy rules, allow regulators to seek large fines for companies that abuse consumers’ data, add new digital protections for kids, and govern businesses’ use of artificial intelligence.  It remains to be seen whether it will become a legislative priority once the House of Commons returns from recess, but the fervor to achieve political consensus on privacy rules, and on protecting youth from online harms, clearly is transnational.

 

Australia

Australia pushed ahead with an (albeit limited) Online Safety Act in 2021, looking at content years ahead of anywhere else, establishing a public complaints mechanism and independent safety regulator. It also enacted a News Media Bargaining Code in 2020 to force the largest tech platforms to ‘pay for journalistic content.’ While problematic in its early days, the Code has led to significant payouts to large and smaller news publishers. A new government was elected in 2022 who have announced an intention to review and change the legislative landscape, but no concrete reforms have been announced so far.

Published on
September 19, 2022

Posted in: