Policy Principles

PURPOSE

These principles are designed to (1) guide the internal development of legislative, regulatory, and other policy proposals by CHT, either alone or together with our partners and collaborators, and (2) provide a means by which to evaluate any legislative, regulatory, or other policy proposals put forward by parties external to CHT.

SCOPE

They are intended to cover a broad array of legislative, regulatory, and other policy proposals relating to “tech policy,” broadly construed. This may include proposals related to the regulation of technology itself, the classification or legal status of tech companies, the taxation of tech companies or their services, the expansion of existing regulatory authority and/or the creation of new regulators, competition-related measures, and data governance-related measures (including privacy legislation).

SUMMARY

1) Put people first

2) Avoid “atomizing” solutions

3) Confront power

4) Address root causes

5) Presume harm

6) Compel caution

7) Embrace complexity

8) Seek sustainability

THE 
PRINCIPLES

Put people first
The policy or proposal should privilege the rights and interests of people over corporations.

For a variety of reasons, including significant corporate lobbying efforts and (often) superior industry knowledge and expertise, many technology-related laws, regulations, and policies are written from the perspective of companies and tend to privilege commercial interests. We must invert this model and put the rights and interests of actual human beings above corporate interests and agendas. Policies should be written from a people-centered point of view, accounting for human strengths and vulnerabilities, and with particular regard for the safety and wellbeing of children, families, and socially or economically disadvantaged or marginalized communities. Note, this is not the same as “human-centered” design, which still puts much of the onus on the individual (see Principle 2 below).


For example:

  • Section 230 of the Communications Decency Act (CDA) is a law written by and for corporations to protect their rights and interests. While 230 has provided an open platform for users, including political dissidents, the law as written does not account for the impact on, or harm to, individuals resulting from its implementation. Any effective Section 230 reform must account for such impacts and harms to human beings and account for their rights and interests.
Avoid “atomizing” solutions
The policy or proposal should, where possible, prioritize social and collective approaches over “atomizing” solutions.

Tech-related policies are often hyper-focused on the individual and individual harms. Although we are each impacted by technology as individuals, impact and harms are often disproportionately experienced by certain groups and populations—usually the most vulnerable. In thinking about the impact of a particular technology or tech-related policy, we must consider its potential for collective and societal harms in addition to individual ones. Moreover, while we have limited power and capacity to negotiate rights or to absorb any harmful effects as individuals, we are stronger together. By thinking about our collective, shared experiences with technology, we can formulate stronger policy solutions.


For example:

  • Privacy laws premised on the individual notice & choice (or consent) paradigm are “atomizing.” They divide and conquer, requiring each of us to negotiate for our rights individually. They burden individuals with understanding and assessing the impact of their choices through obscure consumer-facing notices, despite limited information, and despite the fact that the choices of one individual often have a direct or indirect impact on others. A non-atomizing alternative would impose minimum standards that benefit everyone equally such that the choices of one individual do not undermine the rights or protections of others.
Confront power
The policy or proposal should seek to identify and correct power asymmetries and imbalances.

The wielding of technology is ultimately about power. A policy or proposal that seeks to regulate a technology, its proponents, or its effects must account for power asymmetries and imbalances, including the impact of systemic racism and oppression. It should clearly articulate the legal, financial, and ethical responsibilities and liability of tech companies for their products and services. Reining in or rebalancing power may require tools from different legal domains, including consumer protection, non-discrimination laws, product and other liability frameworks, fiduciary law, and competition or “anti-monopoly” style measures. Effective policies should consider a combination of measures to address the underlying power dynamics in context. Where an imbalance of power remains (or a balance is unachievable), the policy should require those with more power to serve those with less. This includes placing the burden of proof and compliance on those with more power.


For example:

  • Many privacy-related legislative proposals address the mechanics of individual consent without addressing the power dynamics underlying it. An effective policy or proposal would consider whether meaningful consent is even possible given the power dynamics and asymmetries between a party seeking consent and a party providing it, and offer a meaningful alternative or approach that can restore a balance of power.
Address root causes
The policy or proposal should go beyond symptoms to address the root causes of the problem or challenge at hand.

The true impact of technology on our lives can be subtle and profound at once. Activities that appear harmless in isolation (e.g. scrolling through social media) can be extremely harmful in the aggregate (e.g. social media addiction). Effective policies or proposals will tackle the underlying causes of harmful behaviors, rather than merely addressing symptoms. While often more politically challenging at the outset, they are likely to be more sustainable in the long-term. Relevant root causes in the context of tech policy include underlying business models; vast asymmetries of power, knowledge, capacity, and resources; and an absence of agreed upon norms in the digital realm.


For example:

  • Technology-related consumer protection or antitrust proposals should consider more than just competitive pricing, especially in the case of “free” services. They should also seek to address the lack of consumer choice and market dominance of companies, technology or vendor lock-in, quality-adjusted factors such as concessions with respect to privacy or data protection, and other factors that may result in poor or unfair choices that undermine autonomy.
Presume harm
The policy or proposal should presume that all technologies, and their applications, are capable of inflicting a variety of harms and seek to identify those harms.

Technology is never neutral. Depending on its use and context, every technology is capable of inflicting a variety of harms. At present, most laws and policies presume that technology is neutral and safe unless proven otherwise. Instead, policy interventions should presume harm and, in accordance with Principle 6 (below), compel caution. First and foremost, this means identifying and naming foreseeable risks and harms and anticipating unforeseeable ones. This also means placing the burden of proof on the owner, operator, or proponent of a technology to prove the absence, or effective mitigation, of specific kinds of risks and harms. Further, it may require imposing specific disclosure requirements about the risks and harms of a technology or practice, both to consumers through warning labels or certification schemes, as well as to regulators through certified corporate statements and mandatory reporting requirements.


For example:

  • Data security proposals that only focus on ex-post breach notification miss the mark. A better proposal that presumes harm would impose ex-ante data security-related corporate disclosure requirements on tech companies, with potential criminal penalties for executives who falsify such disclosures (along the lines of the corporate fraud disclosures imposed by the Sarbanes-Oxley Act). Recognizing that all data-related activities present risks and have the potential to be harmful to consumers, the burden should be on companies to disclose these risks in advance and internalize the harms.
Compel caution
The policy or proposal should require a precautionary approach to technology development and deployment.

The age of moving fast and breaking things is over. It’s time to acknowledge the fragility of our systems and the potential for powerful actors to exploit this fragility. Effective laws and policies to regulate technology itself will require companies to assess risks in advance and build in adequate safeguards against potential harms before deploying a technology product, service, or solution. Relatedly, policy proposals that address the status, taxation, or regulation of tech companies should include economic and other measures that incentivize a precautionary approach. Relevant steps may include undertaking an array of cost-benefit analyses and impact assessments (potentially including a social impact and/or human rights impact assessment), as well as harm reduction measures, among others. If addressing the market or competition, the policy may also impose certain fiduciary-related duties or other obligations on technology companies.


For example:

  • A policy proposal to regulate the commercial deployment of advanced facial recognition technology may consider imposing a partial ban or outright moratorium on the use of such technology before there is sufficient evidence of its safety, functionality, and efficacy; an array of impact assessments are undertaken; necessity and proportionality are established; and concrete mitigation and accountability measures are in place.
Embrace complexity
The policy or proposal should reflect the complexity of a problem or challenge by advancing comprehensive and contextualized solutions.

Humane policy proposals will acknowledge and demonstrate an awareness of the fact that everything is connected. Effective policy proposals do not propose “silver bullet” solutions or magical thinking; rather, they situate solutions within their interconnected and complex environment. Effective policy proposals will consider a spectrum of flexible, calibrated options, while avoiding reductionist or binary thinking and false dichotomies. While more challenging, this approach is more likely to result in sustainable solutions.


For example:

  • Privacy legislation often relies on reductionism, including false dichotomies, such as sensitive vs. non-sensitive data or public vs. private data. The sensitivity of a given data point cannot be predetermined in the abstract but depends on context and an array of factors (such as other data presented along with it, the purposes/uses of that data, and the potential and means to exploit it). For example, grocery store purchases may be highly sensitive depending on how a seller decides to exploit the information to derive health-related insights or share the data with insurance companies. Similarly, the fact that data is “public” does not mean it should be devoid of protection or rules, as demonstrated by the public’s recent outrage over the scraping of publicly-available faces from social media platforms. A better approach would embrace complexity by emphasizing context.
Seek sustainability
The policy or proposal should privilege sustainable, regenerative solutions over self-terminating, quick-fixes.

Technology evolves at a lightning pace, while law and policy-making are slow and deliberative by design. As a result, “harm creation” outpaces “harm response.” We must recognize the self-terminating properties of existing systems and avoid “band aid” solutions. Where possible, we should prioritize sustainable solutions that will not require frequent changes or the regular introduction of new measures. We should favor adaptive solutions that create positive feedback loops, ensuring relevance and ongoing suitability for a purpose. Sustainable solutions are not easy to evade, and account for the arc of technology.


For example:

  • Some legislative proposals to address online tracking or the surveillance of web traffic and browsing activity have called for browsers to install Do Not Track signals or ban third-party cookies. These proposals are inherently short-sighted because vendors can easily circumvent them by replacing third-party cookies with beacons, tracking pixels, fingerprinting, and other (often more intrusive) methods. A more sustainable policy would foresee such workarounds and address root causes, in accordance with Principle 4 (above).
Would you recommend these principles to a friend?
YesNo
Policy-Principles

Have comments about these Policy Principles? Let us know. Our small team gets a lot of requests so forgive us if we are not able to respond directly.