A common misconception is that tech is neutral. But tech is never neutral. We shape technology, and technology shapes us.
Each choice in the design, deployment, implementation, or use of technology has downstream implications for society. These choices influence mindsets, actions, and behaviors. Below we show how humans shape and are shaped by technology, both individually and collectively.
1. HUMANS SHAPE TECHNOLOGY
Humans are not and cannot be neutral. Our choices are always trying to realize some of our values whether it’s a sense of self-worth or meaningful relationships. Moreover, we are constantly changing, affected each moment by people, environments, and events, in ways we are aware of – and in ways we aren’t. Although we often focus on how we are able to shape the world around us, we are also shaped by the conditions we find ourselves in.
As a result, whether we admit it or not, every one of us has a set of underlying conditions: values, beliefs, physical characteristics, monetary resources, and assumptions that come from socialization, education, birth circumstances, and other experiences. These conditions influence the way we see reality and frame the space of possibility for our future.
At the same time, there are infinite conditions we have not been subject to, creating inevitable gaps in our understanding.
Our values and assumptions are baked into how we share and use technology, our perceptions of technology, and as technologists, how we build technology. These decisions affect lives. For example:
- When a technologist chooses the default setting that videos won’t show captions, it has adverse effects on people with hearing impairments or people who learn better through reading.
- When health organizations don’t prioritize mobile viewing of their service websites, they inadvertently make it more difficult for those with mobile-only devices to get the services they need.
Since it is impossible for us as humans to present all available choices with equal priority, what we and our organizations choose to emphasize in the design and deployment of technology reflects our values and priorities.
2. SOCIETY SHAPES TECHNOLOGY
Technology products don’t exist in isolation. They are subject to financial pressures, social and power dynamics, geopolitical pressures, and cultural norms, among others. For example:
Financial pressure: Operating within a profit-centric paradigm, boards, executives, and managers typically feel immense pressure to grow revenue and/or market share. These decisions can override the values of those that build technology. We’ve seen this happen consistently as attempts by integrity teams and whistleblowers to align technology products with societal benefits have been shut down due to a prioritization of profits over people.
Social dynamics: Power dynamics, unequal access to resources, and other systemic inequities can result in technology that deepens divides in society. For instance, to the surprise of its creators, a previous version of the Amazon recruiting platform learned to prioritize men in the applicant pool based on historical hiring data and current staffing.
Cultural norms: Regardless of intentions, the customs, ways of interaction, and value systems in society will shape how technology is used.
These factors become even more challenging for technologies that are dynamic in nature, adapting to their environments via machine learning. To make sure that technology upholds our values when some of our societal systems might not, we need to identify these disconnects and design ways to fix them.
3. HUMANS ARE SHAPED BY TECHNOLOGY
Just like the physical design of a building or the way a city is laid out influences how people feel and interact in physical spaces, digital technology influences our experience in digital and physical spaces, both individually and collectively.
For example, a social media environment that emphasizes engagement incentivizes us to post content that will get others to react. And those reactions subsequently shape how we feel about what we posted.
Technology can never be neutral because humans shape technology and technology shapes humans. Understanding and acting on the myth of neutrality means considering how our own perspectives shape our role in creating, using, sharing, and thinking about technology.
This piece has been adapted from a lesson in our recently launched course, Foundations of Humane Technology. If you are involved in shaping tomorrow’s technology, we welcome you to register for the course here.
WHAT WE'RE READING, LISTENING TO, & WATCHING
- In Choices, Risks, and Reward Reports, Thomas Krendl Gilbert, Sarah Dean, Tom Zick, and Nathan Lambert provide recommendations to help policymakers assess the safety of reinforcement learning (RL) systems. Reinforcement learning is a machine learning procedure where a system continuously learns to achieve a goal by making decisions that optimize for that goal. RL systems are particularly risky as they can have unintended consequences of optimizing for a particular goal. To limit these consequences, the authors propose that regulators adopt “Reward Reports” in order to limit the design choices of RL systems in “safety-critical domains.”
- A central goal of humane technology is that it enables us to address some of the greatest challenges of our time. In “Cautious Optimism on Tech’s Role in Food Innovation” Mai Sistla discusses the potential of “food tech” companies to help our food systems, which are increasingly contributing to climate change and harming public health. The piece is a thoughtful reminder of what’s at stake if we don’t intervene in the food system and how technology, if pursued thoughtfully with research and government investment, could play a vital role in addressing urgent societal issues.
- Meetali Jain, Deputy Director at Reset, joined our Foundations of Humane Technology course community to discuss the tech reform policy ecosystem and how it is evolving. The full video, where Meetali unpacks her thoughts on the tech reform policy landscape around the globe, is now available to the public on YouTube.