(This post was authored by Cyril Focht)
When we look at software—and think about the process of creating software—it’s easy to assume that software is neutral: politically neutral, ideologically neutral and values neutral. Software is mere calculation, after all, and how could the numbers that comprise those calculations be anything but neutral? Because this is such a natural line of thinking, we as engineers and as creators of software, often aren’t cognizant of the ways in which we need to be critical of that software. Software may be a series of calculations, but it is human motivated: implemented by humans and used by humans. It’s in those human motivations that values and ideology emerge.
There are probably as many different ways to understand these ideologies as there are ideologies themselves, but in this blog post I’ll demonstrate two perspectives we can use to understand how that ideology comes into play: software as a model, and software as a tool. A model in the sense of a scientific model, like any number of equations used in physics or a scale model in architecture, and a tool in the sense of, well… a screwdriver or a hammer.
Starting with models—which are representations of human understanding of the world around us. All software is built on a model in some way, from a model of satellites’ orbital trajectories in GPS systems to conversions between Fahrenheit and Celsius that we’ve used in our first programming class (arithmetic itself is a model). Again, it’s easy to get caught in the thinking that the world around us is and therefore the way we understand it is neutral, but the way we represent that understanding is a decision. Considering two programmatic functions that calculate the Fibonacci sequence, one iterative and one recursive, those two functions are two fundamentally different ways of understanding and representing the same phenomenon. As someone with a lot of pride in my French heritage, I really like to use the adoption of the metric system to demonstrate how this affects us in practice. We’ll have to delve into some of the history of the metric system to see how it was influenced by human values.
The metric system was part of the first piece of legislation passed by the new government following the French revolution. Now, why would a system of measurement have been so important to be one of their first decisions following one of the most violent revolutions in human history? It wasn’t because they thought science was really neat. The short-short version is that before the revolution, France had common units of measurement, but to say they were inconsistent would be putting it lightly. Something that would often happen is that the measures being used by tax collectors weren’t accurate to what was owed, so if I owed some weight of wheat they would use a heavier weight on their scale. They would skim the difference off the top and keep it for themselves, and if they were challenged they had the legal authority to say” no, my measure is correct, this is what you owe.” By creating a more consistent, accessible system, it gave common people the ability to challenge their tax collectors by saying “I don’t think your measure is a kilo, like you say it is, so I’m going to measure a liter of water and compare that to your measure,” and if they don’t match, the tax collector isn’t able to skim off the top. Since this adoption by the French government was deeply intertwined with the very invention of the metric system, we see that its core values as a system of measurement are pushing back against exploitation and holding those in power accountable.
Considering software instead as a tool, I’m sure is how most of us who work in computing are already used to thinking about software, so I won’t dwell on demonstrating why that perspective is a useful one. To compare software again with an entity outside computing, let’s consider a firearm. Before I’ve done any more than invoke the topic, I’m sure you’ve already considered a handful of different ways I might approach this discussion, and many more political positions you’ve seen people take up in public discourse. That in itself demonstrates how our tools carry ideology better than any specific analysis I have to offer, but I will still offer some ways we can consider ideology when thinking about our tools. The most obvious questions are in regards to the tool’s intended use: what is the goal it helps to achieve, how does it help do so, and why is that goal of value? Sometimes the intended use on the part of a creator differs from its use in practice once it’s in the hands of others. Maybe less obvious—but nonetheless most of why people have such strong emotions about firearms as a topic—are the cultural context in which that tool exists, and how the tool affects that cultural context. Consider the cloth facemask, which existed in a drastically different context a year ago than it does now, and as such any analysis, we conduct on such a facemask as an artifact will have changed just as dramatically in that year.
These are all the kinds of things we should be mindful and critical of when we’re creating software. This applies, of course, to those hotly discussed in public discourse, such as social media, autonomous vehicles, AI and machine learning, automation, and surveillance technology, but it’s equally true for those that seem more mundane, like banking and tax filing software, word processors and spreadsheets, mailing systems, or OS kernels. Everything we make requires a series of decisions, and every one of those decisions reflects our values as designers.