Computers are inherently tools of privilege. Society and the average user often view computers as blank canvases that reflect their user’s intentions, but that is not the case. Rather, computers should be viewed as tools that reflect the privileges and priorities of their historical and contemporary creators.
Modern computing originated in the U.K and U.S. militaries. Its development has been and continues to be disproportionality guided by men from developed countries, while contributions from marginalized identity groups are often erased. It is critical that we, as members of a society increasingly built around computing technology, especially those of us who work intimately with this technology, deconstruct the history and design of the devices that follow us everywhere we go.
The history of computers is a fascinating tale that reveals a technology created as a tool of war and profit. Some of the earliest computers were funded by the British government to break German codes in WWII, which began a long path of explicit militaristic goals in the design and development of computers.
Once they became tools of war, computers were adopted as tools of profit. Limited to research institutions and large companies, computers were, and still are, designed to minimize cost while maximizing speed and storage capacity. Essentially, computers moved from strengthening the front-line to strengthening the bottom-line, all while serving the goals of Western bourgeoise.
In the ’50s, the early Internet emerged from (surprise, surprise) military funding. Though originally designed to create fast and secure military communication, the Internet made its way into the public domain, reaching predominantly white and middle- to upper-class neighborhoods far before anywhere else. Many low-income communities and communities of color, among them descendants of the 272 slaves sold by Georgetown in 1838, still lack reliable Internet access.
The early public Internet was intended to be ruled by its users, not its technical creators, corporate providers, or government regulators. However, mass data collection by internet giants and the death of net neutrality means that user control of the internet is all but gone.
At almost every step, improvements to computers were developed through publicly-funded research, and then released to companies who were able to patent and profit off the discoveries. Rather than make the technology a public good, computers were made a tool for private gain.
The history of computer science has also washed over the identities of individuals critical to computers’ success. While there is a growing push to highlight the contributions of underrepresented groups to STEM (for example, see the recent film highlighting overlooked women of color at NASA, Hidden Figures), many names remain forgotten.
Ada Lovelace, a woman almost never mentioned in the history of computing, is credited as the first person to execute an algorithm on a computer.
Alan Turing is often lauded, especially in computer science circles, as a brilliant theoretician and creator of early modern computers, but his queer identity, as well as the fact that his queerness led to his eventual death at the hands of the state he once turned the tides of war for, are often forgotten.
Grace Hopper, another forgotten woman of computer science, developed compilers, an amazing tool that turns the code we humans can write into more specific code computers can write. She also formalized the process of debugging. While that may seem insignificant, a compiler is the difference between one line of code that says “print(‘I am a computer.’)” and 50-100 lines of code that look something like “LD $R1 $R3 $R4.”
John Henry Thompson, a black computer scientist, created a computer language that allows users to make custom visuals on a computer monitor. Ever made a meme? Thank Thompson.
Like many products made in an increasingly globalized economy, there are neo-colonial overtones to the creation of modern computers. Many of the raw materials necessary to manufacture computers, including coltan and cobalt, are mined in the Democratic Republic of Congo, where militia groups (formed in the wake of U.S. intervention) rely on slave labor to provide these minerals cheaply. After going through a long and complex supply chain, major technology firms are happy to accept the eventual product at the lowest price possible. While numerous corporations do have social responsibility frameworks, their claims to an ethical supply chain are usually based in symbolic investigations and self-reporting from their suppliers.
Once these minerals are developed into computers and other electronic devices, they are often sold back to these countries for a substantial profit or distributed by non-profits for development purposes. While the latter may be well-intentioned, it does not change the fact that the creation of the computers and applications that run on them is often done by individuals removed from the environment and not from the affected communities.
Beyond neo-colonial implications, computer science struggles to incorporate the users of a product into the design process, known as participatory design. Without participatory design, products generally reflect what computer nerds think the user wants, or what the buyer (who is not always the user) specifies. Often times, this means a product will overlook the complex needs of the user, as the product was based off a page of written specifications. For example, work tracking software for management designed without worker input could use evaluation metrics that do not fully reflect the needs and experiences of the workers.
After winding their way through the global supply chain, the electronic components meet the tech industry, a particularly white and overwhelmingly male sector of society. Many organizations, including campus groups like GUWeCode and the Georgetown Chapter of the National Society of Black Engineers, are making a strong push to diversify the field, and should be applauded for doing so. However, the problems with computing go beyond equitable participation and representation; they reach into the design of computing itself.
Despite numerous contributions from women and underrepresented groups, the norms of the field were developed almost exclusively by men. Computers are conventionally built by creating one layer, forgetting everything about the layer except what to give it and what it will give back, and moving on. This design approach conforms to traditionally masculine values of independence, autonomy, and distance. Though seldom made explicit, the prevalence of these values underlie an environment made by and for men.
Although some of these norms are changing, the culture of computer science still values those who “tinker with” and “explore” the technology. What this usually means is that those (men) who do not have to care for families, especially young children, and are comfortable isolating themselves, are rewarded for supposed “brilliance,” while traditionally feminine qualities such as empathy, collaboration, and communication earn less praise, even as the industry starts to realize how much it desperately needs those qualities.
To be fair, I don’t know exactly what computers would look like if created outside of capitalism, patriarchy, and neo-colonialism, and I have a hard time believing it will be possible to ever create such a machine. However, by being conscious of the status-quo, we can begin to consider alternatives. What does a computer look like that’s designed to share resources? What about one that promotes introspection? A computer that inspires its users to be creative? Mind you, I’m not referring to the applications that run on the computer, but the computer itself.
They’re interesting questions, at least to this nerd, and one day I hope there are enough computer scientists asking them to put genuine effort into finding the answers.