Distraction Free Reading

What Would A Techno-Ethics Look Like?

Each year, Platypus invites the recipients of the annual Forsythe Prize to reflect on their award-winning work. This week’s post is from 2017’s winner Sareeta Amrute, for her book Encoding Race, Encoding Class (Duke, 2016).

What would a techno-ethics look like? This question persists long after this book, has been written and edited, proofed and published; perhaps it lingers, too, in the minds of its readers as they ponder the pathways and dead-ends digital technologies lay down. Digital technologies build on previous iterations of capital, labor, as well as social and environmental relations, even as they materialize new relations. The part-time visa regimes that most tech companies make use of build on a long history of mobile migrant, free and unfree, labor that has been used to build other kinds of infrastructure, from plantation economies across the British Empire to railroads in the United States and glass-and-steel skyscrapers in Germany. Similarly, the infrastructure of cloud computing relies on previously established military bunkers and railway lines, even as it creates unprecedented demands for energy. An ethical response to these dynamics would produce regimes of care that unite a knowledge of subjects’ evolving relationships with technologies with the goal of reducing spaces of domination created by these technologies. A techno-ethics should provide guidance for those who develop, use, and make polices about technologies.

I am particularly honored to receive this prize in the name of Diana Forsythe whose scholarship uncovered latent action and knowledge in technological systems. One strand of Forsythe’s scholarship investigates interactions between persons and technologies. When doing research on medical technologies, she discovered what other experts could not recognize—the tacit knowledge, fears, and desires that govern how we interact with scientific enterprises. In one such study, Forsythe discovered that migraines make sufferers think of death by brain tumor and stroke, a fact that neurologists working to prescribe medication could not recognize, but was revealed through extended, post-visit interviews. Forsythe’s scholarship placed this tacit knowledge, discoverable through ethnography, at the center of her analysis of doctor-patient communication. Ethnography, by gathering information in real-world situations and holding onto their complexity, can surprise, overturn assumptions, and then help us design our world differently.

Book cover of Encoding Race, Encoding Class White text of title over dark blue rectangular field, laid over a color photograph of a busy tram station in Berlin.

Encoding Race, Encoding Class: winner of the 2017 Diana Forsythe Prize

When I was conducting fieldwork with programmers from India who had taken short-term contracts for jobs in Berlin, I found similar surprises in patterns of work, play, banter, and serious discussion. The most important surprise was that programmers from India often tacitly voice criticisms of the global tech industries that employ them. Programmers from India pursue membership in elite cosmopolitan circles even while they chafe against their sequestering in racially marked grunt labor categories. These findings were somewhat different from what I set out to discover. I was initially most concerned with the interaction between migration law designed to regulate immigration and national identity, and the need of tech industries for programmers. These concerns developed from watching debates about short-term visa takers from India unfold in the public sphere—across debates in newspapers, on television, and in parliament. When I began meeting programmers from India, observing how they worked in their offices, and talking with them about how they came to be in Berlin, I noticed their concerns with enjoying their lives (not just working them) and their Euro-American managers’ concerns with harnessing the cultural knowledge they believed Indian programmers brought with them to develop India as an emerging market for software and services. These convictions, gleaned through ethnography, became the scaffold for my theory of how race and class work in tech industries. I began to recognize that racial difference is prized as a source of new ideas even while it becomes an alibi for differential treatment of temporary workers. Class emerges relationally through racial and other categories of identity. Middle class aspirations among programmers from India, for instance, is lived through the simultaneous valuation of race as a marker of novelty and devaluation of race as a marker of uncreative, replaceable labor.

A man sits by piles of books for sale on a sidewalk.

A book seller in Pune, India who specializes in selling computer programming manuals to college students. Photo by the author.

In the second, incredibly prescient, strand of her research, Forsythe investigated how women become deleted in accounts of computer science. Their labor, while incredibly important to the work of the lab, was not considered part of lab work itself. This made women, wrote Forsythe in 1989, both invisible and hyper-visible. They shuttled between doing unrecognized labor and being recognized largely for their exotic and sexualized presence in what was considered by all to be a male domain.

Ethnographic fieldwork taught me to notice the diminutions and over-inflations of bodies in technical production. Listening to what people say, noticing what they do, and asking people what they think should be done showed me all the kinds of labor hidden behind the image of a lone white man cutting code, from bug testing to answering questions about foreign cultures for the sake of building cosmopolitan office camaraderie. The struggle for this labor to be recognized and rewarded provided programmers from India a means to critique the value of hard work and hierarchies of creative and grunt labor in IT firms.

Listening for dissenting voices happens in ethnography, in close reading, in the analysis of images, and in historical investigations. All of these methods will be required to produce a techno-ethics sensitive to the complexities of real world contexts and oriented towards reducing relations of domination. What would a techno-ethics look like? It could start with a few simple principles:

  1. Coding is labor. It is a kind of work that makes use of all a person’s faculties, both mental and physical. Coding includes hunching over a laptop for hours on end, and can often involve solving problems that coders themselves find uninteresting. Though it can be fun, and coders can get into a flow, it can also be a grind, repetitive and unfulfilling. As labor, it is valued according to the relationship it has to an entire series of commodities and their socially determined values. Most importantly, coding is enmeshed and supported by many other types of labor. Janitors and cooks, messengers and box loaders, customer support technicians and content moderators are all part of what we call the tech industry, though they do not come to mind readily when we pronounce the place-names ‘Silicon Valley’ or ‘Bangalore’. Thinking of all these types of labor, and how they are differentially valued, as well as thinking about the manual, physical, part of coding work would accomplish at least two things: First, it would reduce the ‘specialness’ of coding, an idea that encourages engineers and executives to separate the labor of solving engineering problems from the effects that coding projects have on communities on the one hand, and the work of non-engineers necessary to produce these solution on the other. Technologists are as important as other kinds of workers, but not more. Second, it would place programmers within a larger circle of workers in tech firms, a circle of all workers who might decide that they want to determine in concert what the conditions of their labor should be and to what ends their labor should be put.
  2. Digital technologies have effects that cannot always be predicted. Engineering problems are generally considered problems that can be tested to see if they can work. Then, to the best of an engineer’s ability, those tests determine whether a structure—like a bridge or a road—is ready to withstand the vagaries of traffic, weather, and wear. But coding projects might be a bit different. Though testing can determine what works within the parameters set out by a project, it can predict very little about how the project will unfold in real-time, in the hands of real people, or bots. This means that good intentions on the part of developers, even good testing, is not enough. Code makers and project-builders have to be willing to revisit their products iteratively. Firms should have sections dedicated to asking, what effect has this product had on communities? What use are communities, especially those who are underprivileged, putting this product to? How can we amplify positive effects and discourage negative ones? Admittedly tech companies are trying now trying to minimize negative effects of their products, but recognizing that all products will have unanticipated effects make this process ongoing and open-ended. A technology will have to be continually reevaluated in light of the relations of domination or refusal it makes possible over time.
  3. Data triggers obligations. Modern scientific and cultural enterprises from biomedicine to botany have depended on the collection of data from global, and often colonially subjugated populations. The history of these enterprises is a history of stolen cell lines, bodies, plants, and intellectual property. The communities, families, individuals, and nations from which this data and bio-matter were collected were not asked for permission, received no compensation, and had no say in how this data could be used. In the case of biomedical data stricter regulations are now in place to cover at least the issue of permission, while intellectual property rights over local knowledge of plants and medicines can be asserted juridically. None of these measures have yet to make their way into the kinds of data collection projects routinely undertaken by tech firms. When data is collected, it should be thought of as something given which necessitates a return and should happen with the permission of those from whom it is taken. The conditions of providing data and the obligations set out for doing so need to be worked out in concert with the communities who produce that data.

Towards the end of my book, I write about a young woman who embodies for me the hopes and fears that tech industries have produced. To move towards the hope, I argue, means finding ways to ask what a good life should look like, and to ask this question in as many ways as possible, over and again. Ethnographers are often asked for solutions to the problems they describe in their books. The greatest gift ethnographers can provide towards finding solutions comes from listening to how lives are lived in concert with and through contradiction. From this listening, sometimes principles might be distilled that open a problem to surprise and to revision.

 

Leave a Reply

Your email address will not be published. Required fields are marked *