Category: Research

Nothing Special: Standards, Infrastructure, and Maintenance in the Great Age of American Innovation

Despite Bruno Latour’s provocation that “nothing special” happens in laboratories, scholars of science and technology continue to be fascinated by them. And for good reason: laboratories, after all, are crucibles for inventions and innovations. In an age like ours where innovation-speak reigns, could there be any more urgent task than to understand the sources of inspiration and discovery? Yet our affinity for innovation has a corresponding dark side that manifests in indifference toward existing technological systems. As a scholar and as a citizen, it is this indifference that concerns me most: rather than fixating so much on innovation and discovery, I wish we would spend more time thinking through the dynamics of standardization, infrastructure, and maintenance. The neglect of infrastructure, for example, is especially evident in public policy, as the comedian John Oliver showed in a recent rant. Oliver boiled the issue down to its tragicomic essence: we need public funds to maintain old technologies such as bridges and dams, but our elected officials prefer to break out their oversized scissors and celebrate something new. Images: John Oliver on Last Week Tonight, via YouTube.com (top); Ribbon cutting at the U.S. Naval Research Laboratory, March 2012 (bottom)(photo credit: Jamie Hartman, public domain). (read more...)

Decolonizing Design Anthropology with Tinn

In fall 2014, I began building Tinn, a health data tracking application for people living with tinnitus. I approached building Tinn as an opportunity to explore what a socially conscious, feminist, and anti-colonial process for developing mobile applications might look like. The idea of pairing design, building, and anthropology is hardly all that innovative; “design anthropology,” a subfield of cultural anthropology focused on empirical studies of use cultures, has been around since the 1980’s (at least). What sets Tinn apart, however, is my commitment to working with communities of color in Portland, OR, that have been alienated by and from the technology and application development industry because of intersecting and juxtaposed systems of gender, racial, and access inequality. Perhaps one of the central epistemic problematics for the project, then, can be posed as a question: Can we decolonize design anthropology, and to what success ordegree? What might this entail? Screenshot of the title deck for Tinn (by author). Decolonization is slippery. My “academic anthropology” is a historical ethnography of the ways media scientists after World War II shaped (and continue to shape) the gendered contours of global media flows in the British Empire, with a particular focus on sound, listening, and citizen-subject formation in Gibraltar. My work on Tinn gave me the opportunity to transform my critique of the violence done by media scientists into the starting point for a more ethical approach to user experience research with marginalized Native American, Latin@, migrant, and African American Oregonians living with tinnitus in Portland. Yet what I thought of as decolonizing and what my collaborators thought of as decolonizing was at odds in some instances. For one, while decolonizing anthropology attempts to re-balance the scales of recognition and redistribution in the research process, it is much more difficult to reconcile the colonial histories of designer and designed for. Yet, for my collaborators, this division didn’t actually matter. As Nephi, one of my politically astute collaborators, put it, “the ethics are too heady when we need material help. Someone has to do that work. It’s you today.” While Tinn began with my commitment to making the project open source (as resistance to the privatization and commoditization of collaboration — it’s not that simple), Nephi protested. “My labor is there, too. You’d give it away for free? There’s a long history of white men giving away the work of my people for free.” I said it wasn’t simple. While there were times where my collaborators and I didn’t agree on what constituted decolonization, we did agree on one thing: data poses a particularly tricky sociohistorical landscape for thinking about recognition, redistribution, and reconciliation. The rest of this post is dedicated to the complications of tracking data about tinnitus, and tracking data about vulnerable and/or marginalized users with tinnitus. (read more...)

Who speaks for soil?

Finally! 2015 is the year of soils! Ready the celebration. Polish your spade, pick, and shovel, and carefully wrap those gifts of organic fertilizer you’ve been hiding away. It’s going to be a hell of a party. Humor aside, soil is obviously important in a number of very complex ways. The Food and Agriculture Organization of the UN (FAO) is spearheading the “2015 International Year of Soils” initiative to raise awareness of soil issues for food systems and broader environmental concerns. The director of the FAO, José Graziano da Silva, had the following to say of the importance of soil: “The multiple roles of soils often go unnoticed. Soils don’t have a voice, and few people speak out for them. They are our silent ally in food production” (as quoted on FAO’s website). Yet as I’ve found researching soil conservation in Haiti in 2012 and examining the history of soil conservation more broadly, it seems that many people have spoken out for soils. In fact, through the panic related to the 1930s dustbowl crisis in the United States, soil erosion arguably became the first global environmental problem (Anderson 1984). This rapid spread of environmental concern highlights the way that soil has, in the past, captured the imagination and emotion of governments around the world. But the spread of soil conservation was not the seemingly de-politicized “awareness” campaign that we’re presented with by the FAO. Rather, in the 1930s, soil conservation was rooted in a desire to control and manipulate rural farmers. So while I’d agree with Mr. da Silva that soils do not have an “audible” voice, I’d argue that we need to pay far more attention to who speaks for soils and why. (read more...)

Let’s Think about the University: Anthropology, Data Science, and the Function of Critique

There have been surprisingly few sustained, ethnographic studies of the university that aim to understand it as an institution devoted at once to the production of knowledge and technologies, the circulation of those products, and the cultivation of particular types of subjects. Ethnographers have largely worked at it piecemeal, with admittedly excellent work from both the anthropology of education and of science carving out various areas of inquiry: classrooms, laboratories, admissions offices, student groups, start-up incubators. To my mind, it seems that the lack of a synthetic approach to the knowledge work going on in the university might be due to the disappointing fact that these two camps within anthropology don’t talk to each other very much. In part, this is a result of their different goals, positions within the ecology of anthropological knowledge production, possible sources of research funding, and available career paths both within and without academia; yet, despite the sociological intelligibility of this lack of communication, it remains intellectually unfortunate. As the business of research and education becomes increasingly corporatized, increasingly shaped by wider forms of rationality that rely upon quantification, standardization, and the devolution of responsibility to the individual, it becomes correspondingly urgent to develop a rigorous, holistic understanding of the university as such. This has only been underscored by my fieldwork among Russian data scientists, who are themselves involved in the ongoing reorganization of higher education here. That is to say, the neoliberal university qua institution, with its own internal forms of organization and expertise as well as its place within the broader political economy, deserves to be the object of a newly shared inquiry. The current shape of the university has profound implications for the professional lives of anthropologists of both science and education, and similarly thorough-going epistemological consequences for their ongoing, ultimately complementary attempts to understand how contemporary people make knowledge. I’m working through the latter half of this proposition in my current research project. Data science has emerged as a key site of intervention into the educational system in Russia; elites from both industry and academy are working together to modernize and re-purpose Russia’s formidable pedagogical infrastructure in pure mathematics and theoretical computer science to train a new generation of algorithmists, developers, and programmers in both the practical skills and professional attitudes that they see as necessary for the creation of a truly Russian knowledge economy. The result has been both the creation of a number of hybrid, industrial-academic institutions and wide-ranging modifications to curriculum and requirements at more traditional institutions. These changes are occurring within a broader context of profound reforms to post-graduate education1 and the science system more generally.2 (read more...)

How influential was Alan Turing? The tangled invention of computing (and its historiography)

Alan Turing was involved in some of the most important developments of the twentieth century: he invented the abstraction now called the Universal Turing Machine that every undergraduate computer science major learns in college; he was involved in the great British Enigma code-breaking effort that deserves at least some credit for the Allied victory in World War II, and last, but not the least, while working on building early digital computers post-Enigma, he described — in a fascinating philosophical paper that continues to puzzle and excite to this day — the thing we now call the Turing Test for artificial intelligence. His career was ultimately cut short, however, after he was convicted in Britain of “gross indecency” (in effect for being gay), and two years later was found dead in an apparent suicide. The celebrations of Turing’s birth centenary began three years ago in 2012. As a result, far, far more people now know about him than perhaps ever before. 2014 was probably the climax, since nothing is as consecrating as having an A-list Hollywood movie based on your life: a film with big-name actors that garners cultural prestige, decent press, and of course, an Academy Award. I highly recommend Christian Caryl’s review of the The Imitation Game (which covers Turing’s work in breaking the Enigma code). The film is so in thrall to the Cult of the Genius that it adopts a strategy not so much of humanizing Turing or giving us a glimpse of his life, but of co-opting the audience into feeling superior to the antediluvian, backward, not to mention homophobic, Establishment (here mostly represented by Tywin Lannister, I’m sorry, Commander Denniston). Every collective achievement, every breakthrough, every strategy, is credited to Turing, and to Turing alone. One scene from the film should give you a flavor of this: as his colleagues potter around trying to work out the Enigma encryption on pieces of paper, Turing, in a separate room all by himself, is shown to be building a Bombe (a massive, complicated, machine!) alone with his bare hands armed with a screwdriver! The movie embodies a contradiction that one can also find in Turing’s life and work. On one hand, his work was enormously influential after his death: every computer science undergrad learns about the Turing Machine, and the lifetime achievement award of the premier organization of computer scientists is called the Turing Award. But on the other, he was relatively unknown while he lived (relatively being a key word here, since he studied at Cambridge and Princeton and crossed paths with minds ranging from Wittgenstein to John Von Neumann). Perhaps in an effort to change this, the movie (like many of his recent commemorations) goes all out in the opposite direction: it credits Turing with every single collective achievement, from being responsible for the entirety of the British code-breaking effort to inventing the modern computer and computer science. (read more...)

The Entrepreneurial Future

Georges Doriot, who founded the first publicly traded venture capital firm in 1946, arguably announced a new regime of speculative capital when he said: “I want money to do things that have never been done before” (Ante 2008). In the years immediately after World War II, the establishment of venture capital firms was crucial to the ascent of a new kind of commercial enterprise, one that has profoundly influenced the development of digital technologies on a very broad scale. It was with the creation of the first venture capital firms that a financial network to support technology startup companies began to form. The fact that the earliest Silicon Valley startups were funded by venture capital investments is an indicator of the degree to which the developmental trajectory of personal computing has been intertwined with that of finance capital. Fairchild Semiconductor, for example, was the first startup funded by venture capital (in 1957), and it launched numerous “spin-off” companies that were collectively responsible for the innovations that enabled what became the microelectronics industry. Since then, of course, venture capital has grown into a powerful industry that directs vast financial resources into technology startup companies. But venture capital investment doesn’t only fuel the tech startup economy — it actively shapes it. Research on Silicon Valley’s high tech industry suggests that venture capitalists’ importance to processes of innovation has more to do with their role in selecting promising companies than with simply providing financing itself (Ferrary & Granovetter 2009). Beyond choosing the criteria for valuation by which the potential commercial success of startups is measured, they determine which innovations will even have a chance to enter the market. The result is that Silicon Valley innovation is guided directly by finance capital’s future-oriented logic of speculation. Companies with few tangible assets pursue funding without which they will have little chance to successfully launch their products and, as business news coverage attests, some companies are valued at billions of dollars without demonstrating that they have the means to become profitable. What matters is keeping the possibility of a market open. One could think, for example, of Snapchat, a popular photo and video sharing app that has expanded through significant venture capital investments. Last year, the company was valued at $10 billion, despite the fact that it generates almost no revenue, mostly on the basis of its potential to reach users and create an audience. (read more...)

Crowdsourcing the Expert

“Crowd” and “cloud” computing are exciting new technologies on the horizon, both for computer science types and also for us STS-types (science and technology studies, that is) who are interested in how different actors put them to (different) uses. Out of these, crowd computing is particularly interesting — as a technique that both improves artificial intelligence (AI) and operates to re-organize work and the workplace. In addition, as Lilly Irani shows, it also performs cultural work, producing the figure of the heroic problem-solving innovator. To this, I want to add a another point: might “human computation and crowdsourcing” (as its practitioners call it) be changing our widely-held ideas about experts and expertise? Here’s why. I’m puzzled by how crowdsourcing research both valorizes expertise while at the same time sets about replacing the expert with a combination of programs and (non-expert) humans. I’m even more puzzled by how crowd computing experts rarely specify the nature of their own expertise; if crowdsourcing is about replacing experts, then what exactly are these “human computation” experts themselves experts on? Any thoughts, readers? How might we think about the figure of the expert in crowd computing research, given the recent surge of public interest in new forms of — and indeed fears about — this thing called artificial intelligence? (read more...)

Country in the Cloud

We are accustomed to think of the “cloud” as a place-less, formless mass of data floating “out there.” It has even been argued that new computer technologies and the movement of companies’ data “to the cloud” might so transform our inherited notions of time, space, and power that it could mean the end of history, geography, and power. The case of “e-Estonia,” however, challenges this notion: Estonia is a country which, unlike people and companies going “to the cloud,” hopes to actually move itself “into the cloud,” with profound implications for how we understand both the cloud metaphor and geopolitics in the digital age. e-Estonia Estonia is a small former Soviet Republic in northern Europe, with a territory of only 45 thousand square kilometers and population of just 1.3 million. Since the collapse of the Soviet Union in 1991, it has made a number of moves towards building a digital state, or, as it is often referred to, an “e-Estonia.” As a Research Fellow with the Centre for Science and Technology Studies of the European University at St. Petersburg, I have been studying how with e-Estonia the “the cloud” actually becomes a new type of space, the contours of which affect other concrete spaces and feed into a new type of nation-building project. (read more...)