In the opening scene of the recent NOVA documentary, Rise of the Drones, the narrator ominously tells us that a revolution is underway. “Are we” he leads, “approaching a time when movies like The Terminator become our reality?” A clip from the Terminator III, with two humans cowering in fear whispering, fades in and out, “Oh God. It’s the machines. They’re starting to take over…” The narrator continues, “a time when machines fly, think, and even kill on their own?”
My dissertation research is focused on how technologies used in remote warfare are changing conceptions of warfare and experiences of agency within human-computer systems. These technologies include what the Air Force prefers to call RPAs, Remotely Piloted Aircraft, also known as UAVs, Unmanned Aerial Vehicles, and more commonly known as drones. While my fieldwork looks at individual experiences and institutional narratives within military communities, a larger backdrop of my research is popular conceptions of robotics and drones informed by mass media coverage and the long standing cultural paranoia that technological creations, once unleashed in the world, will escape human control and stop at nothing short of humankind’s destruction. In an important sense, these extreme narratives obscure more consequential and immediate discussions that should be had about robotics and artificial intelligence, and more specifically, the use of unmanned systems.
It will likely not be a surprise to CASTAC readers that popular understandings of technologies, especially in the field of robotics, are quite different from the on-the-ground reality of technological capabilities. Lucy Suchman’s blog, Robot Futures, has looked at numerous examples of such misconceptions. Nor is it unexpected that the terms “autonomous” and “unmanned,” so frequently applied to robotic technologies, are misnomers and obscure the very real human labor involved, from producing and operating hardware (from flying to interpreting data) to coding software. From Marx’s Capital to the more recent work of Shoshana Zuboff, Lucy Suchmann and many others, research has shown that advances in automation and robotics do not so much do away with the human but rather obscure the ways in which human labor and social relations are reconfigured.
And yet, the circulation of terms like autonomous and unmanned continues to frame much of the public discussion surrounding robotics in areas as diverse as the military and healthcare (see, for instance this month’s article in The Atlantic, The Robot Will See You Now). Although drones are termed “unmanned” aerial vehicles, every operation requires a team of at least three human Air Force personnel and sometimes a team of fifteen or more. (The precise nature of CIA operated drones, a different program than that of the Air Force, is not officially public.) This is in addition to those people who code the software and produce the computer and drone hardware, modes of labor that have been traditionally marginalized in conceptions of computing. And this is of course also in addition to the human lives on the ground, over which the drones fly.
The final section of the NOVA documentary looks at new sensing technologies that are being developed. The term autonomous is used numerous times and yet never defined. This is a problem because computer science understandings of autonomy differ substantially from popular understandings of the term. In the first instance, an autonomous robot indicates a sophisticated level of awareness of its surroundings and ability to react taking this awareness into account. Popular understandings of autonomous would assume that the robot could act entirely on its own and of its own accord, like Cylons or the Terminator.
Although Paul Eremenko, the Deputy Director of the Tactical Technology Office within DARPA, the Pentagon’s blue-sky research and development program, says in an interview clip, “I think if we were to ask most autonomy researchers or most AI researchers about if the “rise of the machines” type scenario is a real concern, their response would be, ‘We should be so lucky.’ In fact, if we could get little slivers of that kind of adaptive or cognitive capability that would be a very significant breakthrough over where we stand today.” Yet, the subtext of the documentary and the visual rhetoric suggest otherwise. A low-pitch sound pulses ominously throughout the documentary. Drones fly through the sky, isolated from their ties to human work and intention. (In fact, most media stories illustrate stories about drones with photographs of the machine in flight, and humans are absent from the frame.) In the final scene, the narrator says, “The ability to respond to the unknown may be the final hurdle if drones are ever able to fully replace manned planes… and start making decisions on their own.” Ominous indeed. A clip of interview with Abe Karem, the main engineer of the Predator platform, plays in response, “I think we’re far. But let me say, I’m the last guy who says impossible.” The tone of the documentary tends to push the audience toward this extreme paranoia even as the narrator reassures us that a machine still can’t do what a human can.
Although it feels like a false and placating reassurance, it is true. Humans are implicated in every moment of remote warfare. My hope is that my research can bring greater understandings to multiple communities about the social implications of remote warfare. I look forward to sharing more of my research as it continues in the coming months.
Along those lines, I also wanted to take this opportunity to let the CASTAC community know about a community that is beginning to formally coalesce, thanks to the organizing of Zoe Wool and Ken MacLeish, the Military and Security Critical Interest Group, which joins folks working ethnographically with critical conceptualizations of military life and security institutions, technologies and populations or related issues. If you’d like more information or would like to join the listserve, email MSCIGListserv@gmail.com.
I’ll leave you with a few links for further exploration:
Marcel LaFlamme, a fellow graduate student in the dissertation phase, on the emerging UAV industry in North Dakota.
Jessica Riskin’s classic essay, “Eighteenth Century Wetware” which explores the socio-historical specificity of what constitutes the imitation of life by machines.
As a new year’s resolution for 2012, I started a wordpress blog titled Robot Futures (see http://robotfutures.wordpress.com/about-this-blog/). The idea was to do some writing that could be more timely and critical than journal publications allow (though the deadlines of the latter and the rest of academic life have limited my posts!) about developments in robotics and artificial intelligence, particularly in the area of remotely-controlled war fighting. Increasingly distressed by the use of armed drones (see Medea Benjamin’s brilliant new book Drone Warfare: Killing by remote control, 2012, OR Books) and the arming of robots (including the 710 Warrior by Boston-based iRobot, makers of the Roomba vacuum cleaner), I’ve begun to focus my research on what James der Derian (Virtuous War, 2009) has identified as the military-industrial-media-entertainment network (MIME-NET), particularly as it has emerged over the past twenty years within the United States and Britain.
As someone who has made a concerted effort to avoid any involvement in military worlds, I face the challenge of how to locate myself as a researcher, and particularly as an anthropologist committed to an understanding of practices in situ. Following the MIME-NET thread and my own at-homeness in worlds of research and development, I paid a visit earlier this year to the Mixed Reality Lab at the University of California’s Institute for Creative Technologies. Given an extensive tour by the Lab’s director, I also had the great good fortune to meet a former member of one of the Lab’s premiere projects, Flatworld, who has provided me access to an extraordinary archive of project materials. While ethnographic work remains as an aspiration, archive fever offers a rich beginning.
The Flatworld project brings together practitioners from the Hollywood film industry, gaming and other modes of immersive computing to “create a modular and transportable mixed reality environment that can simulate a variety of real-world locations”; specifically, locations in which US military forces are currently engaged. Military analysts agree that despite their promise of illumination, information and communications technologies have intensified rather than dissipated what nineteenth-century military theorist Carl von Clausewitz famously described as the ‘fog of war,’ regenerating it in a matrix of ever faster and noisier channels of transmission. The enduring problem of ‘situational awareness,’ defined by military commentators as “the ability to maintain a constant, clear mental picture of relevant information and the tactical situation including friendly and threat situations” (Dostal 2001) is the tether by which I connect my own previous research to the indigenous preoccupations of contemporary war fighting (more on this in an article in press in the journal Mediatropes).
For the AAA in November, I’ll be part of a CASTAC/SCA session titled ‘Warfare and Healthcare: Action at a distance and bodies in contact’ along with Joe Dumit, Hugh Gusterson, Caren Kaplan, and Rachel Prentice. The papers consider projects to extend human capacities for action at a distance, and forms of proximate, embodied co-presence that characterize realities ‘on the ground,’ across the seemingly disparate but always connected arenas of war fighting and healthcare. Our focus is on technologies in their broadest anthropological sense, including techniques as well as devices, and claims about technology as much as configurations of hardware and software. These are the topics that join us as members of CASTAC, and I look forward to continuing the conversation with you in San Francisco.