Tag: ethics

Dumbwaiters and Smartphones: The Responsibility of Intelligence

“I don’t have to drink alone,” she paused for comedic effect, “now that I have Alexa.” Thus was the punchline of a story told by a widowed octogenarian at a recent wedding. Alexa is a mass-produced personality that can play music, suggest items for purchase, monitor consumption and health habits, or, like any good friend, just listen. While all these tasks could be performed in silence with various algorithmic appliances, Alexa and her cousins from Google and Apple are imbued with a perceived autonomy directly stemming from their capacity for vocalization. Speech, it seems, beckons the liberation of abiotic materials from their machinic programming. (read more...)

Critiquing Big Data in China and Beyond

“I do think that the Internet truly makes us feel the world can become a smaller place,” an interlocutor, whom I will call Bo, told me in his parents’ home in Shijiazhuang, a city in China’s Hebei Province. It was late 2014, and he was studying to become a filmmaker in Beijing. During our conversation, he told me about discovering Google Earth when he was younger, recalling how, suddenly, he could “see any place in the world” from the comfort of his home. He could zoom in to explore a mountain village in Iceland, a house, and even a village dog, feeling that, without Google Earth, he would never have been able to visit such faraway places. The experience might have been virtual (xuni), he mused, but it had also been real (zhenshi). His account expressed a kind of enthusiasm for the digital that I often encountered during my ethnographic fieldwork on digital opportunity in China. However, his story was made especially compelling by the oppressive smog plaguing the city outside. While neighboring buildings disappeared in a toxic fog, he expressed his excitement about “seeing” a digitally mediated “Google Earth.” (read more...)

Data Doppelgängers and Issues of Consent

Editor’s Note: This is the fifth post in our Law in Computation series. In February 2018, journalist Kashmir Hill wrote about her collaboration with researcher Surya Mattu to make her (Hill’s) home as “smart” as possible. They wanted to see what they could learn about privacy, both from the perspective of living in such a house and from the ‘data fumes’ or ‘data exhaust’ of all these smart appliances themselves. Data fumes or exhaust refer to the traces we leave behind when we interact digitally but also, often, information that we provide to sign-up on digital platforms (gender, location, relationships etc). These traces, when aligned and collated with our daily digital behaviours on social media, e-commerce and Search platforms, are vital to the speculative and dynamic constructions of who we might be. (read more...)

Killer Robots: Algorithmic Warfare and Techno-Ethics

Editor’s Note: This is the fourth post in our Law in Computation series. War is an experiment in catastrophe; yet, policymakers today believe chance can be tamed and ‘ethical war’ waged by simply increasing battlefield technology, systematically removing human error/bias. How does an increasingly artificially intelligent battlefield reshape our capacity to think ethically in warfare (Schwartz, 2016)? Traditional ethics of war bases the justness of how one fights in war on the two principles of jus in bello (justice in fighting war) ethics: discrimination and proportionality, weighted against military necessity. Although how these categories apply in various wars has always been contested, the core assumption is that these categories are designed to be an ethics of practical judgment (see Brown, 2010) for decision-makers to weigh potentially grave consequences of civilian casualties against overall war aims. However, the algorithmic construction of terrorists has radically shifted the idea of who qualifies as a combatant in warfare. What then are the ethical implications for researchers and practitioners for a computational ethics of war? (read more...)

What Would A Techno-Ethics Look Like?

Each year, Platypus invites the recipients of the annual Forsythe Prize to reflect on their award-winning work. This week’s post is from 2017’s winner Sareeta Amrute, for her book Encoding Race, Encoding Class (Duke, 2016). What would a techno-ethics look like? This question persists long after this book, has been written and edited, proofed and published; perhaps it lingers, too, in the minds of its readers as they ponder the pathways and dead-ends digital technologies lay down. Digital technologies build on previous iterations of capital, labor, as well as social and environmental relations, even as they materialize new relations. The part-time visa regimes that most tech companies make use of build on a long history of mobile migrant, free and unfree, labor that has been used to build other kinds of infrastructure, from plantation economies across the British Empire to railroads in the United States and glass-and-steel skyscrapers in Germany. Similarly, the infrastructure of cloud computing relies on previously established military bunkers and railway lines, even as it creates unprecedented demands for energy. An ethical response to these dynamics would produce regimes of care that unite a knowledge of subjects’ evolving relationships with technologies with the goal of reducing spaces of domination created by these technologies. A techno-ethics should provide guidance for those who develop, use, and make polices about technologies. (read more...)

How Not to Be a Bot: An Interview with Natasha Dow Schüll

Natasha Dow Schüll is a cultural anthropologist and associate professor in New York University’s Department of Media, Culture, and Communication. In her 2012 book Addiction by Design, she explores how electronic slot machines facilitate the compulsive behavior of gambling addicts through their digital interfaces. Informed by extensive ethnographic research among designers and users, the book details how the interrelationship between humans and digital media is engineered and experienced, and how it relates to the demands and logics of life in contemporary capitalist society. In current research, Schüll has shifted her focus to the design and use of digital self-tracking technologies. Her recent article, “Abiding Chance: Online Poker and the Software of Self-Discipline,” which provides the starting point for the following interview, bridges her first and second projects. Adam Webb-Orenstein: What brought you to focus on players of online poker and how is this work related to the concerns of your earlier research on slot machine addicts? Natasha Dow Schüll: My approach as an anthropologist is to explore how technology mediates cultural demands in human experience, and slot machine play and online poker play are two cases I’ve examined to get at that. I see both forms of play as responses to contemporary life but the ways in which they are mediated by technology, and the experiences they afford, differ. (read more...)

Remembering David Hakken

This week, the CASTAC community received the sad news that Professor David Hakken had passed away. Hakken was Director of the Social Informatics Program at The University of Indiana. Trained as an anthropologist, Hakken conducted research at the intersection of ethnography and cyberspace. He was concerned about how digital technologies and culture are continually co-constructive. His prolific career included publication of a recent book co-authored with Maurizio Teli and Barbara Andrews entitled, Beyond Capital: Values, Commons, Computing, and the Search for a Viable Future (Routledge, 2015). Hakken presciently focused on critical areas emerging at the intersection of digital anthropology and science and technology studies. The outpouring on social media from his colleagues and former students has been truly touching and shows the depth of his impact on the community. Hakken was a principal founding member of CASTAC. As a pioneer in anthropological studies of computing in the early 1990s, Hakken initiated action on creating a committee devoted to particular concerns of anthropologists in science and technology studies. He was also a friend to the CASTAC Blog. He helped lend our fledgling endeavor gravitas by writing posts and graciously being interviewed. Please join me in honoring his life and work by enjoying this gem from the Platypus vault, which originally appeared on the blog in January 2013. I was honored to have the opportunity to interview him and hear more about his big ideas on big data. I first met David at a CASTAC summer conference (remember those?) nearly twenty years ago. Over the years, I personally benefited from his wise mentoring and vibrant disposition. I was deeply saddened to hear of his passing. He will be greatly  missed. Colleagues who would like to share public remembrances about David for a longer tribute post should contact the editor, Jordan Kraemer. Patricia G. Lange May 6, 2016 (read more...)

Decolonizing Design Anthropology with Tinn

In fall 2014, I began building Tinn, a health data tracking application for people living with tinnitus. I approached building Tinn as an opportunity to explore what a socially conscious, feminist, and anti-colonial process for developing mobile applications might look like. The idea of pairing design, building, and anthropology is hardly all that innovative; “design anthropology,” a subfield of cultural anthropology focused on empirical studies of use cultures, has been around since the 1980’s (at least). What sets Tinn apart, however, is my commitment to working with communities of color in Portland, OR, that have been alienated by and from the technology and application development industry because of intersecting and juxtaposed systems of gender, racial, and access inequality. Perhaps one of the central epistemic problematics for the project, then, can be posed as a question: Can we decolonize design anthropology, and to what success ordegree? What might this entail? Decolonization is slippery. My “academic anthropology” is a historical ethnography of the ways media scientists after World War II shaped (and continue to shape) the gendered contours of global media flows in the British Empire, with a particular focus on sound, listening, and citizen-subject formation in Gibraltar. My work on Tinn gave me the opportunity to transform my critique of the violence done by media scientists into the starting point for a more ethical approach to user experience research with marginalized Native American, Latin@, migrant, and African American Oregonians living with tinnitus in Portland. Yet what I thought of as decolonizing and what my collaborators thought of as decolonizing was at odds in some instances. For one, while decolonizing anthropology attempts to re-balance the scales of recognition and redistribution in the research process, it is much more difficult to reconcile the colonial histories of designer and designed for. Yet, for my collaborators, this division didn’t actually matter. As Nephi, one of my politically astute collaborators, put it, “the ethics are too heady when we need material help. Someone has to do that work. It’s you today.” While Tinn began with my commitment to making the project open source (as resistance to the privatization and commoditization of collaboration — it’s not that simple), Nephi protested. “My labor is there, too. You’d give it away for free? There’s a long history of white men giving away the work of my people for free.” I said it wasn’t simple. While there were times where my collaborators and I didn’t agree on what constituted decolonization, we did agree on one thing: data poses a particularly tricky sociohistorical landscape for thinking about recognition, redistribution, and reconciliation. The rest of this post is dedicated to the complications of tracking data about tinnitus, and tracking data about vulnerable and/or marginalized users with tinnitus. (read more...)