Tag: data

Period Tracking Apps: Something Old, Something New

They’re sleek and colorful, “fun and easy”, full of icons and dials. Period tracking apps, or “menstruapps,” are an increasingly common way a large segment of the population attends to their health and embodied experience of menstruation. In some ways, these apps are part of very recent trends towards the Quantified Self, the datafication of health, and reliance on biometric tracking devices to “optimize” one’s habits. In other ways, they evoke older legacies of feminist health care, notably the Our Bodies, Ourselves movement begun in 1969. Fifty years later, what does it mean to use technology to “understand how your body works”, as Clue advertises, or “take control of your body,” the tagline for Natural Cycles, which are two of the most popular menstruapps? (read more...)

Clinical Data in the U.S. Department of Veterans Affairs: Ethnographic Engagements

By: Peter Taber, Nicholas Rattray, Lauren Penney, Megan McCullough and Samantha Gottlieb This post emerged from a 2018 Society for Applied Anthropology panel on anthropological engagements with health data in the U.S. Department of Veterans Affairs (VA). Serving over 9 million enrollees with a current federal budget of USD68 billion, the VA is an important testing site for digital healthcare infrastructure, as it has been for several decades. The panel brought our VA research and quality improvement (QI) efforts targeting the electronic health record (EHR) and other digital infrastructure into dialog with existing work on the social lives of data and algorithms, as well as the broader concerns of medical anthropology and STS in an era of the “datafication of health” (Ruckenstein and Schüll 2017). Extracts from our conversation, presented below, are taken from a follow-up video call exploring these issues. (read more...)

The Surveillance Cyborg

Editor’s Note: This post is part of our ongoing series, “Queering Surveillance,” and was co-written with Alexander Wolff. Surveillance is an embodied experience, both being watched and watching. The sheer number of concert-goers recording Cher’s “Here We Go Again” concert this past year with their phones had them trade singing and dancing for an act of documentation. Whether the recordings are to remember the experience later, share the experience with others, or to simply document one’s presence in that space and at that time, recording the concert on one’s phone becomes an experience in its own right. They are present in the space, but their attention is about both what is happening in the here and now and the recording that filters the experience in the future. Their phones and recordings are central to their embodied experience, fused into one like a cyborg traveling across space and time in the moment. Add to this that countless concert-goers are recording the same concert from their individuated perspective, and thus the concert becomes infinite and virtual—of course, the way Cher was always meant it to be. (read more...)

Towards a Queer Art of Surveillance in South Korea

Editor’s Note: This post was co-written with Timothy Gitzen. When is a face not a face? With the launch of the iPhone X that boasts facial recognition capabilities, the individual markers of one’s face tie one’s identity to the security of their phone. Yet it also makes the face complicit in forms of self-surveillance, as it requires definitive facial proof to access one’s phone. It produces the face as evidence of one’s identity that supposedly cannot be forged. In this instance, one continuously uses one’s phone to surveil one’s own identity—with the face becoming a safeguard against potential security breaches. Small-scale, yes, but surveillance need not always be connected to sprawling security apparatuses and institutions. So we ask again: when is a face not a face? When it is used to distinguish a body as a body rather than as an individuated person? With this post, we seek to explore possible answers to this question in the context of South Korea, by focusing on the role of self-surveillance in the politics of queer student activist organizations. (read more...)

Data Doppelgängers and Issues of Consent

Editor’s Note: This is the fifth post in our Law in Computation series. In February 2018, journalist Kashmir Hill wrote about her collaboration with researcher Surya Mattu to make her (Hill’s) home as “smart” as possible. They wanted to see what they could learn about privacy, both from the perspective of living in such a house and from the ‘data fumes’ or ‘data exhaust’ of all these smart appliances themselves. Data fumes or exhaust refer to the traces we leave behind when we interact digitally but also, often, information that we provide to sign-up on digital platforms (gender, location, relationships etc). These traces, when aligned and collated with our daily digital behaviours on social media, e-commerce and Search platforms, are vital to the speculative and dynamic constructions of who we might be. (read more...)

Killer Robots: Algorithmic Warfare and Techno-Ethics

Editor’s Note: This is the fourth post in our Law in Computation series. War is an experiment in catastrophe; yet, policymakers today believe chance can be tamed and ‘ethical war’ waged by simply increasing battlefield technology, systematically removing human error/bias. How does an increasingly artificially intelligent battlefield reshape our capacity to think ethically in warfare (Schwartz, 2016)? Traditional ethics of war bases the justness of how one fights in war on the two principles of jus in bello (justice in fighting war) ethics: discrimination and proportionality, weighted against military necessity. Although how these categories apply in various wars has always been contested, the core assumption is that these categories are designed to be an ethics of practical judgment (see Brown, 2010) for decision-makers to weigh potentially grave consequences of civilian casualties against overall war aims. However, the algorithmic construction of terrorists has radically shifted the idea of who qualifies as a combatant in warfare. What then are the ethical implications for researchers and practitioners for a computational ethics of war? (read more...)

What Would A Techno-Ethics Look Like?

Each year, Platypus invites the recipients of the annual Forsythe Prize to reflect on their award-winning work. This week’s post is from 2017’s winner Sareeta Amrute, for her book Encoding Race, Encoding Class (Duke, 2016). What would a techno-ethics look like? This question persists long after this book, has been written and edited, proofed and published; perhaps it lingers, too, in the minds of its readers as they ponder the pathways and dead-ends digital technologies lay down. Digital technologies build on previous iterations of capital, labor, as well as social and environmental relations, even as they materialize new relations. The part-time visa regimes that most tech companies make use of build on a long history of mobile migrant, free and unfree, labor that has been used to build other kinds of infrastructure, from plantation economies across the British Empire to railroads in the United States and glass-and-steel skyscrapers in Germany. Similarly, the infrastructure of cloud computing relies on previously established military bunkers and railway lines, even as it creates unprecedented demands for energy. An ethical response to these dynamics would produce regimes of care that unite a knowledge of subjects’ evolving relationships with technologies with the goal of reducing spaces of domination created by these technologies. A techno-ethics should provide guidance for those who develop, use, and make polices about technologies. (read more...)

Locating Servers, Locating Politics

When we think of servers, like web servers and Amazon servers, we don’t usually think of them as occupying physical space. We might think of a remote data center, thanks in large part to images that have been circulated by companies like Facebook and Google. But still, these only visualize unmarked buildings and warehouse rooms, showcasing a particular tech aesthetic of colored wires and tubes, and neatly assembled rows of blinking machines (Holt and Vondereau 2015). Such imagery is hardly meant to provide the public with a sense of where servers are actually located. For most day-to-day computer users, it often doesn’t matter at all whether servers are in the U.S. or China or Russia, so long as they work.     But server location matters, and many groups of people value certain material benefits and effects of the placement of servers and their own proximity to servers. It matters (read more...)