Tag: data

The Surveillance Cyborg

Editor’s Note: This post is part of our ongoing series, “Queering Surveillance.”  Surveillance is an embodied experience, both being watched and watching. The sheer number of concert-goers recording Cher’s “Here We Go Again” concert this past year with their phones had them trade singing and dancing for an act of documentation. Whether the recordings are to remember the experience later, share the experience with others, or to simply document one’s presence in that space and at that time, recording the concert on one’s phone becomes an experience in its own right. They are present in the space, but their attention is about both what is happening in the here and now and the recording that filters the experience in the future. Their phones and recordings are central to their embodied experience, fused into one like a cyborg traveling across space and time in the moment. Add to this that countless concert-goers are recording the same concert from their individuated perspective, and thus the concert becomes infinite and virtual—of course, the way Cher was always meant it to be. (read more...)

Towards a Queer Art of Surveillance in South Korea

When is a face not a face? With the launch of the iPhone X that boasts facial recognition capabilities, the individual markers of one’s face tie one’s identity to the security of their phone. Yet it also makes the face complicit in forms of self-surveillance, as it requires definitive facial proof to access one’s phone. It produces the face as evidence of one’s identity that supposedly cannot be forged. In this instance, one continuously uses one’s phone to surveil one’s own identity—with the face becoming a safeguard against potential security breaches. Small-scale, yes, but surveillance need not always be connected to sprawling security apparatuses and institutions. So we ask again: when is a face not a face? When it is used to distinguish a body as a body rather than as an individuated person? With this post, we seek to explore possible answers to this question in the context of South Korea, by focusing on the role of self-surveillance in the politics of queer student activist organizations. (read more...)

Data Doppelgängers and Issues of Consent

Editor’s Note: This is the fifth post in our Law in Computation series. In February 2018, journalist Kashmir Hill wrote about her collaboration with researcher Surya Mattu to make her (Hill’s) home as “smart” as possible. They wanted to see what they could learn about privacy, both from the perspective of living in such a house and from the ‘data fumes’ or ‘data exhaust’ of all these smart appliances themselves. Data fumes or exhaust refer to the traces we leave behind when we interact digitally but also, often, information that we provide to sign-up on digital platforms (gender, location, relationships etc). These traces, when aligned and collated with our daily digital behaviours on social media, e-commerce and Search platforms, are vital to the speculative and dynamic constructions of who we might be. (read more...)

Killer Robots: Algorithmic Warfare and Techno-Ethics

Editor’s Note: This is the fourth post in our Law in Computation series. War is an experiment in catastrophe; yet, policymakers today believe chance can be tamed and ‘ethical war’ waged by simply increasing battlefield technology, systematically removing human error/bias. How does an increasingly artificially intelligent battlefield reshape our capacity to think ethically in warfare (Schwartz, 2016)? Traditional ethics of war bases the justness of how one fights in war on the two principles of jus in bello (justice in fighting war) ethics: discrimination and proportionality, weighted against military necessity. Although how these categories apply in various wars has always been contested, the core assumption is that these categories are designed to be an ethics of practical judgment (see Brown, 2010) for decision-makers to weigh potentially grave consequences of civilian casualties against overall war aims. However, the algorithmic construction of terrorists has radically shifted the idea of who qualifies as a combatant in warfare. What then are the ethical implications for researchers and practitioners for a computational ethics of war? (read more...)

What Would A Techno-Ethics Look Like?

Each year, Platypus invites the recipients of the annual Forsythe Prize to reflect on their award-winning work. This week’s post is from 2017’s winner Sareeta Amrute, for her book Encoding Race, Encoding Class (Duke, 2016). What would a techno-ethics look like? This question persists long after this book, has been written and edited, proofed and published; perhaps it lingers, too, in the minds of its readers as they ponder the pathways and dead-ends digital technologies lay down. Digital technologies build on previous iterations of capital, labor, as well as social and environmental relations, even as they materialize new relations. The part-time visa regimes that most tech companies make use of build on a long history of mobile migrant, free and unfree, labor that has been used to build other kinds of infrastructure, from plantation economies across the British Empire to railroads in the United States and glass-and-steel skyscrapers in Germany. Similarly, the infrastructure of cloud computing relies on previously established military bunkers and railway lines, even as it creates unprecedented demands for energy. An ethical response to these dynamics would produce regimes of care that unite a knowledge of subjects’ evolving relationships with technologies with the goal of reducing spaces of domination created by these technologies. A techno-ethics should provide guidance for those who develop, use, and make polices about technologies. (read more...)

Locating Servers, Locating Politics

When we think of servers, like web servers and Amazon servers, we don’t usually think of them as occupying physical space. We might think of a remote data center, thanks in large part to images that have been circulated by companies like Facebook and Google. But still, these only visualize unmarked buildings and warehouse rooms, showcasing a particular tech aesthetic of colored wires and tubes, and neatly assembled rows of blinking machines (Holt and Vondereau 2015). Such imagery is hardly meant to provide the public with a sense of where servers are actually located. For most day-to-day computer users, it often doesn’t matter at all whether servers are in the U.S. or China or Russia, so long as they work. But server location matters, and many groups of people value certain material benefits and effects of the placement of servers and their own proximity to servers. It matters for online (read more...)

The Heliopolitics of Data Center Security

From Cyberattack to Solar Attack The small-scale cyberattack, characteristic of the late-twentieth century, has long dominated discourses and practices of data center security. Lately, however, these fears are increasingly giving way to concerns over large-scale, existential risks posed by solar activity. Increasing numbers of data centers are going to extreme measures to protect their facilities from solar flares, solar energetic particles and Coronal Mass Ejections – collectively referred to as “space weather”. As data centers are put into circulation with what Georges Bataille famously called the sun’s “superabundance of energy” (1991:29), the act of protecting digital-industrial infrastructure takes on strangely mythical dimensions. In this post, I would like to briefly explore the business end of the mythical dispositif that arises from the surreal and distinctly Bataillean meeting of data centers and the sun. (read more...)

From Technocracy to the Anthropocene: 2016 in Review

#ALSIceBucketChallenge. Deflategate. Twins in Space. Animal Sex Work. The joy of working on Platypus since its inception arises from the many lively, timely, engaged posts that our team of contributing editors and authors bring to the blog each week. Sometimes funny, sometimes serious, often critical and reflective, the blog offers a look into up-and-coming research in anthropology, STS, and related fields on science, tech, computing, informatics, and more. As editor, I’ve delighted in posts that frequently turn commonsense assumptions upside down. For the past two years, I’ve summarized the major themes and highlights in a yearly review post, and have the pleasure of doing so for 2016. Two noteworthy themes threaded through many of last year’s posts: 1) reflections on technocracy, and 2) living in the anthropocene. By technocracy, I mean emerging regimes of data, algorithms, and quantitative living. Melissa Cefkin (Human-Machine Interactions and the Coming Age of Autonomy) opened (read more...)