More than three months ago I wanted to write about the ethnographic butterfly effect and a key informant’s book. But there were strange things happening around games and social media at the time coupled with tragic events in Ferguson, Missouri. So I wrote about those things. It is more than three months later and there are still strange things happening in social media around games and everything in Ferguson, Missouri (and other parts of the United States) is somehow impossibly more sad.
So I’m going to write about the ethnographic butterfly effect and a key informant’s book on the game Jagged Alliance 2. « Read the rest of this entry »
2014 was the year that the major players in qualitative data analysis (QDA) software released native versions for the Mac. For me, the timing was perfect: my dissertation fieldwork in North Dakota had drawn to a close by summer’s end, and my advisor was encouraging me to roll up my sleeves and start working through my material. I wasn’t sure which software package would serve me best, though, and most of the guidance I could find around the Web declined to make head-to-head comparisons. Then, too, I was mindful of the critiques charging that QDA software of any stripe contributes to the mystification of method and amounts to an overpriced means of avoiding index cards, glue, and scissors. I have nothing against index cards, but with operating system issues off the table and student licenses available for under $100, I decided to see if one of these tools could help me to organize my data and get writing.
After sizing up the available options, I downloaded trial versions of two well-known QDA products: NVIVO and Atlas.ti. I knew I was looking for an attractive and intuitive user interface that would allow me to code data in multiple formats: handwritten field notes, interview transcripts, documents I collected in the field. I had little faith that calculating the frequency and co-occurrence of the codes I assigned would unlock some deep, hidden structure of my material. But, taking a cue from one of the founding texts of software studies, I resolved to approach QDA software as an object that “deserves a reciprocation of the richness of thought that went into it, with the care to pay attention to what it says and what it makes palpable or possible.” How, I wondered, would my choice of software package make some kinds of analytical thinking possible and forestall others? What would my choice commit me to? « Read the rest of this entry »
This anthropocene thing has really taken hold. We’re caught in the grips of extinction, visualizing our own end (or at least visualizing the data of our own end), urgently calling upon each other to act, convincing ourselves that we have the power – scientifically, technologically and maybe politically – to do something about it. We can organize marches, resurrect species, bank seeds, manipulate clouds, make videos of collapsing ice caps, drive hybrids, fly to space stations. Of course, our worry over the planet’s health is narcissistic, in the end. It’s not the planet’s survival we are worried about. It’s our own, human future.
These anthropocentric worries over human continuity make for a strange tension in the theoretical moment: they are appearing just as a range of disanthropic moves have attempted to decenter and displace the human as subject, agent, or figure: Actor-Network Theory, Post-Humanism, multi- and interspecies analytics, Object Oriented and other “ontological” turns, speculative realism and new materialism, to name a few. Despite this turn away from the human, however, the final disappearance of the species seems to mark a limit for most disanthropic theorists; few welcome the possibility of human extinction. Disanthropy yes, misanthropy no. « Read the rest of this entry »
For me, Christmas sometimes comes twice a year. The release of the National Science Foundation’s biennial quantitative data report to Congress on the state of American science, engineering, and technology, The Science and Engineering Indicators (SEI), made 2014 one of those years. This year, the SEI has mostly good news about public attitudes and American’s understanding of science and technology. Good enough, I think, to merit sharing here.
For starters, Americans seem to be getting a bit more science news in their lives. For example, the number of minutes of annual nightly weekday television newscast airtime devoted to science, space, and technology has averaged about 2% of broadcast network (ABC, CBS, and NBC) news between 2000 and 2012, but crept up above 3% in the most recently reported data. « Read the rest of this entry »
By Beth Reddy and Kim Fortun
Since 2012, the EcoEd Research Group (http://sustainabilityresearch.wp.rpi.edu/k-12-resources/eco-ed-program/) has run over thirty workshops in New York. The group brings faculty and college students (mostly from Rensselaer Polytechnic Institute) together with K-12 students in collaborative environmental education. EcoEd workshops have focused on green building, environmental photography, and county-level sustainability assessments, among other topics – engaging both the environment and education in new ways.
Dr. Kim Fortun is an anthropologist and professor in the Department of Science and Technology Studies at RPI, and has been a key participant in the development of EcoEd. I sent her a few simple questions about what EcoEd is up to and how she’s thinking about this kind of work. Her responses, below, touch on issues that won’t be unfamiliar to many CASTAC readers: experiments in ethnography and in the classroom that engage with what Fortun calls “late industrialism” in creative and critical ways.
Fortun: We think through what we have learned about environmental problems – how they play out, the conceptual and cultural challenges they pose – and then try to observe, read about and think through how environmental problems are out of synch with the education and thinking of U.S. kids – so that we can design and deliver K-12 curriculum that speaks to both. It is one way to make ethnographic knowledge “relevant;” it is one of many possible forms of activism.
« Read the rest of this entry »
In October 2014, New York University’s Center for Urban Science and Progress (CUSP) unveiled the Urban Observatory, as part of an urban informatics initiative for monitoring, recording, and modeling the actions and nonactions of New York City. Inspired by research methods in observational astronomy, the scientists at CUSP placed an 8 megapixel camera on top of a building in Downtown Brooklyn, which shoots one panoramic, long-distance image of Lower and Midtown Manhattan every 10 seconds. Using the Urban Observatory and a network of similar sensors, the scientists at CUSP are attempting to capture what they call “the pulse of the city,” formulating massive data sets that provide information regarding various domains of everyday life, ranging from energy efficiency to the detection of toxic releases. As urban informatics professionals, they imagine that the collected data will serve as “raw material” for policy making — once they have access to this raw material, the CUSP scientists will be able to model their predictions, and hope to ultimately (somehow) manufacture the steps required to reduce electricity consumption in office buildings, or to generate emergency responses to hazardous substances.
« Read the rest of this entry »
As of late October, nearly 60% of California faces conditions of “exceptional drought,” a category that the National Drought Mitigation Center refers to as indicating “exceptional and widespread crop/pasture losses,” with “shortages of water in reservoirs, steams and wells creating water emergencies”. Mandatory conservation measures are in effect across the state, and Governor Brown recently signed a Sustainable Groundwater Management Act that will tighten regulation of California’s notoriously under-managed groundwater supply.