The CASTAC community joined together in 2012 to launch this blog and begin dialogue on contemporary issues and research approaches. Even though the blog is just getting off the ground, certain powerful themes are already emerging across different projects and areas of study. Key themes for the coming year include dealing with large data sets, connecting individual choices to larger economic forces, and translating the meaning of actions from different realms of experience.
Perhaps the most visible trend on our minds right now involves dealing with scale. How can anthropologists, ethnographers, and other STS scholars address large data sets and approaches in research and pedagogy, while also retaining an appropriate relationship to the theories and methods that have made our disciplines strong? As we look ahead to 2013, it would seem that a big question for the CASTAC community involves finding creative and ethical ways to deal with phenomena that range from the overwhelmingly large to the microscopic, in order to provide insight and serve our constituents in research and teaching.
Discussing large-scale forays into education and research
In the past two weeks in her posts on MOOCs in the Machine, Jordan Kraemer, our dedicated Web Producer, has been reflecting on how higher education is grappling with MOOCs, or “massive open online classes,” which open up opportunities to those who have been shut out of traditional elite institutions. At the same time, serious questions emerged about the ramifications of trade-offs between saving money and providing high-quality education. Kraemer points out that much of the debate ties into larger arguments about why it is that people have been shut out of education and how concentration of wealth and the neoliberalization of the university are challenging the old equation of supporting open-ended research that ultimately strengthens and supports teaching. She proposes new forms of graduate education in which recent graduates are supported by their universities with teaching jobs, to complete teaching experience, transfer teaching loads from full-time faculty, and support graduate students as they transition into full-time positions.
Part of the issue with MOOCs has to do with questions of scale, and how or whether individual lectures and course preparation can be generalized to large-scale audiences in ways that provide solid instruction without compromising quality. Higher-education depends upon staying current with research, and so far, we do not have enough evidence to support the idea that MOOCs will work or will address all of the concerns emerging from the neoliberalization of the academy. Those of us interested in online interaction and pedagogy will be watching this space closely in the coming year.
Questions of scale also came into play with Daniel Miller’s discussion of doing Eight Comparative Ethnographies. Miller argues that doing several ethnographies at the same time will enable comparative questions that are not possible when investigating one site alone. He provides an example from social network sites. He asks, to what extent are particular behaviors the product of a type of site, a single site, or the intersection of cultures in which a site is embedded? Is the behavior so because it is happening on Facebook or because the participants are Brazilian? A comparative study enables a level of analysis that is more inclusive than that derived from a single study. Expanding scale without compromising the traditions and benefits of ethnographic work remains a challenge for these and other large-scale projects in the future, which have the potential to provide crucial insights.
Making small-scale choices visible
As one set of researchers bring up issues with regard to enormously large-scale education and research, other STS participants on The CASTAC Blog are dealing with the opposite issue, which involves grappling with how the dynamics of extremely personal and individualistic acts—such as the donation of sex cells—interact with large-scale economic and cultural forces. In her post on The Medical Market for Eggs and Sperm, Rene Almeling, the winner of the 2012 Forsythe Prize, provides an inside look into how human beings’ donations of sex cells are connected to much larger economic forces that play out differently for women and men. Women are urged to regard egg donation as a feminine act of a gift; men are encouraged to see donation as a job. Almeling ties our understanding of what might be an individual act into economic forces, as well as gendered, cultural expectations about families and reproduction. Gendered framings of donation not only impact the individuals who provide genetic material, but also strongly influence the structure of the market for sex cells.
Another key issue on our minds has to do with dealing with personal responsibility and showing how individual choices impact much larger social and economic forces in finance, computing, and going green.
In his post, On Building Social Robustness, David Hakken raises the question of how individuals contributed to large-scale economic and social crises, such as the recent disasters in the world of finance. His project is informed by work that is trying to deal with the first “5,000 years” in the history of debt. He proposes developing a notion of social robustness, parallel to the idea of the technical notion of robustness in computer science.
His work provides an intriguing use of ideas from people whom we study, and applying them as an inspiration for making social change. When Hakken asks about the extent to which computing professionals are ethically responsible for the financial crisis, he is proposing a way of asking how a large-scale disaster can be traced to more individual, micro-units of action. By investigating these connections, his project informs a conversation that is increasingly picking up steam in the area of the anthropology of value.
Hakken’s reflections are especially haunting as he warns of the difficulties of building a career in anthropology and STS. As he is moving towards retirement, his perspective is especially valued in our community. As an antidote to more provincial institutional perspectives, he urges a more consolidated and community approach that involves supporting each other in doing the important work that the CASTAC community has the potential to achieve.
Questions of scale and responsibility are once again intertwined in David J. Hess’s post on Opening Political Opportunities for a Green Transition. Hess points out that a non-partisan political issue has become partisan despite the fact that the planet has now surpassed a carbon dioxide level that it has not had for at least 800,000 years! But because change is imperceptibly slow to the human eye, politics is allowed to complicate change. Hess has worked to investigate what he calls the “problem behind the problem,” which involves the lack of political will to address environmental sustainability and social fairness, which considerably worsens the environmental problem itself. He provides real solutions through an ambitious three-part series of books that propose “alternative pathways” or social movements centered on reform in part through the efforts of the private sector.
Notably, personal experiences in anthropology inform Hess’s work. Although he is in a sociology department and in an energy and environment institute, he points out that an anthropological sensibility continues to inform his thinking. While the discourse on these issues has traditionally revolved around a two party system, Hess’s more anthropological approach makes visible other ideologies such as localism and developmentalism that may pave a more direct path to “good green jobs” and a more sensitive and responsible green policy. Again interacting with questions of scale, Hess’s notions of responsibility are grounded in understanding the “broad contours” of the “tectonic shifts” of ideology and policy that are underway in working toward a green transition in the United States and around the world. Without real action, however, his prognoses remains pessimistic.
Translating phenomena across different realms of experience
A theme that also emerged from our nascent blog’s initial posts had to do with understanding the ramifications of processing one realm of experiencing by using metaphors and concepts from another. In her post on the Anthropological Investigations of MIME-NET, Lucy Suchman explores the darker side of entertainment and its relationship to military applications. She investigates how information and communication technologies have “intensified rather than dissipated” what theorists have described as the “fog of war.”
The problem is partly one of translation. How is it possible to maintain what military strategists call “situational awareness,” which has to do with maintaining a constant and accurate mental image of relevant tactical information. Suchman is studying activities such as The Flatworld Project, which bring together practitioners from the Hollywood film industry, gaming, and other models of immersive computing to understand these dynamics. Such a project also involves analyzing how such approaches “extend human capacities for action at a distance,” and present ethical challenges to researchers as they grapple with military realms and connecting seemingly disparate but interrelated areas such as war and healthcare.
Lisa Messeri’s post, Anthropology and Outer Space, offers an absolutely fascinating look into human conceptualization of place. She asks, why should earthlings be concerned about what is happening on Mars? Her work focuses on how “scientists transform planets from objects into places.” Significant milestones in space exploration such as the passing of Venus between the Earth and the Sun (not scheduled to do so again until 2117) and the landing of the Mars rover, Curiosity, provide rich areas to mine for understanding cultural notions of place and human exploration. Curiosity has its own Twitter account (!) and tweets freely about its experience of “springtime” in its southern hemisphere. Messeri argues that this kind of language “bridges” our worlds in that Curiosity somehow seems to experience something that is familiar to humans—springtime. Scientists are now studying things that are so far away that telescopes cannot take an image of them. Somehow, these “invisible” objects become familiar and complex. Planets begin to seem like places because of the way in which language “makes the strange familiar,” and bridges the experience between events on an exoplanet and life on Earth.
Astronomers become place makers, and observing these processes shows how spaces become “social” even as Messeri argues, “humans will never visit such planetary places.” Messeri shows how such conceptualizations can lead to the spread of erroneous scientific rumors that get reported on national news organizations. Her work shows not only how knowledge production is compromised by the use of such metaphors but also provides an intriguing look at how humans process invisible objects through the cultural production of imagined place.
Tune in next week!
Given that questions of scale were on our minds in 2012, it is especially fitting that we launch 2013 with a discussion about Big Data, and the challenges and opportunities that emerge when entities collect and combine huge data sets that are far too large to handle through ordinary coding schemes or desktop databases. Social scientists, technologists, and other researchers must grapple with numerous issues including legibility, data integrity, ethics, and usability. I am particularly pleased that David Hakken agreed to be interviewed by The CASTAC Blog to discuss his views. Next week, he provides fascinating insights into what the future holds for dealing with Big Data!
Before signing off, I would like to thank everyone for their participation in The CASTAC Blog, especially those who wrote posts, left comments, read articles, and tweeted our posts to the world. I very much appreciated everyone’s participation. The richness of the posts makes it too difficult to adequately cover all the content of the past year in one commentary, but rest assured that everyone’s post is contributing to the conversation and is valued by the CASTAC community.
In an effort to include more voices and keep a continuing flow of content, The CASTAC Blog is now seeking a core group of “frequent” contributors to keep pace with new developments in this space in 2013. Notice that I use the term “frequent” sparingly—even a few posts throughout the year makes you a frequent contributor. Please consider sharing your thoughts and views with the CASTAC community. If you would like to join in, please email me at: email@example.com.
I look forward to an interesting and productive year ahead!
Patricia G. Lange
The CASTAC Blog
Since my undergraduate days, I’ve both aspired to do feminist anthropology and been fascinated with people’s everyday engagement with mundane (and extraordinary) technologies. I can’t express how thrilled and honored I am to receive the 2013 Diana Forsythe Prize for The Life of Cheese: Crafting Food and Value in America (University of California Press), my ethnography of American artisanal cheese, cheesemaking and cheesemakers. I do not present a summary of the book here (if interested, the Introduction is available on the UC Press website: http://www.ucpress.edu/book.php?isbn=9780520270183). Instead, I alight on some of the STS-related themes that run throughout my book (and especially Chapter 6): regulating food safety and promoting public health, artisanal collaboration with microbial agencies, and the mutual constitution of production and consumption.
Real Cheese or Real Hazard — or Both?
By U.S. law, cheese made from raw (unpasteurized) milk, whether imported or domestically produced, must be aged at least 60 days at a temperature no less than 1.7˚C before being sold. The 60-day rule intends to offer protection against pathogenic microbes that might thrive in the moist environment of a soft cheese. But while the U.S. Food and Drug Administration (FDA) views raw-milk cheese as a potential biohazard, riddled with threatening bugs, fans see it as the reverse: a traditional food processed for safety by the metabolic action of good microbes—bacteria, yeast, and mold—on proteins and carbohydrates in milk. The very quality that gives food safety officials pause about raw-milk cheese — that it is teeming with an uncharacterized diversity of microbial life — makes handcrafting it a rewarding challenge for artisan producers, and consuming it particularly desirable for gastronomic and health-conscious eaters, drawn to its purportedly “pro-biotic” aspect.
I have introduced the notion of microbiopolitics as a theoretical frame for understanding debates over the gustatory value and health and safety of cheese and other perishable foods.[i] Calling attention to how dissent over how to live with microorganisms reflects disagreement about how humans ought to live with one other, microbiopolitics offers a way to frame questions of ethics and governance. The U.S. American revival of artisanal cheesemaking and rising enthusiasm for raw milk and raw-milk cheese exemplifies microbiopolitical negotiations between a hyper-hygienic regulatory order bent on taming nature through forceful eradication of microbial contaminants — a Pasteurian social order (as currently forwarded by the FDA) — and what I have called a post-Pasteurian alternative committed working in selective partnership with ambient microbes.
As Bruno Latour relates in The Pasteurization of France, in recognizing microbes as fully enmeshed in human social relations, early Pasteurians legitimated the hygienist’s right to be everywhere; once microbes can be revealed in the lab (Pasteurians continue to believe) they may be eradicated — only then will “pure” social relations be able to flourish. In contrast, post-Pasteurians move beyond an antiseptic attitude to embrace mold and bacteria as potential friends and allies. The post-Pasteurian ethos of today’s artisanal food cultures—recognizing microbes to be ubiquitous, necessary, and even (sometimes) tasty—is productive of modern craft knowledge and expanded notions of nutrition, and it produces a new vocabulary for thinking about conjunctures of cultural practice and agrarian environments, along the lines of what the French call terroir.
I want to be very clear: some bacteria and viruses make some people sick, something no food-maker wants to risk. Successful post-Pasteurian food-makers are never cavalier about pathogenic risk. Dairy farmers who trade in raw milk and cheesemakers who work with it are exceptionally careful about hygiene—they are not anti-Pasteurian. To the contrary, they work hard to distinguish between “good” and “bad” microorganisms and to harness the former as allies in vanquishing the latter. Post-Pasteurianism takes after Pasteurianism in taking hygiene seriously; it differs in being more discriminating.
Focused on the aggregate of national population, Pasteurian microbiopolitics has been criticized for taking a one-size-fits all approach to food safety, predicating regulation on industrial-scale production (relying on pasteurization or irradiation to kill pathogens presumed to be present owing to insanitary agricultural practices) and population-wide consumption (young raw-milk cheese is forbidden to all because it carries particular threat to immunocompromised and pregnant consumers). Post-Pasteurians counter that fresh milk is not inherently “dirty” and in need of pasteurization; contamination is a matter of human agricultural practice, it is not in the “nature” of milk. Moreover, many assert that the heterogeneity of the public in “public health” should not be reduced to its lowest common denominator; people are individuals. In other words, the post-Pasteurian position lobbies for socio-legal latitude that would permit potentially risky foods to be made and consumed safely by some, if not others.
I worry, though, that as enthusiasm for the beneficial agencies of microorganisms grows, underinformed enthusiasts may overestimate the power of “nature’s” microbial goodness.[ii] I fret even more when such a position is characterized—as I am beginning to see—in terms of “post-Pasteurianism.” Last year I discovered for sale on the Web t-shirts, bumper stickers, even maternity shirts and baby bibs emblazoned with a smiling microbe and the slogan, “I’m a Post Pasteurian.”
Descriptive copy explains, “What is a ‘Post Pasteurian’? A really smart person who understands that pasteurization kills all (yes, ALL) the good in food.”[iii] This is not how I defined “post-Pasteurian” in my 2008 article or 2013 book. For the record, I refuse the claim. Pasteurization does not “kill” all the good in food. The position putatively espoused by the t-shirt would pit a beneficent “nature” supernaturally enlivened by microorganisms against a power-greedy “culture” championed by regulatory overreach. But the natural-cultural reality is that milk and fermented foods such as cheese, yogurt, miso, and beer are multispecies muddles that resist such simplistic parsing.
There’s nothing essential about a food’s goodness. Humility is required to navigate (not necessarily manage, let alone steward) post-Pasteurian microbial ecologies.
By “microbiopolitics,” then, I mean to describe and analyze regimes of social management, both governmental and grassroots, which admit to the vital agencies of microbes, for good and bad. Including beneficial microbes like starter bacterial cultures and cheese mold — in addition to the harmful E. coli, Lysteria monocytogenes, and Micobacterium tuberculosis — in accounts of food politics extends the scaling of agro-food studies into the body, into the gastrointestinal. “Microbes connect us through diseases,” writes Latour, “but they also connect us, through our intestinal flora, to the very things we eat.”[iv] At the beginning of the twenty-first century, as it comes to light that 90 percent of what we think of as the human organism turns out to comprise microorganisms, the truism, “We are what we eat,” has never seemed more literal. One aim of my work has been to show how artisan food-makers carefully sort out microbial friends from foes, work (not faith) that produces the conditions through which a post-Pasteurian dieticity might safely emerge—for some if not others.
[i] See Heather Paxson, “Post-Pasteurian Cultures: The Microbiopolitics of Raw-Milk Cheese in the United States,” Cultural Anthropology 23(1): 15-47, 2008. And also Heather Paxson, The Life of Cheese: Crafting Food and Value in America. Berkeley: University of California Press, 2013.
[ii] See also Gareth Enticott, “Risking the Rural: Nature, Morality and the Consumption of Unpasteurized Milk,” Journal of Rural Studies 19(4): 411-424, 2003.
[iv] Bruno Latour, The Pasteurization of France. Cambridge: Harvard University Press, 1993, p. 37.
April 30th, 2013, by Ali Kenner Comments Off
It’s been nearly four years since The Asthma Files (TAF) really took off (as a collaborative ethnographic project housed on an object-oriented platform). In that time our work has included system design and development, data collection, and lots of project coordination. All of this continues today; we’ve learned that the work of designing and building a digital archive is ongoing. By “we” I mean our “Installation Crew”, a collective of social scientists who have met almost every week for years. We’ve also had scores of students, graduate and undergraduates at a number of institutions, use TAF in their courses, through independent studies, and as a space to think through dissertations. In a highly distributed, long-term, ethnographic project like TAF, we’ve derived a number of modest findings from particular sites and studies; the trick is to make sense of the patterned mosaic emerging over time, which is challenging since the very tools we want to use as a window into our work — data visualization apps leveraging semantic tools, for example — are still being developed.
Given TAF’s structure — thematic filing cabinets where data and projects are organized — we have many small findings, related to specific projects. For example, in our most expansive project “Asthmatic Spaces”, comparisons of data produced by state agencies (health and environmental), have made various layers of knowledge gaps visible, spaces where certain types of data, in certain places, is not available (Frickel, 2009). Knowledge gaps can be produced by an array of factors, both within organizations and because of limited support for cross agency collaboration. Another focus of “Asthmatic Spaces” (which aims to compare the asthma epidemic in a half dozen cities in the U.S. and beyond) is to examine how asthma and air quality data are synced up (or not) and made usable across public, private, and nonprofit organizations.
In another project area, “Asthma Knowledges”, we’ve gained a better understanding of how researchers conceptualize asthma as a complex condition, and how this conceptualization has shifted over the last decade, based on emerging epigenetic research. In “Asthma Care” we’ve learned that many excellent asthma education programs have been developed and studied, yet only a fraction of these programs have been successfully implemented, such as in school settings. Our recent focus has been to figure out what factors are at play when programs are successful.
Below I offer three overarching observations, taken from what our “breakout teams” have learned working on various projects over the last few years:
*In the world of asthma research, data production is uneven in myriad ways. This is the case at multiple levels — seen in public health surveillance and our ability to track asthma nationally, as well as at the state and county level; as seen through big data, generated by epigenetic research; in the scale of air quality monitoring, which is conducted at the level of cities and zip codes rather than at neighborhood or street level. Uneven and fragmented data production is to be expected; as ethnographers, we’re interested in what this unevenness and fragmentation tells us about local infrastructure, environmental policy, and the state of health research. Statistics on asthma prevalence, hospitalizations, and medical visits are easy to come by in New York State and California, for example; experts on these data sets are readily found. In Texas and Tennessee, on the other hand, this kind of information is harder to come by; more work is involved in piecing together data narratives and finding people who can speak to the state of asthma locally. Given that most of what we know about asthma comes from studies conducted in major cities, where large, university-anchored medical systems help organize health infrastructure, we wonder what isn’t being learned about asthma and air quality in smaller cities, rural areas, and the suburbs; what does environmental health (and asthma specifically) look like beyond urban ecologies and communities? We find this particularly interesting given the centrality that place has for asthma as a disease condition and epidemic.
*Asthma research is incredibly diffuse and diverse. Part of the idea for The Asthma Files came from Kim Fortun and Mike Fortun’s work on a previous project where they perceived communication gaps between scientists who might otherwise collaborate (on asthma research). Thus, one of our project goals has been to document and characterize contemporary asthma studies, tracing connections made across research centers and disciplines. In the case of a complex and varied disease like asthma — a condition that looks slightly different from one person to the next and is likely produced by a wide composite of factors — the field of research is exponential, with studies that range from pharmaceutical effects and genetic shifts, to demographic groups, comorbidities, and environmental factors like air pollution, pesticides, and allergens. Admittedly, we’ve been slow to map out different research trajectories and clusters while we work to develop better visualization tools in PECE (see Erik Bigras’s February post on TAF’s platform).
What has been clear in our research, however, is that EPA and/or NIEHS-funded centers undertaking transdisciplinary environmental health research seem to advance collaboration and translation better than smaller scale studies. This suggests that government support is greatly needed in efforts to advance understanding of environmental health problems. Transdisciplinary research centers have the capacity to conduct studies with more participants, over longer periods of time, with more data points. Columbia University’s Center for Children’s Environmental Health provides a great example. Engaging scientists from a range of fields, CCCEH’s birth cohort study has tracked more than 700 mother-child pairs from two New York neighborhoods, collecting data on environmental exposures, child health and development. The Center’s most recent findings suggest that air pollution primes children for a cockroach allergy, which is a determinant of childhood asthma. CCCEH’s work has made substantial contributions to understandings of the complexity of environmental health, as seen in the above findings. Of course, these transdisciplinary centers, which require huge grants, are just one node in the larger field of asthma research. What we know from reviewing this larger field is that 1) most of what we know about asthma is based on studies conducted in major cities, 2) that studies on pharmaceuticals greatly outnumber studies on respiratory therapy; that studies on children outnumber studies on adults; that studies on women outnumber studies on men; and that many of the studies focused on how asthma is shaped by race and ethnicity focus on socioeconomic factors and structural violence; finally, 3) that over the last fifty years, advancements in inhaler technology mechanics and design has been limited in key ways, especially when compared to a broader field of medical devices.
*Given the contextual dimensions of environmental health, responses to asthma are shaped by local factors. What’s been most interesting in our collaborative work is to see what comes from comparing projects, programs, and infrastructure across different sites. What communities and organizations enact what kinds of programs to address the asthma epidemic? What resources and structures are needed to make environmental health work happen? Environmental health research of the scale conducted by CCCEH depends on a number of factors and resources — an available study population, institutional resources, an air monitoring network, and medical infrastructure, not to mention an award winning grassroots organization, WE-ACT for Environmental Justice. Infrastructure can be just as uneven and fragmented as the data collected, and the two are often linked: Despite countless studies that associate air pollution and asthma, less than half of all U.S. counties have monitors to track criteria pollutants. And although asthma education programs have been designed and studied for more than two decades now, implementation is uneven, even in the case of the American Lung Association’s long-standing Open Airways for Schools. This is not to say that asthma information and care isn’t standardized; many improvements have been made to standardize diagnosis and treatment in the last decade. Rather, it’s often the form that care takes that varies from place to place. One example of what has been a successful program is the Asthma and Allergy Foundation of America’s Breathmobile program. Piloted in California more than a decade ago, Breathmobiles serve hundreds of California schools each year and more than 5,000 kids. Not only are eleven Breathmobiles in operation in California, but the program has also been replicated in Phoenix, Baltimore, and Mobile, AL. Part of the program’s success in California can be attributed to the work of the state’s AAFA chapter, and partnerships with health organizations, like the University of Southern California and various medical centers. Importantly, California has historically been a leader in responses to environmental health problem.
As we continue our research, in various fieldsites, grow our archive, and implement new data visualization tools, we hope to expand on these findings and further synthesize from our collective work. And beyond what we’re learning about the asthma epidemic and environmental health in the U.S., we’ve also taken many lessons from our collaborative work, and the platform that organizes us.
March 5th, 2013, by Andrew Asher Comments Off
Understanding “discovery”—the processes through which people locate previously unknown information—is a critical issue for academic libraries and librarians as they endeavor to provide and make accessible materials for students, faculty members, and other library users. Until relatively recently, people seeking information at an academic library were typically faced with a myriad of confusing catalogs, indexes, and databases, each with a different topical coverage, organizational structure and search interface. For people increasingly accustomed to Google’s simple search interface and natural language functionality, the “cognitive load” of siloing information in this way can be extremely high. Library discovery systems were developed to address this problem. By creating a centralized index of a library’s resources, these tools allow a user to simultaneously query almost all of a library’s holdings via a single Google-style search box.
Along with my colleagues Lynda Duke and Suzanne Wilson, I recently completed a research study examining how undergraduate students located information using two discovery tools, the “Ebsco Discovery Service (EDS)” and Serials Solution’s “Summon,” as well as Google Scholar and the “traditional” suite of library catalogs and databases (see Asher, Duke & Wilson 2013). In this study, we asked students to find resources for a set of research questions that were similar to research-paper assignments they might receive for a course for a class. After they were finished finding these materials, we played back a recording of their searches for them and conducted a debriefing interview during which we asked them to discuss how they approached finding particular resources and how they evaluated sources and information they chose to use.
As we observed how students interacted with the discovery tools, we also learned about how these systems perform an epistemological function by structuring how students use information and construct knowledge. No matter which search system the students used, the process by which they approached searches typically followed a single pattern. Students generally treated every search interface they encountered like a Google search box, using simple keyword search and ignoring more advanced functionality. Simple keyword searches accounted for 82% of the searches we observed in our study.
Used in this way, all of the discovery systems in this study will return a very large number of items for any given query. Faced with a results set that was almost always too large to evaluate comprehensively on an item-by-item basis, students instead relied primarily on the effectiveness of the search algorithms to determine resources’ quality, making rapid appraisals of an item’s usefulness based on its title or a superficial scan of its abstract, and almost never considering materials displayed after the first page of results. In total, 92% of the resources utilized by students in our study were found on the first page.
This de facto outsourcing of the evaluation process to the search algorithm itself makes the default ranking criteria of the discovery system perhaps the single most important factor in determining which resources students chose to use. Moreover, differences in the way the discovery systems produce search results could be directly observed in the resources chosen. For example, students using Summon utilized more newspaper and trade journal resources than those using EDS since EDS weights results braced on article length (i.e. when other factors are held constant, longer materials rank higher). Similarly, students using Google Scholar used more book resources due to its integration with Google Books.
This intersection between student’s search practices—which likely reflect their day-to-day usage of Google and other general search engines—and the design of discovery system’s interfaces and algorithms illustrate an important example of “algorithmic culture.” Ted Striphas (2011) uses the term “algorithmic culture” to describe how some aspects of the work of culture–“the sorting, classifying, hierarchizing, and curating of people, places, objects, and ideas”– are becoming the purview of “machine-based information processing systems.” He continues, “some of our most basic habits of thought, conduct, and expression. . .are coming to be affected by algorithms, too. It’s not only that cultural work is becoming algorithmic; cultural life is as well” (Striphas 2011).
Through the act of ordering and ranking, search systems’ relevancy algorithms impart (and reinforce) a sense of authority and credibility to the results. Students in our study regularly assumed that information that is objectively “best” will be ranked first, substituting the judgment of the algorithm for their own thought processes. This “trust bias” is well documented in the literature on search engines (see Vaidhyanathan 2011:59; Hargittati et. al. 2010; Hargittai 2007; Pan et. al. 2007), and is also reflexive; because the search system alone holds the power to create a ranked list of resources from the huge number possible choices, it self-validates the quality of these results.
Relevancy-ranking algorithms are also cultural artifacts, and can be understood as embodying a set of socially and culturally embedded negotiations, decisions, judgments, biases, politics, and ideologies. For example, PageRank, the ranking and relevancy algorithm that comprises the core of Google search, is premised on a concept of aggregated social judgment, that is, the assumption that a mathematical calculation based on the number of links to a website combined with an evaluation of the relative importance of the websites from which those links originate, can be used as a proxy for evaluating the quality or value of a site (see Brin & Page 1998; Page et. al. 1999; Battelle 2005:75-76). Likewise, the discovery systems used in our study also contain a set of embedded decisions about information organization and quality, each of which represents a specific decision about the relative value of information. For example, each system must define what characteristics qualify a journal as “peer-reviewed” and scholarly, as well as how to treat these materials once a determination has been made.
Unfortunately, since discovery systems are for the most part proprietary technologies, many of these judgments and decisions are kept secret from the user. For this reason, students can not properly interrogate how a discovery system works even if they want to, and must simply put their faith and trust in the algorithm and the people who designed it. From a pedagogical standpoint this is quite concerning since students for the most part appear to mistakenly view discovery and other search systems as neutral tools and do not consider their potential biases.
By shaping the processes through which information is found, discovery systems thus exert a form of disciplinary power that provides the scaffolding for how students complete their academic work and structures the way they acquire knowledge. For this reason, libraries using or considering the implementation of these systems should carefully and critically assess their design and functionality as well as the potentially determinative effect these systems might have on students’ research outcomes. Students’ practices of primarily utilizing the basic search functionality of any search system, relying only on the first page of search results, and trusting the relevancy rankings of a given discovery system makes the default settings of these tools critically important. These patterns also underscore the instructional needs of students in both the technical and conceptual aspects of search, as well as in algorithmic literacy and the understanding of algorithmic cultures.
Asher, Andrew, Lydna Duke, & Suzanne Wilson
2013. “Paths of Discovery: Comparing the Search effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library Resources.” College and Research Libraries. Forthcoming July 2013. Preprint available at http://crl.acrl.org/content/early/2012/05/07/crl-374.full.pdf+html .
2005. The search: How Google and its rivals rewrote the rules of business and transformed our culture. New York: Portfolio.
Brin, S., and L. Page.
1998. “The anatomy of a large-scale hypertextual Web search engine.” Computer networks and ISDN systems 30 (1-7): 107–117.
2007. “The social, political, economic, and cultural dimensions of search engines: An introduction.” Journal of Computer-Mediated Communication 12 (3): 769–777.
Hargittai, E., L. Fullerton, E. Menchen-Trevino, and K.Y. Thomas.
2010. “Trust online: young adults’ evaluation of Web content.” International Journal of Communication 4: 468–494.
Page, L., S. Brin, R. Motwani, and T. Winograd.
1999. “The PageRank citation ranking: Bringing order to the web.”
Pan, B., H. Hembrooke, T. Joachims, L. Lorigo, G. Gay, and L. Granka.
2007. “In Google we trust: Users’ decisions on rank, position, and relevance.” Journal of Computer-Mediated Communication 12 (3): 801–823.
2011. “Who Speaks for Culture?” posted Sept. 26, 2011, http://www.thelateageofprint.org/2011/09/26/who-speaks-for-culture/
2011. The Googlization of Everything (and Why We should Worry). Berkeley: University of California Press.
February 5th, 2013, by Erik Bigras Comments Off
The Asthma Files is a collaborative ethnographic project focused on the diverse ways people in settings around the world have experienced and responded to the global asthma epidemic and air pollution crisis. It is experimental in a number of ways: It is designed to support collaboration among ethnographers working at different sites, with different foci, such that many particular projects can nest within the larger project structure. This is enabled through a digital platform that we have named PECE: Platform for Experimental, Collaborative Ethnography. PECE is open source and will become shareable with other research groups once we work out its kinks.
PECE has been built to support collaborative, multi-sited, scale-crossing ethnographic research addressing the complex conditions that characterize late industrialism – conditions such as the global asthma epidemic and air pollution crisis; conditions that implicate many different types of actors, locales and systems – social, cultural, political-economic, ecological and technical, calling call for new kinds of ethnographic analyses and collaboration. The platform links researchers in new ways, and activates their engagement with public problems and diverse audiences. The goal is to allow platform users to toggle between jeweller’s eye and systems-level perspective, connecting the dots to see “the big picture” and alternative future pathways.
The Asthma Files has taken us “beyond academia” in a number of ways. Ethnographically, we are engaging an array of professionals, organizations and communities, trying to understand how they have made sense of environmental public health problems. We want to document their sense-making processes, and what has shaped them; we also want to facilitate their sense-making processes – through ethnography that help them understand their own habits of thought and language, and those of others with whom they likely need to work cooperatively. For example, we’ve recently been contacted by a New Orleans housing contractor who wished to know the kind of research being done on asthma and housing in Louisiana. PECE is designed to support this, making space for different kinds of participants at different points in the ethnographic process.
We’ve also gone “beyond academia” to learn how to think about and build a digital platform to support ethnographic work. One step involved selection of the best – for our purposes, for now – online content management system. Quickly, it became apparent that most technical professionals had strong preferences, sometimes based on assessments of functionality, sometimes – it seemed – as a matter of habit. Through a long, comparative process, we ultimately decided on Plone, an open source content management system known for its security capabilities (important in creating space where groups of ethnographers can work together with material, perhaps IRB restricted, out of sight even though online), for its capacity to archive original content (such as interview recordings), and for the ways it supports our effort to nest multiple projects within a larger project structure.
Another important step, which we are still figuring out, is to hire the ongoing technical help we need for PECE. We need ongoing technical help because the platform isn’t finished, as we now envision it. But also because we want the platform to continually evolve as we continue to figure out what kinds of functionality we need to support collaborative ethnographic work. And this may be specific to each project housed on PECE. So we need on-going, ever learning relationships with people who can provide the technical support PECE requires, such as computer scientists, IT specialists, or programmers. As ethnographers, we know that technical professionals will think very differently about the work that we do. And we need to learn to work with this. We need to engage with skills and knowledges that are traditionally outside of the discipline of anthropology by taking on, in a practical way, the continual anthropological challenge of figuring out how difference works.
The Asthma Files and PECE are experiments that have taken us in many new directions – beyond academia, as well as back to basic questions about what should be considered ethnographic material, where theory is in ethnography, how ethnographic findings are best presented, etc. We keep open a call for new collaborators. Let us know if you would like to be in our mix.