On the morning of Friday, March 10, 2023 Nick Seaver and I met over Zoom to talk about his new book Computing Taste: Algorithms and Makers of Music Recommendation, which was published in 2022 by the University of Chicago Press. In that meeting, we recorded an episode for the Playpod podcast, which is available at the link above.
About the author
Nick Seaver is an anthropologist who, as he puts it, studies how people use technology to make sense of cultural things. He teaches in the Department of Anthropology at Tufts University, where he also directs the program of Science, Technology, and Society. His first book is Computing Taste: Algorithms and Makers of Music Recommendation. Nick has published several articles in academic journals on topics related to critical algorithm studies, as well as ethnographic stories and anthropological research methods. A more comprehensive list of his academic work can be found on this link.
About the book
Computing Taste is about the people who make music recommender systems and how they think about their work. The book is 216 pages long, divided into six chapters, plus a prologue, introduction, and epilogue. The book stems from Nick’s Ph.D. dissertation at the University of California Irvine. Each chapter of Computing Taste offers a dense, well-researched, and well-told story of how socio-technical arrangements giving life to music recommender systems come together in practice. Each chapter of the book also challenges conventional narratives about algorithmic systems and their “evil” impacts on society. In a way, Nick’s book surprises the reader by telling stories that we’re not expecting to hear.
In our interview we covered several topics, including how Nick’s work has been received by the anthropological community and some major themes from the book. As a graduate student, I was especially interested in questions of how to conduct research. I encourage the reader to read the entire book and engage with the richness of information and anthropological analysis brilliantly conducted by Nick.
On black boxes as legal regimes
Nick has a sophisticated way of critiquing data metaphors. He is cautious and does not jump into precipitated conclusions and judgments that classify algorithms as good or evil, which ultimately excludes the socio from these technical systems. Nick reminds the reader that black boxes are constituted by legal regimes, a perspective that he builds on from Frank Pasquale’s book, The Black Box Society. According to Nick, these secret boxes are the story of legal intellectual property related to company secrets. The black box, as a metaphor, becomes a problem because it makes us want to know what’s in it. Nick thinks the black box figure is damaging to how we think about these systems as it encourages us to think about them as discrete or as individual entities that exist by themselves in the world. This metaphor can lead us to think that black boxes are openable, which they are not, since they’re being constantly changed, updated, trained with new datasets, and adapted to users’ behaviors. This makes Nick’s book even more fascinating, as the people behind algorithms for music recommendation are trying to capture, measure, retain, and work with these systems, which are always in flux.
As a reader, I noticed how the conventional idea of access used in much anthropological research does not translate well to the studies of objects bound by “legal regimes.” Nick mentions in the book that “access is not an event.” In discussing more access-related questions (access to information, people, and resources during the fieldwork), Nick explained he has a complicated relationship with this idea. There’s no way to show up at Facebook or other tech companies and just do “fieldwork inside the company.” Not all anthropologists can get to every place, Nick said, and what can we do in this case? Nick hinted in his response that we might need to change our questions and methods and, more importantly, what access even means. As anthropologists of tech, and more specifically in Nick’s case, as an anthropologist of tech and startups, we’re doing something more than just going to the field, finding something that everybody knows, and telling other people about it. “My job is not to go into a company, figure out how the algorithm works, write it down, and then sort of be a corporate espionage actor and bring it out,” he said. This disrupts, perhaps, what we conceive of as the point of anthropology. Is the point of anthropology just to share secrets? Nick does not think so, especially as these sociotechnical systems are protected by legal regimes and also because they simply don’t exist out there in the world as discrete things to unearth. As he says,
If you’re studying algorithms, there’s no algorithm. They [the company] don’t have that, like sitting out on the table somewhere, right? It’s not anything that you can see. And so, really, it’s this ongoing process as you access this kind of thing. It’s a relationship. It’s a negotiation. It’s an ongoing effort.
Nick described that an ethnography is also history, and in this light, an ethnography would rather narrate this ongoing effort of gaining access to fieldwork and also of telling stories that people might not be expecting to hear. In all, he moves away from an essentialized view of fieldwork, and talks about how parts of his fieldwork involved meeting people at rock climbing gyms, waiting rooms, conferences, and offices, as well as watching Youtube videos. He is also aware of his positionality and notes that he shares some identity markers with the developers of music recommender systems—predominantly white males, American, and in their thirties—and how that impacted his entrance into this fieldwork.
On critiquing sociotechnical systems
Nick thinks there is much work to do about the anthropology of “these systems”—algorithmic systems and their designers who try to capture, measure, retain users’ attention, etc. He highlights how his book is not explicitly critical enough about the systems he writes about in the way people might expect, and how he worries about the book “coming across as being very nice.” This is within the context of a public discourse around recommender systems right now that tend to portray them as either good or evil.
Rather, since Nick’s book stems from a moment in his career when he was an anthropologist in training, he ended up realizing that his goal as an anthropologist was to try to give an adequate representation of what was going on in these sociotechnical systems behind music recommender systems. He blended this idea with “a little bit of our sort of classic anthropological virtues of interpretive charity.” In this light, his book is rich not only in ethnographic descriptions and vivid stories from the field, but also in anthropological interpretation, and the reader can expect to encounter Bourdieu, Lévi-Strauss, Alberto Cosín Jiménez, and many other theorists in the book. During our interview, Nick noted that his book is about “music recommender system developers primarily based in the United States,” and the findings can’t be universalized. Still, he’s looking forward to new ethnographic works on streaming services focusing on non-hegemonic systems like the US, adding layers of technical and cultural specificity.
As we wrapped up the interview, Nick noted how being part of the broader CASTAC community means being in contact with brilliant researchers and their research.
During our interview, I told Nick about the impressions his book left on me as a reader. But it wasn’t just the book that left a positive impression on me. It was my first time recording an interview for a podcast. My anxiety and insecurity are noticeable in the recording, as well as my accent since English is not my first language. But Nick was extremely patient and kind. I thank him for the insightful and warm conversation, and my colleagues at Platypus for the opportunity to produce this episode of Platypod. I encourage you to listen to the entire episode and let us know if you have any authors or books you’d like us to record an episode about.