This article is the first in a series about stuckness in science and technology. Read the introduction to the series here.
What might we learn from the experiences of tech professionals being stuck? How does stuckness come about and what do these moments represent?
This post traces two stories from different worlds: an Indian NGO and an American Big Tech corporation. One follows Leena[1] , an employee at InnovateTech, an Indian education technology (EdTech) NGO. The other follows Cody, a software engineer at Microsoft, working in the United States. On the surface, Leena and Cody have more differences than things in common. Their employers operate in very different cultural and technological contexts influenced by distinct economic and political machinations. Their everyday experiences as they move through the world, one as a brown woman, and the other as a white man, have significant contrasts.
However, as I spent time engaging with professionals in these distinct field sites, I found that there was a common thread across their disparate experiences and processes of technology building: a feeling of being stuck. In this post, I explore how stuckness came about for these workers—in instances where the anti-politicality of tech workplaces collided with decidedly political structures—and how they internalized feelings of stuckness in ways that recast systemic, structural issues as personal failures.

The reach of Big Tech felt from the Golden Gate Bridge to India. Photos by author.
Part I: Indian EdTech
Leena is the head of learning and design at InnovateTech. She was one of the NGO’s initial employees, now an organisation of over 400 people. At the time of our conversation, she had been working there for four years. Over her educational and professional career, she has worn many hats: from studying journalism, to teaching, being a design consultant, and building educational curricula for EdTech startups.
InnovateTech is an NGO that provides pedagogical materials and feedback to children in rural regions of India who have minimal access to educational resources. Over the course of our conversation, Leena spoke to me about her work and how she enjoys designing the experience of learning. However, moments of stuckness come through, in the tensions between business and pedagogical priorities, and concerns of cultural erasure in bringing AI-powered tech into rural Indian communities.
InnovateTech operates, in the words of its employees, more like a tech startup than an NGO. It has received a fair bit of attention in the past few years, playing a significant role in influencing national education policy on the one hand, and garnering funding from Silicon Valley Big Tech on the other. One such round of funding happened to come from a UN Sustainability Development Goals-themed grant from a Big Tech company.
The grant, Leena explained, was specifically geared towards incorporating AI and machine learning (ML) into InnovateTech’s existing technological infrastructures and solutions. In some ways, this meant good things. The use of ML models meant the ability to scale solutions at levels that InnovateTech could not previously achieve. But the pedagogical implications of such funding weighed heavily on Leena’s mind:
“NGOs, they play a really large role in how public education gets shaped, in India more so than probably anywhere else. And I think the funders, the sort of program, is often designed with very rigid ideas about what children should learn, and how they should demonstrate their learning.”
That this funding was specifically tied to building AI solutions was a source of further worry for her. One of the highlights of work for Leena was the time she spent with communities in rural regions: with teachers, parents and children. However, this experience threw the colonising nature of schooling and education into stark relief for her. Leena had observed that there were many things that children belonging to rural, agricultural families “fundamentally [knew] from their environment”. But classroom spaces made these children distrust what they knew. It seemed to Leena that it made them feel that “there is a right answer, and my answer is not the right answer”. This was something she felt was exacerbated by the scale and speed at which AI was being pushed within EdTech. She elaborated on her concerns in her work with rural communities:
“One of my greatest inhibitions and fears is that, am I going and telling them कि आप यह करो [that you please do this]…at the price of convincing them that what they’re doing is not right. They might be doing a bunch of incredible things. But I’m replacing it […] And one couldn’t do this at this massive scale before […] At max, you would destroy a hundred people’s lives […] But now you’re destroying things that have taken decades, generations of intelligence.”
These were concerns that Leena repeatedly brought up at work. But they seemed to fall on deaf ears. There was helplessness as she went on to state: “this system, I mean…a $1 million grant, what is my voice against that.” Despite walking me through the political position NGOs occupied in the country, the economic factors influencing what got built, and the impact of AI hype on the creation of specific types of EdTech solutions, Leena mused aloud that perhaps the issue was her: “I am a misfit in some sense.” She interpreted her feelings of frustration and resignation, of stuckness, as being a matter of her personal inability to align with InnovateTech’s paradigm of work; a paradigm within which a Big Tech grant that would help scale InnovateTech’s solutions remained unassailable.
A few months later, Leena told me that she had quit InnovateTech. She did not get into details of why. Revisiting our initial conversation, I perceived a sense of isolation in her criticality. Perhaps Leena’s actions, in part, represented what Widder et al. (2023) call “quiet quitting”, a tactic emerging “from a feeling of powerlessness to affect change within organizations” (p.5).
Part II: American Big Tech
Cody is a software engineer working at Microsoft in the United States. He is a veteran when it comes to the tech sector, having worked in the industry in various engineering roles for over 15 years. There are many things he really likes about his company. “The people here are very, very smart” he tells me, and “there’s a lot of computer science knowledge […] sunk into the company […], lots of opportunities to learn”.
But there were aspects of the work that frustrated him. He articulated feeling like a roadblock to progress, getting stuck between ethical concerns and the push to deliver innovative products. For example, Cody used to volunteer his time towards being a privacy reviewer for new features being built by other teams at Microsoft. This was a key process in ensuring that user data was sufficiently protected, adhering to laws such as the General Data Protection Regulation. But it was a role that Cody, in his words, ended up “hating”. People kept treating him like a “roadblock,” especially when there were release deadlines or promotions on the line. According to Cody it was an experience where you repeatedly had to “hold your ground against people who are otherwise wanting to bully [you]”. He eventually stopped being a privacy reviewer altogether.
The company had also changed over the years, Cody shared. When he started in 2017, there were a lot more opportunities to work on things that were of interest to each engineer, which there weren’t anymore. And the number of people who were willing to call out unethical practices had declined over the years. At one point he tells me that speaking up (that he laughingly referred to as “stirring up trouble”) might be “the last thing you do”. He noted that this experience at Microsoft reflected a wider industry trend: the firing of AI ethicist Timnit Gebru comes up in our conversation, as does the spate of layoffs that have impacted the U.S tech industry.
Speaking through all these things, Cody eventually says: “we all feel a bit trapped. I feel trapped. It’s not like I’m helping the world. I’m not saving the planet. I’m not doing any of those things that I as a 20-year-old promised myself I’d go do.” But Cody still held onto a type of cynical optimism:
“I stick around hoping that at some point I switch to a team or something that will actively work towards say helping combat climate change and that type of thing […] I think we can improve people’s lives and improve the quality of education and all these good things. So I have hope that this advanced technology will be applied for good and improve future generations’ lives, and their opportunities. I don’t know that it will happen. I have a bad taste in my mouth while saying it, that it probably won’t work out that way.”
Engaging with Stuckness
In the experiences of both Leena and Cody, the material conditions (shaped by power structures) within which tech work was being performed brought about moments of stuckness, embodied in feelings of frustration and helplessness. These were hard problems—and decidedly not technical ones—with no quick fixes. STS scholars Malazita & Resetar (2019) argue that the discipline of computer science fosters a type of anti-politicality. It represents an approach that, unlike apoliticality, acknowledges political or ethical issues present in technological assemblages but abstracts these issues away as not being part of the art of building good tech; particularly if these cannot be instrumentalized through code.
It was within stuckness that I understood how tech professionals struggled with the contradictions that arose within the peculiar epistemic cultures of technology work which fostered this type of anti-politicality. Despite their experiences being brought about by larger issues stemming from the political economy of tech work, and despite the criticality they both brought to the table, the stuckness Leena and Cody experienced resulted in feelings of personal failure. For Leena, she was the misfit. For Cody, there was a sense of personal moral failure in stuckness, in not using his skills for goals he thought were worthy. Within the paradigm of anti-politicality, feelings of discomfort, frustration, or helplessness that could not be tackled or instrumentalized through processes of technology development were left to languish and consigned to the realm of the personal.
In bringing these two stories together, I do not claim equivalency between experiences, dissolving historic and continuing extractive and oppressive relations—especially in tech work—between the Majority and Minority worlds. But I do so with the goal of highlighting the common experiences that permeate across worlds, commonalities upon which unlikely alliances may be conceived.
Within the diversity of their work, contexts, and cultures, stuckness was an experience that both Leena and Cody went through. If solidarity can and must be fostered between different types of tech workers (Dorschel, 2022), there is perhaps a potential in stuckness as a shared experience, to bring together tech workers in relatively well-paid positions in the Majority and Minority worlds.
Eventually my interlocutors simply pivoted so as to keep moving, to get un-stuck; pivots that resigned to the fact this was just the way things are. Leena chose to remove herself from the system that she saw as problematic. Cody tried to extricate himself from types of work that frustrated him, holding onto an optimism that he might eventually do work that aligned with his values. What counted was finding ways of moving forward beyond stuckness.
In the face of increasing financial and technological domination, and ecological destruction by Big Tech actors, what can such potential alliances based on the shared experience of stuckness achieve? What must they stand for? These are questions for not just those engaged in different types of technological work, but a collective question for all of us.
This post was curated by Contributing Editor Michelle Venetucci and Shoko Yamada, and edited with the help of Contributing Editor Shreyasha Paudel.
Notes
[1] Names of people and companies have been changed to preserve anonymity
References
Dorschel, R. (2022). Reconsidering Digital Labour: Bringing Tech Workers into the Debate. New Technology, Work and Employment, 37(2), 288–307. https://doi.org/10.1111/ntwe.12225
Malazita, J. W., & Resetar, K. (2019). Infrastructures of Abstraction: How Computer Science Education Produces Anti-political Subjects. Digital Creativity, 30(4), 300–312. https://doi.org/10.1080/14626268.2019.1682616
Widder, D. G., Zhen, D., Dabbish, L., & Herbsleb, J. (2023). It’s About Power: What Ethical Concerns Do Software Engineers have, and what do they (feel they can) do about them?. FAccT ’23: Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency, 467-479.