Tag: computer science

The Anti-Politics of Women in Tech

Almost daily are news articles about women in tech. Among these on the day I wrote this post, for example, were an article in Marie Claire, the women’s magazine, called “How Much Have Things Really Changed for Women in Technology?” and another in India’s business newspaper Mint titled “Two kinds of pay gap in the IT industry: NetApp’s Mark Bregman.” Both articles touch on several issues about women in tech, and STEM fields more generally; the cornerstone in each, however, is simply the number of women in the tech world—or the lack thereof, compared with men. This is a problem that has been explored since at least the mid-1970s in computer science (e.g., Montanelli Jr. and Mamrak 1976), longer for some other STEM fields. More recently this issue was highlighted last year, particularly in the media and public attention, when large tech companies like Google, Apple, Twitter, and Facebook released “diversity data” showing the dismal number of women and minorities among their employees. The articles also point to several issues seen as contributing to the disparities, including pay and hiring gaps for women, so-called “brogrammer” culture (involving frat-house-like sociality and performances of technical heroism, generally among men), and implicit biases shaping how women (and men) are perceived and judged. As a former woman in tech—I pursued an undergraduate degree in computer science—I appreciate how this surge in public awareness and interest is helpful to many, particularly in relation to discussions about sexism and tech cultures. Through social media, blogs, and news articles people are sharing and discussing personal experiences and working to further raise awareness of, and gain support for, challenges women as a group face in tech. Tech companies and governments have also pledged a great deal of money towards “fixing” this problem. (read more...)

Hardwired Hayek: Lessons for economic anthropology from electricity markets

For most of its history in the US, electricity has been a monopoly commodity: in a delimited territory, only one company was legally allowed to produce and deliver electricity to consumers. This state of affairs started to be challenged in the 1970s, when, in accordance with the neoliberal wave, a number of infrastructural services (e.g., airlines, telecommunications) were deregulated, meaning, they were made competitive by law. Electricity followed in the 1990s. First, the Energy Policy Act of 1992 allowed states to break monopolistic utilities into separate production and delivery companies. This act also allowed states to take technological measures to ensure that new companies could plug into the electric grid to sell or buy electricity. And then the Federal Energy Regulatory Commission (FERC) introduced the concept of electricity markets—computational processes through which prices are set for all buyers and sellers, and which are operated by non-profit operators of the transmission grid. I can’t stress enough the computational nature of these new markets: they exist because the grid is wired up with many kinds of sensors and computational devices that are calculating continuously and zigzagging “information.” Making these markets requires not just economists, but also engineers, programmers, traders, and database specialists—all concerned with making sure that the nature and order of information flows are just right. (read more...)

How influential was Alan Turing? The tangled invention of computing (and its historiography)

Alan Turing was involved in some of the most important developments of the twentieth century: he invented the abstraction now called the Universal Turing Machine that every undergraduate computer science major learns in college; he was involved in the great British Enigma code-breaking effort that deserves at least some credit for the Allied victory in World War II, and last, but not the least, while working on building early digital computers post-Enigma, he described — in a fascinating philosophical paper that continues to puzzle and excite to this day — the thing we now call the Turing Test for artificial intelligence. His career was ultimately cut short, however, after he was convicted in Britain of “gross indecency” (in effect for being gay), and two years later was found dead in an apparent suicide. The celebrations of Turing’s birth centenary began three years ago in 2012. As a result, far, far more people now know about him than perhaps ever before. 2014 was probably the climax, since nothing is as consecrating as having an A-list Hollywood movie based on your life: a film with big-name actors that garners cultural prestige, decent press, and of course, an Academy Award. I highly recommend Christian Caryl’s review of the The Imitation Game (which covers Turing’s work in breaking the Enigma code). The film is so in thrall to the Cult of the Genius that it adopts a strategy not so much of humanizing Turing or giving us a glimpse of his life, but of co-opting the audience into feeling superior to the antediluvian, backward, not to mention homophobic, Establishment (here mostly represented by Tywin Lannister, I’m sorry, Commander Denniston). Every collective achievement, every breakthrough, every strategy, is credited to Turing, and to Turing alone. One scene from the film should give you a flavor of this: as his colleagues potter around trying to work out the Enigma encryption on pieces of paper, Turing, in a separate room all by himself, is shown to be building a Bombe (a massive, complicated, machine!) alone with his bare hands armed with a screwdriver! The movie embodies a contradiction that one can also find in Turing’s life and work. On one hand, his work was enormously influential after his death: every computer science undergrad learns about the Turing Machine, and the lifetime achievement award of the premier organization of computer scientists is called the Turing Award. But on the other, he was relatively unknown while he lived (relatively being a key word here, since he studied at Cambridge and Princeton and crossed paths with minds ranging from Wittgenstein to John Von Neumann). Perhaps in an effort to change this, the movie (like many of his recent commemorations) goes all out in the opposite direction: it credits Turing with every single collective achievement, from being responsible for the entirety of the British code-breaking effort to inventing the modern computer and computer science. (read more...)

On the Porous Boundaries of Computer Science

The term “big data” brings up the specter of a new positivism,  as another one in the series of many ideological tropes that have sought to supplant the qualitative and descriptive sciences with numbers and statistics. But what do scientists think of big data? Last year, in a widely circulated blog post titled “The Big Data Brain Drain: Why Science is in Trouble,” physicist Jake VanderPlas made the argument that the real reason big data is dangerous is because it moves scientists from the academy to corporations. (read more...)