Editor’s Note: This is the seventh and final post in our Law in Computation series.
At first, I was perplexed by the K5 by Knightscope, a “fully autonomous security data machine,” rolling through the Irvine Spectrum Shopping Center last summer. Now, I am not cavalier, nor naive, about my rights to privacy, confidentiality, and anonymity, but I fully accept that I will be captured by surveillance cameras from my arrival to departure in many private places. After all, there is a strong market demand for surveillance technologies, and the market has long existed with little regulations from statutory or case law; their use continues to expand as the cost of sensors and data processing decreases.For example, many major businesses have now adopted mobile location analytics sensors, to capture IP and MAC addresses from visitor’s cell phones. The decrease in cost of license plate reading technology has led to adoption beyond law enforcement, from malls to debt collectors. It is impressive that the K5—a white, bullet-shaped object that can autonomously patrol a designated area—can fit a variety of surveillance sensors (e.g. cameras, mobile location analytics, license plate readers) and processing capabilities (e.g. facial recognition, sound event recognition) into a 5-foot tall, 400 pound machine (pictured below). However, the addition of the K5 at the Irvine Spectrum is a bit perplexing: it seems to simply duplicate the already established security and surveillance functions of the already fortress-like shopping center.
Autonomous Surveillance in Private/Public Space
It is easy to imagine other spaces where the K5 and other “fully autonomous security data machines” would amplify existing surveillance technology. For example, the clustering of businesses that compose many vibrant downtown areas creates a quasi-public space, sidewalks are public and land parcels are private. Many cities have implemented laws which allow clusters of private businesses to pool resources to promote the economic vitality of these quasi-public spaces. Businesses improvement districts allow businesses owners a certain level of control over quasi-public spaces. Technologies of surveillance can indirectly effect who is and isn’t allowed in these quasi-public spaces. Whether they are deployed to private or public spaces, the surveillance capabilities of the Knightscope robots comply with all current legal regulations for privacy. In a recent interview for WIRED, William Santana Li, Knightscope chairman & CEO, rather blandly insisted that, simply, “you have no expectation of privacy in [the kind of] public area[s] where all these machines are operating.” While the K5 might be more cost-effective, at least compared with each business adopting its own surveillance technology or security guards, the adoption of the K5 might be dependent on the public’s expectation that the regulation of private space differs from the regulation of public space. After all, autonomous security robots represent a novel governing architecture for regulating space.
Interactions between humans and K5 robots in co-occupied space have been curious so far. The shopping mall, as a highly securitized and surveilled space, has acted as a sort of lab to demonstrate the capabilities of the autonomous security robots and to observe the accompanying human interactions. The numerous security cameras, installed both on the robot and throughout the shopping plaza, can capture interactions with the K5 robots. While the K5 is highly unlikely to give pursuit to a thief or robber at Irvine Spectrum—the old surveillance technologies already deterred criminals—the robot is able to practice moving through complex spaces with crowds of people while simultaneously testing their surveillance capabilities. That said, the K5 needs more practice navigating their environments; one model “drowned” itself in a mall fountain and another ran over toddler.
At the same time, mall visitors, perhaps preoccupied with their other tasks, appear to display an air of deference toward their mechanical co-occupants. Patrons have not been seen conducting acts of resistance, like attempting to block sensors or disrupting other functions. This is perhaps surprising, given that a man in Mountain View, California was recently arrested for public intoxication after knocking down a patrolling K5. On the contrary, at the Irvine Spectrum, I witnessed visitors gawking, photographing, and generally fawning over the K5, struck by its novelty and sci-fi aesthetic. In this setting, the K5 robot is socialized as an object of entertainment, and even a symbol for the uniqueness of the shopping plaza. The socialization of deference and obedience towards these novel autonomous security robots in such a highly regulated space might gives us clues about how it might alter less regulated spaces as well.
But There Are Consequences…
In Fall of 2017, in San Francisco, CA, a K5 robot was deployed to patrol around an animal shelter, in an area frequented by the homeless. The K5 was successful in deterring homeless populations from the area around the shelter, but there was intense and swift public outrage to deploying K5 for the securitization of space in this manner. Unlike mall patrons at the Irvine Spectrum, the homeless who were surveilled (and harassed) used active forms of resistance, including smearing the cameras and covering the robot with a tarp. The associations and interactions we have to urban places “are now not only mediated by software and code, they are becoming constituted by it” (Burrows 2009). People are likely to fawn and gawk over an autonomous security robot at the mall, but does the reaction differ when we witness autonomous security robots actively harassing homeless people to move off a sidewalk?
In the end, the city threatened to fine the animal shelter under recently passed legislation limiting the operating of delivering robots on public city streets and sidewalks. This is a rare example of a legal regulation anticipating technology and benefiting the less powerful. Eventually, the public backlash caused the animal shelter to discontinue the use of the K5. Unsurprisingly, Knightscope is currently working on legislation and court rulings that would be more favorable to autonomous security robots. So, while these robots do not appear to solve entrenched structural problems of inequality, only time will tell what legal regulation will come about and who it will truly benefit.
Lawrence Lessig argued that cyberspace should reflect the values of society, and the strong regulation of cyberspace depends on its code, the architecture of cyberspace (see Code is Law and Code: Version 2.0). There are many examples of “real-space” code, instances of physical architecture that is designed to encourage or discourage certain forms of behavior. For example, the city of Irvine, CA, home to the University of California Irvine and the Irvine Spectrum Shopping Center, is actively designed and maintained to reduce criminal opportunities. New surveillance technologies, from autonomous security robots to predictive policing, represent a type of code, designed to regulate behavior in the real world.
Law and Society scholars have long examined how laws can be used regulate places against the powerless to benefit the powerful. For example, various vagrancy laws, especially homeless admonishments, produce legally imposed spatial exclusion against the poor (Beckett and Herbert 2009). The contribution of housing and zoning policies to the entrenchment of poverty and racial segregation is another unforeseen consequence of law regulating place (Rothstein 2017). Autonomous security robots represent a novel form of behavioral regulation at places, as a force of architecture; but they are also coded objects, imbued with both the values and biases of their programmers. Technology and code can create physical architectures that provide spaces for people, rather just keeping people out of places. Hopefully, land use laws will be significantly changed, so we can experience the revolution in 3D printed housing.
The Technology, Law and Society Institute at the University of California, Irvine will explore the societal implications of code, programmed with values and biases, that regulates cyberspace and physical places. Surveillance technology can act as a type of architectural force in regulating society. However, surveillance technology does not exist in a vacuum; it is interdependent with other regulatory forces at multiple scales of influence. The Institute was established to not only examine the intersection of law and technology, but how technology, law, and society are mutually constitutive of each other. Autonomous security robots, like the K5, provide an interesting case study, that engages both our legal consciousness (Ewick and Silbey 1998) and our technological consciousness in mutually constituting our perceptions of both surveillance technology, and of place.
References
Beckett, Katherine, and Steve Herbert
2009 Banished: The new social control in urban America. Oxford University Press.
Burrows, Rodger J.
2009 “Afterword: Urban Informatics and Social Ontology”, in M. Foth (ed.) Handbook of Research on Urban Informatics, pp. 450–4. Information Science Reference.
Ewick, Patricia, and Susan S. Silbey
1998 The common place of law: Stories from everyday life. University of Chicago Press.
Lessig, Lawrence
2006 Code and other laws of Cyberspace: Version 2.0. CC BY-SA 2.5.
Rothstein, Richard
2017 The color of law: A forgotten history of how our government segregated America. Liveright Publishing.