Distraction Free Reading

Content Moderation: Mediating Public Speech Privately

Social media constitutes a universe of more images, text and videos than can be humanly experienced, read, and heard. However, disinformation, terrorist content, harassment, and other kinds of negative content have made ‘content moderation’ one of the most pressing demands from large online communication platforms (“intermediaries”), such as Facebook, YouTube, and Twitter.

Every single day, major platforms like Facebook, Twitter, and YouTube receive thousands of requests to review or take down content that violates their internal policies or an external law. Sometimes they receive requests, both from the US government and foreign governments, for information on users, or to censor specific people and accounts.

Content moderation can be defined as “the organized practice of screening user-generated content (UGC) posted to Internet sites, social media, and other online outlets, in order to determine the appropriateness of the content for a given site, locality, or jurisdiction” (Roberts 2017). The rules for content moderation differ on each platform. The absence of a legal framework for content moderation as a whole has led internet companies to take over the role of legislator, executive, and judiciary when it comes to mediating what stays online and what doesn’t. Each platform has different categories for online content and various ways to manage it. In response to the ambiguities in law, intermediaries are constructing and instituting their own content moderation practices.

However, little is known about the process of the construction of content moderation policies, as well as about the policies themselves. It’s not clear what the underlying objectives and values of the policies are. There is no one policy to govern all the major intermediaries, and the process is an ongoing and dynamic one. There is little public access and scrutiny of the processes of construction and implementation.

In this post, I provide an overview of my research, an exploration of the socio-legal process of content moderation by private intermediaries. It examines the internalization of legal processes, and the interactions of organizations and individuals creating new norms.

For my preliminary fieldwork, I used a qualitative approach and was a ‘participant observer’ at 5 public events over the course of 4 months. The events were publicly accessible, however one would only be aware of them if they were subscribed to internal e-mail lists or kept up with announcements by organizers. The construction of policies is also being articulated and contested on social media platforms themselves. The process by which those both intermediaries and commentators are constructing the narrative of content moderation can be unpacked in both digital and physical spaces.

Talesh’s (2012) research on how the structure of dispute resolution affects how business values shape the meaning of law is helpful in framing the field in which I have conducted preliminary fieldwork. Talesh asks “how the meaning of law is constructed through different organizational dispute resolution structures” (464). Content moderation policies can be viewed as examples of structures that vary in several ways, and these structures can be seen to be forming in the organizational field, which includes public events as well as statements made to the public on the Internet.

Construction of a “Legal” Process

In recent months, Facebook has repeatedly come under fire for issues ranging from disinformation campaigns allowed on its platform during the 2016 US elections, to police officers using Facebook to monitor Black Lives Matter activists (Newton 2018), to amplifying disinformation and hate speech which some contend has led to a genocide in Myanmar (Mozur 2018).

In what seems like an attempt to assuage growing public uproar, Facebook recently said it would institute an “independent Supreme Court” or “Oversight Board” to resolve content moderation reviews and appeals in “difficult” cases. Mark Zuckerberg has stated that his goal is to institute a more “democratic or community-oriented process” (Douek 2018).

When Facebook’s CEO Mark Zuckerberg announced that they would be instituting an independent “Supreme Court,” there was an immediate response from academics and commentators. There was some skepticism but also intrigue about how these systems would work (Klonick and Kadri 2018).

After consulting “2000+ people” over “6+ months” (Harris 2019), Facebook released more details on the structure and composition of the Oversight Board (Harris 2019). Of interest is the fusion of legal language and logic with the process of constructing content moderation policies. The process of the construction of content moderation practices is described as having a “common law” aspect to it. The structure of the Oversight Board is described as a “Governance Structures” and the guidelines are referred to as “Governing Documents” (Harris 2019). Members of the Board will be expected to defer to “precedent.” Facebook claims to have added checks and balances that will ensure the Board’s “independence” (Harris 2019).

Facebook’s Oversight Board is a response to the several layers of uncertainty faced by intermediaries. One is how users will use the platform and the kind of content they will post. Since this is hard to predict, intermediaries must institute processes to manage the content. Another aspect of the content people post is how others will react to and use that content. At certain scales, this leads to public scandals such as the examples given above.

The third source of uncertainty is governmental pressure. Oftentimes governments request that platforms provide them with information, or censor content on the platforms. Intermediaries have to build these possibilities into their policies. There is an interplay between intermediaries’ desire to achieve certain business goals and values and responding to social, political, and legal imperatives. This includes being able to do business in jurisdictions with authoritarian regimes. The internal processes have to respond to these uncertainties while also gaining legitimacy in the eyes of its users and the government. The kind of language Facebook uses in its Oversight Board charter adds a preliminary layer of legitimacy to what private intermediaries are doing, and how they are doing it.

Discussions of Scale

It may seem like intermediaries have unfettered freedom and resources to construct internal processes from scratch. Facebook reports that between July and September 2018 alone, it took action on 31 million “pieces of content” i.e. photos, videos and posts (Facebook Transparency Report). Google’s Trust and Safety team alone consists of 10,000 people (Madrigal 2018) and at Facebook, 60 people are involved solely in crafting content moderation policies (Madrigal 2018).

Discussions of the scale and complexity of content moderation are brought up repeatedly by commentators and representatives of intermediaries. The scale and complexity of content moderation requests has fundamentally shaped the policies. It’s also led intermediaries to grapple with the difficulty of creating a system that takes into account “common due process issues.” Facebook’s Monika Bickert (2018) has admitted, “A company that reviews a hundred thousand pieces of content per day and maintains a 99 percent accuracy rate may still have up to a thousand errors” (269).

Content moderation has evolved in some ways directly in response to scale. When the intermediaries weren’t hosting the content of millions of people and weren’t involved in public debates about their responsibilities, they responded to content differently than they do now. They didn’t have to make their policies so transparent or even structured in their early days. They had less public-facing accountability concerns attached to every high-profile decision. However, growing content and scale of content moderation issues has led intermediaries to start creating more mechanized, structured policies. In some ways, public scrutiny was important in getting platforms to respond and appear legitimate.

The creation of the Oversight Board is further evidence of the role of public scrutiny. The Board will review cases that are “severe, large-scale and important for public discourse” and public discourse is defined as “the content spurs significant public debate and/or important political and social discourse” (Harris 2019).

The present structures of the policies are partly derived from the platforms expanding and needing a system in place. However, that also means scale may serve as a justification for whatever procedures they will be adopting. Because of the scale of content moderation, intermediaries can’t predict how effective a given policy will be. This also means that they have a lot of room to experiment.

It’s objectively difficult to manage thousands of requests; that’s why platforms have used, and will continue to use, automated filters and content removal technology. However, there’s little transparency and public accountability about how exactly these tools are developed and in what instances they are used.

A Public Future Shaped by Private Processes

It’s important to understand the special role that intermediaries have in shaping the mechanics and dynamics of an internal legal processes. That does not mean private intermediaries should have unfettered freedom to shape public rights. While it is interesting to see intermediaries take the lead in instituting structures, there are many remaining questions. Will public legal institutions like courts have access to the Oversight Board? Will users be able to appeal to courts? What are the internal bylaws based on, and who gives them the force of the law?


Works Cited

Bickert, Monika. 2018. “Defining the Boundaries of Free Speech on Social Media.” In The Free Speech Century, edited by Lee C. Bollinger and Geoffrey R. Stone, 254-271. New York: Oxford University Press.

Douek, E. 2018, April 09. The Supreme Court of Facebook: Mark Zuckerberg Floats a Governance Structure for Online Speech. Retrieved from https://www.lawfareblog.com/supreme-court-facebook-mark-zuckerberg-floats-governance-structure-online-speech

Edelman, Lauren. B. 2016. Working law: Courts, corporations, and symbolic civil rights. Chicago: The University of Chicago Press.

Edelman, Lauren B., et al. 1993. “Internal Dispute Resolution: The Transformation of Civil Rights in the Workplace.” Law & Society Review, 27 (3), 497.

Edelman, Lauren B., and Suchman, Mark C., 1999. “When the “Haves” Hold Court: Speculations on the Organizational Internalization of Law.” Law & Society Review, 33(4) 941.

Exclusive: Facebook exec says content moderation is ‘never going to be perfect’. (2019, September 13). Retrieved from https://finance.yahoo.com/news/facebook-content-moderation-john-devine-142243998.html

Harris, B. (2019, June 27). Global Feedback and Input on the Facebook Oversight Board for Content Decisions Retrieved from https://newsroom.fb.com/news/2019/06/global-feedback-on-oversight-board/

Harris, B. (2019, September 17). Establishing Structure and Governance for an Independent Oversight Board. Retrieved from https://newsroom.fb.com/news/2019/09/oversight-board-structure/

Klonick, K., & Kadri, T. 2018, November 17. How to Make Facebook’s ‘Supreme Court’ Work. Retrieved from https://www.nytimes.com/2018/11/17/opinion/facebook-supreme-court-speech.html, December 13, 2018.

Mozur, P. 2018, October 15. A Genocide Incited on Facebook, With Posts From Myanmar’s Military. Retrieved from https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html

Newton, C. 2018, August 24. Facebook’s content moderation efforts face increasing skepticism. Retrieved from https://www.theverge.com/2018/8/24/17775788/facebook-content-moderation-motherboard-critics-skepticism

Roberts S.T. (2017) Content Moderation. In: Schintler L., McNeely C. (eds) Encyclopedia of Big Data. Springer, Cham

Talesh, S. A. 2012. “How Dispute Resolution System Design Matters: An Organizational Analysis of Dispute Resolution Structures and Consumer Lemon Laws.” Law & Society Review, 46(3), 463-496. doi:10.1111/j.1540-5893.2012.00503.x

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *