Tag: content moderation

Content Moderation: Mediating Public Speech Privately

Social media constitutes a universe of more images, text and videos than can be humanly experienced, read, and heard. However, disinformation, terrorist content, harassment, and other kinds of negative content have made ‘content moderation’ one of the most pressing demands from large online communication platforms (“intermediaries”), such as Facebook, YouTube, and Twitter. Every single day, major platforms like Facebook, Twitter, and YouTube receive thousands of requests to review or take down content that violates their internal policies or an external law. Sometimes they receive requests, both from the US government and foreign governments, for information on users, or to censor specific people and accounts. Content moderation can be defined as “the organized practice of screening user-generated content (UGC) posted to Internet sites, social media, and other online outlets, in order to determine the appropriateness of the content for a given site, locality, or jurisdiction” (Roberts 2017). The rules for content (read more...)