Skip to main content
Social Sci LibreTexts

15.3: Content Moderation and Deplatforming

  • Page ID
    144086
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    “The wave of violence has shown technology companies that communication and coordination flow in tandem.  Now that technology corporations are implicated in acts of massive violence by providing and protecting forums for hate speech, CEOs are called to stand on their ethical principles, not just their terms of service.” – Joan Donovan, author of “Navigating the Tech Stack: When, Where and How Should We Moderate Content?


    Overview

    At first glance, the issue of whether a private company has the right to moderate content for the purpose of removing (perceived) offensive or misleading content is simply a matter of conducting business. The most common meme used to describe this policy is the “No shirt. No Shoes. No Service” policy used by restaurants to refuse service to those who do not comply with the rules. Not even the First Amendment to the U.S. Constitution applies to common instances of content moderation since the social media companies in question are not entities of the government curtailing free speech.

    However, the issue becomes more complicated when you consider that a small handful of tech companies now control the vast majority of content that people engage with – there is no equivalent “public sphere.”

    What happens when a municipal government entity that uses Facebook blocks a user because their posts were considered offensive? When then-president Donald Trump used his personal Twitter account as a forum for his political purposes, the legal question arose whether he was allowed to block users who’s views he did not like. The courts stated that, even when a person could still express their viewpoints elsewhere, Trump’s inflicting a burden on speech was a violation of the First Amendment, especially since he was using private property for public use. These instances point to potential viewpoint discrimination.

    This chapter of study asks: Under what conditions does content moderation and deplatforming fall into the realm of censorship? Is freedom of speech synonymous with an entitlement to freedom of reach?

    Numerous legal arguments have been made to claim that social media systems, by virtue of their function and reach, constitute a “transformation” from being a private entity into a public forum (which must not discriminate against viewpoints). All of these arguments have failed in the court of law. As Justice Kavanaugh stated, “Providing some kind of forum for speech is not an activity that only governmental entities have traditionally performed. Therefore, a private entity who provides a forum for speech is not transformed by that fact alone into a state actor.”

    This chapter will provide you with a range of perspectives on how content on social media systems are moderated (and by whom) and the means by which a person or entity can be deplatformed (it’s more complex than you’d imagine!).

    Key Terms

    Deplatforming – An action taken by a tech company to remove or constrain a person’s communication on its private, commercial system due to violation of its terms of use. The most prominent example of deplatforming took place following the 2020 presidential election. Deplatforming can also refer to a data hosting system cutting off services to a communication platform for lack of moderating content considered dangerous to the public.

    First Amendment to the Constitution of the United States of America – “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.” Please note that the First Amendment applies to government actions to abridge free speech. It does not pertain to the right of private companies to censor content according to the Terms of Service policies each user agrees to upon joining. Arguments for unabridged free speech in social media as an unalienable right are often predicated on the misbelief that all speech is protected under the First Amendment when in fact it only pertains to government actions to curtail free speech.

    § 230(c) of the Communications Decency Act: This act of Congress (1996) states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This provision protects online platforms from liability for the content posted by its users. Review the infographic published by the Electronic Frontier Foundation, and organization dedicated to free speech issues on the Internet.

    Viewpoint discrimination: “When [the government] engages in viewpoint discrimination, it is singling out a particular opinion or perspective on that subject matter for treatment unlike that given to other viewpoints.” Source: The First Amendment Encyclopedia.

    Capitalism – An economic and political model that sanctions private ownership of the production of goods and services. This is relevant to this area of study because a private company can set their own Terms of Use policy within the limits of statutory rules of commerce. When a user agrees to the Terms of Use, they are bound by those terms without government recourse, provided those Terms of Use do not violate laws related to discrimination against a protected class.

    What should you be focusing on?

    Your objectives in this module are:

    • Identify the range of methods SM systems use to moderate content.
    • Examine the ethical factors involved in employing human content moderators where they are exposed to violence, cruelty, hatred, and death as a focal point of their work.
    • Develop an executive position on the conditions under which a person or entity should or should not be deplatformed.

    Readings & Media

    Thematic narrative in this chapter

    In the following readings and media, the authors will present the following themes:

    1. Content moderation removes violent and objectionable content from social media and the Internet.
    2. Content moderators are exposed to a non-stop flow of horrible content that causes psychological damage.
    3. Companies that facilitate online communication are making discretionary decisions to remove users or entities from their systems that they deem to impose a risk to their customers or the public.
    4. There are several ways that companies can deplatform a person or entity.

    Required Article: WIRED Magazine – “The Laborers Who Keep D**k Pics and Beheadings Out of Your Facebook Feed” by Adrien Chen, October 3, 2014.

    This article contains foul language and references to violent imagery (it does not show any), so please be warned in advance as you read about the tasks human content moderators perform to keep social media systems “clean.”

    Chen, A. (2014, October 3). The Laborers Who Keep D**k Pics and Beheadings Out of Your Facebook Feed. WIRED Magazine. https://www.wired.com/2014/10/content-moderation/

    Optional: Bloomberg.com “TikTok Sued by Content Moderator Disturbed by Graphic Videos” by Robert Burnson, December 23, 2021

    “TikTok’s 10,000 content moderators are exposed to a regular diet of child pornography, rapes, beheadings and animal mutilation, according to a lawsuit filed against the video-sharing platform and its parent, ByteDance Inc.”

    Required Article: “5 Types Of Content Moderation And How To Scale Using AI” by Thomas Molfetto, Clarifai, Inc., June 13, 2021

    This article describes the different ways that artificial intelligence (AI) can be used to moderate content without human intervention.

    Molfetto, T. (2021, June 13). 5 Types Of Content Moderation And How To Scale Using AI. Clarifai, Inc., https://www.clarifai.com/blog/5-type...-ai-is-helping


    Deplatforming: The following media describe the legal framework through which content moderation and deplatforming is viewed, according to various perspectives and legal interests. In the Supplemental resources, you will find an article that describes the various ways that a person or entity can be deplatformed according to the layers of service in the Internet architecture.


    Required Article: Normalizing De-Platforming: The Right Not to Tolerate the Intolerant” by Robert Sprague, Professor of Legal Studies in Business, University of Wyoming College of Business Department of Management & Marketing, September 1, 2021.

    This article describes the legal boundaries within which online platforms operate. It traces the origins of the § 230(c) of the Communications Decency Act and the various legal cases that have been tried to moderate online content and control users’ access to Internet platforms.

    There are details provided for specific legal cases which you can skim if you like. However, your focus of interest should be about how corporate interests interpret their roles, obligations, and responsibility for user generated content:

    • Where is the line of responsibility when user generated content causes harm while using the tools provided to them by a platform?
    • At what point, if at all, does the state (our government) assert an interest in sustaining free speech on a privately owned business’s property? Wouldn’t this be considered a Socialist intervention?
    • Under what conditions is a corporate action to deplatform a user considered viewpoint discrimination?
    • How strong is the argument that the lack of a comparable alternative to publish speech amounts to a constraint of free speech if a person is deplatformed from a global publishing platform?

    Sprague, R. (2021). Normalizing De-Platforming: The Right Not to Tolerate the Intolerant. Social Science Research Network: https://ssrn.com/abstract=3915739 or http://dx.doi.org/10.2139/ssrn.3915739

    Required Video: Al Jazeera English – “Trump-free Twitter: The debate over deplatforming | The Listening Post”

    It is difficult to find objective unbiased reporting on the issue of deplatforming since the majority of those who have been subject to it have been predominantly right-wing, politically. The program below is produced by Al Jazeera, a global news agency that often frames issues from a perspective other than a strictly western/American point of view.

    View only the first 12 minutes of this video (the rest of it is related to different topics). Consider the following as you watch:

    • What does being “silenced” really mean, from a Constitutional perspective? Does being deplatformed prevent anyone from actually speaking, or are there special conditions to the exercise of free speech today that the Founding Fathers could not have anticipated?
    • Does government intervention into controlling a private company’s platforming decisions deviate from our traditional private capitalist laissez-faire economic model into the realm of state controlled socialism?

    Al Jazeera English. (2021, January 17). “Trump-free Twitter: The debate over deplatforming | The Listening Post”[Video]. YouTube. https://youtu.be/PxCmtHie6HM


    Optional: Supplemental resources that are relevant to content moderation

    Centre for International Governance Innovation – “Navigating the Tech Stack: When, Where and How Should We Moderate Content?” by Joan Donovan, Centre for International Governance Innovation, October 28, 2019. This article describes how content moderations and deplatforming occurs at each level of the Internet’s service structure.

    The New York Times – “The Silent Partner Cleaning Up Facebook for $500 Million a Year” By Adam Satariano and Mike Isaac. Published Aug. 31, 2021, updated Oct. 28, 2021. This article traces the historic efforts by Facebook to outsource content moderation to a third-party provider, Accenture, and how they realized they needed to expand their resources to keep up with the flood of disturbing content being published all day, everyday.


    15.3: Content Moderation and Deplatforming is shared under a not declared license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?