This is the mobile-friendly web version of the original article.
Cyber-Silencing the Community - YouTube, Divino Group, and Reimagining Section 230
Washington Journal of Law, Technology & Arts
Volume 17 Issue 2, Article 2
7-1-2022
Layla G. Maurer
Case Western Reserve University School of Law
Video: Case Western Reserve University - Campus Tour
Recommended Citation Layla G. Maurer, Cyber-Silencing the Community: YouTube, Divino Group, and Reimagining Section 230, 17 WASH. J. L. TECH. & ARTS 172 (2022). Available at: https://digitalcommons.law.uw.edu/wjlta/vol17/iss2/2
This Article is brought to you for free and open access by the Law Reviews and Journals at UW Law Digital Commons. It has been accepted for inclusion in Washington Journal of Law, Technology & Arts by an authorized editor of UW Law Digital Commons. For more information, please contact [email protected].
Cover Page Footnote
*Layla Maurer received her JD from Case Western Reserve University School of Law in May of 2022. She holds a Master’s in Library and Information Science from Kent State University and has a career background in technology and digital media. She currently works for the legal department at Wizards of the Coast, and is broadly interested in technology law, gaming law, privacy, and digital citizenship. She has written on internet- and technology-based legal issues including artificial intelligence, Section 230, and trademark for gamertags in Esports.
ABSTRACT
Social media platforms, once simple messaging boards, have grown to colossal size. They are now a vital source of communication and connection, particularly for marginalized groups such as the LGBTQ+ community. Social media holds incredible sway over the news, political discourse, and entertainment that we consume, and the platforms we use are now able to sculpt conversations simply by allowing or disallowing (i.e., moderating) specific types of speech or content.
One indirect form of moderation is demonetization, a means by which content creators are disallowed revenue from advertisements on their hosted media. The consequence of improper demonetization is not just financial: demonetized content is also deprioritized and, in a sea of competing media, often overlooked or in some cases entirely hidden. This process effectively removes demonetized voices from the broader conversation, which is precisely what happened to a list of LGBTQ+ creators on YouTube starting in 2017. Those creators’ voices were—seemingly unintentionally—silenced, as an algorithm inadvertently flagged their content as “adult” or “sexually suggestive.” The creators lost following and revenue, and YouTube as a host of online content faced no consequences for the error, thanks to the protections afforded it by Section 230 of the Communications Decency Act of 1996. Section 230 has been treated as a shield for online platforms, as well as a sword enabling those platforms to moderate content as they see fit (with several restrictions).
Moderation is necessary and important in the broadest sense. However, modern platforms, being a far cry from the messaging boards of the late 1990s in practically every sense, must be held to higher account for the means by which they undertake that moderation. This paper suggests a set of simple amendments to Section 230 that would allow for monetized content creators whose content had been inappropriately flagged and demonetized to a) have that content remonetized and b) to seek recourse in the form of fines levied against platforms that repeatedly misflag content that conforms with that platform’s stated policies. While this solution is less than ideal, it is one which would place a higher onus on the platforms themselves while still protecting those platforms’ rights to moderate as they see fit.
* Layla Maurer received her JD from Case Western Reserve University School of Law in May of 2022. She holds a Master’s in Library and Information Science from Kent State University and has a career background in technology and digital media. She currently works for the legal department at Wizards of the Coast, and is broadly interested in technology law, gaming law, privacy, and digital citizenship. She has written on internet- and technology-based legal issues including artificial intelligence, Section 230, and trademark for gamertags in Esports.
TABLE OF CONTENTS
INTRODUCTION
PART I: THE PATH TO DIVINO GROUP AND THE ROADBLOCKS FOLLOWING
- PART II: SPEECH ON THE INTERNET AND SECTION 230
- A. The Necessity of, and Unintended Consequences from, Private Moderation
- *1. “Free Speech” as a Moving Target
- *2. Shifting Sands and the Social Media Debate
- *3. YouTube, Despite Its Publicly Shareable Content, is Not a Public Site
- B. Section 230’s Applicability in Divino Group
- C. The Intent of Section 230 and What Needs to Change
- A. The Necessity of, and Unintended Consequences from, Private Moderation
PART III: REIMAGINING 230 AND RECOURSE FOR CREATORS
- CONCLUSION
Table of contents
- INTRODUCTION
- PART I - THE PATH TO DIVINO GROUP AND THE ROADBLOCKS FOLLOWING
- PART II - SPEECH ON THE INTERNET AND SECTION 230
- PART III - REIMAGINING 230 AND RECOURSE FOR CREATORS
- CONCLUSION