skip to main content
research-article
Open Access
Honorable Mention

Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas

Published:18 October 2021Publication History
Skip Abstract Section

Abstract

Social media sites use content moderation to attempt to cultivate safe spaces with accurate information for their users. However, content moderation decisions may not be applied equally for all types of users, and may lead to disproportionate censorship related to people's genders, races, or political orientations. We conducted a mixed methods study involving qualitative and quantitative analysis of survey data to understand which types of social media users have content and accounts removed more frequently than others, what types of content and accounts are removed, and how content removed may differ between groups. We found that three groups of social media users in our dataset experienced content and account removals more often than others: political conservatives, transgender people, and Black people. However, the types of content removed from each group varied substantially. Conservative participants' removed content included content that was offensive or allegedly so, misinformation, Covid-related, adult, or hate speech. Transgender participants' content was often removed as adult despite following site guidelines, critical of a dominant group (e.g., men, white people), or specifically related to transgender or queer issues. Black participants' removed content was frequently related to racial justice or racism. More broadly, conservative participants' removals often involved harmful content removed according to site guidelines to create safe spaces with accurate information, while transgender and Black participants' removals often involved content related to expressing their marginalized identities that was removed despite following site policies or fell into content moderation gray areas. We discuss potential ways forward to make content moderation more equitable for marginalized social media users, such as embracing and designing specifically for content moderation gray areas.

References

  1. Julia Angwin, ProPublica, and Hannes Grassegger. 2017. Facebook's Secret Censorship Rules Protect White Men From Hate Speech But Not Black Children. https://www.propublica.org/article/facebook-hate-speech-censorship-internal-documents-algorithmsGoogle ScholarGoogle Scholar
  2. Carolina Are. 2020. How Instagram's algorithm is censoring women and vulnerable users but helping online abusers. Feminist Media Studies, Vol. 20, 5 (July 2020), 741--744. https://doi.org/10.1080/14680777.2020.1783805 Publisher: Routledge _eprint: https://doi.org/10.1080/14680777.2020.1783805.Google ScholarGoogle ScholarCross RefCross Ref
  3. Carolina Are. 2021. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Feminist Media Studies, Vol. 0, 0 (May 2021), 1--18. https://doi.org/10.1080/14680777.2021.1928259 Publisher: Routledge _eprint: https://doi.org/10.1080/14680777.2021.1928259.Google ScholarGoogle ScholarCross RefCross Ref
  4. Paul M. Barrett and J. Grant Sims. 2021. False Accusation: The Unfounded Claim that Social Media Companies Censor Conservatives. Technical Report. NYU Stern Center for Business and Human Rights. https://static1.squarespace.com/static/5b6df958f8370af3217d4178/t/6011e68dec2c7013d3caf3cb/1611785871154/NYU+False+Accusation+report_FINAL.pdfGoogle ScholarGoogle Scholar
  5. Jeffrey Layne Blevins, Ezra Edgerton, Don P. Jason, and James Jaehoon Lee. 2021. Shouting Into the Wind: Medical Science versus "B. S." in the Twitter Maelstrom of Politics and Misinformation About Hydroxychloroquine. Social Media+Society, Vol. 7, 2 (April 2021), 20563051211024977. https://doi.org/10.1177/20563051211024977 Publisher: SAGE Publications Ltd.Google ScholarGoogle ScholarCross RefCross Ref
  6. Danielle Blunt, Emily Coombes, Shanelle Mullin, and Ariel Wolf. 2020. Posting into the Void. Technical Report. Hacking//Hustling.Google ScholarGoogle Scholar
  7. Danielle Blunt and Ariel Wolf. 2020. Erased: The Impact of FOSTA-SESTA & the Removal of Backpage. Technical Report. Hacking//Hustling.Google ScholarGoogle Scholar
  8. Catalina Botero-Marino, Jamal Greene, Michael W. McConnell, and Helle Thorning-Schmidt. 2020. Opinion textbar We Are a New Board Overseeing Facebook. Here's What We'll Decide. The New York Times (May 2020). https://www.nytimes.com/2020/05/06/opinion/facebook-oversight-board.htmlGoogle ScholarGoogle Scholar
  9. Carolyn Bronstein. 2020. Pornography, Trans Visibility, and the Demise of Tumblr. TSQ: Transgender Studies Quarterly, Vol. 7, 2 (May 2020), 240--254. https://doi.org/10.1215/23289252--8143407 Publisher: Duke University Press.Google ScholarGoogle ScholarCross RefCross Ref
  10. Taina Bucher. 2018. If. .. Then: Algorithmic Power and Politics. Oxford University Press. Google-Books-ID: 2GdaDwAAQBAJ.Google ScholarGoogle Scholar
  11. Robyn Caplan. 2018. Content or Context Moderation? Artisanal, Community-Reliant, and Industrial Approaches. Technical Report. Data&Society. https://datasociety.net/wp-content/uploads/2018/11/DS_Content_or_Context_Moderation.pdfGoogle ScholarGoogle Scholar
  12. Eshwar Chandrasekharan, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein, and Eric Gilbert. 2017. You Can't Stay Here: The Efficacy of Reddit's 2015 Ban Examined Through Hate Speech. Proc. ACM Hum.-Comput. Interact., Vol. 1, CSCW (Dec. 2017), 31:1--31:22. https://doi.org/10.1145/3134666Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Eshwar Chandrasekharan, Mattia Samory, Shagun Jhaver, Hunter Charvat, Amy Bruckman, Cliff Lampe, Jacob Eisenstein, and Eric Gilbert. 2018. The Internet's Hidden Rules: An Empirical Study of Reddit Norm Violations at Micro, Meso, and Macro Scales. Proceedings of the ACM on Human-Computer Interaction, Vol. 2, CSCW (Nov. 2018), 1--25. https://doi.org/10.1145/3274301Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Alexander Cheves. 2018. The Dangerous Trend of LGBTQ Censorship on the Internet. (Dec. 2018). https://www.out.com/out-exclusives/2018/12/06/dangerous-trend-lgbtq-censorship-internetGoogle ScholarGoogle Scholar
  15. Aymar Jean Christian, Faithe Day, Mark Díaz, and Chelsea Peterson-Salahuddin. 2020. Platforming Intersectionality: Networked Solidarity and the Limits of Corporate Social Media:. Social MediaGoogle ScholarGoogle ScholarCross RefCross Ref
  16. Society (Aug. 2020). https://doi.org/10.1177/2056305120933301 Publisher: SAGE PublicationsSage UK: London, England.Google ScholarGoogle ScholarCross RefCross Ref
  17. Danielle Keats Citron. 2014. Hate Crimes in Cyberspace. Harvard University Press.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Danielle Keats Citron and Helen Norton. 2011. Intermediaries and Hate Speech: Fostering Digital Citizenship for our Information Age. BUL Review (2011), 51.Google ScholarGoogle Scholar
  19. Ben Collins. 2021. Increasingly militant 'Parler refugees' and anxious QAnon adherents prep for doomsday. https://www.nbcnews.com/tech/internet/increasingly-militant-parler-refugees-anxious-qanon-adherents-prep-doomsday-n1254775Google ScholarGoogle Scholar
  20. Kate Crawford and Tarleton Gillespie. 2014. What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society (July 2014), 1--19. https://doi.org/10.1177/1461444814543163Google ScholarGoogle ScholarCross RefCross Ref
  21. Cristina Criddle. 2020. Transgender users accuse TikTok of censorship. BBC News (Feb. 2020). https://www.bbc.com/news/technology-51474114Google ScholarGoogle Scholar
  22. Gilles Deleuze and Felix Guattari. 1987. A Thousand Plateaus: Capitalism and Schizophrenia. U of Minnesota Press. Google-Books-ID: C948Tsr72woC.Google ScholarGoogle Scholar
  23. Dolores Delgado Bernal. 1997. Chicana school resistance and grassroots leadership: providing an alternative history of the 1968 East Los Angeles blowouts. Ph.D. Dissertation. University of California, Los Angeles.Google ScholarGoogle Scholar
  24. Christina Dinar. 2021. The state of content moderation for the LGBTIQA+ community and the role of the EU Digital Services Act. Technical Report. Heinrich-Böll-Stiftung. 23 pages.Google ScholarGoogle Scholar
  25. Natasha Duarte, Emma Llansó, and Anna Loup. 2018. Mixed Messages? The Limits of Automated Social Media Content Analysis. In 2018 Conference on Fairness, Accountability, and Transparency. https://cdt.org/files/2017/12/FAT-conference-draft-2018.pdfGoogle ScholarGoogle Scholar
  26. Stefanie Duguay, Jean Burgess, and Nicolas Suzor. 2018. Queer women's experiences of patchwork platform governance on Tinder, Instagram, and Vine. 16. https://doi.org/10.1177/1354856518781530Google ScholarGoogle ScholarCross RefCross Ref
  27. Elizabeth Dwoskin, Nitasha Tiku, and Heather Kelly. 2020. Facebook to start policing anti-Black hate speech more aggressively than anti-White comments, documents show. Washington Post (Dec. 2020). https://www.washingtonpost.com/technology/2020/12/03/facebook-hate-speech/Google ScholarGoogle Scholar
  28. Facebook. 2021 a. Community Standards. https://www.facebook.com/communitystandards/Google ScholarGoogle Scholar
  29. Facebook. 2021 b. Community Standards: Adult Nudity and Sexual Activity. https://www.facebook.com/communitystandards/adult_nudity_sexual_activityGoogle ScholarGoogle Scholar
  30. Facebook. 2021 c. Community Standards: False News. https://www.facebook.com/communitystandards/false_newsGoogle ScholarGoogle Scholar
  31. Facebook. 2021 d. Community Standards: Hate Speech. https://www.facebook.com/communitystandards/hate_speechGoogle ScholarGoogle Scholar
  32. Facebook. 2021 e. Community Standards: Violence and Incitement. https://www.facebook.com/communitystandards/credible_violenceGoogle ScholarGoogle Scholar
  33. Jenny Fan and Amy X. Zhang. 2020. Digital Juries: A Civics-Oriented Approach to Platform Governance. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). Association for Computing Machinery, Honolulu, HI, USA, 1--14. https://doi.org/10.1145/3313831.3376293Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of Eating Disorders through Content Moderation. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, CSCW1 (May 2020), 040:1--040:28. https://doi.org/10.1145/3392845Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Facebook for Media. [n.d.]. Working to Stop Misinformation and False News. https://www.facebook.com/formedia/blog/working-to-stop-misinformation-and-false-newsGoogle ScholarGoogle Scholar
  36. Electronic Frontier Foundation. 2019 a. EFF Project Shows How People Are Unfairly "TOSsed Out" By Platforms' Absurd Enforcement of Content Rules. https://www.eff.org/press/releases/eff-project-shows-how-people-are-unfairly-tossed-out-platforms-absurd-enforcementGoogle ScholarGoogle Scholar
  37. Electronic Frontier Foundation. 2019 b. What Tumblr's Ban on 'Adult Content' Actually Did. https://www.eff.org/tossedout/tumblr-ban-adult-contentGoogle ScholarGoogle Scholar
  38. Paulo Freire. 1970. Education for critical consciousness. Continuum Publishing Company, New York, NY, USA.Google ScholarGoogle Scholar
  39. Paulo Freire. 1973. Pedagogy of the oppressed. The Seabury Press, New York, NY, USA.Google ScholarGoogle Scholar
  40. Ahiza Garcia-Hodges. 2021. Apple App Store, Google Play suspend Parler pending better moderation. https://www.nbcnews.com/tech/social-media/google-play-suspends-parler-until-app-develops-moderation-policies-n1253609Google ScholarGoogle Scholar
  41. Meira Gebel. 2020. Black Creators Say TikTok Still Secretly Hides Their Content. https://www.digitaltrends.com/social-media/black-creators-claim-tiktok-still-secretly-blocking-content/Google ScholarGoogle Scholar
  42. Ysabel Gerrard and Helen Thornham. 2020. Content moderation: Social media's sexist assemblages. New Media & Society, Vol. 22, 7 (July 2020), 1266--1286. https://doi.org/10.1177/1461444820912540 Publisher: SAGE Publications.Google ScholarGoogle ScholarCross RefCross Ref
  43. Tarleton Gillespie. 2017. Governance of and by platforms. In The SAGE Handbook of Social Media. SAGE, New York, 30.Google ScholarGoogle Scholar
  44. Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press, New Haven.Google ScholarGoogle Scholar
  45. Henry Giroux. 1983. Theories of reproduction and resistance in the new sociology of education: a critical analysis. Harvard Educational Review (1983).Google ScholarGoogle Scholar
  46. Kayla Gogarty, Spencer Silva, Carly Evans, and Media Matters. 2020. A new study finds that Facebook is not censoring conservatives despite their repeated attacks. Technical Report. https://www.mediamatters.org/facebook/new-study-finds-facebook-not-censoring-conservatives-despite-their-repeated-attacksGoogle ScholarGoogle Scholar
  47. Alessandra Gomes and Dennys e Thiago Dias Oliva Antonialli. 2019. Drag queens and Artificial Intelligence: should computers decide what is ?toxic' on the internet? http://www.internetlab.org.br/en/freedom-of-expression/drag-queens-and-artificial-intelligence-should-computers-decide-what-is-toxic-on-the-internet/Google ScholarGoogle Scholar
  48. James Grimmelmann. 2015. The Virtues of Moderation. Yale Journal of Law and Technology, Vol. 17 (2015), 42--109. https://heinonline.org/HOL/P?h=hein.journals/yjolt17&i=42Google ScholarGoogle Scholar
  49. Jessica Guynn. 2017. Facebook apologizes to black activist who was censored for calling out racism. USA Today (Aug. 2017). https://www.usatoday.com/story/tech/2017/08/03/facebook-ijeoma-oluo-hate-speech/537682001/Google ScholarGoogle Scholar
  50. Jessica Guynn. 2019. Facebook while black: Users call it getting 'Zucked,' say talking about racism is censored as hate speech. https://www.usatoday.com/story/news/2019/04/24/facebook-while-black-zucked-users-say-they-get-blocked-racism-discussion/2859593002/Google ScholarGoogle Scholar
  51. Oliver L. Haimson. 2018. Social Media as Social Transition Machinery. Proc. ACM Hum.-Comput. Interact., Vol. 2, CSCW (Nov. 2018), 63:1--63:27. https://doi.org/10.1145/3274332Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Oliver L. Haimson, Jed R. Brubaker, Lynn Dombrowski, and Gillian R. Hayes. 2016. Digital Footprints and Changing Networks During Online Identity Transitions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 2895--2907. https://doi.org/10.1145/2858036.2858136Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Oliver L. Haimson, Justin Buss, Zu Weinger, Denny L. Starks, Dykee Gorrell, and Briar Sweetbriar Baron. 2020. Trans Time: Safety, Privacy, and Content Warnings on a Transgender-Specific Social Media Site. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, CSCW2 (Oct. 2020), 124:1--124:27. https://doi.org/10.1145/3415195Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Oliver L. Haimson, Avery Dame-Griff, Elias Capello, and Zahari Richter. 2019. Tumblr was a trans technology: the meaning, importance, history, and future of trans technologies. Feminist Media Studies (Oct. 2019), 1--17. https://doi.org/10.1080/14680777.2019.1678505Google ScholarGoogle ScholarCross RefCross Ref
  55. Oliver L. Haimson and Anna Lauren Hoffmann. 2016. Constructing and enforcing "authentic" identity online: Facebook, real names, and non-normative identities. First Monday, Vol. 21, 6 (June 2016). https://doi.org/10.5210/fm.v21i6.6791Google ScholarGoogle ScholarCross RefCross Ref
  56. Donna Haraway. 1988. Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies, Vol. 14, 3 (1988), 575--599. https://doi.org/10.2307/3178066 Publisher: Feminist Studies, Inc.Google ScholarGoogle ScholarCross RefCross Ref
  57. Sandra G. Harding. 2004. The Feminist Standpoint Theory Reader: Intellectual and Political Controversies. Psychology Press. Google-Books-ID: qmSySHvIy5IC.Google ScholarGoogle Scholar
  58. Drew Harwell and Craig Timberg. 2019. Pro-Trump message board "quarantined' by Reddit following violent threats. Washington Post (June 2019). https://www.washingtonpost.com/technology/2019/06/26/pro-trump-message-board-quarantined-by-reddit-following-violent-threats/Google ScholarGoogle Scholar
  59. Sally Hines. 2019. The feminist frontier: on trans and feminism. Journal of Gender Studies, Vol. 28, 2 (Feb. 2019), 145--157. https://doi.org/10.1080/09589236.2017.1411791Google ScholarGoogle ScholarCross RefCross Ref
  60. Anna Lauren Hoffmann and Anne Jonas. 2017. Recasting Justice for Internet and Online Industry Research Ethics. In Internet Research Ethics for the Social Age: New Challenges, Cases, and Contexts, Michael Zimmer and Katharina Kinder-Kurlanda (Eds.). Peter Lang Publishing, 3--19.Google ScholarGoogle Scholar
  61. Twitter Inc. 2021. Permanent suspension of @realDonaldTrump. https://blog.twitter.com/en_us/topics/company/2020/suspension.htmlGoogle ScholarGoogle Scholar
  62. Instagram. 2021 a. Combatting Misinformation on Instagram. https://about.instagram.com/blog/announcements/combatting-misinformation-on-instagramGoogle ScholarGoogle Scholar
  63. Instagram. 2021 b. Community Guidelines textbar Instagram Help Center. https://help.instagram.com/477434105621119?ref=ig_aboutGoogle ScholarGoogle Scholar
  64. Mike Isaac. 2020. Reddit, Acting Against Hate Speech, Bans ?The_Donald' Subreddit. The New York Times (June 2020). https://www.nytimes.com/2020/06/29/technology/reddit-hate-speech.htmlGoogle ScholarGoogle Scholar
  65. Mike Isaac and Kellen Browning. 2020. Fact-Checked on Facebook and Twitter, Conservatives Switch Their Apps. The New York Times (Nov. 2020). https://www.nytimes.com/2020/11/11/technology/parler-rumble-newsmax.htmlGoogle ScholarGoogle Scholar
  66. Tracy Jan and Elizabeth Dwoskin. 2017. A white man called her kids the n-word. Facebook stopped her from sharing it. Washington Post (July 2017). https://www.washingtonpost.com/business/economy/for-facebook-erasing-hate-speech-proves-a-daunting-challenge/2017/07/31/922d9bc6--6e3b-11e7--9c15--177740635e83_story.htmlGoogle ScholarGoogle Scholar
  67. Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. "Did You Suspect the Post Would be Removed?": Understanding User Reactions to Content Removals on Reddit. Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW (Nov. 2019), 192:1--192:33. https://doi.org/10.1145/3359294Google ScholarGoogle ScholarDigital LibraryDigital Library
  68. Shagun Jhaver, Sucheta Ghoshal, Amy Bruckman, and Eric Gilbert. 2018. Online Harassment and Content Moderation: The Case of Blocklists. ACM Trans. Comput.-Hum. Interact., Vol. 25, 2 (March 2018), 12:1--12:33. https://doi.org/10.1145/3185593Google ScholarGoogle ScholarDigital LibraryDigital Library
  69. Shan Jiang, Ronald E. Robertson, and Christo Wilson. 2019. Bias Misperceived:The Role of Partisanship and Misinformation in YouTube Comment Moderation. Proceedings of the International AAAI Conference on Web and Social Media, Vol. 13 (July 2019), 278--289. https://ojs.aaai.org/index.php/ICWSM/article/view/3229Google ScholarGoogle ScholarCross RefCross Ref
  70. Shan Jiang, Ronald E. Robertson, and Christo Wilson. 2020. Reasoning about Political Bias in Content Moderation. Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34, 09 (April 2020), 13669--13672. https://doi.org/10.1609/aaai.v34i09.7117Google ScholarGoogle ScholarCross RefCross Ref
  71. Jeffrey M. Jones. 2021. LGBT Identification Rises to 5.6% in Latest U. S. Estimate. https://news.gallup.com/poll/329708/lgbt-identification-rises-latest-estimate.aspx Section: Politics.Google ScholarGoogle Scholar
  72. Ben Kew. 2018. Poll: Two-Thirds of Conservatives Don't Trust Facebook, Believe Social Media Censors Conservatives. https://www.breitbart.com/tech/2018/08/29/poll-two-thirds-of-conservatives-dont-trust-facebook-believe-social-media-censors-conservatives/Google ScholarGoogle Scholar
  73. Kate Klonick. 2020. The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression. the yale law journal (2020), 82.Google ScholarGoogle Scholar
  74. Kate Klonick. 2021. Inside the Making of Facebook's Supreme Court. https://www.newyorker.com/tech/annals-of-technology/inside-the-making-of-facebooks-supreme-courtGoogle ScholarGoogle Scholar
  75. Karen Kornbluh, Adrienne Goldstein, and Eli Weiner. 2020. New Study by Digital New Deal Finds Engagement with Deceptive Outlets Higher on Facebook Today Than Run-up to 2016 Election. Technical Report. The German Marshall Fund of the United States. https://www.gmfus.org/blog/2020/10/12/new-study-digital-new-deal-finds-engagement-deceptive-outlets-higher-facebook-todayGoogle ScholarGoogle Scholar
  76. Jessa Lingel. 2017. Digital Countercultures and the Struggle for Community 1 edition ed.). The MIT Press, Cambridge, MA.Google ScholarGoogle Scholar
  77. Jessa Lingel. 2019. The gentrification of the internet. https://culturedigitally.org/2019/03/the-gentrification-of-the-internet/Google ScholarGoogle Scholar
  78. Dottie Lux and Lil Miss Hot Mess. 2017. Facebook's Hate Speech Policies Censor Marginalized Users. Wired (Aug. 2017). https://www.wired.com/story/facebooks-hate-speech-policies-censor-marginalized-users/Google ScholarGoogle Scholar
  79. Brandeis Marshall. 2021. Algorithmic misogynoir in content moderation practice. Technical Report. Heinrich-Böll-Stiftung. 17 pages.Google ScholarGoogle Scholar
  80. Nataliez Martinez and Media Matters. 2018. Study: Analysis of top Facebook pages covering American political news. Technical Report. https://www.mediamatters.org/facebook/study-analysis-top-facebook-pages-covering-american-political-newsGoogle ScholarGoogle Scholar
  81. Natalie Martinez and Media Matters. 2019. Study: Facebook is still not censoring conservatives. Technical Report. https://www.mediamatters.org/facebook/study-facebook-still-not-censoring-conservativesGoogle ScholarGoogle Scholar
  82. Adrienne Massanari. 2017. #Gamergate and The Fappening: How Reddit's algorithm, governance, and culture support toxic technocultures. New Media & Society, Vol. 19, 3 (March 2017), 329--346. https://doi.org/10.1177/1461444815608807Google ScholarGoogle ScholarCross RefCross Ref
  83. Peter McLaren. 1994. Life in schools: an introduction to critical pedagogy in the foundations of education 2nd ed.). Longman, New York, NY, USA.Google ScholarGoogle Scholar
  84. Danaë Metaxa, Joon Sung Park, James A. Landay, and Jeff Hancock. 2019. Search Media and Elections: A Longitudinal Investigation of Political Search Results. Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW (Nov. 2019), 129:1--129:17. https://doi.org/10.1145/3359231Google ScholarGoogle ScholarDigital LibraryDigital Library
  85. Deirdre K. Mulligan, Daniel Kluttz, and Nitin Kohli. 2019. Shaping Our Tools: Contestability as a Means to Promote Responsible Algorithmic Decision Making in the Professions. SSRN Scholarly Paper ID 3311894. Social Science Research Network, Rochester, NY. https://doi.org/10.2139/ssrn.3311894Google ScholarGoogle ScholarCross RefCross Ref
  86. Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society, Vol. 20, 11 (Nov. 2018), 4366--4383. https://doi.org/10.1177/1461444818773059Google ScholarGoogle ScholarCross RefCross Ref
  87. Lisa Nakamura. 2014. ?I WILL DO EVERYthing That Am Asked': Scambaiting, Digital Show-Space, and the Racial Violence of Social Media. Journal of Visual Culture, Vol. 13, 3 (Dec. 2014), 257--274. https://doi.org/10.1177/1470412914546845 Publisher: SAGE Publications.Google ScholarGoogle ScholarCross RefCross Ref
  88. Viviane Namaste. 2000. Invisible Lives: The Erasure of Transsexual and Transgendered People. University of Chicago Press. Google-Books-ID: Pq5jwRVbvY8C.Google ScholarGoogle Scholar
  89. Casey Newton. 2019. The real bias on social networks isn't against conservatives. https://www.theverge.com/interface/2019/4/11/18305407/social-network-conservative-bias-twitter-facebook-ted-cruzGoogle ScholarGoogle Scholar
  90. Casey Newton. 2021. Conservative social networks keep making the same mistake. https://www.platformer.news/p/conservative-social-networks-keepGoogle ScholarGoogle Scholar
  91. Ihudiya Finda Ogbonnaya-Ogburu, Angela D. R. Smith, Alexandra To, and Kentaro Toyama. 2020. Critical Race Theory for HCI. In Proceedings of ACM CHI Conference on Human Factors in Computing Systems. 16.Google ScholarGoogle ScholarDigital LibraryDigital Library
  92. onlinecensorship.org. [n.d.]. onlinecensorship.org - Submit Your Report. https://onlinecensorship.org/takedowns/newGoogle ScholarGoogle Scholar
  93. onlinecensorship.org. 2018. Offline-Online. https://onlinecensorship.org/content/infographicsGoogle ScholarGoogle Scholar
  94. John E Pachankis. 2007. The psychological implications of concealing a stigma: A cognitive-affective-behavioral model. Psychological Bulletin, Vol. 133, 2 (March 2007), 18. https://doi.org/10.1037/0033--2909.133.2.328Google ScholarGoogle ScholarCross RefCross Ref
  95. Deokgun Park, Simranjit Sachar, Nicholas Diakopoulos, and Niklas Elmqvist. 2016. Supporting Comment Moderators in Identifying High Quality Online News Comments. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 12. https://doi.org/10.1145/2858036.2858389Google ScholarGoogle ScholarDigital LibraryDigital Library
  96. Jessica A. Pater, Moon K. Kim, Elizabeth D. Mynatt, and Casey Fiesler. 2016. Characterizations of Online Harassment: Comparing Policies Across Social Media Platforms. In Proceedings of the 19th International Conference on Supporting Group Work - GROUP '16. ACM Press, Sanibel Island, Florida, USA, 369--374. https://doi.org/10.1145/2957276.2957297Google ScholarGoogle ScholarDigital LibraryDigital Library
  97. Delroy L. Paulhus. 1991. Measurement and Control of Response Bias.Google ScholarGoogle Scholar
  98. Manoel Horta Ribeiro, Shagun Jhaver, Savvas Zannettou, Jeremy Blackburn, Emiliano De Cristofaro, Gianluca Stringhini, and Robert West. 2020. Does Platform Migration Compromise Content Moderation? Evidence from r/The_Donald and r/Incels. arXiv:2010.10397 [cs] (Oct. 2020). http://arxiv.org/abs/2010.10397 arXiv: 2010.10397.Google ScholarGoogle Scholar
  99. Sarah T Roberts. 2016. Commercial Content Moderation: Digital Laborers' Dirty Work. In Intersectional Internet: Race, Sex, Class and Culture Online. Peter Lang, 12.Google ScholarGoogle Scholar
  100. Sarah T. Roberts. 2018. Digital detritus: 'Error' and the logic of opacity in social media content moderation. First Monday, Vol. 23, 3 (March 2018). https://doi.org/10.5210/fm.v23i3.8283Google ScholarGoogle ScholarCross RefCross Ref
  101. Kevin Roose. 2020. The President Versus the Mods. The New York Times (May 2020). https://www.nytimes.com/2020/05/29/technology/trump-twitter.htmlGoogle ScholarGoogle Scholar
  102. Mey Rude. 2019. Trace Lysette Is Latest Trans Woman Banned By Tinder. https://www.out.com/transgender/2019/9/19/trace-lysette-latest-trans-woman-be-banned-tinder Library Catalog: www.out.com.Google ScholarGoogle Scholar
  103. Salty. 2019. Exclusive: An Investigation into Algorithmic Bias in Content Policing on Instagram. https://www.saltyworld.net/algorithmicbiasreport-2/Google ScholarGoogle Scholar
  104. Salty. 2020. Shadowbanning is a Thing - and It's Hurting Trans and Disabled Advocates. https://saltyworld.net/shadowbanning-is-a-thing-and-its-hurting-trans-and-disabled-advocates/ Library Catalog: saltyworld.net Section: Algorithmic Bias.Google ScholarGoogle Scholar
  105. Morgan Klaus Scheuerman, Stacy M. Branham, and Foad Hamidi. 2018. Safe Spaces and Safe Places: Unpacking Technology-Mediated Experiences of Safety and Harm with Transgender People. Proc. ACM Hum.-Comput. Interact., Vol. 2, CSCW (Nov. 2018), 155:1--155:27. https://doi.org/10.1145/3274424Google ScholarGoogle ScholarDigital LibraryDigital Library
  106. Sarita Schoenebeck, Oliver L Haimson, and Lisa Nakamura. 2020. Drawing from justice theories to support targets of online harassment. New Media & Society (March 2020), 1461444820913122. https://doi.org/10.1177/1461444820913122 Publisher: SAGE Publications.Google ScholarGoogle ScholarCross RefCross Ref
  107. Joseph Seering, Tony Wang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media & Society (Jan. 2019), 1461444818821316. https://doi.org/10.1177/1461444818821316Google ScholarGoogle ScholarCross RefCross Ref
  108. Jack Sidnell. 2002. Outline of African American Vernacular English (AAVE) Grammar. Technical Report. https://cdt.org/wp-content/uploads/2017/11/Outline_of_AAVE_grammar___Jack_Sidnell_2002_1_Afr.pdfGoogle ScholarGoogle Scholar
  109. Olivia Solon. 2020. Facebook ignored racial bias research, employees say. https://www.nbcnews.com/tech/tech-news/facebook-management-ignored-internal-research-showing-racial-bias-current-former-n1234746Google ScholarGoogle Scholar
  110. Daniel Solórzano and Dolores Delgado Bernal. 2001. Critical race theory, transformational resistance and social justice: Chicana and Chicano students in an urban context. Urban Education (2001).Google ScholarGoogle Scholar
  111. Clare Southerton, Daniel Marshall, Peter Aggleton, Mary Lou Rasmussen, and Rob Cover. 2020. Restricted modes: Social media, content classification and LGBTQ sexual citizenship. New Media & Society (Feb. 2020), 1461444820904362. https://doi.org/10.1177/1461444820904362 Publisher: SAGE Publications.Google ScholarGoogle ScholarCross RefCross Ref
  112. Katta Spiel, Oliver L Haimson, and Danielle Lottridge. 2019. How to Do Better with Gender on Surveys: A Guide for HCI Researchers. Interactions (Aug. 2019), 4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  113. Sanna Spi?ák, Elina Pirjatanniemi, Tommi Paalanen, Susanna Paasonen, and Maria Vihlman. 2021. Social Networking Sites' Gag Order: Commercial Content Moderation's Adverse Implications for Fundamental Sexual Rights and Wellbeing. Social Media+Society, Vol. 7, 2 (April 2021), 20563051211024962. https://doi.org/10.1177/20563051211024962 Publisher: SAGE Publications Ltd.Google ScholarGoogle ScholarCross RefCross Ref
  114. Liam Stack. 2019. Trump Wants Your Tales of Social Media Censorship. And Your Contact Info. The New York Times (May 2019). https://www.nytimes.com/2019/05/15/us/donald-trump-twitter-facebook-youtube.htmlGoogle ScholarGoogle Scholar
  115. Anselm Strauss and Juliet M. Corbin. 1998. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. SAGE Publications. Google-Books-ID: tBcEjwEACAAJ.Google ScholarGoogle Scholar
  116. Nicolas P. Suzor. 2019 a. Lawless: The Secret Rules That Govern Our Digital Lives. Cambridge University Press. Google-Books-ID: EjGdDwAAQBAJ.Google ScholarGoogle Scholar
  117. Nicolas P Suzor. 2019 b. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. International Journal of Communication, Vol. 13 (2019), 1526--1543.Google ScholarGoogle Scholar
  118. Jeanna Sybert. 2021. The demise of #NSFW: Contested platform governance and Tumblr's 2018 adult content ban. New Media & Society (Feb. 2021), 1461444821996715. https://doi.org/10.1177/1461444821996715 Publisher: SAGE Publications.Google ScholarGoogle ScholarCross RefCross Ref
  119. Twitter. 2021 a. COVID-19 misleading information policy. https://help.twitter.com/en/rules-and-policies/medical-misinformation-policyGoogle ScholarGoogle Scholar
  120. Twitter. 2021 b. The Twitter rules: safety, privacy, authenticity, and more. https://help.twitter.com/en/rules-and-policies/twitter-rulesGoogle ScholarGoogle Scholar
  121. Twitter. 2021 c. Twitter's policy on hateful conduct textbar Twitter Help. https://help.twitter.com/en/rules-and-policies/hateful-conduct-policyGoogle ScholarGoogle Scholar
  122. Twitter. 2021 d. Twitter's sensitive media policy textbar Twitter Help. https://help.twitter.com/en/rules-and-policies/media-policyGoogle ScholarGoogle Scholar
  123. Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation. Proc. ACM Hum.-Comput. Interact, Vol. 4, CSCW2 (2020), 22.Google ScholarGoogle Scholar
  124. Emily A. Vogels, Andrew Perrin, and Monica Anderson. 2020. Most Americans Think Social Media Sites Censor Political Viewpoints. Technical Report. Pew Research Center. https://www.pewresearch.org/internet/2020/08/19/most-americans-think-social-media-sites-censor-political-viewpoints/Google ScholarGoogle Scholar
  125. Cristan Williams. 2016. Radical Inclusion: Recounting the Trans Inclusive History of Radical Feminism. TSQ: Transgender Studies Quarterly, Vol. 3, 1--2 (May 2016), 254--258. https://doi.org/10.1215/23289252--3334463Google ScholarGoogle ScholarCross RefCross Ref
  126. Jillian C. York. 2021. Silicon Values: The Future of Free Speech Under Surveillance Capitalism. Verso Books. Google-Books-ID: SNwfEAAAQBAJ.Google ScholarGoogle Scholar
  127. Tara J. Yosso. 2005. Whose culture has capital? A critical race theory discussion of community cultural wealth., Vol. 8 (March 2005), 24. https://doi.org/10.1080/1361332052000341006Google ScholarGoogle ScholarCross RefCross Ref
  128. Savvas Zannettou. 2021. "I Won the Election!": An Empirical Analysis of Soft Moderation Interventions on Twitter. arXiv:2101.07183 [cs] (Jan. 2021). http://arxiv.org/abs/2101.07183 arXiv: 2101.07183.Google ScholarGoogle Scholar
  129. Amy X Zhang, Grant Hugh, and Michael S Bernstein. 2020. PolicyKit: Building Governance in Online Communities. In Proceedings of UIST. 14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  130. Mark Zuckerberg. 2021. Mark Zuckerberg announces Trump banned from Facebook indefinitely. https://www.facebook.com/zuck/posts/10112681480907401Google ScholarGoogle Scholar

Index Terms

  1. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image Proceedings of the ACM on Human-Computer Interaction
        Proceedings of the ACM on Human-Computer Interaction  Volume 5, Issue CSCW2
        CSCW2
        October 2021
        5376 pages
        EISSN:2573-0142
        DOI:10.1145/3493286
        Issue’s Table of Contents

        Copyright © 2021 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 18 October 2021
        Published in pacmhci Volume 5, Issue CSCW2

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader