![](https://static.wixstatic.com/media/11062b_0c84dffbef0047abb72f129c317907b0~mv2.jpg/v1/fill/w_980,h_653,al_c,q_85,usm_0.66_1.00_0.01,enc_auto/11062b_0c84dffbef0047abb72f129c317907b0~mv2.jpg)
Moderating content plays a significant role in our digital media ecosystem. The process dates back to when Congress enacted Section 230. It’s been over 20 years since the legislation passed and both political parties have tried to dismantle Section 230. While social media platforms have immunity when it comes to their user’s posts, they are permitted to moderate their services by removing offensive content that violates their standards. Content moderation is the process of monitoring user-generated content and making sure it upholds an online platform’s regulations. The goal is to reduce the visibility of spam content which helps improve the user’s experience.
In an article from the AP, it explains that “The legal interpretation of section 230 also allows social platforms to moderate their services by removing posts that, for instance, are obscene or violate the services’ own standards, so long as they are acting in “good faith” (Ortutay, 2020).
Tech companies follow at least one of these techniques when moderating content. There is pre-moderation, post-moderation, reactive moderation, automated moderation, and distributed moderation. Pre-moderation stops certain user-generated content from going live before it damages a brand’s image. The process is time-consuming, so users will have to wait a little longer for their posts to appear on the feed. Post-moderation occurs after the content is live, and then it will be reviewed by a human or AI moderator. A user’s post will go live immediately after they click on submit. Reactive moderation relies on the site’s users to flag or report content on the platform. From there, moderators will respond to the submitted reports. Another form of the method is community moderation which allows content to go public and then rely on the community to report the unwanted content. Distributed moderation is a looser method than the rest. The process relies on rating systems where the highly voted UGC is on the top while the lowest-rated content is hidden.
Sources
Ortutay, B. (2020, October 28). AP Explains: The rule that made the modern internet. AP News. https://apnews.com/article/what-is-section-230-tech-giants-77bce70089964c1e6fc87228ccdb0618
Schoolov, K. (2021, February 27). Why content moderation costs billions and is so tricky for Facebook, Twitter, YouTube and others. CNBC. https://www.cnbc.com/2021/02/27/content-moderation-on-social-media.html