Content Moderation: Meaning and Definition

What is Content Moderation?

Content moderation is the practice of examining and controlling user-generated content on online platforms to determine whether it is used in accordance with the laws, the platform's policies, and the community's values. It assists in obstructing the transmission of harmful, unlawful, deceptive, or unfit material like hate speech, misinformation, copyrights and impersonation. Moderation of content may be done on a manual basis by humans, algorithmically, or a combination of both. Proper moderation would also assist in ensuring online safety, preventing abuse among users, and assisting the platforms in maintaining trust, accountability, and regulatory adherence in online environments.