Roblox Wiki
Roblox Wiki

The moderation system is the system used by moderators and admins alike to determine the appropriateness or content and users. Players were able to view their moderation history through their account settings up until 2015.


Currently, Roblox uses CommunitySift as an automated chat filter and moderation software used by other children's games like Habbo Hotel and MovieStarPlanet. CommunitySift utilizes a context-based chat filter to determine hidden contexts within 20 of the most commonly-used languages (and bypasses like using "1337 5p34k" or "mIxEd cApS"). The software assesses chat, usernames, and images based on risk, topic, user reputation, and context; it is able to filter or allow content or to flag it to a real-person moderator for further investigation. The auto image moderator detects pornographic content, graphic violence, weapons, terrorism, drugs, and has the ability to train the filter on custom topics.

The moderation system

We’ve got a full staff of moderators who work every hour of every day to make sure that the content that hits Roblox is safe for all ages. Though we have users of varying age groups, we make an effort to keep our younger audiences from seeing offensive words or pictures. Keeping moderators on Roblox 24 hours a day for seven days a week is a heck of a challenge, as the majority of our moderators have ever-changing schedules. But each of them enforces a strict code of rules that we’ve developed over time.

In a Roblox blog post, Becky Herndon, the Community Manager of Roblox, discusses how the Roblox moderation system works[1]. She first notes that moderated content can be grouped into two simplified categories: "pre-moderated" content and "post-moderated" content. Assets like shirts and decals are pre-approved by the moderation team before they are broadcasted to the website. Though rarer, post moderation occurs if an asset needs to be taken down, like if a place is found to violate the place creation guidelines after it is published.

When talking about the Report Abuse system, she notes that it is best to report something as it's happening instead of waiting and then reporting. When a user is reported, the moderation team will look at the situation and make a decision. Consequences are based on the player's prior history of moderation action and the severity of the issue. As discipline is handled between the moderation team and the person being reported, the person who reported the player will not receive any notification if the player has been warned or banned because of the report. She also reports that the Roblox moderators are automatically notified if any swear words are said.

In May 2020 a hacker successfully bribed an employee working for Roblox in India to see the moderation panel. It revealed some very disturbing facts about moderation including that it is outsourced to a company called iEnergizer, a company specialized in outsourcing and game moderation.

Restricted topics

These are topics that are likely to be blocked by the moderation system.

  • Inappropriate phrases.
  • Any private information about someone (e.g. real-life names, age, etc.).
  • Cuss words; This is one of the most heavily moderated items.
  • Anything related to romance (e.g. animations, actions, and phrases).
  • Any real life physical or suicide threats to others or yourself: such as saying," I'm going to kill you."
  • Any adult content: alcohol, places, actions, etc.
  • Cuss words in other languages: Roblox can identify swear words in other languages, so typing swear words in another language is censored.
  • Any tragic events, such as for example, 9/11 attack.
  • Anything to do with Robux, including sales, advertisements.
  • Share someone else's website, share a website you made, a click ad, or advertise stuff.
  • Loophole ways of breaking rules.
  • Anything related to hackers or hacking.
  • Robux casinos, giveaways, or games.
  • Racial or ethnic slurs.
  • Impersonation of any famous user.
  • Desecration of Real world politics and political figures.
  • Applies for off-platform actions as well.
  • Dangerous activities (Misuse of fireworks, and Physical challenges or stunts.)
  • Realistic depictions of extreme Gore.
  • Illegal and Regulated Activities
  • Copyright infringing assets after the respected owner files a DMCA.


The product contains a user reputation system where players' chat capabilities with the filter can be restricted or lessened based on their history of chat abuse[2]. Though not confirmed that Roblox utilizes the user reputation system, it can explain the frequently-criticized phenomenon of players being moderated for content that other players receive no reprimands for[citation needed]. Many users frequently criticize that appropriate words, such as "my", "I", "don't" are occasionally flagged by the filter. As Roblox can set the filtering level of the software, it is assumed that they initially had the software filter at a higher level and cause it to flag more content. After the closure of the Roblox Forums, some users have received moderation action against their account for forum posts that they made years prior to the moderation action. It is assumed that this could have happened if the current filter is set to a higher filtering level than the filter when the forum post was initially made. As the content still exists on the Roblox platform, the filtering system is able to flag it as inappropriate and cause the poster to receive a warning or a ban, even if it was deemed appropriate at the time of the posting.


  1. ReeseMcBlox. (2012). Moderation on ROBLOX: How it Works. Roblox Blog. Retrieved from
  2. CommunitySift. (2017). Frequently Asked Questions. Retrieved from