Lesson 1: Basics: Your Responsibilities as a Moderator
You may already know by now that you can report content to Discord’s Trust & Safety team if it breaks Discord’s Community Guidelines or Terms of Service. However, while moderating a server you may come across situations where you are unsure of whether you should make a report or not. This article will give you a good idea of what sorts of things are reportable, and what things are not worth reporting. Knowing this will help you decide what content needs to be moderated by your moderation team, and what content should be escalated to Trust & Safety. Wherever it is mentioned that an issue should be handled on a server level, it is for the most part implied, but not required, that the server should consider punishing users who participate in the given behavior.
Discord Community Guidelines – A Breakdown
The Discord Community Guidelines may seem to indicate exactly what content you should report to Trust & Safety, but interpretation of the rules may not always be that simple. The Community Guidelines are shown below with an explanation on when it is appropriate to report the situations described by each of the 18 guidelines in gray.
Here are some rules for interacting with others:
Do not organize, participate in, or encourage harassment of others. Disagreements happen and are normal, but continuous, repetitive, or severe negative comments may cross the line into harassment and are not okay.
- Disagreements, insults, and other rude or disrespectful behavior is common, and should generally be handled in accordance with your own server rules. Usually these situations can be resolved with proper moderator action from server staff or asking victims to block offending users. However, if all possible measures have been taken on a server and user level to stop the behavior, and the behavior persists, (e.g., by means of ban evasion and alt account creation) then a Trust & Safety report is warranted (see guideline four).
- Do not organize, promote, or coordinate servers around hate speech. It’s unacceptable to attack a person or a community based on attributes such as their race, ethnicity, national origin, sex, gender, sexual orientation, religious affiliation, or disabilities.
- If a user is participating in any of these actions aimed at another person or community, it is usually reportable. Some examples of usage of hate speech or slurs that should not be reported are meta talk about hate speech/slurs or relevant quotes containing hate speech/slurs. These situations are up to the server to moderate as you see fit.
- Do not make threats of violence or threaten to harm others. This includes indirect threats, as well as sharing or threatening to share someone’s private personal information (also known as doxxing).
- If a user is at risk due to a threat made by another user, they should be reported. If the threat seems immediate, the affected user should call their local authorities and escalate the issue to law enforcement along with being reported. It is important to recognize, however, when a threat may not be truly dangerous. Users may joke about this topic, and mean no real harm. It is your job as a moderator to communicate with those involved to determine if a user is truly in danger.
- Do not evade user blocks or server bans. Do not send unwanted, repeated friend requests or messages, especially after they’ve made it clear they don’t want to talk to you anymore. Do not try to hide your identity in an attempt to contact someone who has blocked you, or otherwise circumvent the tools we have which enable users to protect themselves.
- If a user is evading blocks or bans for the purpose of repeating their previous bad behavior, they should be reported. Users who evade a block or ban and do not continue bad behavior should not be reported, but you may choose to discipline users who do so on a server level if you prefer. The differentiation here is determining if their actions are maliciously directed towards others.
- Do not send others viruses or malware, attempt to phish others, or hack or DDoS them.
- Viruses, malware, and phishing attacks are reportable as this content is directly against Discord’s Community Guidelines. Rumors of users who are deemed hackers are not reportable in any way, unless valid evidence of their behavior is included.
Here are some rules for content on Discord:
You must apply the NSFW label to channels if there is adult content in that channel. Any content that cannot be placed in an age-gated channel, such as avatars, server banners, and invite splashes, may not contain adult content.
- Isolated incidents where NSFW content is posted outside of an NSFW channel should be taken care of on a server level. If a server regularly fails to remove NSFW content posted outside of NSFW channels, the server may be reported for this behavior. Other NSFW content outside of age-gated channels, such as the content described in this guideline, may also be reported.
- You may not sexualize minors in any way. This includes sharing content or links which depict minors in a pornographic, sexually suggestive, or violent manner, and includes illustrated or digitally altered pornography that depicts minors (such as lolicon, shotacon, or cub). We report illegal content to the National Center for Missing and Exploited Children.
- Content with minors being sexualized is taken very seriously and should be reported and removed as soon as possible. Be aware that some borderline content may be more subjective in terms of whether the depicted content involves minors or not. In this case, this includes material where the content reported as being of the “sexualized” nature is ambiguous or hard to definitively identify as such.
- You may not share sexually explicit content of other people without their consent, or share or promote sharing of non-consensual intimate imagery (also known as revenge porn) in an attempt to shame or degrade someone.
- The above content should always be reported to Discord’s Trust & Safety team.
- You may not share content that glorifies or promotes suicide or self-harm, including any encouragement to others to cut themselves, or embrace eating disorders such as anorexia or bulimia.
- Small, one time offenses, or users who talk about self-harm in reference to themselves should be handled on a server level. Users who continue to promote self-harm either for themselves or others should be reported. Users who are in physical danger due to a potential suicide attempt or self-harm should be reported as soon as possible. You should consider calling law enforcement if you believe the user may be in imminent danger to themselves or others.
- You may not share images of sadistic gore or animal cruelty.
- You may not use Discord for the organization, promotion, or support of violent extremism.
- The above content should always be reported. Violent extremism is any content that supports extreme religious or political views that potentially put other people in danger. This content is typically encapsulated by the behavior described in guidelines one and three.
- You may not operate a server that sells or facilitates the sales of prohibited or potentially dangerous goods. This includes firearms, ammunition, drugs, and controlled substances.
- Servers that do this should always be reported to Discord’s Trust & Safety Team.
- You may not promote, distribute, or provide access to content involving the hacking, cracking, or distribution of pirated software or stolen accounts. This includes sharing or selling cheats or hacks that may negatively affect others in multiplayer games.
- Users or servers that are involved in the above should be reported if there is significant evidence to prove that they are involved, such as explicit distribution of any of the above items or services. Users who talk about the topic but aren’t clearly involved should be handled by the server.
- In general, you should not promote, encourage or engage in any illegal behavior. This is very likely to get you kicked off Discord, and may get you reported to law enforcement.
- Most common cases involving illegal behavior are covered in other guidelines. If you encounter illegal behavior not covered in the other guidelines which could potentially pose a serious threat, it should be reported. If a user jokes about illegal behavior or does not appear to be involved in illegal behavior, you may consider handling the situation on a server level.
Finally, we ask that you respect Discord itself:
You may not sell your account or your server.
- Users who want to buy or sell an account or server should be reported. Again, try to be sure they are serious about it before reporting.
- You may not use self-bots or user-bots to access Discord.
- Users who use a self-bot or user-bot in a malicious way should be reported.Self-botting in this case includes users setting up a bot or automations that allow them to perform actions on Discord faster than any human is physically capable of. Discord prioritizes reports of self-bots where there is some malicious behavior, such as nitro-sniping or data collection.
- You may not share content that violates anyone’s intellectual property or other rights.
- In general, this guideline should be handled by the users involved, usually by means of a DMCA request. Instructions on properly filing a DMCA takedown request can be found in Discord’s Terms of Service, and must be initiated by the rights-holder of the intellectual property or an authorized representative. You may handle these situations on a server level if you wish.
- You may not spam Discord, especially our Customer Support and Trust & Safety teams. Making false and malicious reports, sending multiple reports about the same issue, or asking a group of users to all report the same content may lead to action being taken on your account.
- If you have evidence of users participating in this behavior, you should report it. Users impersonating Discord employees may also fall in this category. When submitting a report ticket about an issue, reply to the support ticket to bump it, but do not create a new ticket as you may end up violating this community guideline.
Extra Considerations
There are some things you might encounter that are not explicitly mentioned in the community guidelines. Here is a short list of some common situations that you might encounter while on Discord:
- Bot Advertisements/Solicitations – If a bot is being used to advertise/solicit, you should report the bot. This is most commonly done in DMs.
- User Advertisements/Solicitations – Users can be reported for advertising and soliciting in DMs as well. However, if a user is advertising in a server, they should be dealt with on a server level, provided they are not advertising something that breaks another community guideline.
- Discord Nitro Scams – Any scams being spread about getting free Discord Nitro or server boosts should be reported if they contain a potentially malicious link, or if being spread by a bot. If you see any messages that claim to give users Discord Nitro or server boosts through other means, such as putting a command in the console, you may handle those cases on a server level. It is recommended, however, that you remove these messages swiftly, as it is likely a ploy to get users to reveal their account token, which can give other users access to their account.
- Chain Mail – You may encounter messages that are created to scare users and spread rapidly. Common phrases in these messages are “Look out for Discord user ____” or anything claiming a user or bot might DDoS you or steal your IP. These messages will usually say they were written by Discord as a scare tactic. This is not true. Any important messages from Discord will be posted on their social media, or in a system message and nowhere else. System messages are identifiable by the system tag seen next to the user that sends them. Any chain mail sent is not reportable, but should be dealt with on a server level. It is recommended that you delete these messages and inform users that they are fake.
Conclusion
This is not an exhaustive list of the situations you may encounter as a moderator, nor is it a guide for what rules you should enforce on your server, as the platform is ever-evolving. You may enforce extra rules as you see fit in order to cultivate the type of community you want for your server. To learn more about creating server rules, However, integral to being able to continue to grow your server is the understanding that you need to report and remove content that breaks Discord’s Community Guidelines from your server as necessary. Leaving content that breaks the Community Guidelines can result in action being taken on your server if it becomes a space that enables harmful behavior.