August 18, 2022

Top 10 actions you should never do in moderation

By The Bodyguard Team
View all posts

Share

Getting moderation right is tough. Achieving consistently effective moderation is even harder. But part of getting there is understanding what not to do, just as much as what to do. So, let’s explore the top 10 things that good moderators must avoid.

1: Over-moderating content

We’re all human, which means we all have individual biases against views, beliefs, or ways of life that we don’t agree with. This is why over-moderating content is a problem. Moderators take it upon themselves to moderate content that they simply don’t like, rather than content that breaks their community guidelines. Content moderatorsshould be guardians of free speech and help others express themselves without fear of judgment, so over-moderation is a big no-no.

2: Losing your temper

Being a moderator is difficult even at the best of times, and everyone has their limit. So when a hater starts throwing out extremely hurtful comments, it’s tempting to hate right back at them. But this only makes the moderator as bad as the hater. Chill out, stay calm and professional at all times, and deal with haters without stooping to their level. You’re better than this!

3: Doing nothing

Leaving dramas doesn't help anybody — that's a fact. Action is crucial because viral content can quickly and easily hurt people. People who have trust in a community are more likely to express themselves freely. In contrast, individuals that are harassed or exposed to toxic content are most likely to engage less or even leave the community altogether.

4: Ignoring feedback

Listening to feedback and incorporating it into your community is key to building long-term trust and consistent traffic. Whether it’s good or bad doesn’t matter: you should always take time to go through what users are saying and make them feel heard. Ignore it at your peril.

5: Neglecting guidelines

The creation of platforms to facilitate online communication can work brilliantly, but as soon as a platform allows for user-created content to be published, moderation through the use of guidelines becomes crucial. Ignoring the development of guidelines risks creating a negative user experience and inconsistency when it comes to standards of behavior.

6: Ignoring trolls

Trolls come in many forms, but some are toxic to online communities. Spreading negative content, abuse or threats is one of the fastest ways to reduce a user base and damage a reputation. That’s why, when a troll rears its ugly head, it needs to be handled properly. Whatever your chosen response, be it deleting posts or revoking posting permissions, see that you respond consistently and effectively every time.

7: Immediately answering every user question

This might sound counterintuitive, but answering all your community’s questions straight away can backfire. This is because it discourages users from participating and developing conversations. So, instead of jumping in immediately, hold back and see if another user can answer the question. You have to find your own balance between instantaneity and self-control (back to number two).

8: Working too hard

Yep, that’s right. It’s so easy to get burned out as a moderator, so don’t overwork yourself or commit to more than you can handle. Moderation usually comes with a heavy workload, so going too hard will drain your energy and motivation, leading to a lower quality of moderation in the long run. After all, moderators are humans, too.

9: Staying hidden

A great way to bring your community to life is to act as a role model. That means engaging with your community and reinforcing the kinds of behavior you want to see. Though of course moderators naturally differ in terms of visibility, not being part of a community is a missed opportunity. Remember, communication is key.

10: Doing it all yourself

As mentioned above, moderation involves a lot of hard work. But you can increase the efficiency of your moderation by enabling effective (and respectful) peer-to-peer moderation. It’s a great way to engage your users, teach them your guidelines, and even find new moderators.

Alternatively, you can opt for effective automated moderation tools, like ours. Automated moderation takes a lot of the heavy lifting out of moderators’ lives and makes huge amounts of content easy to manage. 

AI moderation: our solution intelligently filters out content and then shows you how it’s been classified and what action has been taken.

Cleaning phase: Bodyguard analyzes text, typos, deliberately misspelled words and emojis in context.

Contextual analysis: our technology analyses to whom the toxic content is addressed.

Classification: easily organize messages into categories of severity.

Lives streaming protection: Automatically moderate millions of comments in real-time during live events.

Dashboard: get access to insightful metrics that help you foster respect and engage your community.

To find out more about our services and how automatic moderation can help you manage your community without straying into censorship, click here.

Popular Insights

What is user-generated content and why does it matter?

By Gracia Kahungu|February 26, 2024
Read more

Bodyguard elevates content moderation with new language features

By Gracia Kahungu|December 19, 2023
Read more

TikTok moderation: why it’s important and how to do it

By Lucy Wright|September 15, 2023
Read more

New season, new rules: Why football clubs can’t tolerate online toxicity

By The Bodyguard Team |August 3, 2023
Read more
Solutions
Threat MonitoringCommunity ProtectionAudience UnderstandingIntegrations
Helpful Links
SupportTechnical Documentation
About
CompanyCareersContact us

© 2024 Bodyguard ai. All rights reserved worldwide.

Terms & Conditions|Privacy Policy