Whether it’s employees, fans, followers or subscribers: every kind of community thrives on active, healthy exchange. The problems of today’s social media platforms clearly show that it is often and never really possible without human moderation. Because where there is discussion, there are sometimes unwanted words. And where there are exciting target groups, there are also quickly economic interests.
tchop as a platform offers many different possibilities to build and maintain a lively and engaged community. Moderation by a team of administrators is of course also part of it, but what possibilities are there at all?
Of course, users can also post content themselves – in the News Feed, in a specific mix, or in the chat. Content in the public News Feed is of course particularly sensitive, as it is prominently accessible (see also here).
As is usual with all social media platforms, we do not recommend “post moderation”. In principle, it is of course possible to review and approve every single post first. But for the users themselves, this is usually frustrating and not well understood. The risk of abuse is far lower than assumed, because most users think carefully about what they post and where.
If there is any unsuitable content (for example, unauthorized advertising), it can be unpublished or deleted with a single click. You can usually rely on the reporting of inappropriate content by the community itself. Every card has a “Report Abuse” function, which is also required by Apple and Google for such communities. Users can also provide a reason for their complaint or for reporting content. The complaint itself then directly triggers an email to one or all administrators. These can then quickly check the content – just as happens with Facebook & Co.
It’s often not just the content that’s exciting, but also the comments underneath. It’s not unusual for a lively discussion to unfold there. And it’s not uncommon for one or the other to take the wrong tone. Here, too, there is the option for the administrator to hide each individual comment under a card (the comment is only hidden in the app and not deleted so that it can be tracked if necessary). And here, too, users can report conspicuous posts.
In addition, especially the automatic detection of swear words is also a useful feature. We maintain a long list of swear words in different languages. If one of the words is used, it is hidden with “***”. The administrator does not have to do anything here. If there are certain words that are missing in our list from the customer’s point of view, they can easily be added. Please contact us at any time.
Chat is, of course, exciting in many ways when it comes to moderation. For chat groups – whether private or public – we rely on the principle of centrally curated organization. Only administrators and editors can create groups, manage and invite users. This already prevents a “proliferation” of groups.
Nevertheless, there are of course also inappropriate messages in the chat from time to time. Even in this case, the administrator can intervene at any time and delete individual messages (unfortunately, only hiding is not possible in a chat for technical reasons). In the dashboard, this is easily done by clicking on the element directly next to a message. In the same way, cards or content can be deleted, moved or even edited in the chat.
A problem that always arises in open communities at some point: newly registered users use the platform – whether chatting or posting content for self-promotion and spam. Of course, this should and can be stopped quickly. There are various escalation levels: a user can be deleted from a channel, a chat or, of course, from the entire organization or app. If there is a real person behind it, sometimes addressing them helps. But if the intentions are “evil,” banishment is the only solution.
The user cannot then log in again with the same email. However, a new email is then simply used to log in again. If someone is banned from the organization, we also automatically block their IP address for a week. A new registration is then initially not possible again.
At all levels, you as the customer or administrator have full control to hide or delete content, comments and messages at any time. Together with the “Report Abuse” function, the essential requirements of “Post Moderation” are thus fulfilled. Of course, the platform also comes with a number of features beyond those mentioned above that make moderation efficient and easy. This includes the direct addressing of users via a 1:1 chat as well as the entire area of content management.
Do you have further questions about content and community moderation? Contact us at any time!