Delivering a safe and appropriate chat environment to your users is essential. Agora Chat gives you multiple options to intelligently moderate the chat content and handle inappropriate user behavior.
The following table summarizes the message management tools provided by Agora Chat:
Function | Description |
---|---|
Message reporting | The reporting API allows end-users to report objectionable messages directly from their applications. Moderators can view the report items on Agora Console and process the messages and message senders. |
Text and image moderation | This feature is based on a third-party machine learning model and is used to automatically moderate text and image messages and block questionable content. |
Profanity filter | The profanity filter can detect and filter out profanities contained words in messages according to configurations you set. |
Domain filter | The domain filter can detect and filter out certain domains contained in messages according to configurations you set. |
To use the reporting feature, you need to call the reporting API when you develop your application. For details, see the following documents:
After a user reports a message from the application, moderators can check and deal with the report on Agora Console:
To enter the Message Report page, in the left navigation menu, click Project Management > Config for the project that you want > Config of Chat > Message Report > History, as shown in the following figure:
On the Message Report page, moderators can filter the message report items by time period, session type, or handling method. They can also search for a specific report item by user ID, group ID, or chat room ID. For the reports, Agora Chat supports two handling methods: withdrawing the message or asking the sender to process the message.
Powered by Microsoft Azure Moderator, Agora Chat's text and image moderation can scan messages for illegal text and image content and mark the content for moderation. Microsoft Azure Moderator uses the following three categories to moderate the message and returns a confidence score between 1 and 5 for a message:
After enabling the text and image moderation feature on Agora Console, you can set a threshold for each moderation category. When the score returned by Microsoft Azure Moderator reaches the threshold, Agora Chat blocks the message. You can also impose a penalty on users who reach the violation limit within a time period. The moderation penalties include the following: banning the user, forcing the user to go offline, or deleting the user.
To see how moderation works and determine which moderation settings suit your needs, you can test different text strings and images on Agora Console.
Taking a one-to-one chat text as an example, follow these steps to create a text moderation rule:
To enter the Rule Config page, in the left navigation menu, click Project Management > Config for the project for which you want > Config of Chat > Text Moderation or Image Moderation > Rule Config, as shown in the following figure:
To create a text moderation rule, click Add:
The following table lists the fields that a text moderation rule supports:
Field | Description |
---|---|
Rule name | The rule name. |
Conversation type | The moderation scope, which can be one of the following: a one-to-one chat, chat group or chat room, chat groups or chat rooms. If you set a rule for a specific chat group or room, the global moderation rules for chat groups and rooms are overwritten. |
Enable | Determines whether to turn a rule on or off. |
Message handling |
|
Rule | Sets the threshold for each moderation category. |
User management | Imposes a penalty on users who reach the violation limit within a time period. The moderation penalties include the following: banning the user, forcing the user to go offline, or deleting the user. |
After creating a rule, you can edit or delete the rule:
To enter the Rule Test page, in the left navigation menu, click Project Management > Config for the project that you want > Config of Chat > Text Moderation or Image Moderation > Rule Test, as shown in the following figure:
Select a rule, fill in the text to moderate, and click Check to test the rule. The moderation result is displayed on the same page.
The profanity filter can detect and filter out profanities contained words in messages according to configurations you set.
Follow these steps to specify a profanity filter configuration:
To enter the Rule Config page, in the left navigation menu, click Project Management > Config for the project that you want > Config of Chat > Word List > Rule Config, as shown in the following figure:
On the Rule Config page, you can add or delete words and determine which filtering method to apply to messages that contain the specified keywords. You can replace the word with *** or simply not send the word.
The domain filter can detect and filter out certain domains contained in messages according to configurations you set.
Follow these steps to specify a domain filter configuration:
To enter the Rule Config page, in the left navigation menu, click Project Management > Config for the project that you want > Config of Chat > Domain Filter > Rule Config, as shown in the following figure:
To create a domain filtering rule, click Add:
The following table lists the fields that a domain filtering rule supports:
Field | Description |
---|---|
Rule name | The rule name. |
Conversation type | The moderation scope, which can be one of the following: a one-to-one chat, chat group or chat room, chat groups or chat rooms. If you set a rule for a specific chat group or room, the global moderation rules for chat groups and rooms are overwritten. |
Enable | Determines whether to turn a rule on or off. |
Message handling |
|
Domain name | Adds a domain to the rule. |
User management | Imposes a penalty on users who reach the violation limit within a time period. The moderation penalties include the following: banning the user, forcing the user to go offline, or deleting the user. |
After creating a rule, you can edit or delete the rule:
You can check the history of the text moderation, image moderation, profanity filter, and domain filter on Agora Console. You can filter the moderation items by the time period, session type, or moderation result. You can also search for a specific item by the sender ID or receiver ID.
You can impose a penalty on users for repeated violations. The penalties can be applied as a global application setting, or only to a specific chat group or room. The following table lists all the user moderation options that Agora Chat supports:
User moderation options | Actions | Description |
Global actions on users | Banning a user | A banned user immediately goes offline and is not allowed to log in again until the ban is lifted. |
Forcing a user to go offline | Users who are forced to go offline need to log in again to use the Chat service. | |
Deleting a user | If the deleted user is the owner of a chat group or chat room, the group or room is also deleted. | |
Chat group management | Muting a user in a chat group | A muted user cannot send messages in this chat group until unmuted. |
Muting all users in a chat group | Members in a muted group cannot send messages until the muted state is lifted. | |
Managing a group blocklist | Users who are added to the group blocklist are removed from the group immediately and cannot join the group again. | |
Removing a user from a chat group | The removed user can no longer receive the messages in this group until rejoining the group. | |
Chat room management | Muting a user in a chat room | A muted user cannot send messages in this chat room until unmuted. |
Muting all users in a chat room | Members in a muted room cannot send messages until the muted state is lifted. | |
Managing a room blocklist | Users who are added to the room blocklist are removed from the room immediately and cannot join the room again. | |
Removing a user from a chat room | The removed user can no longer receive the messages in this room until rejoining the room. | |
To enter the User Management page, in the left navigation menu, click Project Management > Config for the project that you want > Config of Chat > Operation Management > User, as shown in the following figure:
To take action on a user (banning a user, deleting a user, or forcing a user to go offline), search for the user ID, and click More:
To enter the Chat Group Management page, in the left navigation menu, click Project Management > Config for the project that you want > Config of Chat > Operation Management > Group, as shown in the following figure:
To take action on a group member (removing a member from the group or adding a user to the group blocklist), search for the group ID, and click More:
You can also click the group ID to enter the group's moderation dashboard, where you can manage the group info, group members, and messages in real-time:
To use this feature, you need to enable it on the Features > Overview page.
To enter the Chat Room Management page, in the left navigation menu, click Project Management > Config for the project that you want > Config of Chat > Chat Room Management, as shown in the following figure:
To take action on a room member (removing a member from the room or adding a user to the room blocklist), search for the room ID, and click More:
You can also click the room ID to enter the room's moderation dashboard, where you can manage the room info, room members, and messages in real-time:
To use this feature, you need to enable it on the Features > Overview page.