The moderation feature allows you to assign certain users to review and approve or reject content changes submitted by community users before they are published. You can designate one or several moderators and enable moderation for a variety of content types. You can also set up abuse reporting and moderate user registration. After reviewing this article, you will be able to describe the various moderation roles.
Moderation allows you to designate one or more users to approve or reject documents, discussions, comments, and other types of content before they are published in the community.
A community can have the following moderator roles:
|Location of setting in the Admin Console|
|Space Moderator||Permissions > Space Permissions > Moderate|
|Global Moderator||Permissions > System Administration > Moderate Content
|Full Access||Permissions > System Administration > Full Access
Note: Jive Cloud instances do not have the Full Access role.
Things to Note
- Social Groups inherit the moderation settings of the root container and are moderated by global moderators as moderation inheritance in Social Groups, Spaces, and Projects works differently.
- Jive follows a moderation hierarchy system which is described in Moderation rules and notification hierarchy.
Jive Moderation Role
The following types of moderation roles are available in Jive.
- Content Moderation
- Document Approval
- Profile Image Moderation
- User-uploaded Avatar Moderation
- Abuse Reporting
- User Registration Moderation
Setting up content moderation in a place is enabled in three steps:
- Selecting the Place to be moderated
- Selecting the content types to be moderated in the Place.
- Assigning the moderators for the Place.
Note: You cannot moderate content created in private groups, secret (also known as private unlisted) groups, or content that has visibility limited to the author (Hidden) or specific users.
Setting up Document approval in your community enables you to make it mandate to approve all documents created in space before the documents are published and made visible to other users. With a space approver assigned, users submit a document for approval before it is published. The space approvers are set in the Admin Console as a setting for each space.
Fastpath: Admin Console: Spaces > Settings > Document Settings
How does document approval work
- A user creates a document in the space.
- The user clicks Send for approval, and the document goes into an approval queue. The document isn't actually sent anywhere but is marked for approval by the application.
- The approver is alerted in their Inbox that something needs approval.
- The approver can view the document, and then approve or reject it.
- If they approve it (and if all other approvers approve it), the document is published. All approvers must approve before the document is published.
- If they reject it, they can enter an explanation, and the document is sent back to the author as a draft. The author then can edit and resubmit the document.
Note: A document can also have document-level approvers designated by the author when creating the document.
Profile Image Moderation
You can enable moderation for images that users upload for use in their user profile. This feature is either fully enabled for all users or fully disabled for all users.
Fastpath: Admin Console: People > Settings > Profile Image Moderation
Before you set up moderation, the upload of profile images must be enabled in the community. For more information, see Configuring user profiles. To set up moderation of profile images:
- In the Admin Console, go to People > Settings > Profile Image Moderation.
- Select Enable moderation of new user profile images.
- Click Save.
User-uploaded Avatar Moderation
You can set up Avatar moderation so every user-uploaded image is added to a moderation queue for approval or rejection before going live on the community.
Fastpath: Admin Console: People > Settings > Avatar Settings
To enable user-uploaded avatar moderation:
- In the Admin Console, go to People > Settings > Avatar Settings.
- Under User Uploaded Avatars, select the Moderate uploaded user avatars checkbox.
- Click Save Settings.
When abuse reporting is enabled, community members can use a link on content to report the content as abusive. When someone clicks the link, the content is sent to the moderator's queue so that it can be evaluated.
Abuse reporting is a system-wide, global setting. When it's enabled, it is available for every piece of public or private content on which abuse can be reported.
Fastpath: Admin Console > Spaces > Settings > Abuse Settings
To enable Abuse Reporting:
- In the Admin Console, go to Spaces > Settings > Abuse Settings.
- Select the checkbox for Enable Abuse Reporting.
- Click on the Save Changes button.
User Registration Moderation
User Registration Moderation can be set up for new user registrations if it has been set up to allow people to register on their own. A community manager or system administrator enables both user registration and moderation for new registration requests.
Fastpath: Admin Console: People > Settings > Registration Settings
To set up new user registration moderation:
- In the Admin Console, go to People > Settings > Registration Settings.
- Select Allow users to create their own account to allow users to create their own account from the login page.
- Under Security > Registration Moderation, select Enabled to turn on the moderation feature for all new user registrations.
- To limit registration moderation to the email address from the blacklisted domains, select Only for addresses matching the blacklisted domain list.
The Blacklisted Domains section contains the list of untrustworthy domains. To block or moderate all addresses from a domain, use an asterisk before the domain, for example,
- To entirely block registrations from the blacklisted domains, select Always block registrations from blacklisted domains under Blacklisted Domains.
- Click on the Save Settings button at the bottom of the page.