All Products
Search
Document Center

Content Moderation:Customize policies for machine-assisted moderation

Last Updated:Nov 10, 2023

Content Moderation uses the default moderation policy of Alibaba Cloud for machine-assisted moderation. If you find that the default moderation policy is either too strict or loose and does not meet your business requirements, you can customize policies for machine-assisted moderation within Content Moderation. This topic describes how to customize a policy for machine-assisted moderation.

Background information

A BizType setting specifies a business scenario. Moderation policies are customized based on the business scenario. Each business scenario corresponds to a moderation policy. If you do not customize moderation policies, the default business scenario and corresponding default moderation policy are used. After you customize a policy for machine-assisted moderation based on a business scenario, you can specify the business scenario in an API request for content moderation. In this case, the corresponding moderation policy takes effect.

Procedure

  1. Log on to the Content Moderation console.

  2. In the left-side navigation pane, choose Machine audit V1.0 > Settings.

  3. Create a business scenario.

    If you have created a business scenario, skip this step.

    1. On the BizType Manage tab of the Machine audit page, click Create BizType.

    2. In the Create bizType dialog box, set the parameters that are described in the following table. You can set a name and specify an industry template and the industry category for the business scenario.

      Parameter

      Description

      Name

      The name of the business scenario. The name can contain digits, letters, and underscores (_). It can be up to 32 characters in length.

      Apply Template

      Specifies whether to import the configuration of an industry template. If you use an industry template, specific features of the moderation policy cannot be customized. If you do not use an industry template, you can customize the moderation policy.

      Industry Category

      The industry category of the business scenario. An appropriate industry category allows Alibaba Cloud to help you adjust your moderation policy.

      Copy from

      If you have created a business scenario, you can import the configuration of the existing business scenario.

      Description

      The description of the business scenario. The description can contain letters, digits, and underscores (_). It can be up to 32 characters in length.

    3. Click OK. The business scenario is created.

  4. Configure the business scenario.

    To configure the business scenario, complete configurations on the Moderation Policy, Audit Data, and Evidence Storage tabs.

    • Configurations on the Moderation Policy tab

      The moderation policy template varies based on the moderation scenario. For more information, see the templates in the console.

      1. On the Machine audit page, find the business scenario that you want to configure and click Edit in the Actions column.

      2. Customize the policy for machine-assisted moderation based on your business requirements.

        Content Moderation allows you to customize the moderation policy for images, videos, text, and audio. For images and videos, machine-assisted moderation covers pornography detection, terrorist content detection, undesirable scene detection, and ad violation detection. For text and audio, machine-assisted moderation detects pornographic content, political content, abuse, ads, and prohibited content.

        If a specific moderation scenario is not required for your business, do not select the moderation scenario. For example, if you do not want to moderate ads, you can clear all ad-related moderation scenarios when you customize the policy for machine-assisted moderation.

        Note

        When you customize the moderation policy for images, you can click Associated Gallery or Associated Text Library in the upper-right corner of a section to associate the moderation policy with the configurations of image libraries or text libraries.

      3. Click Save.

        If you need to use the customized policy for a specific moderation scenario when you call an API operation, you must specify the moderation scenario in the API request. For example, when you call an API operation to moderate images for pornographic content, set the scenes parameter to porn in the API request. In this case, the corresponding moderation policy takes effect. For more information, see /green/image/scan.

    • Configurations on the Audit Data tab

      1. Click the Audit Data tab.

      2. On the Audit Data tab, select the type of data to be reviewed.

        For more information about the human review feature, see Review machine-assisted moderation results.

    • Configurations on the Evidence Storage tab

      When you call the Content Moderation API to moderate content, you can enable the feature of storing evidence in Object Storage Service (OSS) buckets to retain the evidence that is obtained from machine-assisted moderation results. The evidence storage feature must be used with Alibaba Cloud OSS. You can use this feature to store illegal, suspicious, and normal content identified in videos, audio, and images in the specified OSS bucket. After the content is stored, an OSS URL is generated for the stored content. This section describes how to enable and configure the feature of storing evidence in OSS buckets.

      You can store only videos, audio, and images in which illegal, suspicious, or normal content is identified in specified OSS buckets. The stored content refers to the videos, audio, and images whose machine-assisted moderation result returned in the suggestion parameter is block, review, or pass. A value of block indicates illegal, a value of review indicates suspicious, and a value of pass indicates normal.

      By default, the evidence storage feature is disabled. To use this feature, you must enable it and configure data for storing videos, audio, and images in which illegal, suspicious, or normal content is identified in OSS buckets.

      • If the feature of storing video evidence in a specified OSS bucket is enabled, video files or streams whose machine-assisted moderation result is block, review, or pass and the related video snapshots are stored in the specified OSS bucket.

      • If the feature of storing audio evidence in a specified OSS bucket is enabled, audio files or streams whose machine-assisted moderation result is block, review, or pass and the related audio fragments are stored in the specified OSS bucket.

      • If the feature of storing image evidence in a specified OSS bucket is enabled, images whose machine-assisted moderation result is block, review, or pass are stored in the specified OSS bucket.

      1. Click the Evidence Storage tab.

      2. On the Evidence Storage tab, turn on Enable Evidence Storage in the Video Evidence Storage, Audio Evidence Storage, and Image evidence transfer sections as needed and complete the required configurations.

        Note

        If you log on to the Content Moderation console for the first time, you must grant the read and write permissions on OSS to the current account as prompted. You can reuse the permission policy for detecting illegal content in an OSS bucket. If the current account is granted the required permissions, the Evidence Storage tab appears.

        Parameter

        Description

        OSS Bucket

        The OSS bucket that is used to store the evidence files.

        Storage Directory

        The folder used to store the evidence files in the specified OSS bucket. All evidence files are stored in the specified folder based on the storage rules. For more information, see Storage rules.

        Note

        If the specified folder does not exist in the OSS bucket, the system automatically creates one.

        URL Validity Duration

        The validity period of the OSS URL of each evidence file that is stored in the OSS bucket. Valid values: 300 to 3600. Unit: seconds.

        Transfer range

        Valid values:

        • Machine identification violation: stores the content whose machine-assisted moderation result is block.

        • Machine identification is suspected: stores the content whose machine-assisted moderation result is review.

        • The machine identification is normal.: stores the content whose machine-assisted moderation result is pass.

      3. Click Save.

        After you complete the configurations for the evidence storage feature, the detected illegal, suspicious, and normal content is stored in the specified OSS buckets when you call API operations to review videos and images and detect spam in audio. You can view the OSS URLs of the stored video-related files in the returned data.extras.newUrl parameter, the OSS URLs of the stored audio-related files in the returned data.new_url and data.result.details.url parameters, and the OSS URLs of the stored image files in the returned data.storedUrl parameter. You can also log on to the OSS console to view the files stored in the specified buckets.

        Storage rules

        Table 1. Storage rules for video-related files

        File type

        Storage path

        Naming convention

        Video file

        ${bucket}/${Storage folder}/video/${suggestion}/${taskId}/${Video name.Suffix}

        A video file stored in the OSS bucket is named after the original video file.

        Video snapshot

        ${bucket}/${Storage folder}/video/${suggestion}/${taskId}/${Video snapshot name.Suffix}

        A video snapshot file stored in the OSS bucket is named based on the point in time when the snapshot is captured. For example, 00_01_02 indicates that the snapshot was captured at 00:01:02.

        Video stream

        ${bucket}/${Storage folder}/video/${suggestion}/${taskId}/${Video stream name.Suffix}

        A video stream file stored in the OSS bucket is named based on the point in time when the review starts. For example, the name 20190102_12_02_03.wav indicates that the review of the video stream started at 12:02:03 on January 2, 2019.

        Table 2. Storage rules for audio-related files

        File type

        Storage path

        Naming convention

        Audio file

        ${bucket}/${Storage folder}/audio/${suggestion}/${taskId}/${Audio name.Suffix}

        An audio file stored in the OSS bucket is named after the original audio file.

        Audio fragment

        ${bucket}/${Storage folder}/audio/${suggestion}/${taskId}/${Audio fragment name.Suffix}

        An audio fragment file stored in the OSS bucket is named based on the start and end points in time of the audio fragment. For example, the name 00_01_02-00_10_13.mp3 indicates that the audio fragment is trimmed from 00:01:02 to 00:10:13 of the original audio file.

        Table 3. Storage rules for image files

        File type

        Storage path

        Naming convention

        Image

        ${bucket}/${Storage folder}/image/${suggestion}/${taskId}/${Image name.Suffix}

        An image file stored in the OSS bucket is named after the original image file.