Content Moderation applies a default moderation policy to all API requests. If the default policy is too strict or too lenient for a specific business scenario, create a custom BizType and configure its moderation policy. Each BizType maps to one policy, and API requests that specify a BizType use that policy instead of the default.
Prerequisites
Before you begin, ensure that you have:
An active Alibaba Cloud account with access to the Content Moderation console
(First-time users) Granted read and write permissions on Object Storage Service (OSS) to your account, as prompted on first login. You can reuse an existing permission policy for detecting illegal content in an OSS bucket.
Key concepts
BizType is a string identifier you attach to API requests to select a specific moderation policy. If you do not specify a BizType, Content Moderation uses the default policy.
Each BizType covers three configuration areas:
| Area | What you configure |
|---|---|
| Moderation policy | Which content types to screen and how strictly |
| Audit data | Which results to route to human review |
| Evidence storage | Whether to archive flagged content to OSS |
Create a BizType
Log on to the Content Moderation console.
In the left-side navigation pane, choose Machine audit V1.0 > Settings.
On the BizType Manage tab of the Machine audit page, click Create BizType.
In the Create bizType dialog box, configure the following parameters:
Parameter Description Name A name for the BizType. Allowed characters: digits, letters, and underscores ( _). Maximum length: 32 characters.Apply Template Select an industry template to import a preset configuration. If you use a template, some policy settings cannot be customized further. Leave blank to configure the policy manually. Industry Category The industry category for this scenario. Setting an appropriate category helps Alibaba Cloud assist with policy adjustments. Copy from Import the configuration of an existing BizType as the starting point. Description A description of the BizType. Allowed characters: letters, digits, and underscores ( _). Maximum length: 32 characters.Click OK.
Configure the moderation policy
After creating a BizType, configure what content it screens.
On the Machine audit page, find the BizType and click Edit in the Actions column.
On the Moderation Policy tab, select the moderation scenarios that apply to your business. Content Moderation supports machine-assisted moderation across multiple content types: Clear any scenarios your business does not need. For example, if ad moderation is not required, clear all ad-related scenarios to avoid false positives on legitimate promotional content.
For image moderation policies, click Associated Gallery or Associated Text Library in the upper-right corner of a section to associate the policy with your custom image libraries or text libraries.
Content type Available scenarios Images and videos Pornography detection, terrorist content detection, undesirable scene detection, ad violation detection Text and audio Pornographic content, political content, abuse, ads, prohibited content Click Save.
Use the BizType in API requests
To apply a custom policy, include the BizType name in your API request. Each moderation scenario has its own parameter. For example, to apply the custom policy when screening images for pornographic content, set the scenes parameter to porn in the API request.
For the full parameter reference, see /green/image/scan.
Configure audit data routing
Select which machine-assisted moderation results are forwarded to human review.
Click the Audit Data tab.
Select the result types to route to human reviewers.
For details on the human review workflow, see Review machine-assisted moderation results.
Configure evidence storage
Evidence storage archives the content that machine-assisted moderation processes—flagged, suspicious, or clean—to a specified OSS bucket. This lets you retain a record for compliance, auditing, or appeal handling.
Evidence storage is disabled by default.
How it works
When enabled, Content Moderation stores content based on the suggestion value returned by the API:
API suggestion value | Meaning | Content stored |
|---|---|---|
block | Illegal | Stored if Machine identification violation is selected |
review | Suspicious | Stored if Machine identification is suspected is selected |
pass | Normal | Stored if The machine identification is normal. is selected |
After storage, the OSS URL of each archived file is returned in the API response:
| Content type | API response field |
|---|---|
| Video | data.extras.newUrl |
| Audio | data.new_url and data.result.details.url |
| Image | data.storedUrl |
You can also view archived files in the OSS console.
Enable evidence storage
Click the Evidence Storage tab.
In the Video Evidence Storage, Audio Evidence Storage, and Image evidence transfer sections, turn on Enable Evidence Storage and configure the following parameters:
Parameter Description OSS Bucket The OSS bucket where evidence files are stored. Storage Directory The folder within the OSS bucket. If the folder does not exist, it is created automatically. URL Validity Duration How long the OSS URL for each evidence file remains valid. Valid values: 300–3600 seconds. Transfer range The result types to store. Select one or more: Machine identification violation ( block), Machine identification is suspected (review), The machine identification is normal. (pass).Click Save.
Storage path rules
All evidence files follow a structured path within the OSS bucket.
Video-related files
| File type | Storage path | Naming convention |
|---|---|---|
| Video file | ${bucket}/${Storage folder}/video/${suggestion}/${taskId}/${Video name.Suffix} | Named after the original video file |
| Video snapshot | ${bucket}/${Storage folder}/video/${suggestion}/${taskId}/${Video snapshot name.Suffix} | Named by capture time, e.g., 00_01_02 = captured at 00:01:02 |
| Video stream | ${bucket}/${Storage folder}/video/${suggestion}/${taskId}/${Video stream name.Suffix} | Named by review start time, e.g., 20190102_12_02_03.wav = started at 12:02:03 on January 2, 2019 |
Audio-related files
| File type | Storage path | Naming convention |
|---|---|---|
| Audio file | ${bucket}/${Storage folder}/audio/${suggestion}/${taskId}/${Audio name.Suffix} | Named after the original audio file |
| Audio fragment | ${bucket}/${Storage folder}/audio/${suggestion}/${taskId}/${Audio fragment name.Suffix} | Named by start and end time, e.g., 00_01_02-00_10_13.mp3 = trimmed from 00:01:02 to 00:10:13 |
Image files
| File type | Storage path | Naming convention |
|---|---|---|
| Image | ${bucket}/${Storage folder}/image/${suggestion}/${taskId}/${Image name.Suffix} | Named after the original image file |
What's next
Review machine-assisted moderation results — Route flagged content to human reviewers for a final decision.
/green/image/scan — Full parameter reference for image moderation API requests.