All Products
Search
Document Center

:Automated review

Last Updated:Oct 30, 2025

Automated review detects various types of non-compliant content in your media assets, including pornography, terrorism and politically sensitive content, text and image violations, logos, undesirable scenes, and voice content violations. This topic describes each review type and explains how to use them.

Review types

Automated review accurately detects non-compliant content in video, audio, and image files across multiple dimensions. The following types of content moderation are supported:

Type

Description

Pornography

Detects pornographic or sexually suggestive content.

Terrorism and political content

Detects terrorism-related or politically sensitive content.

Text and image violations

Detects advertisements and other non-compliant text and image information.

Logos

Detects logos, such as TV station icons or trademarks.

Undesirable scenes

Detects undesirable scenes, such as black screens, black bars, dim footage, Picture-in-Picture (PiP), smoking, or in-car live streaming.

Voice content violations

Detects non-compliant spoken content within the audio track, such as advertisements, political content, terrorism-related speech, or abusive language.

Create an automated review template

OpenAPI

Call the CreateCustomTemplate operation and set the Type parameter to create a custom template.

Create an automated review job

OpenAPI

Call the SubmitMediaCensorJob operation to submit an automated review job.

Query the details of an automated review job

Callbacks

Check the details of an automated review job by receiving the callback for a completed task. For more information, see Callback overview.

OpenAPI

Call the QueryMediaCensorJobDetail operation to query the details of an automated review job.

API references