Automated review detects various types of non-compliant content in your media assets, including pornography, terrorism and politically sensitive content, text and image violations, logos, undesirable scenes, and voice content violations. This topic describes each review type and explains how to use them.
Review types
Automated review accurately detects non-compliant content in video, audio, and image files across multiple dimensions. The following types of content moderation are supported:
Type | Description |
Pornography | Detects pornographic or sexually suggestive content. |
Terrorism and political content | Detects terrorism-related or politically sensitive content. |
Text and image violations | Detects advertisements and other non-compliant text and image information. |
Logos | Detects logos, such as TV station icons or trademarks. |
Undesirable scenes | Detects undesirable scenes, such as black screens, black bars, dim footage, Picture-in-Picture (PiP), smoking, or in-car live streaming. |
Voice content violations | Detects non-compliant spoken content within the audio track, such as advertisements, political content, terrorism-related speech, or abusive language. |