If you have an emergent control requirement that must be satisfied immediately and cannot wait until the next model update, Content Security provides a custom image library function to address your urgent needs. A custom image library contains a blacklist and a whitelist. In the subsequent returned results, the suggestions for the samples in the blacklist and their similar samples are “block” and those for the samples in the whitelist and their similar samples are “pass”. To use the custom image library function:
Go to the Content Moderation Settings page, find the custom image library function.
Click to create an image library. Name the image library according to your business, and set “Image Library Type” to a blacklist or a whitelist. When the API is called, you must select a value for the “scenes” parameter. There is one optional values, “Porn”, each of which corresponds to an applicable scenario. For example, if the image library is applied to identify pornographic elements, set “scenes” to “Porn”, and when you subsequently call the porn identification service for images or videos, the blacklist and whitelist set for this scenario will be enabled by default.
Manage the image library. In the image library list, click “Manage Image Library”, and then you can upload images to or delete images from the image library. The image format can be PNG, JPG, JPEG, or BMP, and each image must be smaller than 5 MB. Images can be uploaded or deleted in a batch, and a maximum of 20 images can be uploaded simultaneously.
Delete or rename the image library. You can delete or rename the image library in the image library list.