All Products
Search
Document Center

Intelligent Media Services:Common scenarios

Last Updated:Dec 12, 2024

This topic describes the parameters for producing videos in intelligent image-text matching mode for common scenarios. This topic also provides advanced configurations and examples of SDK calls.

Note
  • Before you read this topic, we recommend that you read Use the intelligent and quick video production feature to learn the terms and procedure of producing videos in intelligent image-text matching mode for common scenarios.

  • Intelligent image-text matching mode for common scenarios supports two video production modes. This topic describes the parameters in the following video production modes:

    • Global broadcast mode

    • Storyboard script mode

  • The following regions are supported: China (Shanghai), China (Beijing), China (Hangzhou), and China (Shenzhen).

Usage notes

  • For information about how to produce videos by combining multiple video, audio, and image materials in an intelligent and quick manner, see SubmitBatchMediaProducingJob. For information about key parameters for the SubmitBatchMediaProducingJob operation, see the "InputConfig parameters", "EditingConfig parameters", and "OutputConfig parameters" sections of this topic.

  • For information about how to query the details of an intelligent and quick batch video production job, see GetBatchMediaProducingJob.

InputConfig parameters

You can configure InputConfig parameters to specify basic materials, such as video clips, voice-over scripts, background music, and stickers.

Parameter

Type

Description

Required

Supported mode

MediaArray

List<String>

Specify editing materials by uploading media assets. You can specify the IDs or Object Storage Service (OSS) URLs of the materials that you want to use. The total length of the video materials can be up to 2 hours.

You must configure at least one of the MediaArray and MediaSearchInput parameters.

All modes

MediaSearchInput

MediaSearchInput

Intelligently search for matching materials by specifying a search library and theme description texts.

All modes

TitleArray

List<String>

An array of titles. A title is randomly selected each time the system produces a video.

You can specify at most 50 titles. Each title can be up to 50 characters in length.

No

All modes

SpeechTextArray

List<String>

An array of voice-over scripts. A voice-over script is randomly selected each time the system produces a video.

You can specify at most 50 voice-over scripts. Each voice-over script can be up to 1,000 characters in length.

No

  • Global broadcast mode

SceneInfo

SceneInfo

The scenario-related settings.

Yes

  • Storyboard script mode

StickerArray

List<Sticker>

An array of stickers. A sticker is randomly selected each time the system produces a video.

You can specify at most 50 stickers.

No

All modes

BackgroundMusicArray

List<String>

An array of background music materials. A background music material is randomly selected each time the system produces a video.

You can specify at most 50 background music materials by specifying their IDs or OSS URLs.

No

All modes

BackgroundImageArray

List<String>

An array of background images. A background image material is randomly selected each time the system produces a video.

You can specify at most 50 background images by specifying their IDs or OSS URLs.

No

All modes

MediaSearchInput parameters

Parameter

Type

Description

Required

LibSearchCondition

LibSearchCondition

The search conditions of the search library.

Required

LibSearchCondition parameters

Parameter

Type

Description

Required

SearchLibs

List<String>

A list of search libraries, such as ims-default-search-lib.

Required

SearchText

String

The theme description text, which describes the theme of the matching materials. The text can be up to 20 characters in length. Examples: "Alibaba Cloud Assistant is learning live commerce" and "Ocean, coral reef, seal, dolphin, and marine environment".

Required

Sticker parameters

Parameter

Type

Description

Required

MediaId

String

The ID of an image, such as a sticker, a logo, or a watermark.

You must specify at least one of the parameters.

If you specify both the MediaId and MediaURL parameters, the MediaId parameter takes precedence.

MediaURL

String

The URL of the image. You must specify an OSS URL.

X

Float

For more information, see the description of the X parameter in the "VideoTrackClip" section of the Timeline configurations topic.

No

Y

Float

For more information, see the description of the Y parameter in the "VideoTrackClip" section of the Timeline configurations topic.

No

Width

Float

For more information, see the description of the Width parameter in the "VideoTrackClip" section of the Timeline configurations topic.

No

Height

Float

For more information, see the description of the Height parameter in the "VideoTrackClip" section of the Timeline configurations topic.

No

DynamicFrames

Integer

The number of frames of an animated image.

No. This parameter is required only when an animated sticker is specified.

SceneInfo parameters

Note

This parameter is valid only in storyboard script mode. You do not need to specify this parameter in global broadcast mode.

Parameter

Type

Description

Required

Scene

String

The type of the matching scenario. For common scenarios, set the value to General.

Yes

ShotInfo

ShotInfo

The storyboard script.

Yes

ShotInfo parameters

Note

This parameter is valid only in storyboard script mode. You do not need to specify this parameter in global broadcast mode.

Parameter

Type

Description

Required

ShotScripts

List<ShotScript>

An array of storyboard scripts.

Yes

ShotScript parameters

Note

This parameter is valid only in storyboard script mode. You do not need to specify this parameter in global broadcast mode.

Parameter

Type

Description

Required

ScriptText

String

The script that describes a storyboard. Example: The old wizard Danny is working on some strange instruments, trying to develop a new magic potion.

No

SpeechText

String

The voice-over script for a storyboard. The voice-over script is up to 100 characters in length.

No

Duration

Float

The length of a storyboard. The value of this field takes effect only if no voice-over script is specified for the storyboard. If a voice-over script is specified for the storyboard, the length of the storyboard is the same as the voice-over.

No

Sample code in global broadcast mode

{
  // Choose between the MediaArray and MediaSearchInput parameters.
  "MediaArray": [
    "****9d46c886b45481030f6e****",
    "****c886810b4549d4630f6e****",
    "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test1.mp4",
    "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test2.png"
  ],
  // Choose between the MediaArray and MediaSearchInput parameters.
  "MediaSearchInput": {
        "LibSearchCondition": {
            "SearchLibs": [
                "ims-default-search-lib",
                "test-20"
            ],
            "SearchText": "Alibaba Cloud Assistant is learning live commerce"
      }
  },
  "TitleArray": [
    "Freshippo Opens a Store in Huilongguan",
    "Freshippo Opens a Store"
  ],
  "SpeechTextArray": [
    "Freshippo opens a store near the shopping mall. Today is the first day of opening. Come and check it out. The store is not large but the prices of snacks and drinks are low, which attract many customers waiting in lines.",
    "Freshippo opens a store near the shopping mall. Today is the first day of opening. Come and check it out."
  ],
  "Sticker": {
    "MediaId": "****b681034549d46c880f6e****",
    "X": 10,
    "Y": 100,
    "Width": 300,
    "Height": 300
  },
  "StickerArray": [
    {
      "MediaId": "****9d46c8b4548681030f6e****",
      "X": 10,
      "Y": 100,
      "Width": 300,
      "Height": 300
    },
    {
      "MediaURL": "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test3.png",
      "X": 10,
      "Y": 100,
      "Width": 300,
      "Height": 300
    }
  ],
  "BackgroundMusicArray": [
    "****b4549d46c88681030f6e****",
    "****549d46c88b4681030f6e****",
    "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test4.mp3"
  ],
  "BackgroundImageArray": [
    "****6c886b4549d481030f6e****",
    "****9d46c8548b4681030f6e****",
    "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test1.png"
  ]
}

Sample code in storyboard script mode

{
  // Choose between the MediaArray and MediaSearchInput parameters.
  "MediaArray": ["MediaId1", "MediaId2"],
  // Choose between the MediaArray and MediaSearchInput parameters.
  "MediaSearchInput": {
        "LibSearchCondition": {
            "SearchLibs": [
                "ims-default-search-lib",
                "test-20"
            ],
            "SearchText": "Alibaba Cloud Assistant is learning live commerce"
      }
  },
  "SceneInfo": {
    "Scene": "General", // General matching mode. 
    "ShotInfo": {
      "ShotScripts": [
        {
          "ScriptText": "The script for the first storyboard",
          "SpeechText": "The voice-over script for the first storyboard",
          "Duration": 5.0 // The value of this field takes effect only if no voice-over script is specified.
        },
        {
          "ScriptText": "The script for the second storyboard",
          "SpeechText": "The voice-over script for the second storyboard",
          "Duration": 8.0 // The value of this field takes effect only if no voice-over script is specified.
        }
      ]
    }
  },
   "TitleArray": [
    "Freshippo Opens a Store in Huilongguan",
    "Freshippo Opens a Store"
  ],
  "StickerArray": [
    {
      "MediaId": "****9d46c8b4548681030f6e****",
      "X": 10,
      "Y": 100,
      "Width": 300,
      "Height": 300
    },
    {
      "MediaURL": "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test3.png",
      "X": 10,
      "Y": 100,
      "Width": 300,
      "Height": 300
    }
  ],
  "BackgroundMusicArray": [
    "****b4549d46c88681030f6e****",
    "****549d46c88b4681030f6e****",
    "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test4.mp3"
  ],
  "BackgroundImageArray": [
    "****6c886b4549d481030f6e****",
    "****9d46c8548b4681030f6e****",
    "http://test-bucket.oss-cn-shanghai.aliyuncs.com/test1.png"
  ]
}

EditingConfig parameters

Note
  • You can configure EditingConfig parameters to specify the volume, location, and other production settings of output videos. If you have no special requirements for a parameter, we recommend that you leave the parameter empty. The parameter uses the default value.

  • In common scenario, the EditingConfig parameters in global broadcast mode are the same as those in storyboard script mode.

Parameter

Type

Description

Required

MediaConfig

JSON

The configurations of the input video materials.

Supported fields:

  • Volume: the volume of the input video. Valid values: [0, 10.0]. You can specify a decimal. Example: 0.5.

  • MediaMetaDataArray: an array of media asset metadata. A MediaMetaData value contains the following fields:

    • Media: the ID or OSS URL of the material. The value of this field must be the same as the value of the corresponding field of the InputConfig parameter.

    • TimeRangeList: the time ranges of clips cut from a material. You can specify multiple time ranges to cut multiple clips from a material. A TimeRange value contains the following fields:

      • In: the start time.

      • Out: the end time.

No

TitleConfig

JSON

The configurations of titles. You can configure subtitle parameters. For more information, see the "Banner text" section of the Effect configurations topic.

No

SpeechConfig

JSON

The configurations of voice-over scripts.

Supported fields:

  • Volume: the volume of voice-overs. Valid values: [0, 10.0]. You can specify a decimal. Example: 0.5.

  • AsrConfig: You can configure subtitle parameters. For more information, see the "Banner text" section of the Effect configurations topic.

  • Voice: the voice for voice-overs. You can specify one or more voices. If you specify multiple voices, the system selects a random voice. Example: "zhimiao_emo,zhilun".

  • SpeechRate: the tempo of voice-overs. Valid values: -500 to 500. Default value: 0.

    • A value of -500 indicates that the speech tempo is 0.5 times the original speech tempo. A value of 0 indicates that the original speech tempo is used. A value of 500 indicates that the speech tempo is 2 times the original speech tempo.

Note

The following examples show how the speech tempo is calculated:

  • 0.8: (1-1/0.8)/0.002 = -125

  • 1.2: (1-1/1.2)/0.001 = 166

If the speed is less than 1x, use the coefficient of 0.002.

If the speed is greater than 1x, use the coefficient of 0.001.

The actual calculation result is rounded to the nearest integer.

  • Style: the style of voice-overs. By default, this parameter is left empty. If you specify both the Voice and Style fields, the Voice field takes precedence. Valid values:

    • "Gentle"

    • "Serious"

    • "Entertainment"

No

BackgroundMusicConfig

JSON

The configurations of the background music.

Supported fields:

  • Volume: the volume of the background music. Valid values: [0, 10.0]. You can specify a decimal. Example: 0.5.

  • Style: the style of the background music. By default, this parameter is left empty. This parameter does not take effect if a background music is specified in the InputConfig parameter. Valid values:

    • "bgm-beauty"

    • "bgm-chinese-style"

    • "bgm-cuisine"

    • "bgm-dynamic"

    • "bgm-quirky"

    • "bgm-relaxing"

    • "bgm-romantic"

    • "bgm-upbeat"

No

BackgroundImageConfig

JSON

The configurations of background images. This parameter does not take effect if background images are specified in the InputConfig parameter.

Supported fields:

  • SubType: the type of background images. Valid values:

    • "Color": solid color background.

    • "Blur": blurred background.

  • Radius: the blurred radius. This parameter takes effect only when the SubType parameter is set to Blur. Valid values: [0.01, 1].

  • Color: the background color. The RGB color value in the hexadecimal format. Example: #000000. This parameter takes effect only if the SubType parameter is set to Color.

No

ProcessConfig

The video editing settings.

Supported fields:

  • SingleShotDuration: the length of each clip that is automatically segmented from a long video material by the system. Choose between the SingleShotDuration and EnableClipSplit parameters.

  • EnableClipSplit: specifies whether to perform AI-powered video segmentation. Default value: false. If this field is set to true, the SingleShotDuration field does not take effect.

  • AllowVfxEffect: specifies whether special effects can be used. Default value: false.

  • VfxEffectProbability: the probability that special effects are applied to a clip. Valid values: 0.0 to 1.0. Default value: 0.5. You can specify a value that is accurate to two decimal places.

  • AllowTransition: specifies whether transitions can be used. Default value: false.

  • TransitionList: the custom transitions. If AllowTransition is set to true, the system randomly selects a specified transition. Example: ["directional", "linearblur"]

  • UseUniformTransition: specifies whether to use a consistent transition in the output video. Default value: true.

  • AllowDuplicateMatch: specifies whether a clip can be repeated. Default value: false.

ProduceConfig

JSON

The configurations of video editing and production. For more information, see the "EditingProduceConfig" section of the Editing and production parameters topic.

No

EditingConfig sample code

All fields of the EditingConfig parameter are optional. The following sample code shows the default configurations:

{
  "MediaConfig": {
    "Volume": 0 // By default, video materials are muted.
  },
  "TitleConfig": {
    "Alignment": "TopCenter",
    "AdaptMode": "AutoWrap",
    "Font": "Alibaba PuHuiTi 2.0 95 ExtraBold",
    "SizeRequestType": "Nominal",
    "Y": 0.1, // The coordinate of the title in the Y axis when the video is produced in portrait mode.
    "Y": 0.05, // The coordinate of the title in the Y axis when the video is produced in landscape mode.
    "Y": 0.08 // The coordinate of the title in the Y axis when the video is produced in square mode.
  },
  "SpeechConfig": {
    "Volume": 1,  // By default, the original volume setting of the voice-over is used.
    "SpeechRate": 0,
    "Voice": null,
    "Style": null,
    "AsrConfig": {
      "Alignment": "TopCenter",
      "AdaptMode": "AutoWrap",
      "Font": "Alibaba PuHuiTi 2.0 65 Medium",
      "SizeRequestType": "Nominal",
      "Spacing": -1,
      "Y": 0.8, // The coordinate of the subtitle in the Y axis when the video is produced in portrait mode.
      "Y": 0.9, // The coordinate of the subtitle in the Y axis when the video is produced in landscape mode.
      "Y": 0.85 // The coordinate of the subtitle in the Y axis when the video is produced in square mode.
    }
  },
  "BackgroundMusicConfig": {
    "Volume": 0.2,   // By default, the volume of the background music is set to 20%.
    "Style": null
  },
  "ProcessConfig": {
    "SingleShotDuration": 3,      // The length of each clip after segmentation. Choose between the SingleShotDuration and EnableClipSplit parameters.
    "EnableClipSplit": false // Specifies whether to perform AI-powered video segmentation. If you set this parameter to true, the SingleShotDuration parameter does not take effect.
    "AllowVfxEffect": false,	  // Specifies whether special effects can be used.
    "AllowTransition": false,	  // Specifies whether transitions can be used.
    "AllowDuplicateMatch": false, // Specifies whether a clip can be repeated in intelligent image-text matching mode.
  }
}

OutputConfig parameters

Note
  • You can configure OutputConfig parameters to specify the URL, name rules, width, height, and number of output videos.

  • In common scenarios, the OutputConfig parameters in global broadcast mode are the same as those in storyboard script mode.

Parameter

Type

Required

Description

MediaURL

String

This parameter is required when you store the output video in OSS.

The URL of the output video. The URL must contain a placeholder.

Example: http://xxx.oss-cn-shanghai.aliyuncs.com/xxx_{index}.mp4

StorageLocation

String

This parameter is required when you store the output video in ApsaraVideo VOD (VOD).

The URL of the output videos stored in VOD.

Example: outin-xxxxxx.oss-cn-shanghai.aliyuncs.com

FileName

String

This parameter is required when you store the output video in VOD.

The name of the output video. The name must contain a placeholder.

Example: xxx_{index}.mp4

GeneratePreviewOnly

Boolean

No. Default value: false.

If you set the GeneratePreviewOnly parameter to true, the current job generates a timeline only for preview and no video is produced. In this case, you do not need to specify the URL of the output video.

After the quick video production job is complete, you can call the GetBatchMediaProducingJob operation to query the result of the job. The returned task list contains the ID of the edit project (projectId). You can call the GetEditingProject operation to obtain the timeline for preview.

Count

Integer

No. Default value: 1.

The number of videos to be produced.

  • Global broadcast mode: A maximum of 100 videos can be produced.

  • Storyboard script mode: A maximum of 100 videos can be produced.

MaxDuration

Float

No

The maximum length of each video.

If the voice-over script setting is specified, the text-to-speech (TTS) length shall prevail. In this case, this parameter does not take effect.

If no voice-over script setting is specified, the maximum length of each video specified in this parameter takes effect. Default length: 15 seconds.

FixedDuration

Float

No

The fixed length of each video. If you specify a fixed length, all produced videos use the fixed length.

Note:

  • This parameter is not supported in storyboard script mode.

  • In global broadcast mode, you can specify this parameter when the SpeechTextArray parameter is left empty.

  • You can specify only one of the FixedDuration and MaxDuration parameters.

Width

Integer

Yes

The width of the output video in pixels.

Height

Integer

Yes

The height of the output video in pixels.

Video

JSONObject

No

The settings related to the output video streams, such as Crf and Codec.

Sample code

{
 	"MediaURL": "http://xxx.oss-cn-shanghai.aliyuncs.com/xxx_{index}.mp4",
 	"Count": 1,
 	"MaxDuration": 15,
 	"Width": 1080,
 	"Height": 1920,
 	"Video": {"Crf": 27},
        "GeneratePreviewOnly":false
}

Examples of SDK calls

Prerequisites

The Intelligent Media Services (IMS) server-side SDK is installed. For more information, see Preparations.

Sample code

The following sample code provides examples of the global broadcast mode.

Show code

package com.example;

import com.alibaba.fastjson.JSONObject;
import com.aliyun.ice20201109.Client;
import com.aliyun.ice20201109.models.*;
import com.aliyun.teaopenapi.models.Config;

import java.util.*;

/**
 *  Add the following Maven dependencies:
 *   <dependency>
 *      <groupId>com.aliyun</groupId>
 *      <artifactId>ice20201109</artifactId>
 *      <version>2.3.0</version>
 *  </dependency>
 *  <dependency>
 *      <groupId>com.alibaba</groupId>
 *      <artifactId>fastjson</artifactId>
 *      <version>1.2.9</version>
 *  </dependency>
 */

public class SmartMixBatchEditingService {

    static final String regionId = "<service-region>"; // The IDs of regions that support intelligent image-text matching: cn-shanghai, cn-beijing, and cn-hangzhou.
    static final String bucket = "<your-bucket>";
    private Client iceClient;

    public static void main(String[] args) throws Exception {
        SmartMixBatchEditingService smartMixBatchEditingService = new SmartMixBatchEditingService();
        smartMixBatchEditingService.initClient();
        smartMixBatchEditingService.runExample();
    }

    public void initClient() throws Exception {
        // The AccessKey pair of an Alibaba Cloud account has access permissions on all API operations. We recommend that you use the AccessKey pair of a Resource Access Management (RAM) user to call API operations or perform routine O&M. 
        // In this example, the AccessKey ID and AccessKey secret are obtained from the environment variables. For information about how to configure environment variables to store the AccessKey ID and AccessKey secret, visit https://www.alibabacloud.com/help/en/sdk/developer-reference/v2-manage-access-credentials.
        com.aliyun.credentials.Client credentialClient = new com.aliyun.credentials.Client();

        Config config = new Config();
        config.setCredential(credentialClient);

        // To hard-code your AccessKey ID and AccessKey secret, use the following lines. However, for security concerns, we recommend that you do not hard-code your AccessKey ID and AccessKey secret. 
        // config.accessKeyId = <AccessKey ID>;
        // config.accessKeySecret = <AccessKey secret>;
        config.endpoint = "ice." + regionId + ".aliyuncs.com";
        config.regionId = regionId;
        iceClient = new Client(config);
    }

    public void runExample() throws Exception {

        // The video materials.
        List<String> mediaArray = Arrays.asList(
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-1.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-2.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-3.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-4.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-5.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-6.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-7.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-8.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-9.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-10.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-11.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-12.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-13.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-14.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-15.mp4",
            "http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-16.mp4"
        );

        // The voice-over scripts.
        String speechText = "A variety of creatures live in the vast sea. The clear blue water is home to colorful coral reefs, which are the cornerstone of marine biodiversity and provide shelter for countless small fish, shellfish, and seaweed. Lovely sea lions enjoy the warm sunlight on rocks, living in harmony with nature. Dolphins are agile and smart. They swim freely in the sea, chasing and playing with each other. However, this beautiful marine environment faces unprecedented challenges. Marine litter and pollution pose serious risks to the ecosystem. Many species are experiencing food shortages, a high incidence of diseases, and even population loss. The sea is beautiful and fragile. We must take action to protect the sea and restore its purity and vitality.";

        // The video title.
        String title = "Protect the Sea";

        JSONObject inputConfig = new JSONObject();
        inputConfig.put("MediaArray", mediaArray);
        inputConfig.put("SpeechText", speechText);
        inputConfig.put("Title", title);

        // The number of videos to be produced.
        int produceCount = 4;

        // The width and height of the output videos in portrait mode.
        int outputWidth = 1080;
        int outputHeight = 1920;

        //// The width and height of the output videos in landscape mode.
        //int outputWidth = 1920;
        //int outputHeight = 1080;

        // The OSS URL of the output videos. The URL must contain the {index} placeholder.
        String mediaUrl = "http://" + bucket + ".oss-" + regionId + ".aliyuncs.com/smart_mix/output_{index}.mp4";

        JSONObject outputConfig = new JSONObject();
        outputConfig.put("MediaURL", mediaUrl);
        outputConfig.put("Count", produceCount);
        outputConfig.put("Width", outputWidth);
        outputConfig.put("Height", outputHeight);

        // Submit the quick video production job.
        SubmitBatchMediaProducingJobRequest request = new SubmitBatchMediaProducingJobRequest();
        request.setInputConfig(inputConfig.toJSONString());
        request.setOutputConfig(outputConfig.toJSONString());

        SubmitBatchMediaProducingJobResponse response = iceClient.submitBatchMediaProducingJob(request);
        String jobId = response.getBody().getJobId();
        System.out.println("Start smart mix batch job, batchJobId: " + jobId);

        // Iterate over tasks until all the tasks are complete.
        System.out.println("Waiting job finished...");
        int maxTry = 3000;
        int i = 0;
        while (i < maxTry) {
            Thread.sleep(3000);
            i++;
            GetBatchMediaProducingJobRequest getRequest = new GetBatchMediaProducingJobRequest();
            getRequest.setJobId(jobId);
            GetBatchMediaProducingJobResponse getResponse = iceClient.getBatchMediaProducingJob(getRequest);
            String status = getResponse.getBody().getEditingBatchJob().getStatus();
            System.out.println("BatchJobId: " + jobId + ", status:" + status);

            if ("Failed".equals(status)) {
                System.out.println("Batch job failed. JobInfo: " + JSONObject.toJSONString(getResponse.getBody().getEditingBatchJob()));
                throw new Exception("Produce failed. BatchJobId: " + jobId);
            }

            if ("Finished".equals(status)) {
                System.out.println("Batch job finished. JobInfo: " + JSONObject.toJSONString(getResponse.getBody().getEditingBatchJob()));
                break;
            }
        }
    }
}

API request parameters

Show InputConfig parameters

{
	"MediaArray": [
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-1.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-2.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-3.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-4.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-5.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-6.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-7.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-8.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-9.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-10.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-11.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-12.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-13.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-14.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-15.mp4",
		"http://ice-document-materials.oss-cn-shanghai.aliyuncs.com/test_media/sea/sea-16.mp4"
	],
	"SpeechText": "A variety of creatures live in the vast sea. The clear blue water is home to colorful coral reefs, which are the cornerstone of marine biodiversity and provide shelter for countless small fish, shellfish, and seaweed. Lovely sea lions enjoy the warm sunlight on rocks, living in harmony with nature. Dolphins are agile and smart. They swim freely in the sea, chasing and playing with each other. However, this beautiful marine environment faces unprecedented challenges. Marine litter and pollution pose serious risks to the ecosystem. Many species are experiencing food shortages, a high incidence of diseases, and even population loss. The sea is beautiful and fragile. We must take action to protect the sea and restore its purity and vitality.",
	"Title":"Protect the Sea",
}

Show OutputConfig parameters

{
  "Count": 4,
  "Height": 1080,
  "Width": 1920,
  "MediaURL": "http://<your-bucket>.oss-<region-id>.aliyuncs.com/script/output_{index}_w.mp4"
}

Sample output videos

Portrait mode

Landscape mode

Editing logic and advanced configurations

Editing logic

Global broadcast mode:

  • If the video material is selected by specifying a search library and theme description texts, you can use the theme description texts as the search condition to search for video clips in the search library. The matching video clips are used as input video materials.

  • If the input video is a long video, the video is segmented into clips. During video editing, you can select from the video clips and merge them into a new video. The default length of each clip after segmentation is 3 seconds. You can specify the SingleShotDuration parameter to configure the length of the clips. For more information, see the "EditingConfig parameters" section of this topic.

  • If no voice-over script is specified, the system selects random clips and merge them into a video of about 15 seconds.

  • If a voice-over script is specified, the system aligns the voice-over script with the clips in intelligent image-text matching mode and merges the clips into multiple videos at a time.

Storyboard script mode:

  • If the video material is selected by specifying a search library and theme description texts, use the theme description texts as the search condition to search for video clips in the search library. Then, use the matching video clips as input video materials.

  • In storyboard script mode, you do not need to specify the SpeechTextArray or SpeechText field. You can use the SceneInfo, ShotInfo, and ShotScripts fields to specify the content, length, and voice-over in each storyboard of the output videos.

  • In a single storyboard, clips are cut and matched based on the script. If no script is specified but a voice-over script is specified, the voice-over script is used to match clips.

  • The length of a storyboard is aligned with the length of the corresponding voice-over or the custom length.

Advanced configurations

For more information about advanced configurations, see Editing logic and advanced configurations for batch video production.

References