All Products
Search
Document Center

ApsaraVideo Media Processing:Parameter details

Last Updated:Sep 06, 2024

This topic describes the details of the parameters that are used in the ApsaraVideo Media Processing (MPS) API, including the parameter types, description, and valid values. You can configure these API parameters to use the features of MPS, including transcoding, MPS queues, and workflows.

Input

This parameter is referenced by the SubmitJobs operation.

Parameter

Type

Required

Description

Bucket

String

Yes

The Object Storage Service (OSS) bucket that stores the input file.

For more information about the term bucket, see Terms.

Location

String

Yes

The region in which the OSS bucket that stores the input file resides.

  • The OSS bucket must reside in the same region as MPS.

  • For more information about the term region, see Terms.

Object

String

Yes

The OSS path of the input file. The OSS path is a full path that includes the name of the input file.

  • For more information about the term ObjectKey, see Terms.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

  • For example, Alibaba Cloud/mts HD+.mp4 is encoded as %E9%98%BF%E9%87%8C%E4%BA%91/mts%20HD%2B.mp4.

Referer

String

No

The configuration of hotlink protection. If you enable the hotlink protection feature for the OSS bucket to allow only specified referers in the whitelist to download files, you must specify this parameter. If you do not enable the hotlink protection feature for the OSS bucket, you do not need to specify this parameter. For more information, see Hotlink protection.

  • If you use a workflow for transcoding, you must specify this parameter in the MPS console. For more information, see the "Step 3: (Optional) Configure hotlink protection in MPS" section of the Add media buckets topic.

  • If you call an API operation to submit a transcoding job, you must specify this parameter in the request.

Output

This parameter is referenced by the SubmitJobs, AddMediaWorkflow, and UpdateMediaWorkflow operations.

Parameter

Type

Required

Description

OutputObject

String

Yes

The OSS path of the output file. The OSS path is a full path that includes the name of the output file.

  • For more information about the term ObjectKey, see Terms.

  • Placeholders are supported. For more information, see the Placeholder replacement rules section of this topic.

  • Rules for specifying the file name extension:

    • Workflow: You do not need to specify the file name extension. MPS automatically appends the file name extension to the value of the OutputObject parameter based on the container format of the transcoding template.

    • Transcoding job: You must specify the file name extension, and the file name extension must match the container format of the transcoding template. If the container format is M3U8, MPS automatically adds the file name extension .m3u8 to the playlist. A five-digit serial number is automatically added as a suffix to the playlist name to generate the name of a media segment file. The five-digit serial number starts from 00001 and connects to the playlist name with a hyphen (-). The file name extension of a media segment file is .ts. For example, if the file name of the playlist is filename.m3u8, the name of the first media segment file is filename-00001.ts.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

  • For example, the path of the input file is a/b/example.flv, and you want to set the path of the output file to a/b/c/example+test.mp4. In this case, you can use placeholders to specify the path of the output file in the following format: {ObjectPrefix}/c/{FileName}+test.mp4. After URL encoding, the path is displayed as %7BObjectPrefix%7D/c/%7BFileName%7D%2Btest.mp4.

TemplateId

String

Yes

The ID of the transcoding template.

Container

Object

No

The container format. For more information, see the Container section of this topic.

  • If you specify this parameter, the corresponding parameter in the specified transcoding template is overwritten.

Video

Object

No

The parameter related to video transcoding. For more information, see the Video section of this topic.

  • If you specify this parameter, the corresponding parameter in the specified transcoding template is overwritten.

Audio

Object

No

The parameter related to audio transcoding. For more information, see the Audio section of this topic.

  • If you specify this parameter, the corresponding parameter in the specified transcoding template is overwritten.

TransConfig

Object

No

The parameter related to the transcoding process. For more information, see the TransConfig section of this topic.

  • If you specify this parameter, the corresponding parameter in the specified transcoding template is overwritten.

  • Example: {"TransMode":"onepass","AdjDarMethod":"none","IsCheckVideoBitrateFail":"true","IsCheckAudioBitrateFail":"true"}.

VideoStreamMap

String

No

The identifier of the video stream to be retained in the input file. Valid values:

  • Not specified: A default video stream is selected.

  • 0:v:{Serial number}: A specific video stream is selected. The serial number specifies the subscript of the video stream. The serial number starts from 0. For example, 0:v:1 specifies that the second video stream is selected for transcoding.

  • 0:v: All video streams are selected.

AudioStreamMap

String

No

The identifier of the audio stream to be retained in the input file. Valid values:

  • Not specified: A default audio stream is selected. In most cases, a Chinese, multi-channel, and high-quality audio stream is preferred.

  • 0:a:{Serial number}: A specific audio stream is selected. The serial number specifies the subscript of the audio stream. The serial number starts from 0. For example, 0:a:1 specifies that the second audio stream is selected for transcoding.

  • 0:a: All audio streams are selected. This value is applicable to a multi-language dubbing scenario.

Rotate

String

No

The rotation angle of the video in the clockwise direction.

  • Valid values: 0, 90, 180, and 270.

  • Default value: 0, which specifies that the video is not rotated.

WaterMarks

Object[]

No

The watermarks. Watermarks are images or text added to video images. If you specify this parameter, the corresponding parameter in the specified watermark template is overwritten. For more information, see the WaterMarks section of this topic.

  • You can add up to four watermarks to a transcoding job.

  • Example of a single image watermark: ["WaterMarkTemplateId":"88c6ca184c0e47098a5b665e2a12****"},{"InputFile":{"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example-logo.png"},{"Timeline":{"Start":"0","Duration":"ToEND"}}].

  • Example of a single text watermark: ["Type":"Text","TextWaterMark":"{"Content":"5rWL6K+V5paH5a2X5rC05Y2w","FontName":"SimSun","FontSize":"16","Top":2,"Left":10}].

DeWatermark

Object

No

The blur operation. For more information, see the DeWatermark section of this topic.

  • Example: {"0": [{"l":10,"t":10,"w":10,"h":10},{"l":100,"t":0.1,"w":10,"h":10}],"128000": [],"250000": [{"l":0.2,"t":0.1,"w":0.01,"h":0.05}]}.

SubtitleConfig

Object

No

The configurations of the hard subtitle. This parameter allows you to add external subtitle files to the video. For more information, see the SubtitleConfig section of this topic.

  • You can add up to four subtitle files to a transcoding job.

  • Example: {"ExtSubtitleList":[{"Input":{"Bucket":"example-bucket-****","Location":"oss-cn-hangzhou","Object":"example.srt"},"CharEnc":"UTF-8"}]}.

Clip

Object

No

The clip. For more information, see the Clip section of this topic.

  • Example: {"TimeSpan":{"Seek":"00:01:59.999","End":"18000.30"},"ConfigToClipFirstPart":false}, which specifies that the clip starts at 1 minute, 59 seconds, and 999 milliseconds and ends at the point in time that is 5 minutes and 30 milliseconds before the end of the video. The clip is cropped from the video that is formed by merging multiple input files.

MergeList

Object[]

No

The merge list. You can merge multiple input files and clips in sequence to generate a new video. For more information, see the MergeList section of this topic.

  • You can specify only one of the MergeList and MergeConfigUrl parameters. The priority of the MergeConfigUrl parameter is higher.

  • You can add up to four MergeURL parameters to a transcoding job. To add more MergeURL parameters to a transcoding job, specify the MergeConfigUrl parameter.

  • Example for specifying a single MergeURL parameter: [{"MergeURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4"}].

  • Example for specifying two MergeURL parameters: [{"MergeURL":"http://exampleBucket****m.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4","Start":"1","Duration":"20"},{"MergeURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_comm_02.mp4","Start":"5.4","Duration":"10.2"}].

MergeConfigUrl

String

No

The OSS path of the configuration file for merging clips.

  • You can specify only one of the MergeList and MergeConfigUrl parameters. The priority of the MergeConfigUrl parameter is higher.

  • The file must be stored in an OSS bucket. Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/mergeConfigfile.

  • The file contains multiple MergeURL parameters. Specify the MergeURL parameters in the order in which you want to merge the corresponding clips. You can specify up to 50 MergeURL parameters. For more information about the format, see the MergeList section of this topic. Example of the configuration file content: {"MergeList":[{"MergeURL":"http://exampleBucket****m.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4","Start":"1","Duration":"20"},{"MergeURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_comm_02.mp4","Start":"5.4","Duration":"10.2"}]}.

OpeningList

Object[]

No

The list of opening scenes. Opening is a special merging effect that allows you to embed opening parts at the beginning of the input video. The opening parts are displayed in Picture-in-Picture (PiP) mode. For more information, see the OpeningList section of this topic.

  • You can add up to two opening parts to a transcoding job. Specify the opening parts in the order in which you want to embed the opening parts in the output video.

  • Example: [{"OpenUrl":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/opening_01.flv","Start":"1","Width":"1920","Height":"1080"},{"OpenUrl":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/opening_02.flv","Start":"1","Width":"-1","Height":"full"}].

TailSlateList

Object[]

No

The list of ending scenes. Ending is a special merging effect that allows you to add ending parts to the end of the input video. The ending parts are displayed in fade-in and fade-out mode. For more information, see the TailSlateList section of this topic.

  • You can add up to two ending parts to a transcoding job. Specify the ending parts in the order in which you want to embed the ending parts in the output video.

  • Example: [{"TailUrl":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_01.flv","Start":"1","BlendDuration":"2","Width":"1920","Height":"1080","IsMergeAudio":false,"BgColor":"White"}].

Amix

Object[]

No

The audio mixing configuration. This parameter is suitable for scenarios in which you want to merge multiple audio tracks in a video or add background music. For more information, see the Amix section of this topic.

  • You can add up to four mixed audio files to a transcoding job.

  • Example for mixing two audio streams of an input file: [{"AmixURL":"input","MixDurMode":"longest","Start":"1","Duration":"2"}].

  • Example for mixing the audio stream of an external file and the audio stream of an input file: [{"AmixURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail.flv","Map":"0:a:1","MixDurMode":"longest","Start":"1","Duration":"2"}].

MuxConfig

Object

No

The packaging configurations. For more information, see the MuxConfig section of this topic.

  • If you specify this parameter, the corresponding parameter in the specified transcoding template is overwritten.

  • Example: {"Segment":{"Duration":"10","ForceSegTime":"1,2,4,6,10,14,18"}, which specifies that the video is forcibly segmented at the 1st, 2nd, 4th, 6th, 10th, 14th, 18th, 20th, 30th, 40th, and 50th seconds. By default, the interval is 10 seconds.

M3U8NonStandardSupport

Object

No

The non-standard support for the M3U8 format. For more information, see the M3U8NonStandardSupport section of this topic.

  • Example: {"TS":{"Md5Support":true,"SizeSupport":true}}, which specifies that the MD5 value and size of each TS file are included in the output M3U8 video.

Encryption

String

No

The encryption configuration. This parameter takes effect only if the Container parameter is set to M3U8. For more information, see the Encryption section of this topic.

  • Example: {"Type":"hls-aes-128","Key":"ZW5jcnlwdGlvbmtleTEyMw","KeyType":"Base64","KeyUri":"aHR0cDovL2FsaXl1bi5jb20vZG9jdW1lbnQvaGxzMTI4LmtleQ=="}.

UserData

String

No

The custom data, which can be up to 1,024 bytes in size.

Priority

String

No

The priority of the transcoding job in the MPS queue to which the transcoding job is added.

  • Valid values: [1,10]. A value of 1 specifies the lowest priority, whereas a value of 10 specifies the highest priority.

  • Default value: 6.

  • Best practice: MPS queues have concurrency limits. If you submit a large number of jobs, the jobs may need to queue. We recommend that you configure a higher priority for the job that requires high time performance or processes important content.

Container

This parameter is referenced by the Output.Container parameter.

Parameter

Type

Required

Description

Format

String

No

The container format.

  • For more information about the supported formats and compatible codecs, see Supported formats.

    • Supported video formats: 3GP, AVI, FLV, F4V, fMP4, MKV, MOV, MP4, TS, MXF, WebM, M3U8, HLS-fMP4, MPD, CMAF-HLS, and CMAF-DASH

    • Supported audio formats: AAC, M4A, MP2, MP3, MP4, Ogg, FLAC, M3U8, HLS-fMP4, MPD, CMAF-HLS, and CMAF-DASH

    • The format of the animated sticker. Valid values: gif and webp.

  • Default format: MP4.

TransConfig

This parameter is referenced by the Output.TransConfig parameter.

Parameter

Type

Required

Description

TransMode

String

No

The video transcoding mode. This parameter takes effect only if the Codec parameter is set to H.264, H.265, or AV1 and the Bitrate and Crf parameters are set to valid values. For more information, see the Bitrate control mode section of this topic. Valid values:

  • CBR: The bitrate is fixed.

  • onepass: You can set this parameter to onepass if the Bitrate parameter is set to ABR. The encoding speed of this mode is faster than that of the twopass mode.

  • twopass: You can set this parameter to twopass if the Bitrate parameter is set to VBR. The encoding speed of this mode is slower than that of the onepass mode.

  • fixCRF: You can set this parameter to fixCRF if you want to use the quality control mode.

  • Default value: If you specify the Bitrate parameter, the default value of this parameter is onepass. If you do not specify the Bitrate parameter, the default value of this parameter is fixCRF, and the default value of the Crf parameter is used.

AdjDarMethod

String

No

The method that is used to adjust the resolution. This parameter takes effect only if both the Width and Height parameters are specified. You can use this parameter together with the LongShortMode parameter.

IsCheckReso

String

No

Specifies whether to check the video resolution. You can specify only one of the IsCheckReso and IsCheckResoFail parameters. The priority of the IsCheckResoFail parameter is higher. Valid values:

  • true: checks the video resolution. If the width or height of the input video is less than that of the output video, the resolution of the input video is used for transcoding.

  • false: does not check the video resolution.

  • Default value: false.

IsCheckResoFail

String

No

Specifies whether to check the video resolution. You can specify only one of the IsCheckReso and IsCheckResoFail parameters. The priority of the IsCheckResoFail parameter is higher. Valid values:

  • true: checks the video resolution. If the width or height of the input video is less than that of the output video, the transcoding job fails.

  • false: does not check the video resolution.

  • Default value: false.

IsCheckVideoBitrate

String

No

Specifies whether to check the video bitrate. You can specify only one of the IsCheckVideoBitrate and IsCheckVideoBitrateFail parameters. The priority of the IsCheckVideoBitrateFail parameter is higher. Valid values:

  • true: checks the video bitrate. If the bitrate of the input video is less than that of the output video, the bitrate of the input video is used for transcoding.

  • false: does not check the video bitrate.

  • Default value: false.

IsCheckVideoBitrateFail

String

No

Specifies whether to check the video bitrate. You can specify only one of the IsCheckVideoBitrate and IsCheckVideoBitrateFail parameters. The priority of the IsCheckVideoBitrateFail parameter is higher. Valid values:

  • true: checks the video bitrate. If the bitrate of the input video is less than that of the output video, the transcoding job fails.

  • false: does not check the video bitrate.

  • Default value: false.

IsCheckAudioBitrate

String

No

Specifies whether to check the audio bitrate. You can specify only one of the IsCheckAudioBitrate and IsCheckAudioBitrateFail parameters. The priority of the IsCheckAudioBitrateFail parameter is higher. Valid values:

  • true: checks the audio bitrate. If the bitrate of the input audio is less than that of the output audio, the bitrate of the input audio is used for transcoding.

  • false: does not check the audio bitrate.

  • Default value:

    • If the IsCheckAudioBitrate parameter is not specified and the codec of the output audio is different from that of the input audio, the default value is false.

    • If the IsCheckAudioBitrate parameter is not specified and the codec of the output audio is the same as that of the input audio, the default value is true.

IsCheckAudioBitrateFail

String

No

Specifies whether to check the audio bitrate. You can specify only one of the IsCheckAudioBitrate and IsCheckAudioBitrateFail parameters. The priority of the IsCheckAudioBitrateFail parameter is higher. Valid values:

  • true: checks the audio bitrate. If the bitrate of the input audio is less than that of the output audio, the transcoding job fails.

  • false: does not check the audio bitrate.

  • Default value: false.

Bitrate control mode

The following table describes the requirements of different bitrate control modes for setting the TransMode, Bitrate, Maxrate, Bufsize, and Crf parameters.

Bitrate control mode

Setting of the TransMode parameter

Setting of the bitrate-related parameters

Constant bitrate (CBR)

CBR

Set the Bitrate, Maxrate, and Bufsize parameters to the same value.

Average bitrate (ABR)

Set the TransMode parameter to onepass or leave the parameter empty.

The Bitrate parameter is required.

The Maxrate and Bufsize parameters are optional, which can be used to configure the bitrate range during peak hours.

Variable bitrate (VBR)

twopass

The Bitrate, Maxrate, and Bufsize parameters are required.

Constant rate factor (CRF)

fixCRF

A CRF value is required. If the Crf parameter is not specified, the default value of the Crf parameter corresponding to the specified Codec value takes effect.

The Maxrate and Bufsize parameters are optional, which can be used to configure the bitrate range during peak hours.

Leave the TransMode parameter empty.

Leave the Bitrate parameter empty, and the default value of the Crf parameter corresponding to the specified Codec value takes effect.

Video

This parameter is referenced by the Output.Video parameter.

Parameter

Type

Required

Description

Remove

String

No

Specifies whether to delete the video stream. Valid values:

  • true: deletes the video stream. If you set this parameter to true, all video-related parameters are invalid.

  • false: retains the video stream.

  • Default value: false.

Codec

String

No

The video encoding format.

  • Valid values: H.264, H.265, AV1, GIF, and WEBP. For more information about the supported formats and compatible container formats, see Supported formats.

  • Default value: H.264.

Width

String

No

The width or long side of the output video. If the LongShortMode parameter is set to false or left empty, this parameter specifies the width of the output video. If the LongShortMode parameter is set to true, this parameter specifies the long side of the output video.

  • Unit: pixel.

  • Valid values: [128,4096]. The value must be an even number.

  • Default value:

    • If neither the Width nor Height parameter is specified, the default value is the width or long side of the input video.

    • If only the Height parameter is specified, the default value is calculated based on the aspect ratio of the input video.

Height

String

No

The height or short side of the output video. If the LongShortMode parameter is set to false or left empty, this parameter specifies the height of the output video. If the LongShortMode parameter is set to true, this parameter specifies the short side of the output video.

  • Unit: pixel.

  • Valid values: [128,4096]. The value must be an even number.

  • Default value:

    • If neither the Width nor Height parameter is specified, the default value is the width of the input video.

    • If only the Width parameter is specified, the default value is calculated based on the aspect ratio of the input video.

LongShortMode

String

No

Specifies whether to enable the auto-rotate screen feature. This parameter takes effect if at least one of the Width and Height parameters is specified. Valid values:

  • true: enables the auto-rotate screen feature.

  • false: disables the auto-rotate screen feature.

  • Default value: false.

  • Best practice: If your input videos contain both videos in landscape and portrait modes, enable the auto-rotate screen feature and set the scaling parameters based on the resolution parameters. This way, your videos are not stretched or distorted. For more information, see the "Enable auto-orientation" section of the How do I specify a resolution for an output video? topic.

Fps

String

No

The frame rate of the video stream.

  • Unit: frames per second.

  • Valid values: (0,60].

  • Default value: the frame rate of the input file. If the frame rate of the input file exceeds 60, 60 is used.

  • Common values: 24, 25, and 30.

MaxFps

String

No

The maximum frame rate.

Gop

String

No

The time interval or frame interval between two consecutive I frames.

Note

The larger the group of pictures (GOP) value, the higher the compression ratio, the lower the encoding speed, the longer the length of a single segment of streaming media, and the longer the response time to seeking. For more information about the term GOP, see Terms.

  • Format of maximum time interval between two consecutive I frames: {Time}s. Valid values: [1,100000].

  • Format of maximum frame interval between two consecutive I frames: {Number of frames}. Valid values: [1,100000].

  • Default value: 10s, which specifies that the time interval between two consecutive I frames is 10 seconds.

  • Best practice: We recommend that you set the parameter to a time interval from two to seven seconds in a streaming media scenario to reduce the time taken to start playback and the response time to seeking.

Bitrate

String

No

The average bitrate of the output video. If you use the CRB, ABR, or VBR bitrate control mode, you must specify the Bitrate parameter, and you must set the TransMode parameter to a valid value. For more information, see the Bitrate control mode section of this topic.

  • Unit: Kbit/s.

  • Valid values: -1 and [10,50000]. A value of -1 indicates that the original bitrate of the input video is used.

  • Best practice:

    • CBR: Set the TransMode parameter to CBR and the Bitrate, Maxrate, and Bufsize parameters to the same value.

    • ABR: Set the TransMode parameter to onepass and specify the Bitrate parameter. You can also specify the Maxrate and Bufsize parameters to control the bitrate range.

    • VBR: Set the TransMode parameter to twopass and specify the Maxrate or BitrateBnd parameter and the Bufsize parameter.

BitrateBnd

String

No

The average bitrate range of the output video.

  • This parameter takes effect only if the Codec parameter is set to H.264.

  • Example: {"Max":"5000","Min":"1000"}.

Maxrate

String

No

The peak bitrate of the output video. For more information, see the Bitrate control mode section of this topic.

  • Unit: Kbit/s.

  • Valid values: [10,50000].

Bufsize

String

No

The buffer size for bitrate control. You can specify this parameter to control the bitrate fluctuation. For more information, see the Bitrate control mode section of this topic.

Note

The larger the value of Bufsize, the greater the bitrate fluctuation and the higher the video quality.

  • Unit: Kbit/s.

  • Valid values: [1000,128000].

  • Default value: 6000.

Crf

String

No

The quality control factor. To use the CRF mode, you must specify the Crf parameter and set the TransMode parameter to fixCRF. For more information, see Bitrate control mode section of this topic.

Note

The larger the value of the Crf parameter, the lower the video quality and the higher the compression ratio.

  • Valid values: [20,51].

  • If the Codec parameter is set to H.264, the default value is 23. If the Codec parameter is set to H.265, the default value is 26. If the Codec parameter is set to AV1, the default value is 32.

  • Best practice:

    • A value of 0 specifies that the video is lossless. A value of 51 specifies that the image quality is the worst. We recommend that you set the value to a number from 23 to 29. You can adjust the value based on the complexity of the video image. If you increase or decrease the value by six, the bitrate is reduced by half or doubled. Under the same definition, you can set the value for an animated cartoon to be higher than that for a shot video.

    • The CRF mode gives priority to the video quality, and the bitrate of output videos is unpredictable. You can specify the Maxrate and Bufsize parameters to control the bitrate range.

Qscale

String

No

The value of the factor for video quality control. This parameter takes effect if you use the VBR mode.

Note

The larger the value of the Qscale parameter, the lower the video quality and the higher the compression ratio.

  • This parameter takes effect only if the Codec parameter is set to H.264.

  • Valid values: [0,51].

Profile

String

No

The encoding profile. For more information about the term profile, see Terms.

  • This parameter takes effect only if the Codec parameter is set to H.264.

  • Valid values: baseline, main, and high.

  • Default value: high.

  • Best practice: If you want to generate output videos in multiple definitions from the same video, we recommend that you set this parameter to baseline for the lowest definition. This ensures that the output video can be played on the full range of devices. For output videos in other definitions, set this parameter to main or high.

Preset

String

No

The preset mode of the H.264 encoder.

Note

The faster the mode you select, the lower the video quality.

  • This parameter takes effect only if the Codec parameter is set to H.264.

  • Valid values: veryfast, fast, medium, slow, and slower.

  • Default value: medium.

ScanMode

String

No

The scan mode.

  • If you leave this parameter empty, the scan mode of the input file is used. Valid values:

  • auto

  • progressive

  • interlaced

  • By default, this parameter is left empty. In other words, the scan mode of the input file is used.

Best practice: The interlaced scan mode saves data traffic than the progressive scan mode but provides poor image quality. Therefore, the progressive scan mode is commonly used in mainstream video production.

  • If you set the ScanMode parameter to progressive or interlaced but this scan mode does not match that of the input file, the file fails to be transcoded.

  • We recommend that you leave this parameter empty or set this parameter to auto for higher compatibility.

PixFmt

String

No

The pixel format.

  • Leave this parameter empty if you want to use the original color format of the video.

  • Valid values: yuv420p, yuvj420p, yuv422p, yuvj422p, yuv444p, yuvj444p, yuv444p, yuv444p161e, pc, bt470bg, and smpte170m. If the Codec parameter is set to GIF, bgr8 is supported.

Crop

String

No

The method of video cropping. The borders can be automatically detected and removed. You can also manually crop the video.

  • Specify this parameter if the resolution of the input video is greater than that of the output video. Do not specify the AdjDarMethod parameter if this parameter is specified.

  • To automatically remove borders, set this parameter to border.

  • To use a custom cropping method, set the parameter in the format of {width}:{height}:{left}:{top}.

    • width: the width of the output video obtained after cropping.

    • height: the height of the output video obtained after cropping.

    • left: the left margin between the output image and the original image.

    • top: the upper margin between the output image and the original image.

  • Example of custom cropping: 1920:800:0:140.示例

Pad

String

No

The information about the black bars.

  • Specify this parameter if the resolution of the input video is less than that of the output video. Do not specify the IsCheckReso, IsCheckResoFail, or AdjDarMethod parameter if this parameter is specified.

  • Format: {width}:{height}:{left}:{top}.

    • width: the width of the output video obtained after the black borders are added.

    • height: the height of the output video obtained after the black borders are added.

    • left: the left margin between the output image and the original image.

    • top: the upper margin between the output image and the original image.

  • Example: 1920:1080:0:140.视频贴黑边

Audio

This parameter is referenced by the Output.Audio parameter.

Parameter

Type

Required

Description

Remove

String

No

Specifies whether to delete the audio stream. Valid values:

  • true: deletes the audio stream. If you set this parameter to true, all audio-related parameters are invalid.

  • false: retains the audio stream.

  • Default value: false.

Codec

String

No

The audio codec format.

  • Valid values: AAC, AC3, EAC3, MP2, MP3, FLAC, OPUS, VORBIS, WMA-V1, WMA-V2, and pcm_s16le. For more information about the supported formats and compatible container formats, see Supported formats.

  • Default value: AAC.

Profile

String

No

The audio encoding profile.

  • This parameter takes effect only if the Codec parameter is set to AAC.

  • Valid values: aac_low, aac_he, aac_he_v2, aac_ld, and aac_eld. For more information about the term profile, see Terms.

  • Default value: aac_low.

Bitrate

String

No

The audio bitrate of the output file.

  • Unit: Kbit/s.

  • Valid values: [8,1000].

  • Default value: 128.

  • Common values: 64, 128, and 256.

Samplerate

String

No

The sample rate.

  • Unit: Hz.

  • Valid values: 22050, 32000, 44100, 48000, and 96000.

    Note

    The supported sample rates vary based on the encoding format and container format. For more information, see Audio sample rates. For example, if the Codec parameter is set to MP3, you cannot set this parameter to 96000. If the Codec parameter is set to FLV, you can set this parameter only to 22050 or 44100.

  • Default value: 44100.

Channels

String

No

The number of sound channels.

  • Valid values: 0, 1, 2, 4, 5, 6, and 8.

    • If the Codec parameter is set to MP3 or OPUS, you can set this parameter to 0, 1, or 2.

    • If the Codec parameter is set to AAC or FLAC, you can set this parameter to 0, 1, 2, 4, 5, 6, or 8.

    • If the Codec parameter is set to VORBIS, you can set this parameter to 2.

    • If the Format parameter is set to MPD, you cannot set this parameter to 8.

  • Default value: 2.

  • If you want to keep the original number of sound channels, set this parameter to 0.

Volume

String

No

The volume configuration. For more information, see the Volume section of this topic.

  • This parameter is supported only if only one output audio stream is configured. This parameter is not supported if more than one output audio streams are configured.

Volume

This parameter is referenced by the Output.Audio.Volume parameter.

Parameter

Type

Required

Description

Method

String

No

The method that you want to use to adjust the volume.

  • auto

  • dynamic

  • linear

  • Default value: dynamic.

Level

String

No

The level of volume adjustment performed based on the volume of the input audio.

  • This parameter takes effect only if the Method parameter is set to linear.

  • Unit: decibels.

  • Valid values: less than 20.

  • Default value:-20.

IntegratedLoudnessTarget

String

No

The volume of the output video.

  • This parameter takes effect only if the Method parameter is set to dynamic.

  • Unit: decibels.

  • Valid values: [-70,-5].

  • Default value: -6.

TruePeak

String

No

The maximum volume.

  • This parameter takes effect only if the Method parameter is set to dynamic.

  • Unit: decibels.

  • Valid values: [-9,0].

  • Default value: -1.

LoudnessRangeTarget

String

No

The magnitude of volume adjustment performed based on the volume of the output video.

  • This parameter takes effect only if the Method parameter is set to dynamic.

  • Unit: decibels.

  • Valid values: [1,20].

  • Default value: 8.

WaterMarks

This parameter is referenced by the Output.WaterMarks parameter.

Parameter

Type

Required

Description

Type

String

No

The type of the watermark. Valid values:

  • Text: a text watermark. If you set this parameter to Text, you must specify the TextWaterMark parameter.

  • Image: an image watermark. If you set this parameter to Image, you must specify the parameters related to the image watermark.

  • Default value: Image.

TextWaterMark

Object

No

The configuration of the text watermark. For more information, see the TextWaterMark section of this topic.

  • If the Type parameter is set to Text, this parameter is required.

  • Example: {"Content":"5rWL6K+V5paH5a2X5rC05Y2w","FontName":"SimSun","FontSize":"16","Top":2,"Left":10}.

InputFile

Object

No

The file to be used as the image watermark. You can use the Bucket, Location, and Object parameters to specify the location of the file.

  • The following types of files are supported: PNG static images in the .png format, PNG animated images in the .apng format, MOV files in the .mov format, and GIF files in the .gif format.

  • The file must be stored in an OSS bucket. For more information, see the Input section of this topic.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

  • Example: {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example-logo.png"}.

Note

If you add an image watermark whose type is not HDR to an HDR video, the color of the image watermark may become inaccurate.

WaterMarkTemplateId

String

No

The ID of the image watermark template. If you do not specify this parameter, the following default configurations are used for the parameters related to the image watermark:

  • ReferPos: TopRight.

  • Dx and Dy: 0.

  • Width: 0.12 times the width of the output video.

  • Height: proportionally scaled based on the width of the image watermark.

  • Timeline: from beginning to end.

ReferPos

String

No

The location of the image watermark.

  • Valid values: TopRight, TopLeft, BottomRight, and BottomLeft.

Dx

String

No

The horizontal offset of the image watermark relative to the output video. If you specify this parameter, the corresponding parameter in the specified watermark template is overwritten. The following value types are supported:

  • Integer: the pixel value of the horizontal offset.

    • Unit: pixel.

    • Valid values: [8,4096].

  • Decimal: the ratio of the horizontal offset to the width of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Dy

String

No

The vertical offset of the image watermark relative to the output video. The following value types are supported:

  • Integer: the pixel value of the vertical offset.

    • Unit: pixel.

    • Valid values: [8,4096].

  • Decimal: the ratio of the vertical offset to the height of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Width

String

No

The width of the image watermark. The following value types are supported:

  • Integer: the pixel value of the watermark width.

    • Valid values: [8,4096].

    • Unit: pixel.

  • Decimal: the ratio of the watermark width to the width of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Height

String

No

The height of the image watermark. The following value types are supported:

  • Integer: the pixel value of the watermark height.

    • Valid values: [8,4096].

    • Unit: pixel.

  • Decimal: the ratio of the watermark height to the height of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Timeline

String

No

The display time of the image watermark. For more information, see the Timeline section of this topic.

TextWaterMark

This parameter is referenced by the Output.WaterMarks.TextWaterMark parameter.

Parameter

Type

Required

Description

Content

String

Yes

The text to be displayed as the watermark. The text must be encoded in the Base64 format.

  • Example: 5rWL6K+V5paH5a2X5rC05Y2w.

Note

If the text contains special characters such as emojis and single quotation marks ('), the watermark may be truncated or fails to be added. You must escape special characters before you add them.

FontName

String

No

The font of the text watermark.

  • For more information about the supported fonts, see Fonts.

  • Default value: SimSun.

FontSize

Int

No

The font size of the text watermark.

  • Valid values: (4,120).

  • Default value: 16.

FontColor

String

No

The color of the text watermark.

  • For more information about the supported colors, see the name column of FontColor.

  • Default value: Black.

FontAlpha

Float

No

The transparency of the text watermark.

  • Valid values: (0,1].

  • Default value: 1.0.

BorderWidth

Int

No

The outline width of the text watermark.

  • Unit: pixel.

  • Valid values: [0,4096].

  • Default value: 0.

BorderColor

String

No

The outline color of the text watermark.

  • For more information about the supported colors, see the name column of BorderColor.

  • Default value: Black.

Top

Int

No

The top margin of the text watermark.

  • Unit: pixel.

  • Valid values: [0,4096].

  • Default value: 0.

Left

Int

No

The left margin of the text watermark.

  • Unit: pixel.

  • Valid values: [0,4096].

  • Default value: 0.

Timeline

This parameter is referenced by the Output.WaterMarks.Timeline parameter.

Parameter

Type

Required

Description

Start

String

No

The beginning of the time range in which the image watermark is displayed.

  • Format: sssss[.SSS].

  • Valid values: [0.000,86399.999]. If the start time when the image watermark is displayed is later than the ending time of the video, the transcoding job fails.

  • Default value: 0.

  • Example: 18000.30.

Duration

String

No

The time range in which the image watermark is displayed.

  • If the parameter is set to ToEND, the watermark is continuously displayed until the end of the video.

  • Format: sssss[.SSS]. Unit: seconds.

  • Default value: ToEND.

Config

This parameter is referenced by the AddWaterMarkTemplate and UpdateWaterMarkTemplate parameters.

Parameter

Type

Required

Description

Type

String

No

The type of the watermark. Valid values:

  • Image: an image watermark.

  • Default value: Image.

ReferPos

String

No

The location of the image watermark.

  • Valid values: TopRight, TopLeft, BottomRight, and BottomLeft.

  • The following figure provides an example on how to use the ReferPos, Dx, and Dy parameters to specify the location of the image watermark.

Dx

String

No

The horizontal offset of the image watermark relative to the output video. The following value types are supported:

  • Integer: the pixel value of the horizontal offset.

    • Unit: pixel.

    • Valid values: [8,4096].

  • Decimal: the ratio of the horizontal offset to the width of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Dy

String

No

The vertical offset of the image watermark relative to the output video. The following value types are supported:

  • Integer: the pixel value of the vertical offset.

    • Unit: pixel.

    • Valid values: [8,4096].

  • Decimal: the ratio of the vertical offset to the height of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Width

String

No

The width of the image watermark. The following value types are supported:

  • Integer: the pixel value of the watermark width.

    • Unit: pixel.

    • Valid values: [8,4096].

  • Decimal: the ratio of the watermark width to the width of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Height

String

No

The height of the image watermark. The following value types are supported:

  • Integer: the pixel value of the watermark height.

    • Unit: pixel.

    • Valid values: [8,4096].

  • Decimal: the ratio of the watermark height to the height of the output video.

    • Valid values: (0,1).

    • Four decimal places are supported, such as 0.9999. Excessive decimal places are discarded.

Timeline

String

No

The timeline of the dynamic watermark. For more information, see the Timeline section of this topic.

The following figure shows an example on how to use the ReferPos, Dx, and Dy parameters to specify the location of the image watermark.

Take note of the following items when you specify the Width and Height parameters:

  • If you specify neither the Width nor Height parameter, the watermark width is 0.12 times the width of the output video, and the watermark height is proportionally scaled based on the watermark width and the aspect ratio of the original image.

  • If you specify only the Width parameter, the watermark height is proportionally scaled based on the specified width and the aspect ratio of the original image. If you specify only the Height parameter, the watermark width is proportionally scaled based on the specified height and the aspect ratio of the original image.

  • If you specify both the Width and Height parameters, the watermark is displayed in the specified width and height.

DeWatermark

This parameter is referenced by the Output.DeWatermark parameter.

{
// Blur two watermarks in the video image starting from the first frame. The first watermark is 10 × 10 pixels away from the upper-left corner of the video image and is 10 × 10 pixels in size. The second watermark is 100 pixels away from the left side of the video image and is 10 × 10 pixels in size. The distance between the top of the video image and the watermark is calculated by multiplying 0.1 by the height of the video image. 
       "0": [
              {
                "l": 10,
                "t": 10,
                "w": 10,
                "h": 10
              },
              {
                "l": 100,
                "t": 0.1,
                "w": 10,
                "h": 10
              }
            ],
  // Stop blurring the logos at the 128,000th millisecond. In this case, the logos are blurred from the start of the video to the 128000th millisecond. 
     "128000": [],
  // Blur the watermark in the video image starting from the 250,000th millisecond. The watermark width is 0.01 times the width of the video image, and the watermark height is 0.05 times the height of the video image. The distance between the left side of the video image and the watermark is calculated by multiplying 0.2 by the width of the video image. The distance between the top of the video image and the watermark is calculated by multiplying 0.1 by the height of the video image. 
  "250000": [
              {
                "l": 0.2,
                "t": 0.1,
                "w": 0.01,
                "h": 0.05
              }
            ]
 }     

Parameters

  • pts: the point in time at which a frame is displayed. Unit: milliseconds.

  • l: the left margin of the blurred area.

  • t: the top margin of the blurred area.

  • w: the width of the blurred area.

  • h: the height of the blurred area.

If the values of the l, t, w, and h parameters are greater than 1, the values specify the number of pixels. Otherwise, the values specify the ratio of the pixel value to the corresponding pixel value of the video image. The blurred area is specified based on the nearest integers of the values of the l, t, w, and h parameters.

SubtitleConfig

This parameter is referenced by the Output.SubtitleConfig parameter.

Parameter

Type

Required

Description

ExtSubtitleList

Object[]

No

The external subtitles. For more information, see the ExtSubtitle section of this topic.

  • You can add up to four subtitle files to a transcoding job.

  • Example: [{"Input":{"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example.srt"},"CharEnc":"UTF-8"}].

ExtSubtitle

This parameter is referenced by the Output.SubtitleConfig.ExtSubtitle parameter.

Parameter

Type

Required

Description

Input

String

Yes

The external subtitle file. You can use the Bucket, Location, and Object parameters to specify the location of the file.

  • The SRT and ASS formats are supported. The color information in the subtitle file can be read.

  • The file must be stored in an OSS bucket. For more information, see the Input section of this topic.

  • Placeholders are supported. For more information, see the Placeholder replacement rules section of this topic.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

  • For example, if the path of the input file is a/b/example.flv and the path of the subtitle file is a/b/example-cn.mp4, you can use placeholders to specify the object in the following format: {ObjectPrefix}{FileName}-cn.srt. After the path is URL-encoded, the object is {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"%7bObjectPrefix%7d%7bFileName%7d-cn.srt"}.

Note

If the length of a subtitle file exceeds the length of a video, the video length prevails. If the characters of a subtitle are excessive and cannot be displayed in one line, the subtitle is truncated.

CharEnc

String

No

The encoding format of the external subtitles.

  • Valid values: UTF-8, GBK, BIG5, and auto.

  • Default value: auto.

Note

If you set this parameter to auto, the detected character set may not be the actual character set. We recommend that you set this parameter to another value.

FontName

String

No

The font of the subtitle.

  • For more information about the supported fonts, see Fonts.

  • Default value: SimSun.

FontSize

Int

No

The font size of the subtitle.

  • Valid values: (4,120).

  • Default value: 16.

Clip

This parameter is referenced by the Output.Clip parameter.

Parameter

Type

Required

Description

TimeSpan

String

No

The time span of cropping a clip from the input file. For more information, see the TimeSpan section of this topic.

  • Example for specifying the time span by using the Duration parameter: {"Seek":"00:01:59.999","Duration":"18000.30"}, which specifies that the clip starts at 1 minute, 59 seconds, and 999 milliseconds and ends at 5 minutes and 30 milliseconds.

  • Example for specifying the time span by using the End parameter: {"Seek":"00:01:59.999","End":"18000.30"}, which specifies that the clip starts at 1 minute, 59 seconds, and 999 milliseconds and ends at the point in time that is 5 minutes and 30 milliseconds before the end of the video.

ConfigToClipFirstPart

Boolean

No

Specifies whether to crop the first part of the file into a clip before clip merging. Valid values:

  • true: crops the first part of the file into a clip before clip merging.

  • false: merges the clips before the first part of the file is cropped into a clip.

  • Default value: false.

TimeSpan

This parameter is referenced by the Output.Clip.TimeSpan parameter.

Parameter

Type

Required

Description

Seek

String

No

The start point of the clip. You can use this parameter to specify the start point in time of the clip. The default start point in time is the beginning of the video.

  • Format: hh:mm:ss[.SSS] or sssss[.SSS].

  • Valid values: [00:00:00.000,23:59:59.999] or [0.000,86399.999].

  • Example: 00:01:59.999 or 18000.30.

Duration

String

No

The length of the clip. You can specify the length of the clip relative to the point in time specified by the Seek parameter. By default, the length of the clip is from the point in time specified by the Seek parameter to the end of the video. You can specify only one of the Duration and End parameters. If you specify the End parameter, the setting of the Duration parameter is invalid.

  • Format: hh:mm:ss[.SSS] or sssss[.SSS].

  • Valid values: [00:00:00.000,23:59:59.999] or [0.000,86399.999].

  • Example: 00:01:59.99 or 18000.30.

End

String

No

The length of the ending part of the original video to be cropped out. You can specify only one of the Duration and End parameters. If you specify the End parameter, the setting of the Duration parameter is invalid.

  • Format: hh:mm:ss[.SSS] or sssss[.SSS].

  • Valid values: [00:00:00.000,23:59:59.999] or [0.000,86399.999].

  • Example: 00:01:59.999 or 18000.30.

MergeList

This parameter is referenced by the Output.MergeList parameter.

Parameter

Type

Required

Description

MergeURL

String

Yes

The OSS path of the clip to be merged.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

  • Example: httphttp://exampleBucket****m.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4.

Start

String

No

The time at which the output clip is cut from the original clip. Specify this parameter if you want to merge only part of the video into the output file. The default start point in time is the beginning of the video.

  • Format: hh:mm:ss[.SSS] or sssss[.SSS].

  • Valid values: [00:00:00.000,23:59:59.999] or [0.000,86399.999].

  • Example: 01:59:59.999 or 32000.23.

Duration

String

No

The length of the mixed video. The length is relative to the start point in time specified by the Start parameter. Specify this parameter if you want to merge only part of the video into the output file. By default, the length is the period from the start point in time specified by the Start parameter to the end of the video.

  • Format: hh:mm:ss[.SSS] or sssss[.SSS].

  • Valid values: [00:00:00.000,23:59:59.999] or [0.000,86399.999].

  • Example: 01:59:59.999 or 32000.23.

OpeningList

This parameter is referenced by the Output.OpeningList parameter.

Parameter

Type

Required

Description

OpenUrl

String

Yes

The OSS path of the opening part.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

  • Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/opening_01.flv.

Start

String

No

The duration that elapses after the input video is played before the opening scene is played . The value starts from 0.

  • Unit: seconds.

  • Default value: 0.

Width

String

No

The width of the output opening part. Valid values:

  • Custom width: You can customize the width of the output opening part. Valid values: [0,4096]. Unit: pixel.

  • -1: The width of the output opening part equals the width of the input opening part.

  • full: The width of the output opening part equals the width of the main part.

  • Default value: -1.

Note

The output opening part is center-aligned based on the central point of the main part. The width of the opening part must be equal to or less than the width of the main part. Otherwise, the result is unknown.

Height

String

No

The height of the output opening part. Valid values:

  • Custom height: You can customize the height of the output ending part. Valid values: [0,4096]. Unit: pixel.

  • -1: The height of the output opening part equals the height of the input opening part.

  • full: The height of the output opening part equals the height of the main part.

  • Default value: -1.

Note

The output opening part is center-aligned based on the central point of the main part. The height of the opening part must be equal to or less than the height of the main part. Otherwise, the result is unknown.

TailSlateList

This parameter is referenced by the Output.TailSlateList parameter.

Parameter

Type

Required

Description

TailUrl

String

Yes

The OSS path of the ending part of the video.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

  • Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_01.flv.

BlendDuration

String

No

The amount of time between the end of the main part and the beginning of the ending part. During the transition, the last frame of the main part fades out, and the first frame of the ending part fades in.

  • Unit: seconds.

  • Default value: 0.

Width

String

No

The width of the output ending part. Valid values:

  • Custom width: You can customize the width of the output ending part. Valid values: [0,4096]. Unit: pixel.

  • -1: The width of the output ending part equals the width of the input ending part.

  • full: The width of the output ending part equals the width of the main part.

  • Default value: -1.

Note

The output ending part is center-aligned based on the central point of the main part. The width of the ending part must be equal to or less than the width of the main part. Otherwise, the result is unknown.

Height

String

No

The height of the output ending part. Valid values:

  • Custom height: You can customize the height of the output ending part. Valid values: [0,4096]. Unit: pixel.

  • -1: The height of the output ending part equals the height of the input ending part.

  • full: The height of the output ending part equals the height of the main part.

  • Default value: -1.

Note

The output ending part is center-aligned based on the central point of the main part. The height of the ending part must be equal to or less than the height of the main part. Otherwise, the result is unknown.

IsMergeAudio

Boolean

No

Specifies whether to merge the audio content of the ending part. Valid values:

  • true: merges the audio content of the ending part.

  • false: does not merge the audio content of the ending part.

  • Default value: true.

BgColor

String

No

The color of the margin if the width and height of the ending part are less than those of the main part.

  • For more information about the supported colors, see the name column of bgcolor.

  • Default value: White.

Amix

This parameter is referenced by the Output.Amix parameter.

Parameter

Type

Required

Description

AmixURL

String

Yes

The audio stream to be mixed. Valid values:

  • input: mixes multiple audio streams of the input file. You can mix two audio streams of the input file.

  • OSS path: adds an external audio stream. For example, you can add background music. You can mix an audio stream of the input file with the audio stream that is specified by the OSS path. Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail.flv.

Map

String

No

The serial number of the audio stream in the input file. After you specify an audio stream by using the AmixURL parameter, you must configure the Map parameter to specify an audio stream of the input file.

  • Format: 0:a:{Serial number}. The serial number specifies the subscript of the audio stream. The serial number starts from 0.

  • For example, 0:a:1 specifies the second audio stream.

MixDurMode

String

No

The mode for determining the length of the output file after mixing. Valid values:

  • first: The length of the input file is used.

  • longest: The longer one in the lengths of the input file and the audio stream specified by the AmixURL parameter is used.

  • Default value: longest.

Start

String

No

The start point in time of the audio stream. Specify this parameter if you want to mix only part of the audio into the output file. The default start point in time is the beginning of the audio.

  • Format: hh:mm:ss[.SSS] or sssss[.SSS].

  • Valid values: [00:00:00.000,23:59:59.999] or [0.000,86399.999].

  • Example: 00:01:59.999 or 18000.30.

Duration

String

No

The length of the mixed audio. The length is relative to the start point in time specified by the Start parameter. Specify this parameter if you want to mix only part of the audio into the output file. By default, the length is the period from the start point in time specified by the Start parameter to the end of the audio.

  • Format: hh:mm:ss[.SSS] or sssss[.SSS].

  • Valid values: [00:00:00.000,23:59:59.999] or [0.000,86399.999].

  • Example: 00:01:59.999 or 18000.30.

MuxConfig

This parameter is referenced by the Output.MuxConfig parameter.

Parameter

Type

Required

Description

Segment

String

No

The segment configuration. For more information, see the Segment section of this topic.

  • This parameter takes effect only if the container format is set to M3U8, HLS-FMP4, MPD, or CMAF.

  • Example: {"Duration":"10","ForceSegTime":"1,2,4,6,10,14,18"}, which specifies that the video is forcibly segmented at the 1st, 2nd, 4th, 6th, 10th, 14th, 18th, 20th, 30th, 40th, and 50th seconds. By default, the interval is 10 seconds.

Segment

This parameter is referenced by the Output.MuxConfig.Segment parameter.

Parameter

Type

Required

Description

Duration

Int

No

The segment duration.

  • Unit: seconds.

  • Valid values: [1,60].

  • Default value: 10, which specifies that the video is forcibly segmented at the 10th, 20th, 30th, and 40th seconds.

ForceSegTime

String

No

The points in time at which the video is forcibly segmented. Separate points in time with commas (,). You can specify up to 10 points in time.

  • Format: {Point in time},{Point in time},{Point in time}.

  • Type: decimal. This parameter supports up to three decimal places.

  • Unit: seconds.

  • Example: 1,2,4,6,10,14,18, which specifies that the video is forcibly segmented at the 1st, 2nd, 4th, 6th, 10th, 14th, and 18th seconds.

M3U8NonStandardSupport

This parameter is referenced by the Output.M3U8NonStandardSupport parameter.

Parameter

Type

Required

Description

TS

Object

No

The non-standard support for TS files. For more information, see the TS section of this topic.

TS

This parameter is referenced by the Output.M3U8NonStandardSupport.TS parameter.

Parameter

Type

Required

Description

Md5Support

Boolean

No

Specifies whether to include the MD5 value of each TS file in the output M3U8 video.

SizeSupport

Boolean

No

Specifies whether to include the size of each TS file in the output M3U8 video.

Encryption

This parameter is referenced by the Output.Encryption parameter.

Parameter

Type

Required

Description

Type

String

Yes

The encryption method of the video. Valid values:

  • hls-aes-128: standard encryption.

KeyType

String

Yes

The method in which the key is encrypted. Valid values:

  • Base64: basic encryption method.

  • KMS: Key Management Service (KMS). KMS is used to generate plaintext and ciphertext keys.

Key

String

Yes

The ciphertext key that is used to encrypt the video. Specify this parameter based on the value of the KeyType parameter. Valid values:

  • Base64:

    • Encrypt the plaintext key by using Base64 and set this parameter to the ciphertext key that is generated.

    • The plaintext key is customized and can be up to 16 characters in length.

    • For example, the corresponding cyphertext key of the plaintext key "encryptionkey128" is "ZW5 jcnlwdGlvbmtleTEyOA==".

  • KMS:

    • Call the GenerateKMSDataKey operation, pass in the master key, and then set the KeySpec parameter to AES_128 to obtain the corresponding ciphertext key from the CiphertextBlob parameter.

Note

Alibaba Cloud provides the master key. To obtain the master key, submit a ticket to contact us.

KeyUri

String

Yes

The URL of the key. You must construct the URL.

  • The URL cannot be passed to MPS in plaintext. You must use Base64 to encrypt the URL.

  • For example, if the URL is http://aliyun.com/document/hls128.key, use Base64 to encrypt the key into aHR0cDovL2FsaXl1bi5jb20vZG9jdW1lbnQvaGxzMTI4LmtleQ==.

SkipCnt

String

No

The number of clips that are not encrypted at the beginning of the video. This ensures a shorter loading time during startup.

  • Example: 3.

Placeholder replacement rules

The following placeholders can be used in file paths.

For example, the path of the input file is a/b/example.flv and you want to configure the path of the output file to a/b/c/example+test.mp4. In this case, you can use the {ObjectPrefix} and {FileName} placeholders to specify the path of the output file. After URL encoding, the path is displayed as %7BObjectPrefix%7D/c/%7BFileName%7D%2Btest.mp4.

Placeholder description

Transcoding output file

Input subtitle file

Output snapshot file

Placeholder

Description

Perform transcoding by using a workflow

Submit a transcoding job

Subtitle

Capture a snapshot by using a workflow

Submit a snapshot job

{ObjectPrefix}

The prefix of the input file.

Supported

Supported

Supported

Supported

Supported

{FileName}

The name of the input file.

Supported

Supported

Supported

Supported

Supported

{ExtName}

The file name extension of the input file.

Supported

Supported

Supported

Supported

Supported

{DestMd5}

The MD5 value of the output file.

Supported

Supported

Not supported

Not supported

Not supported

{DestAvgBitrate}

The average bitrate of the output file.

Supported

Supported

Not supported

Not supported

Not supported

{SnapshotTime}

The point in time of the snapshot.

Not supported

Not supported

Not supported

Supported

Supported

{Count}

The serial number of a snapshot in multiple snapshots that are captured at a time.

Not supported

Not supported

Not supported

Supported

Supported

{RunId}

The ID of the execution instance of the workflow.

Supported

Not supported

Not supported

Not supported

Not supported

{MediaId}

The ID of the media file in the workflow.

Supported

Not supported

Not supported

Not supported

Not supported

SnapshotConfig

This parameter is referenced by the SubmitSnapshotJob operation.

Important

You can specify whether to capture snapshots in synchronous or asynchronous mode. If you use the asynchronous mode, the snapshot job is submitted to and scheduled in an MPS queue. In this case, the snapshot job may be queued. The snapshot may not be generated when the response to the SubmitSnapshotJob operation is returned. After you submit a snapshot job, call the QuerySnapshotJobList operation to query the result of the snapshot job. Alternatively, you can configure Message Service (MNS) callbacks for the queue to obtain the results. For more information, see Notifications and monitoring. If you specify one of the Interval and Num parameters, the asynchronous mode is used by default.

Parameter

Type

Required

Description

Num

String

No

The number of snapshots to be captured.

  • If you specify one of the Interval and Num parameters, the asynchronous mode is used by default. The value of the Num parameter must be greater than 0.

  • If you leave the Num and Interval parameters empty and specify the Time parameter, the system synchronously captures one snapshot at the time that you specified.

  • If you set the Num parameter to 1 and specify the Time parameter, the system asynchronously captures one snapshot at the time that you specified.

  • If you set the Num parameter to a value greater than 1, the system starts to asynchronously capture snapshots at the time that you specified for the Time parameter and stops when the number of captured snapshots reaches the number you specified for the Num parameter. The snapshots are captured at the interval that you specified for the Interval parameter. If you do not specify the Interval parameter, the snapshots are captured every 10 seconds. If the result of Time + Interval × Num is greater than the length of the input video, only snapshots that are captured at the points in time within the length of the input video can be generated. After the snapshots are generated, the actual number of snapshots is returned.

  • If you set the Num parameter to a value greater than 1 and the Interval parameter to 0, the system starts to asynchronously capture snapshots at the point in time that you specified for the Time parameter. The number of snapshots is determined by the Num parameter and the snapshots are evenly captured within the length of the input video.

Time

String

Yes

The time at which the system starts to capture snapshots in the input video.

  • If the value of the Time parameter exceeds the video length, the snapshots fail to be generated.

  • Unit: millisecond.

Interval

String

No

The interval at which snapshots are captured.

  • If you specify this parameter, the asynchronous mode is automatically used to capture snapshots.

  • Set the Interval parameter to a value greater than 0 if you want to asynchronously capture multiple snapshots. Unit: seconds.

  • Set the Interval parameter to 0 if you want to evenly capture snapshots within the length of the input video.

  • Default value: 10. If you specify the Num parameter but leave the Interval parameter empty, the default value is used.

FrameType

String

No

The snapshot type. Valid values:

  • normal: normal frames

  • intra: keyframes

  • Default value: intra.

Note
  • The image quality of normal frames is lower than that of keyframes. In addition, it takes more time to capture normal frames than to capture keyframes. However, a normal frame can be captured at a specific point in time.

  • Keyframes of videos have high image quality and can be quickly captured because keyframes are independently decoded. However, keyframes appear in a video at intervals and cannot be captured at specific points in time. If the point in time that you specify is not accurate, the system captures the keyframe that is nearest to the point in time that you specify. In this case, if the distance between two keyframes is greater than the interval at which snapshots are captured, the number of generated snapshots may be less than the number of snapshots that you want to capture.

Width

String

No

The width of snapshots.

  • Unit: pixel.

  • Valid values: [8,4096]. We recommend that you specify an even number.

  • Default value:

    • By default, if you do not specify a width or height, the width of the input video is used.

    • If only the height is specified, the width of snapshots is calculated based on the aspect ratio of the input video.

Height

String

No

The height of snapshots.

  • Unit: pixel.

  • Valid values: [8,4096]. We recommend that you specify an even number.

  • Default value:

    • By default, if you do not specify a width or height, the height of the input video is used.

    • If only the width is specified, the height of snapshots is calculated based on the aspect ratio of the input video.

BlackLevel

String

No

The upper limit of black pixels in a snapshot. If the black pixels in a snapshot exceed this value, the system determines that the image is a black screen. For more information about black pixels, see the description of the PixelBlackThreshold parameter.

This parameter takes effect if the following conditions are met:

  • If you set the Time parameter to 0, this parameter takes effect and black screens are identified. If you set the Time parameter to a value greater than 0, black screens cannot be identified.

  • If you set the Time parameter to 0 and the Num parameter to 1 or leave the Num parameter empty, the first 5 seconds of the video are checked. If a normal video image exists, the image is captured. Otherwise, the snapshot fails to be generated.

  • If you set the Time parameter to 0 and the Num parameter to a value greater than 1, this parameter takes effect. The first 5 seconds of the video are checked. If a normal video image exists, the image is captured. If the first 5 seconds contain only black screens, the first frame is captured.

Parameters:

  • Valid values: [30,100].

  • Default value: 100.

  • If you want to identify pure black screens, set this parameter to 100.

  • For example, if you set the Time parameter to 0 and the Num parameter to 10, pure black screens are filtered.

PixelBlackThreshold

String

No

The color value threshold for pixels. If the color value of a pixel is less than the threshold, the system determines that the pixel is a black pixel.

  • Valid values: [0,255]. 0 specifies a pure white pixel and 255 specifies a pure black pixel.

  • If you want to improve the filtering of black screens, specify a higher value for this parameter. We recommend that you set this parameter to 30 and adjust the value based on your business requirements.

  • For example, if you set this parameter to 100, the pixels whose color values are lower than 100 are considered black pixels.

Format

String

No

The format of the output file.

  • If you set this parameter to vtt, the output file is in the WebVTT format. You must specify the SubOut parameter to determine whether to generate WebVTT files.

  • By default, this parameter is left empty. In this case, the output file is generated in the JPG format.

SubOut

Object

No

The configurations of the WebVTT file. For more information, see the SubOut Webvtt section of this topic.

  • The parameter is required if the Format parameter is set to vtt.

TileOut

Object

No

The image sprite configurations. For more information, see the TileOut section of this topic.

  • After you set this parameter, the generated snapshots are combined into an image sprite. The TileOutputFile parameter specifies the output image sprite.

  • If you leave this parameter empty, no image sprites are generated.

OutputFile

Object

Yes

The original snapshots. You must specify the storage path of the objects in OSS. For more information, see the OutputFile section of this topic.

  • The snapshot files are in the JPG format.

  • Example: {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example.jpg"}.

TileOutputFile

Object

No

The output image sprite. You must specify the storage path of the object in OSS. The value of this parameter is similar to that of the OutputFile parameter.

  • This parameter is required if you specify the TileOut parameter to generate an image sprite.

  • The image sprite is in the JPG format.

  • Example: {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example.jpg"}.

Note
  • If you set the Num parameter to a value greater than 1, the placeholder {TileCount} must be used to replace the object name. The name must be encoded before it can be used in MPS. The format of encoded object names is %7BTileCount%7D. The encoded object names can be used to differentiate the storage paths of different objects. For example, if you capture three snapshots, the output files are named 00001.jpg, 00002.jpg, and 00003.jpg.

  • If you want to store the original snapshots and the image sprite, specify different storage paths for the original snapshots and the image sprite to prevent your files from being overwritten.

SubOut Webvtt

This parameter is referenced by the SnapshotConfig.SubOut parameter.

Parameter

Type

Required

Description

IsSptFrag

String

No

Specifies whether to generate WebVTT index files. Valid values:

  • true: generates WebVTT index files. The WebVTT index files are stored in the same path as the snapshots.

  • false: does not generate WebVTT index files. Only snapshots are exported.

  • Default value: false.

TileOut

This parameter is referenced by the SnapshotConfig.TileOut parameter.

Parameter

Type

Required

Description

Lines

Int

No

The number of rows that the tiled snapshot contains.

  • Valid values: (0,10000].

  • Default value: 10

Columns

Int

No

The number of columns that the tiled snapshot contains.

  • Valid values: (0,10000].

  • Default value: 10

CellWidth

String

No

The width of a single snapshot before tiling.

  • Unit: pixel.

  • Default value: the width of the original snapshot.

CellHeight

String

No

The height of a single snapshot before tiling.

  • Unit: pixel.

  • Default value: the height of the original snapshot.

Padding

String

No

The distance between two snapshots.

  • Unit: pixel.

  • Default value: 0.

Margin

String

No

The margin width of the tiled snapshot.

  • Default value: 0.

  • Unit: pixel.

Color

String

No

The background color. The background color that is used to fill the margins, the padding between snapshots, and the area in which no snapshots are displayed.

  • You can specify a color keyword or random value for this parameter. For example, if you want to set the background color to black, you can specify the color keyword in one of the following formats: Black, black, and #000000.

  • Default value: black.

IsKeepCellPic

String

No

Specifies whether to store the original snapshots. Valid values:

  • true: stores the original snapshots. OutputFile specifies the information about the storage of original snapshots.

  • false: does not store the original snapshots.

  • Default value: false.

OutputFile

Parameter

Type

Required

Description

Bucket

String

Yes

The OSS bucket in which the original snapshots are stored.

  • For more information about the term bucket, see Terms.

Location

String

Yes

The region in which the OSS bucket resides.

  • The OSS bucket must reside in the same region as MPS.

  • For more information about the term region, see Terms.

Object

String

Yes

The path in which the output snapshots are stored in OSS.

  • The path includes the file name and file name extension. For more information about the term ObjectKey, see Terms.

  • Placeholders are supported. For more information, see the Placeholder replacement rules section of this topic.

  • The output file must be in the JPG format.

  • The path of an OSS object must be URL-encoded in UTF-8 before you use the path in MPS. For more information, see URL encoding.

Note
  • If you set the Num parameter to a value greater than 1, the placeholder {Count} must be used to replace the object name. The name must be encoded before it can be used in MPS. The format of encoded object names is %7BCount%7D. The encoded object names can be used to differentiate the storage paths of different objects. For example, if you capture three snapshots, the output files are named 00001.jpg, 00002.jpg, and 00003.jpg.

  • If you want to store the original snapshots and the image sprite, specify different storage paths for the original snapshots and the image sprite to prevent your files from being overwritten.

NotifyConfig

This parameter is referenced by the AddPipeline and UpdatePipeline operations.

Parameter

Type

Required

Description

QueueName

String

No

The MNS queue in which you want to receive notifications. After the job is complete in the MPS queue, the job results are pushed to the MNS queue. For more information about receiving notifications, see Receive notifications.

  • You can specify one of the QueueName and Topic parameters.

  • You must specify an MNS queue for this parameter. If no queue exists, create a queue in the MNS console.

Topic

String

No

The MNS topic in which you want to receive notifications. After the job is complete, the job results are pushed to the MNS topic. Then, the MNS topic pushes the message to multiple queues or URLs that subscribe to the topic. For more information about receiving notifications, see Receive notifications.

  • You can specify one of the QueueName and Topic parameters.

  • You must specify an MNS topic for this parameter. If no topic exists, create a topic in the MNS console.

Parameters related to transcoding input files

Parameter

Type

Required

Description

Bucket

String

Yes

The OSS bucket that stores the input file.

  • You must grant the read permissions on the OSS bucket to MPS on the Access Control page in the OSS console.

  • For more information about the term bucket, see Terms.

Location

String

Yes

The region in which the OSS bucket resides.

For more information about the term region, see Terms.

Object

String

Yes

The OSS object that is used as the input file.

  • The path of an OSS object must comply with RFC 2396 and be URL-encoded in UTF-8. For more information, see URL encoding.

  • For more information about the term object, see Terms.

Audio

String

No

The audio configuration of the input file. The value must be a JSON object.

Note

This parameter is required if the input file is in the ADPCM or PCM format.

  • For more information, see the InputAudio section of this topic.

  • Example: {"Channels":"2","Samplerate":"44100"}.

Container

String

No

The container configuration of the input file. The value must be a JSON object.

Note

This parameter is required if the input file is in the ADPCM or PCM format.

  • For more information, see the InputContainer section of this topic.

  • Example: {"Format":"u8"}.

InputContainer

Parameter

Type

Required

Description

Format

String

Yes

The audio format of the input file.

Valid values: alaw, f32be, f32le, f64be, f64le, mulaw, s16be, s16le, s24be, s24le, s32be, s32le, s8, u16be, u16le, u24be, u24le, u32be, u32le, and u8.

InputAudio

Parameter

Type

Required

Description

Channels

String

Yes

The number of sound channels in the input file. Valid values: [1,8].

Samplerate

String

Yes

The audio sample rate of the input file.

  • Valid values: (0,320000].

  • Unit: Hz.

AnalysisConfig

Parameter

Type

Required

Description

QualityControl

String

No

The configuration of the output file quality. The value must be a JSON object. For more information, see the AnalysisConfig section of this topic.

PropertiesControl

String

No

The property configuration. The value must be a JSON object. For more information, see the PropertiesControl section of this topic.

QualityControl

Parameter

Type

Required

Description

RateQuality

String

No

The quality level of the output file.

  • Valid values: (0,51).

  • The value must be an integer.

  • Default value: 25.

MethodStreaming

String

No

The playback mode. Valid values: network and local.

Default value: network.

PropertiesControl

Parameter

Type

Required

Description

Deinterlace

String

No

Specifies whether to forcibly run deinterlacing. Valid values:

  • Auto: automatically runs deinterlacing.

  • Force: forcibly runs deinterlacing.

  • None: forbids deinterlacing.

Crop

String

No

The cropping configuration of the video image.

  • By default, automatic cropping is performed.

  • If you do not set this parameter to an empty JSON object, the Mode parameter is required.

  • For more information, see the Crop section of this topic.

Crop

Parameter

Type

Required

Description

Mode

String

No

This parameter is required if the value of the Crop parameter is not an empty JSON object. Valid values:

  • Auto: automatically runs cropping.

  • Force: forcibly runs cropping.

  • None: forbids cropping.

Width

Integer

No

The width of the video image obtained after the margins are cropped out.

  • Valid values: [8,4096].

  • If you set the Mode parameter to Auto or None, the setting of this parameter is invalid.

Height

Integer

No

The height of the video image obtained after the margins are cropped out.

  • Valid values: [8,4096].

  • If you set the Mode parameter to Auto or None, the setting of this parameter is invalid.

Top

Integer

No

The top margin to be cropped out.

  • Valid values: [8,4096].

  • If you set the Mode parameter to Auto or None, the setting of this parameter is invalid.

Left

Integer

No

The left margin to be cropped out.

  • Valid values: [8,4096].

  • If you set the Mode parameter to Auto or None, the setting of this parameter is invalid.

TransFeatures

Parameter

Type

Required

Description

MergeList

String

No

The URLs of the clips to be merged.

  • The value must be a JSON array that contains up to four MergeURL parameters. For more information, see the MergeList section of this topic.

  • Example: [{"MergeURL":"http://example-bucket-****.oss-cn-hangzhou.aliyuncs.com/k/mp4.mp4"},{"MergeURL":"http://example-bucket-****.oss-cn-hangzhou.aliyuncs.com/c/ts.ts","Start":"1:14","Duration":"29"}].

Parameters related to the output in the SubmitJobs operation

Parameter

Type

Required

Description

URL

String

No

The OSS path of the output file.

  • Example: http://example-bucket-****.oss-cn-hangzhou.aliyuncs.com/example.flv.

  • If you do not specify this parameter, the Bucket, Location, and Object parameters are required.

Bucket

String

No

  • The OSS bucket that stores the output file. If you do not specify the URL parameter, this parameter is required.

  • Otherwise, the setting of this parameter is invalid. Before you specify an OSS bucket, grant the write permissions on the OSS bucket to MPS on the Access Control page in the OSS console.

  • For more information about the term bucket, see Terms.

Location

String

No

  • The region in which the OSS bucket that stores the output file resides. If you do not specify the URL parameter, this parameter is required.

  • Otherwise, the setting of this parameter is invalid.

  • For more information about the term region, see Terms.

Object

String

No

  • The name of the OSS object to be used as the output file. If you do not specify the URL parameter, this parameter is required.

  • Otherwise, the setting of this parameter is invalid. The parameter value must comply with RFC 2396 and be URL-encoded in UTF-8. For more information, see URL encoding.

  • For more information about the term object, see Terms.

MultiBitrateVideoStream

Parameter

Type

Required

Description

URI

String

No

The name of the output video stream, which must end with .m3u8. Example: a/b/test.m3u8. Format: ^[a-z]{1}[a-z0-9./-]+$.

RefActivityName

String

Yes

The name of the associated activity.

ExtXStreamInfo

Json

Yes

The information about the stream. Example: {"BandWidth": "111110","Audio": "auds","Subtitles": "subs"}.

ExtXMedia

Parameter

Type

Required

Description

Name

String

Yes

The name of the resource. The name can be up to 64 bytes in length and must be encoded in UTF-8. This parameter corresponds to NAME in the HTTP Live Streaming (HLS) V5 protocol.

Language

String

No

The language of the resource, which must comply with RFC 5646. This parameter corresponds to LANGUAGE in the HLS V5 protocol.

URI

String

Yes

The path of the resource.

Format: ^[a-z]{1}[a-z0-9./-]+$. Example: a/b/c/d/audio-1.m3u8.

MasterPlayList

Parameter

Type

Required

Description

MultiBitrateVideoStreams

JsonArray

Yes

The array of multiple streams. Example: [{"RefActivityName": "video-1","ExtXStreamInfo": {"BandWidth": "111110","Audio":"auds","Subtitles": "subs"}}].

ExtXStreamInfo

Parameter

Type

Required

Description

BandWidth

String

Yes

The bandwidth. This parameter specifies the upper limit of the total bitrate and corresponds to BANDWIDTH in the HLS V5 protocol.

Audio

String

No

The ID of the audio stream group. This parameter corresponds to AUDIO in the HLS V5 protocol.

Subtitles

String

No

The ID of the subtitle stream group. This parameter corresponds to SUBTITLES in the HLS V5 protocol.

AdaptationSet

Parameter

Type

Required

Description

Group

String

Yes

The name of the group. Example:

<AdaptationSet group="videostreams" mimeType="video/mp4" par="4096:1744"
              minBandwidth="258157" maxBandwidth="10285391" minWidth="426" maxWidth="4096"
              minHeight="180" maxHeight="1744" segmentAlignment="true"
              startWithSAP="1">

Lang

String

No

The language of the resource. You can specify this parameter for audio and subtitle resources.

Representation

Parameter

Type

Required

Description

Id

String

Yes

The ID of the stream. Example:

<Representation id="240p250kbps" frameRate="24" bandwidth="258157"
              codecs="avc1.4d400d" width="426" height="180">

URI

String

Yes

The path of the resource. Format: ^[a-z]{1}[a-z0-9./-]+$. Example: a/b/c/d/video-1.mpd.

InputConfig

Parameter

Type

Required

Description

Format

String

Yes

The format of the input subtitle file. Valid values: stl, ttml, and vtt.

InputFile

String

Yes

{"Bucket":"example-bucket-****","Location":"oss-cn-hangzhou","Object":"example-logo****.png"}
              or
              {"URL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/subtitle/test****.chs.vtt"}