This topic describes the details of specific parameters.

Input

The parameters described in this section are used when you call SubmitJobs.

ParameterTypeRequiredDescription
BucketStringYesThe Object Storage Service (OSS) bucket that stores the input file.
LocationStringYesThe region in which the OSS bucket resides.
  • The region in which the OSS bucket resides and the region where ApsaraVideo Media Processing (MPS) is activated must be the same.
  • For more information about regions, see Terms.
ObjectStringYesThe path in which the input file is stored in OSS. The path contains the file name.
  • For more information about the OSS path, see Terms.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
  • For example, Alibaba Cloud/mts HD+.mp4 must be encoded to %E9%98%BF%E9%87%8C%E4%BA%91/mts%20HD%2B.mp4.
RefererStringNoThe Referer. If you configure hotlink protection in OSS to allow only domain names that are included in the Referer whitelist to download media files, this parameter is required. If you do not configure hotlink protection, this parameter is not required. For more information about hotlink protection, see Hotlink protection.
  • If you use a workflow for transcoding, you must specify this parameter in the MPS console. For more information, see the "Step 3: (Optional) Configure hotlink protection in MPS" section of the Add media buckets topic.
  • If you call an API operation to submit a transcoding job, you must specify this parameter in the request.

Output

The parameters described in this section are used when you call SubmitJobs, AddMediaWorkflow, and UpdateMediaWorkflow.

ParameterTypeRequiredDescription
OutputObjectStringYesThe path in which the output file is stored in OSS. The path contains the file name and file name extension.
  • For more information about the OSS path, see Terms.
  • Placeholders are supported. For more information, see Placeholder replacement rules.
  • The following rules apply to file name extensions:
    • If you use a workflow for transcoding, you do not need to specify a file name extension. In this case, MPS automatically adds a file name extension based on the container format that is defined in the transcoding template.
    • If you call an API operation to submit a transcoding job, you must specify a file name extension that matches the container format that is defined in the transcoding template. If the container format is M3U8, MPS automatically adds .m3u8 to the end of the file name in the playlist. If a file has multiple segments, MPS automatically adds a hyphen (-) and a 5-digit serial number starting from 00001 to the end of segment file names in the playlist. The file name extension of segment files is .ts. For example, if the name of the file in the playlist is filename.m3u8, the first segment file is named filename-00001.ts.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
  • For example, the path of the input file is a/b/example.flv. If you want the path of the output file to be a/b/c/example+test.mp4, you can use placeholders to specify the path of the output file: {ObjectPrefix}/c/{FileName}+test.mp4. Then, encode the path in UTF-8: %7BObjectPrefix%7D/c/%7BFileName%7D%2Btest.mp4. Specify the encoded path for this parameter.
TemplateIdStringYesThe ID of the transcoding template.
ContainerObjectNoThe container format. For more information, see Container.
  • If you specify this parameter, the container format that is defined in the transcoding template is overwritten.
VideoObjectNoThe transcoding settings for the video. For more information, see Video.
  • If you specify this parameter, the video transcoding settings that are defined in the transcoding template are overwritten.
AudioObjectNoThe transcoding settings for the audio. For more information, see Audio.
  • If you specify this parameter, the audio transcoding settings that are defined in the transcoding template are overwritten.
TransConfigObjectNoThe transcoding configurations. For more information, see TransConfig.
  • If you specify this parameter, the transcoding configurations that are defined in the transcoding template are overwritten.
  • Example: {"TransMode":"onepass","AdjDarMethod":"none","IsCheckVideoBitrateFail":"true","IsCheckAudioBitrateFail":"true"}.
VideoStreamMapStringNoThe identifier of the video stream that you want to transcode.
  • If you leave this parameter empty, MPS selects a video stream for transcoding.
  • If you want to specify a video stream for transcoding, set this parameter to a value in the 0:v:{sequence number} format. The sequence number specifies the location of a video stream in the video stream list. The sequence number starts from 0. For example, if you specify 0:v:1 for this parameter, the second video stream is used for transcoding.
  • If you want to transcode all video streams, set this parameter to 0:v.
AudioStreamMapStringNoThe identifier of the audio stream that you want to transcode.
  • If you leave this parameter empty, MPS selects an audio stream for transcoding. In most cases, MPS selects a high-quality audio stream in Chinese that has multiple sound channels.
  • If you want to specify an audio stream for transcoding, set this parameter to a value in the 0:a:{sequence number} format. The sequence number specifies the location of an audio stream in the audio stream list. The sequence number starts from 0. For example, if you specify 0:a:1 for this parameter, the second audio stream is used for transcoding.
  • If you want to transcode all audio streams, set this parameter to 0:a. This is suitable for multi-language dubbing scenarios.
RotateStringNoThe rotation angle of the video in the clockwise direction.
  • Valid values: 0, 90, 180, and 270.
  • Default value: 0. 0 specifies that the video is not rotated.
WaterMarksObject[]NoThe watermarks that are imposed on the video. Watermarks can be images or texts. For more information, see WaterMarks.
  • You can specify a maximum of four watermarks for a transcoding task.
  • Example of an image watermark: ["WaterMarkTemplateId":"88c6ca184c0e47098a5b665e2a12****"},{"InputFile":{"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example-logo.png"},{"Timeline":{"Start":"0","Duration":"ToEND"}}].
  • Example of a text watermark: ["Type":"Text","TextWaterMark":"{"Content":"5rWL6K+V5paH5a2X5rC05Y2w","FontName":"SimSun","FontSize":"16","Top":2,"Left":10}].
DeWatermarkObjectNoThe blurring configurations. For more information, see DeWatermark.
  • Example: {"0": [{"l":10,"t":10,"w":10,"h":10},{"l":100,"t":0.1,"w":10,"h":10}],"128000": [],"250000": [{"l":0.2,"t":0.1,"w":0.01,"h":0.05}]}.
SubtitleConfigObjectNoThe configuration of embedded subtitles. For more information, see SubtitleConfig.
  • You can add a maximum of four subtitle files for a transcoding task.
  • Example: {"ExtSubtitleList":[{"Input":{"Bucket":"example-bucket-****","Location":"oss-cn-hangzhou","Object":"example.srt"},"CharEnc":"UTF-8"}]}.
ClipObjectNoThe video clip configurations. For more information, see Clip.
  • For example, {"TimeSpan":{"Seek":"00:01:59.999","End":"18000.30"},"ConfigToClipFirstPart":false} specifies that the video is merged and then split. The output clip is generated starting from 1:59:999 of the original clip until 5 minutes and 30 milliseconds before the end of the original clip.
MergeListObject[]NoThe clips that you want to merge with the input file. You can merge the input file with clips to create a video. For more information, see MergeList.
  • You can specify either MergeList or MergeConfigUrl. If you specify both parameters, MergeConfigUrl takes precedence over MergeList.
  • You can add a maximum of four clip URLs for a transcoding task. If you want to merge more clips with the input file, specify MergeConfigUrl.
  • Example of merging the input file with one video clip: [{"MergeURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4"}].
  • Example of merging the input file with two video clips: [{"MergeURL":"http://exampleBucket****m.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4","Start":"1","Duration":"20"},{"MergeURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_comm_02.mp4","Start":"5.4","Duration":"10.2"}].
MergeConfigUrlStringNoThe OSS address of the configuration file for merging.
  • You can specify either MergeList or MergeConfigUrl. If you specify both parameters, MergeConfigUrl takes precedence over MergeList.
  • The configuration file can be stored only in OSS. Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/mergeConfigfile.
  • You must specify the URLs of the video clips that you want to merge with the input file in sequence in the configuration file. You can specify a maximum of 50 video clip URLs. For more information about the parameter format, see MergeList. Example of the configuration file content: {"MergeList":[{"MergeURL":"http://exampleBucket****m.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4","Start":"1","Duration":"20"},{"MergeURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_comm_02.mp4","Start":"5.4","Duration":"10.2"}]}.
OpeningListObject[]NoThe list of opening scenes. Opening scenes are played at the beginning of a video in picture-in-picture (PiP) mode. This is a special merging effect. For more information, see Opening.
  • You can add a maximum of two opening scenes in sequence for a transcoding task.
  • Example: [{"OpenUrl":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/opening_01.flv","Start":"1","Width":"1920","Height":"1080"},{"OpenUrl":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/opening_02.flv","Start":"1","Width":"-1","Height":"full"}].
TailSlateListObject[]NoThe list of ending scenes. Ending scenes fade in and out at the end of a video. This is a special merging effect. For more information, see TailSlate.
  • You can add a maximum of two ending scenes in sequence for a transcoding task.
  • Example: [{"TailUrl":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_01.flv","Start":"1","BlendDuration":"2","Width":"1920","Height":"1080","IsMergeAudio":false,"BgColor":"White"}].
AmixObjectNoThe audio mixing configurations. You can use audio mixing to merge multiple audio tracks of a video and add background music to the video. For more information, see Amix.
  • Example of mixing two audio streams of the input file: {"AmixURL":"input","MixDurMode":"longest","Start":"1","Duration":"2"}.
  • Example of mixing external audio with the audio of the input file: {"AmixURL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail.flv","Map":"0:a:1","MixDurMode":"longest","Start":"1","Duration":"2"}.
MuxConfigObjectNoThe packaging configurations. For more information, see MuxConfig.
  • If you specify this parameter, the packaging configurations that are defined in the transcoding template are overwritten.
  • For example, if you specify {"Segment":{"Duration":"10","ForceSegTime":"1,2,4,6,10,14,18"}, the video is forcefully segmented at the 1st, 2nd, 4th, 6th, 10th, 14th, and 18th seconds.
M3U8NonStandardSupportObjectNoSpecifies whether the non-standard M3U8 format is supported. For more information, see M3U8NonStandardSupport.
  • For example, if you set this parameter to {"TS":{"Md5Support":true,"SizeSupport":true}}, the MD5 hash value and the size of the TS file are returned.
EncryptionStringNoThe encryption configurations. This parameter takes effect only when Container is set to m3u8. For more information, see Encryption.
  • Example: {"Type":"hls-aes-128","Key":"ZW5jcnlwdGlvbmtleTEyMw","KeyType":"Base64","KeyUri":"aHR0cDovL2FsaXl1bi5jb20vZG9jdW1lbnQvaGxzMTI4LmtleQ=="}.
UserDataStringNoThe user data. The value can be up to 1,024 bytes in length.
PriorityStringNoThe priority of the transcoding task in the MPS pipeline to which the task is added.
  • Valid values: [1,10]. 1 specifies the lowest priority and 10 specifies the highest priority.
  • Default value: 6.
  • Best practices: The maximum number of concurrent requests in an MPS pipeline is limited. If you submit a large number of tasks to an MPS pipeline, the tasks may be queued. We recommend that you prioritize time-sensitive and important tasks.

Container

The following table describes the parameter that is nested in Container.

ParameterTypeRequiredDescription
FormatStringNoThe container format.
  • For more information about container formats and the compatibility with video codecs, see Supported formats. The following container formats are supported:
    • Video: mp4, flv, ts, mkv, mov, mxf, 3gp, mpeg, fmp4, m3u8, hls-fmp4, mpd, cmaf, cmaf-hls, and cmaf-dash
    • Audio: aac, mp2, mp3, mp4, ogg, and flac
    • Image: GIF and WebP
  • Default value: mp4

Video

The following table describes the parameters that are nested in Video.

ParameterTypeRequiredDescription
RemoveStringNoSpecifies whether to delete the video stream.
  • true: deletes the video stream. If you set this parameter to true, all parameters described in this topic that are related to Video become invalid.
  • false: does not delete the video stream.
  • Default value: false.
CodecStringNoThe video codec.
  • Valid values: H.264, H.265, GIF, and WebP. For more information about container formats and the compatibility with video codecs, see Supported formats.
  • Default value: H.264.
WidthStringNoThe width of the output video.
  • Unit: pixels.
  • Valid values: [128,4096]. The value must be an even number.
  • The default value of this parameter is determined based on the following rules:
    • By default, if you do not specify the width or height of the output video, the width of the input video is used.
    • By default, if you specify only the height of the output video, the width of the output video is calculated based on the aspect ratio of the input video.
HeightStringNoThe height of the output video.
  • Unit: pixels.
  • Valid values: [128,4096]. The value must be an even number.
  • The default value of this parameter is determined based on the following rules:
    • By default, if you do not specify the width or height of the output video, the height of the input video is used.
    • By default, if you specify only the width of the output video, the height of the output video is calculated based on the aspect ratio of the input video.
LongShortModeStringNoSpecifies whether to enable the automatic rotation feature. This parameter takes effect when you specify at least one of the Width and Height parameters. Valid values:
  • true
  • false
  • Default value: false.
  • Best practices: If your input videos are available in landscape mode and portrait mode, enable the automatic rotation feature to prevent distortion and proportionally resize the video based on the resolution. For more information, see Enable auto-rotate screen.
FpsStringNoThe frame rate of the video.
  • Unit: frames per second (FPS).
  • Valid values: (0,60].
  • By default, the frame rate of the input file is used. If the frame rate of the input file exceeds 60, 60 is used.
  • Common values include 24, 25, and 30.
MaxFpsStringNoThe maximum frame rate.
GopStringNoSpecifies the time interval or the number of frames between two adjacent I-frames.
Note A higher value indicates a higher compression rate and a slower encoding speed. A higher value also indicates longer duration of each segment and longer period of time required for seeking operations to complete in the player. For more information, see Terms.
  • Valid values for the time interval between keyframes: [1,100000]. Unit: seconds. Example: {time interval}s.
  • Valid values for the number of frames between keyframes: [1,100000]. Example: {number of frames}.
  • Default value: 10s. 10s indicates that the time interval between keyframes is 10 seconds.
  • Best practices: We recommend that you set the time interval between keyframes to a value that ranges from 2 to 7 seconds during media streaming. This reduces the loading time and improves the seeking speed.
BitrateStringNoThe average bitrate of the output video. You can specify either Bitrate or Crf. If you specify both parameters, Crf takes precedence over Bitrate.
  • Unit: Kbit/s.
  • Valid values: [10,50000].
  • Default value: the bitrate of the input video.
  • Best practices:
    • If you want to use the constant bitrate (CBR) mode, you must set TransMode to CBR and make sure that the values of Bitrate, Maxrate, and Bufsize are the same.
    • If you want to use the average bitrate (ABR) mode, you must set TransMode to onepass and specify Bitrate. You can also specify Maxrate and Bufsize to manage the bitrate range.
    • If you want to use the variable bitrate (VBR) mode, you must set TransMode to twopass and specify Maxrate or BitrateBnd and Bufsize.
MaxrateStringNoThe maximum bitrate of the output video.
  • Unit: Kbit/s.
  • Valid values: [10,50000].
BitrateBndStringNoThe bitrate range of the output video.
  • Example: {"Max":"5000","Min":"1000"}.
BufsizeStringNoThe buffer size. This parameter is used to manage bitrate fluctuations.
Note A larger buffer size indicates a higher video quality and a greater variation in the bitrate.
  • Unit: Kbps.
  • Valid values: [1000,128000].
  • Default value: 6000.
QscaleStringNoThe video quality control factor. This parameter takes effect if use the VBR mode.
Note A higher value indicates a lower video quality and higher compression rate.
  • Valid values: [0,51].
CrfStringNoThe constant rate factor (CRF). You can specify either Bitrate or Crf. If you specify both parameters, Crf takes precedence over Bitrate.
Note A higher value indicates a lower video quality and higher compression rate.
  • Valid values: [0,51].
  • If you set Codec to H.264, this parameter is 23 by default. If you set Codec to H.265, this parameter is 26 by default.
  • Best practices:
    • 0 specifies the lossless video quality and 51 specifies the lowest video quality. We recommend that you set a value from to 23 to 29. You can modify the CRF based on the video image. Every time that you increase or decrease the value by 6, the bitrate is reduced to half or doubled. In most cases, the bitrate of animated videos can be higher than that of other videos in the same definition.
    • Crf is used to control the video quality. You can specify Crf and Maxrate to manage the bitrate range of the output video. You cannot predict the bitrate of the output video.
ProfileStringNoThe encoding profile. For more information, see Terms.
  • This parameter takes effect only if you set Codec to H.264.
  • Valid values: baseline, main, and high.
  • Default value: high.
  • Best practices: If you want to transcode a video to streams in multiple definitions, we recommend that you set this parameter to baseline for the low-definition video stream. This ensures normal playback on low-end devices. Set this parameter to main or high for video streams in other definitions.
PresetStringNoThe preset mode of the H.264 encoder.
Note A faster encoding mode indicates lower video quality.
  • This parameter takes effect only if you set Codec to H.264.
  • Valid values: veryfast, fast, medium, slow, and slower.
  • Default value: medium.
ScanModeStringNoThe scan mode. Valid values:
  • interlaced
  • progressive
  • auto
  • Default value: auto.
PixFmtStringNoThe color format of the video.
  • Valid values: yuv420p, yuvj420p, yuv422p, yuvj422p, yuv444p, yuvj444p, yuv444p, yuv444p161e, pc, bt470bg, and smpte170m. You can set this parameter only to bgr8 if you set Codec to GIF.
  • By default, yuv420p or the color format of the input video is used.
CropStringNoThe video cropping configurations. You can specify automatic black bar removal or custom cropping.
  • You can specify this parameter only if the resolution of the input video is higher than that of the output video. If you specify this parameter, you cannot specify AdjDarMethod.
  • If you want to use automatic black bar removal, set this parameter to border.
  • If you want to use custom cropping, set this parameter to a value in the {width}:{height}:{left}:{top} format.
    • width: the width of the output video.
    • height: the height of the output video.
    • left: the distance between the left border of the cropped video and the left border of the original video.
    • top: the distance between the top border of the cropped video and the top border of the original video.
PadStringNoThe information about the black bars.
  • You can specify this parameter only if the resolution of the input video is lower than that of the output video. If you specify this parameter, you cannot specify IsCheckReso, IsCheckResoFail, or AdjDarMethod.
  • Set this parameter to a value in the {width}:{height}:{left}:{top} format.
    • width: the width of the output video.
    • height: the height of the output video.
    • left: the distance between the left border of the cropped video and the left border of the original video.
    • top: the distance between the top border of the cropped video and the top border of the original video.
  • Example: 1920:1080:0:140.

Audio

The following table describes the parameters that are nested in Audio.

ParameterTypeRequiredDescription
RemoveStringNoSpecifies whether to delete the audio stream.
  • true: deletes the audio stream. If you set this parameter to true, all parameters described in this topic that are related to Audio become invalid.
  • false: does not delete the audio stream.
  • Default value: false.
CodecStringNoThe audio codec.
  • Valid values: AAC, AC3, EC3, AMR, MP2, MP3, FLAC, OPUS, VORBIS, WMA, and pcm_s16le. For more information about container formats and the compatibility with audio codecs, see Supported formats.
  • Default value: AAC.
ProfileStringNoThe audio encoding profile.
  • This parameter takes effect only if you set Codec to ACC.
  • Valid values: aac_low, aac_he, aac_he_v2, aac_ld, and aac_eld. For more information, see Terms.
  • Default value: aac_low.
BitrateStringNoThe audio bitrate of the output file.
  • Unit: Kbit/s.
  • Valid values: [8,1000].
  • Default value: 128.
  • Common values: 64, 128, and 256.
SamplerateStringNoThe sampling rate.
  • Unit: Hz.
  • Valid values: 22050, 32000, 44100, 48000, and 96000.
    • If you set Codec to MP3, you cannot set this parameter to 96000. For more information about the supported sampling rates for each audio codec, see Sampling rates.
    • If you set Codec to OPUS, you can set this parameter to 8000, 16000, 24000, or 48000.
  • Default value: 44100.
ChannelsStringNoThe number of sound channels.
  • Valid values: 0, 1, 2, 4, 5, 6, and 8.
    • If you set Codec to MP3 or OPUS, you can set this parameter to 0, 1, or 2.
    • If you set Codec to ACC or FLAC, you can set this parameter to 0, 1, 2, 4, 5, 6, or 8.
    • If you set Codec to VORBIS, you can set this parameter to 2.
    • If you set Format to mpd, you cannot set this parameter to 8.
  • Default value: 2.
  • If you want to continue using the sound channels of the input file, set this parameter to 0.
VolumeStringNoThe volume configuration. For more information, see Volume.
  • You can specify this parameter only if a single audio stream is generated. This parameter is not supported if multiple audio streams are generated.

TransConfig

The following table describes the parameters that are nested in TransConfig.

ParameterTypeRequiredDescription
TransModeStringNoThe video transcoding mode. Valid values:
  • onepass: In most cases, this value is used for adaptive bitrate (ABR). This mode provides a higher encoding speed than twopass.
  • twopass: In most cases, this value is used for variable bitrate (VBR). This method provides a lower encoding speed than onepass.
  • CBR: The CBR mode.
  • Default value: onepass.
AdjDarMethodStringNoThe method that is used to change the resolution. This parameter takes effect only if you specify Width and Height. You can use this parameter together with LongShortMode.
IsCheckResoStringNoSpecifies whether to check the video resolution. You can specify either IsCheckReso or IsCheckResoFail. If you specify both parameters, IsCheckResoFail takes precedence over IsCheckReso.
  • true: checks the video resolution. If you set this parameter to true, the resolution of the input video is used for transcoding when the width or height of the input video is lower than the width or height of the output video.
  • false: does not check the video resolution.
  • Default value: false.
IsCheckResoFailStringNoSpecifies whether to check the video resolution. You can specify either IsCheckReso or IsCheckResoFail. If you specify both parameters, IsCheckResoFail takes precedence over IsCheckReso.
  • true: checks the video resolution. If you set this parameter to true, transcoding fails when the width or height of the input video is lower than the width or height of the output video.
  • false: does not check the video resolution.
  • Default value: false.
IsCheckVideoBitrateStringNoSpecifies whether to check the video bitrate. You can specify either IsCheckVideoBitrate or IsCheckVideoBitrateFail. If you specify both parameters, IsCheckVideoBitrateFail takes precedence over IsCheckVideoBitrate.
  • true: checks the video bitrate. If you set this parameter to true, the bitrate of the input video is used for transcoding when the bitrate of the input video is lower than that of the output video.
  • false: does not check the video bitrate.
  • Default value: false.
IsCheckVideoBitrateFailStringNoSpecifies whether to check the video bitrate. You can specify either IsCheckVideoBitrate or IsCheckVideoBitrateFail. If you specify both parameters, IsCheckVideoBitrateFail takes precedence over IsCheckVideoBitrate.
  • true: checks the video bitrate. If you set this parameter to true, transcoding fails when the bitrate of the input video is lower than that of the output video.
  • false: does not check the video bitrate.
  • Default value: false.
IsCheckAudioBitrateStringNoSpecifies whether to check the audio bitrate. You can specify either IsCheckAudioBitrate or IsCheckAudioBitrateFail. If you specify both parameters, IsCheckAudioBitrateFail takes precedence over IsCheckAudioBitrate.
  • true: checks the audio bitrate. If you set this parameter to true, the bitrate of the input audio is used for transcoding when the bitrate of the input audio is lower than that of the output audio.
  • false: does not check the audio bitrate.
  • Default values:
    • By default, if you do not specify this parameter and set the codec of the output audio different from the codec of the input audio, false is used.
    • By default, if you do not specify this parameter and set the codec of the output audio the same as the codec of the input audio, true is used.
IsCheckAudioBitrateFailStringNoSpecifies whether to check the audio bitrate. You can specify either IsCheckAudioBitrate or IsCheckAudioBitrateFail. If you specify both parameters, IsCheckAudioBitrateFail takes precedence over IsCheckAudioBitrate.
  • true: checks the audio bitrate. If you set this parameter to true, transcoding fails when the bitrate of the input audio is lower than that of the output audio.
  • false: does not check the audio bitrate.
  • Default value: false.

WaterMarks

The following table describes the parameters nested in WaterMarks.

ParameterTypeRequiredDescription
TypeStringNoThe type of the watermark. If you specify this parameter, the watermark configurations that are specified in the watermark template are overwritten.
  • Text: a text watermark. If you set this parameter to Text, you must specify TextWaterMark.
  • Image: an image watermark. If you set this parameter to Image, you must specify the parameters related to the image watermark.
  • Default value: Image.
TextWaterMarkObjectNoThe configurations of the text watermark. For more information, see TextWaterMark.
  • If you set Type to Text, you must specify this parameter.
  • Example: {"Content":"5rWL6K+V5paH5a2X5rC05Y2w","FontName":"SimSun","FontSize":"16","Top":2,"Left":10}.
InputFileObjectNoThe image watermark file. You must specify the storage address of the object in OSS.
  • You can use a static or dynamic image in the PNG format or an MOV file as the image watermark.
  • You can store watermark files only in OSS. For more information about the parameter format, see Input.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
  • Example: {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example-logo.png"}.
Note If you add a non-HDR image as a watermark to an HDR video, a color cast may occur on the video.
WaterMarkTemplateIdStringNoThe ID of the image watermark template. If you leave this parameter empty, the following settings are used for the image watermark:
  • Position: the upper-right corner.
  • Horizontal offset and vertical offset: 0.
  • Width: 0.12 times the width of the output video.
  • Height: proportionally scaled based on the aspect ratio of the original image.
  • Display duration: the duration of the video.
ReferPosStringNoThe position of the image watermark. If you specify this parameter, the watermark configurations that are specified in the watermark template are overwritten.
  • Valid values: TopRight, TopLeft, BottomRight, and BottomLeft.
DxStringNoThe horizontal offset of the image watermark relative to the output video. If you specify this parameter, the watermark configurations that are specified in the watermark template are overwritten. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the horizontal offset.
    • Unit: pixels.
    • Valid values: [8,4096].
  • Format 2: a decimal, which indicates the percentage of the horizontal offset to the width of the output video.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
DyStringNoThe vertical offset of the image watermark relative to the output video. If you specify this parameter, the watermark configurations that are specified in the watermark template are overwritten. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the vertical offset.
    • Unit: pixels.
    • Valid values: [8,4096].
  • Format 2: a decimal, which indicates the percentage of the vertical offset to the width of the output video.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
WidthStringNoThe width of the image watermark. If you specify this parameter, the watermark configurations that are specified in the watermark template are overwritten. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the watermark width.
    • Valid values: [8,4096].
    • Unit: pixels.
  • Format 2: a decimal, which indicates the percentage of the watermark width to the width of the output video.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
HeightStringNoThe height of the image watermark. If you specify this parameter, the watermark configurations that are specified in the watermark template are overwritten. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the watermark height.
    • Valid values: [8,4096].
    • Unit: pixels.
  • Format 2: a decimal, which indicates the percentage of the watermark height to the height of the output video.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
TimelineStringNoThe display duration of the image watermark. For more information, see Timeline.
  • If you specify this parameter, the watermark configurations that are specified in the watermark template are overwritten.

TextWaterMark

The following table describes the parameters that are nested in TextWaterMark.

ParameterTypeRequiredDescription
ContentStringYesThe text watermark. The text must be Base64-encoded.
  • For example, if you want to add "Test text watermark" to the output video, set this parameter to 5rWL6K+V5paH5a2X5rC05Y2w.
Note If you specify special characters such as emojis and single quotation marks (') in the text, the watermark may be truncated or fail to be created. You must escape special characters before you specify them in the text.
FontNameStringNoThe font of the text watermark.
  • For information about valid values, see Fonts.
  • Default value: SimSun.
FontSizeIntNoThe font size of the text watermark.
  • Valid values: (4,120).
  • Default value: 16.
FontColorStringNoThe color of the text.
  • For information about valid values, see FontColor.
  • Default value: black.
FontAlphaFloatNoThe transparency of the text.
  • Valid values: (0,1].
  • Default value: 1.0.
BorderWidthIntNoThe outline width of the text.
  • Unit: pixels.
  • Valid values: [0,4096].
  • Default value: 0.
BorderColorStringNoThe outline color of the text.
  • For information about valid values, see BorderColor.
  • Default value: Black.
TopIntNoThe top margin of the text.
  • Unit: pixels.
  • Valid values: [0,4096].
  • Default value: 0.
LeftIntNoThe left margin of the text.
  • Unit: pixels.
  • Valid values: [0,4096].
  • Default value: 0.

Timeline

The following table describes the parameters that are nested in Timeline.

ParameterTypeRequiredDescription
StartStringNoThe beginning of the time range in which the image watermark is displayed.
  • Format: sssss[.SSS].
  • Valid values: [0.000,86399.999]. If you specify a start time that is later than the time when the video ends, transcoding fails.
  • Default value: 0.
  • Example: 18000.30.
DurationStringNoThe display duration of the image watermark.
  • If you set this parameter to ToEND, the watermark is displayed from the start time that you specify to the end of the video.
  • Format: sssss[.SSS]. Unit: seconds.
  • Default value: ToEND.

Config

The parameters described in this section are used when you call AddWaterMarkTemplate and UpdateWaterMarkTemplate.

ParameterTypeRequiredDescription
TypeStringNoThe type of the watermark.
  • Image: an image watermark.
  • Default value: Image.
ReferPosStringNoThe position of the image watermark.
  • Valid values: TopRight, TopLeft, BottomRight, and BottomLeft.
DxStringNoThe horizontal offset of the image watermark relative to the output video. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the horizontal offset.
    • Unit: pixels.
    • Valid values: [8,4096].
  • Format 2: a decimal, which indicates the percentage of the horizontal offset to the width of the output video.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
DyStringNoThe vertical offset of the image watermark relative to the output video. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the vertical offset.
    • Unit: pixels.
    • Valid values: [8,4096].
  • Format 2: a decimal, which indicates the percentage of the vertical offset to the width of the output video.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
WidthStringNoThe width of the image watermark. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the watermark width.
    • Unit: pixels.
    • Valid values: [8,4096].
  • Format 2: a decimal, which indicates the percentage of the watermark width to the width of the output video.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
HeightStringNoThe height of the image watermark. You can specify this parameter in two formats:
  • Format 1: an integer, which indicates the pixel value of the watermark height.
    • Unit: pixels.
    • Valid values: [8,4096].
  • A decimal number specifies the ratio of the watermark height to the height in the output video resolution.
    • Valid values: (0,1).
    • You can specify a value accurate to four decimal places, such as 0.9999. Excessive digits are automatically deleted.
TimelineStringNoThe timeline of the watermark. For more information, see Timeline.
The following rules apply when you specify Width and Height of the image watermark.
  • If you do not specify Width or Height, the watermark width is 0.12 times the width of the output video, and the watermark height is proportionally scaled based on the aspect ratio of the original image.
  • If you specify only the Width parameter, the watermark height is proportionally scaled based on the aspect ratio of the original image. If you specify only the Height parameter, the watermark width is proportionally scaled based on the aspect ratio of the original image.
  • You can also specify Width and Height to determine the size of the watermark.

DeWatermark

This section describes the fields nested in DeWatermark and provides sample code.
{
// Blur two logos in the video image at the beginning of the video. The first logo is 10 × 10 pixels away from the upper-left corner of the video image and is 10 × 10 pixels in size. The second logo is 100 pixels away from the left side of the video image and is 10 × 10 pixels in size. The distance between the top of the video image and the second logo is calculated by using the following formula: 0.1 × Height of the video. 
       "0": [
              {
                "l": 10,
                "t": 10,
                "w": 10,
                "h": 10
              },
              {
                "l": 100,
                "t": 0.1,
                "w": 10,
                "h": 10
              }
            ],
  // Stop blurring the logos after 128,000 milliseconds. In this case, the time range during which the logos are blurred is [0,128000] milliseconds. 
     "128000": [],
  // Blur the logo in the video image at the 250,000th millisecond. The logo width is 0.01 times the width of the video, and the logo height is 0.05 times the height of the video. The distance between the left side of the video image and the logo is calculated by using the following formula: 0.2 × Width of the video. The distance between the top of the video image and the logo is calculated by using the following formula: 0.1 × Height of the video. 
  "250000": [
              {
                "l": 0.2,
                "t": 0.1,
                "w": 0.01,
                "h": 0.05
              }
            ]
 }     
Field description
  • pts: the point in time at which the blurring starts. Unit: milliseconds.
  • l: the left margin of the blurred area.
  • t: the top margin of the blurred area.
  • w: the width of the blurred area.
  • h: the height of the blurred area.

If you specify a value greater than 1 for l, t, w, or h, the number specifies the absolute pixel value. Otherwise, the number specifies the ratio of the pixel value to the corresponding pixel value of the input video. The size of the blurred area is specified based on the nearest integers of the values of the l, t, w, and h fields.

SubtitleConfig

The following table describes the parameter that is nested in SubtitleConfig.

ParameterTypeRequiredDescription
ExtSubtitleListObject[]NoThe list of external subtitles. For more information, see ExtSubtitle.
  • You can add a maximum of four subtitle files for a transcoding task.
  • Example: [{"Input":{"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example.srt"},"CharEnc":"UTF-8"}].

ExtSubtitle

The following table describes the parameters that are nested in ExtSubtitle.

ParameterTypeRequiredDescription
InputStringYesThe external subtitle file. You must specify the storage address of the object in OSS.
  • The subtitle file must be in the SRT or ASS format. The system automatically reads the text color information in the file.
  • You can store subtitle files only in OSS. For more information about the parameter format, see Input.
  • Placeholders are supported. For more information, see Placeholder replacement rules.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
  • For example, the path of the input video is a/b/example.flv and the path of the subtitle file is a/b/example-cn.mp4. You can use placeholders to specify the path of the object: {ObjectPrefix}{FileName}-cn.srt. Then, encode the path in UTF-8: {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"%7bObjectPrefix%7d%7bFileName%7d-cn.srt"}. Specify the encoded path for this parameter.
Note If the duration of the subtitle file exceeds the video duration, the video duration is used for transcoding. If the length of subtitles exceeds the length of a row, the excess part is truncated.
CharEncStringNoThe encoding format of the external subtitles.
  • Valid values: UTF-8, GBK, BIG5, and auto.
  • Default value: auto.
Note If you set this parameter to auto, the detected encoding format may not be the actual encoding format. We recommend that you set this parameter to a specific encoding format.
FontNameStringNoThe font of the subtitles.
  • For information about valid values, see Fonts.
  • Default value: SimSun.
FontSizeIntNoThe font size of the subtitles.
  • Valid values: (4,120).
  • Default value: 16.

Clip

The following table describes the parameters that are nested in Clip.

ParameterTypeRequiredDescription
TimeSpanStringNoThe length of the clip that you want to cut from the input file. For more information, see TimeSpan.
  • Configure the continuous duration: {"Seek":"00:01:59.999","Duration":"18000.30"} specifies that the output clip is generated starting from 1:59:999 of the original clip, and stops after 5 minutes and 30 milliseconds. The total length of the clip is 5 minutes and 30 milliseconds.
  • Configure the offset: {"Seek":"00:01:59.999","End":"18000.30"} specifies that the output clip is generated starting from 1:59:999 of the original clip until 5 minutes and 30 milliseconds before the end of the original clip. The total length of the clip is determined based on the length of the original clip.
ConfigToClipFirstPartBooleanNoSpecifies whether to cut the first clip that you want to merge with the input video.
  • true: cuts the first clip before merging.
  • false: does not cut the first clip before merging.
  • Default value: false.

TimeSpan

The following table describes the parameters that are nested in TimeSpan.

ParameterTypeRequiredDescription
SeekStringNoThe start point of the clip. You can use this parameter to specify the time at which the output clip is cut from the original clip. By default, the output clip is cut from the beginning of the original clip.
  • Valid formats: hh:mm:ss[.SSS] and sssss[.SSS].
  • Valid values: [00:00:00.000,23:59:59.999] and [0.000,86399.999].
  • Example: 00:01:59.999 and 18000.30.
DurationStringNoThe duration of the output clip. The duration is calculated from the time you specified for Seek. By default, the duration is from the time you specified for Seek to the end of the original clip. You can specify either Duration or End. If you specify both parameters, End takes precedence over Duration.
  • Valid formats: hh:mm:ss[.SSS] and sssss[.SSS].
  • Valid values: [00:00:00.000,23:59:59.999] and [0.000,86399.999].
  • Example: 00:01:59.99 and 18000.30.
EndStringNoThe duration between the time at which the output clip ends and the end of the original clip. You can specify either Duration or End. If you specify both parameters, End takes precedence over Duration.
  • Valid formats: hh:mm:ss[.SSS] and sssss[.SSS].
  • Valid values: [00:00:00.000,23:59:59.999] and [0.000,86399.999].
  • Example: 00:01:59.999 and 18000.30.

MergeList

The following table describes the parameters that are nested in MergeList.

ParameterTypeRequiredDescription
MergeURLStringYesThe OSS address of the clip that you want to merge with the input video.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
  • Example: httphttp://exampleBucket****m.oss-cn-hangzhou.aliyuncs.com/tail_comm_01.mp4.
StartStringNoThe time at which the output clip is cut from the original clip. If you want to cut a part of the original clip to merge with the input video, use this parameter to specify the start time of the output clip. By default, the output clip is cut from the beginning of the original clip.
  • Valid formats: hh:mm:ss[.SSS] and sssss[.SSS].
  • Valid values: [00:00:00.000,23:59:59.999] and [0.000,86399.999].
  • Examples: 01:59:59.999 and 32000.23.
DurationStringNoThe duration of the output clip. If you want to cut a part of the original clip to merge with the input video, use this parameter to specify duration of the output clip based on the time you specified for Start. By default, the duration is the time range from the time you specified for Start to the end of the original clip.
  • Valid formats: hh:mm:ss[.SSS] and sssss[.SSS].
  • Valid values: [00:00:00.000,23:59:59.999] and [0.000,86399.999].
  • Examples: 01:59:59.999 and 32000.23.

Opening

The following table describes the parameters that are nested in Opening.

ParameterTypeRequiredDescription
OpenUrlStringYesThe storage address of the opening scenes in OSS.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
  • Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/opening_01.flv.
StartStringNoThe duration that elapses after the input video is played before the opening scene is played . The value starts from 0.
  • Unit: seconds.
  • Default value: 0.
WidthStringNoThe width of the output opening scene. You can use one of the following methods to set this parameter:
  • Specify a value that ranges from 0 to 4096. Unit: pixels.
  • Specify -1. -1 specifies the width of the original opening scene.
  • Specify full. full specifies the width of the output video.
  • Default value: -1.
Note The output opening scene is played at the center of the output video. You cannot specify a width for the output opening scene that is higher than the width of the output video. Otherwise, the effect is unknown.
HeightStringNoThe height of the output opening scene. You can use one of the following methods to set this parameter:
  • Specify a value that ranges from 0 to 4096. Unit: pixels.
  • Specify -1. -1 specifies the height of the original opening scene.
  • Specify full. full specifies the height of the output video.
  • Default value: -1.
Note The output opening scene is played at the center of the output video. You cannot specify a height for the output opening scene that is higher than the height of the output video. Otherwise, the effect is unknown.

TailSlate

The following table describes the parameters that are nested in TailSlate.

ParameterTypeRequiredDescription
TailUrlStringYesThe storage address of the ending scenes in OSS.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
  • Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_01.flv.
BlendDurationStringNoThe transition duration during which the last frame of the output video fades out and the first frame of the ending scene fades in.
  • Unit: seconds.
  • Default value: 0.
WidthStringNoThe width of the ending scene. You can use one of the following methods to set this parameter:
  • Specify a value that ranges from 0 to 4096. Unit: pixels.
  • Specify -1. -1 specifies the width of the original ending scene.
  • Specify full. full specifies the width of the output video.
  • Default value: -1.
Note The output ending scene is played at the center of the output video. You cannot specify a width for the output ending scene that is greater than the width of the output video. Otherwise, the effect is unknown.
HeightStringNoThe height of the ending scene. You can use one of the following methods to set this parameter:
  • Specify a value that ranges from 0 to 4096. Unit: pixels.
  • Specify -1. -1 specifies the height of the original ending scene.
  • Specify full. full specifies the height of the output video.
  • Default value: -1.
Note The output ending scene is played at the center of the output video. You cannot specify a height for the output ending scene that is greater than the height of the output video. Otherwise, the effect is unknown.
IsMergeAudioBooleanNoSpecifies whether to merge the audio content of the ending scene with the audio of the input video. Valid values:
  • true
  • false
  • Default value: true.
BgColorStringNoThe background color that is used to fill in the back bars that appear when the size of the ending scene is smaller than that of the output video.
  • For information about valid values, see bgcolor.
  • Default value: White.

Amix

The following table describes the parameters that are nested in Amix.

ParameterTypeRequiredDescription
AmixURLStringYesThe list of audio streams that you want to mix. You can specify this parameter in two formats:
  • URL of the input file: mixes the two audio streams in the input video. This is suitable for merging multiple sound tracks of a video.
  • An OSS URL: mixes one audio stream of the input video with an audio stream stored in OSS. This is suitable for adding an external background music to the input video. Example: http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/tail_01.flv.
MapStringNoSpecifies the audio stream that you want to mix in AmixURL. You must use AudioStreamMap to specify the audio stream that you want to mix in the input file.
  • Specify a value in the 0:a:{sequence number} format. The sequence number specifies the position of the audio stream in the list. The sequence number starts from 0.
  • For example, if you specify 0:a:2 for this parameter, the second audio stream is used for audio mixing.
MixDurModeStringNoThe duration of the output audio after mixing. Valid values:
  • first: the duration of the input file.
  • longest: the duration of a file that has the longest duration among the input file and the audio streams that you specified for AmixURL.
  • Default value: longest.
StartStringNoThe time at which the output audio is cut from the original audio. If you want to cut a part of the original audio to mix with the audio of the input video, use this parameter to specify the start time of the output audio. By default, the output audio is cut from the beginning of the original audio.
  • Valid formats: hh:mm:ss[.SSS] and sssss[.SSS].
  • Valid values: [00:00:00.000,23:59:59.999] and [0.000,86399.999].
  • Example: 00:01:59.999 and 18000.30.
DurationStringNoThe duration of the output audio. If you want to cut a part of the original audio to mix with the audio of the input video, use this parameter to specify the duration of the output audio based on the time you specified for Start. By default, the duration is the time range from the time you specified for Start to the end of the original audio.
  • Valid formats: hh:mm:ss[.SSS] and sssss[.SSS].
  • Valid values: [00:00:00.000,23:59:59.999] and [0.000,86399.999].
  • Example: 00:01:59.999 and 18000.30.

MuxConfig

The following table describes the parameter that is nested in MuxConfig.

ParameterTypeRequiredDescription
SegmentStringNoThe segmentation configurations. For more information, see Segment.
  • This parameter takes effect only when you set Container field to m3u8, hls-fmp4, mpd, or cmaf.
  • For example, if you specify {"Duration":"10","ForceSegTime":"1,2,4,6,10,14,18"} for this parameter, the input video is forcefully segmented at the first, second, fourth, sixth, tenth, fourteenth, and eighteenth seconds.

Segment

The following table describes the parameters that are nested in Segment.

ParameterTypeRequiredDescription
DurationIntNoThe segmentation interval.
  • Unit: seconds.
  • Valid values: [1,60].
  • Default value: 10 10 specifies that the video is forcefully segmented at the tenth, twentieth, thirtieth, and fortieth seconds.
ForceSegTimeStringNoThe time points at which you want to split the input video. Separate multiple time points with commas (,). You can specify a maximum of 10 time points.
  • Valid format: {Time Point},{Time Point},{Time Point},{Time Point}.
  • The time points must be decimals and can be accurate to three decimal places.
  • Unit: seconds.
  • For example, if you specify 1,2,4,6,10,14,18 for this parameter, the video is forcefully segmented at the first, second, fourth, sixth, tenth, fourteenth, and eighteenth seconds.

M3U8NonStandardSupport

The following table describes the parameter that is nested in M3U8NonStandardSupport.

ParameterTypeRequiredDescription
TSObjectNoSpecifies whether the non-standard TS files are supported. For more information, see TS.

TS

The following table describes the parameters that are nested in TS.

ParameterTypeRequiredDescription
Md5SupportBooleanNoSpecifies whether to support the generation of the MD5 hash value of the TS file in the output M3U8 video.
SizeSupportBooleanNoSpecifies whether to support the generation of the size information about the TS file in the output M3U8 video.

Encryption

The following table describes the parameters that are nested in Encryption.

ParameterTypeRequiredDescription
TypeStringYesThe encryption method of the video.
  • Set the value to hls-aes-128.
KeyUriStringYesThe key URL.
  • You must encrypt KeyUri in Base64 before you use the key URL in MPS.
  • For example, if the key URL is http://aliyun.com/document/hls128.key, you must use the Base64-encrypted value aHR0cDovL2FsaXl1bi5jb20vZG9jdW1lbnQvaGxzMTI4LmtleQ== in MPS.
KeyTypeStringYesThe encryption method of the key. Valid values:
  • Base64: uses the basic encryption method.
  • KMS: uses Key Management Service (KMS) to encrypt the Base64-encrypted key.
Note Alibaba Cloud provides the master key. To obtain the master key, submit a ticket. The Base64 algorithm is used as the basic encryption method. If you specify KMS for this parameter, the key is first encrypted in Base64 and then encrypted by using KMS.
KeyStringYesThe key that is used to encrypt the video.
  • You must encrypt KeyUri in Base64 or by using KMS before you use the key URL in MPS.
  • For example, if the key is encryptionkey123, you must use the Base64-encrypted value ZW5jcnlwdGlvbmtleTEyMw or the KMS-encrypted value KMS(Base64("encryptionkey123") in MPS.
SkipCntStringNoThe number of segments that are not encrypted at the beginning of the video. Leaving the segments unencrypted reduces the startup loading duration
  • Example: 3.

Volume

The following table describes the parameters that are nested in Volume.

ParameterTypeRequiredDescription
MethodStringNoThe method that you want to use to adjust the volume.
  • auto
  • dynamic
  • linear
  • Default value: dynamic.
LevelStringNoThe level of the volume adjustment based on the volume of the input audio.
  • This parameter takes effect only when you set Method to linear.
  • Unit: dB.
  • You can specify a value that is less than or equal to 20.
  • Default value: -20.
IntegratedLoudnessTargetStringNoThe volume of the output video.
  • This parameter takes effect only if you set Method to dynamic.
  • Unit: dB.
  • Valid values: [-70,-5].
  • Default value: -6.
TruePeakStringNoThe maximum volume.
  • This parameter takes effect only if you set Method to dynamic.
  • Unit: dB.
  • Valid values: [-9,0].
  • Default value: -1.
LoudnessRangeTargetStringNoThe magnitude of each volume adjustment based on the volume of the output video.
  • This parameter is valid only if you set Method to dynamic.
  • Unit: dB.
  • Valid values: [1,20].
  • Default value: 8.

Placeholder replacement rules

The following table describes the placeholders that can be used in file paths.

For example, the path of the input file is a/b/example.flv and the path of the output file is a/b/c/example+test.mp4. You must use {ObjectPrefix} and {FileName} placeholders to replace the prefix and name of the output file and encode the path in UTF-8 before you specify the following value for OutputObject: %7BObjectPrefix%7D/c/%7BFileName%7D%2Btest.mp4.

Placeholder descriptionTranscoding output fileInput subtitle fileOutput snapshots
PlaceholderDescriptionOutput files generated by using workflowsOutput files generated by calling API operationsSubtitlesSnapshots captured by using workflowsSnapshots captured by calling API operations
{ObjectPrefix}Prefix of the input file.SupportedSupportedSupportedSupportedSupported
{FileName}The name of the input file.SupportedSupportedSupportedSupportedSupported
{ExtName}The file name extension of the input file.SupportedSupportedSupportedSupportedSupported
{DestMd5}The MD5 hash value of the output file.SupportedSupportedNot supportedNot supportedNot supported
{DestAvgBitrate}The average bitrate of the output videos.SupportedSupportedNot supportedNot supportedNot supported
{SnapshotTime}The time when the snapshot was captured.Not supportedNot supportedNot supportedSupportedSupported
{Count}The sequence number of the snapshot.Not supportedNot supportedNot supportedSupportedSupported
{RunId}The ID of the workflow.SupportedNot supportedNot supportedNot supportedNot supported
{MediaId}The ID of the media file.SupportedNot supportedNot supportedNot supportedNot supported

SnapshotConfig

This parameter is used when you call SubmitSnapshotJob.

Important You can specify whether to use the synchronous or asynchronous mode to capture snapshots. If you use the asynchronous mode, the snapshot job is submitted to and scheduled in an MPS pipeline. In this case, the snapshot job may be queued. The snapshot may not be generated when the response to SubmitSnapshotJob is returned. After you submit a snapshot job, call QuerySnapshotJobList to query the result of the snapshot job. Alternatively, you can configure Message Service (MNS) callbacks for the pipeline to obtain the results. For more information, see Receive message notifications. If you specify either Interval or Num, the asynchronous mode is automatically used.
ParameterTypeRequiredDescription
NumStringNoThe number of snapshots that you want to capture.
  • If you specify this parameter, the asynchronous mode is automatically used. The value of Num must be greater than 0.
  • If you leave Num and Interval empty and specify Time, the system synchronously captures a snapshot at the time that you specified.
  • If you set Num to 1, and specify Time, the system asynchronously captures a snapshot at the time that you specified.
  • If you set Num to a value greater than 1, , the system starts to asynchronously capture snapshots at the time that you specified for Time and stops when the number of captured snapshots reaches the number you specified for Num. The snapshots are captured at the interval that you specified for Interval. If you do not specify Interval, the snapshots are captured every 10 seconds. If the result of Time + Interval * Num is greater than the length of the input video, only snapshots that are captured at time points within the duration of the input video can be generated. After the snapshots are generated, the actual number of snapshots is returned.
  • If you set Num to a value greater than 1 and Interval to 0, the system starts to asynchronously capture snapshots at the time that you specified for Time. The number of snapshots is determined by Num and the snapshots are evenly captured within the duration of the input video.
TimeStringYesThe time at which the system starts to capture snapshots in the input video.
  • If the Time that you specified exceeds the video duration, the snapshots fail to be generated.
  • Unit: milliseconds.
IntervalStringNoThe interval at which snapshots are captured.
  • If you specify this parameter, the asynchronous mode is automatically used to capture snapshots.
  • Set Interval to a value greater than 0 if you want to asynchronously capture multiple snapshots. Unit: seconds.
  • Set Interval to 0 if you want to evenly capture snapshots within the duration of the input video.
  • Default value: 10 If you specify Num but leave this parameter empty, the default value is used.
FrameTypeStringNoThe snapshot type. Valid values:
  • normal: normal frames
  • intra: keyframes
  • Default value: normal.
Note
  • The image quality of normal frames is lower than that of keyframes. Capturing normal frames also takes more time than capturing keyframes. However, a normal frame can be captured at a specific time point.
  • Keyframes of videos have good image quality and can be captured quickly because keyframes are independently decoded. However, keyframes appear in a video at intervals and cannot be captured at specific time points. If the time point that you specify is not accurate, the system captures the keyframe that is nearest to the time point that you specify. In this case, if the distance between two keyframes is greater than the interval at which snapshots are captured, the number of generated snapshots may be less than what you specify.
WidthStringNoThe width of snapshots.
  • Unit: pixels.
  • Valid values: [8,4096]. We recommend that you specify an even number.
  • Default values:
    • By default, if you do not specify a width or height, the width of the input video is used.
    • If you specify only the height, the width of the snapshot is calculated based on the aspect ratio of the input video.
HeightStringNoThe height of snapshots.
  • Unit: pixels.
  • Valid values: [8,4096]. We recommend that you specify an even number.
  • Default values:
    • By default, if you do not specify a width or height, the height of the input video is used.
    • If you specify only the width, the height of the snapshot is calculated based on the aspect ratio of the input video.
BlackLevelStringNoThe upper limit of black pixels in a snapshot. If the black pixels in a snapshot exceed this value, the system determines that the image is a black screen. For more information about black pixels, see the description of PixelBlackThreshold.
The following items describe the conditions in which this parameter takes effect:
  • If you set Time to 0, this parameter takes effect and black screens are identified. If you set Time to a value greater than 0, black screens cannot be identified.
  • If you set Time to 0 and Num to 1 or leave Num empty, the first 5 seconds of the video are checked. If a normal video image exists, the image is captured. Otherwise, the snapshot fails to be generated.
  • If you set Time to 0 and Num to a value greater than 1, this parameter takes effect. The first 5 seconds of the video are checked. If a normal video image exists, the image is captured. If the first 5 seconds contain only black screens, the first frame is captured.
  • Valid values: [30,100].
  • Default value: 100.
  • If you want to identify pure black screens, set this value to 100.
  • For example, if you set Time to 0 and Num to 10, pure black screens are filtered.
PixelBlackThresholdStringNoThe color value threshold for pixels. If the color value of a pixel is less than the threshold, the system determines that the pixel is a black pixel.
  • Valid values: [0,255]. 0 specifies a pure white pixel and 255 specifies a pure black pixel.
  • If you want to improve the filtering of black screens, specify a higher value for this parameter. We recommend that you set this parameter to 30 and adjust the value based on your business requirements.
  • For example, if you set this parameter to 100, the pixels whose color values are lower than 100 are considered black pixels.
FormatStringNoThe format of the output file.
  • If you set this parameter to vtt, the output file is in the WebVTT format. You must configure SubOut to specify whether to generate WebVTT files.
  • By default, this parameter is left empty. In this case, the output file is generated in the JPG format.
SubOutObjectNoThe configurations of the WebVTT file. For more information, see SubOut.
  • The parameter is required if you set Format to vtt.
TileOutObjectNoThe image sprite configurations. For more information, see TileOut.
  • After you set this parameter, the generated snapshots are combined into an image sprite. TileOutputFile specifies the output image sprite.
  • If you leave this parameter empty, no image sprites are generated.
OutputFileObjectYesThe original snapshots. You must specify the storage address of the objects in OSS. For more information, see OutputFile.
  • The snapshot files are in the JPG format.
  • Example: {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example.jpg"}.
TileOutputFileObjectNoThe output image sprite. You must specify the storage address of the object in OSS. The format of this parameter is similar to OutputFile.
  • This parameter is required if you specify TileOut to generate an image sprite.
  • The image sprite is in the JPG format.
  • Example: {"Bucket":"example-bucket","Location":"oss-cn-hangzhou","Object":"example.jpg"}.
Note
  • If you set Num to a value greater than 1, the placeholder {TileCount} must be used to replace the object name. The name must be encoded before it can be used in MPS. The format of encoded object names is %7BTileCount%7D. The encoded object names can be used to differentiate the storage addresses of different objects. For example, if you capture three snapshots, the output files are named 00001.jpg, 00002.jpg, and 00003.jpg.
  • If you want to store the original snapshots and the image sprite, specify different storage addresses for the original snapshots and the image sprite to prevent your files from being overwritten.

SubOut

The following table describes parameters that are nested in SubOut.

ParameterTypeRequiredDescription
IsSptFragStringNoSpecifies whether to generate WebVTT index files.
  • true: generates WebVTT index files.
  • false: does not generate WebVTT index files. Only snapshots are exported.
  • Default value: false.

TileOut

The following table describes parameters that are nested in TileOut.

ParameterTypeRequiredDescription
LinesIntNoThe number of rows that the image sprite contains.
  • Valid values:(0,10000].
  • Default value: 10
ColumnsIntNoThe number of columns that the image sprite contains.
  • Valid values:(0,10000].
  • Default value: 10
CellWidthStringNoThe width of each snapshot.
  • Unit: pixels.
  • Default value: the width of the original snapshot.
CellHeightStringNoThe height of each snapshot.
  • Unit: pixels.
  • Default value: the height of the original snapshot.
PaddingStringNoThe distance between two snapshots.
  • Unit: pixels.
  • Default value: 0.
MarginStringNoThe margin width of the image sprite.
  • Default value: 0.
  • Unit: pixels.
ColorStringNoThe background color. The background color that is used to fill the margins, the padding between snapshots, and the area where no snapshots are displayed.
  • You can specify a color keyword or random for this parameter. For example, if you want to set the background color to black, you can specify the color keyword in one of the following formats: Black, black, and #000000.
  • Default value: black.
IsKeepCellPicStringNoSpecifies whether to store the original snapshots.
  • true: stores the original snapshots. OutputFile specifies the information about the storage of original snapshots.
  • false: does not store the original snapshots.
  • Default value: true.

OutputFile

ParameterTypeRequiredDescription
BucketStringYesThe OSS bucket in which the original snapshots are stored.
  • For more information about buckets, see Terms.
LocationStringYesThe region where the bucket resides.
  • The region where the bucket resides must be the same as the region where MPS is activated.
  • For more information about regions, see Terms.
ObjectStringYesThe path in which the output snapshots are stored in OSS.
  • The path includes the file name and file name extension. For more information about the OSS path, see Terms.
  • Placeholders are supported. For more information, see Placeholder replacement rules.
  • The output file must be in the JPG format.
  • Before you call an operation, make sure that the URLs of objects that you use are encoded in UTF-8. For more information, see URL encoding.
Note
  • If you set Num to a value greater than 1, the placeholder {Count} must be used to replace the object name. The name must be encoded before it can be used in MPS. The format of encoded object names is %7BCount%7D. The encoded object names can be used to differentiate the storage addresses of different objects. For example, if you capture three snapshots, the output files are named 00001.jpg, 00002.jpg, and 00003.jpg.
  • If you want to store the original snapshots and the image sprite, specify different storage addresses for the original snapshots and the image sprite to prevent your files from being overwritten.

NotifyConfig

The parameters described in this section are used when you call AddPipeline and UpdatePipeline.

ParameterTypeRequiredDescription
QueueNameStringNoThe MNS queue in which you want to receive notifications. After the job is complete in the MPS pipeline, the job results are pushed to the MNS queue. For more information about receiving notifications, see Receive notifications.
  • You can specify either QueueName or Topic.
  • You must specify an MNS queue for this parameter. If no queue exists, create a queue in the MNS console.
TopicStringNoThe MNS topic in which you want to receive notifications. After the job is complete, the job results are pushed to the MNS topic. Then, the MNS topic pushes the message to multiple queues or URLs that subscribe to the topic. For more information about receiving notifications, see Receive notifications.
  • You can specify either QueueName or Topic.
  • You must specify an MNS topic for this parameter. If no topic exists, create a topic in the MNS console.

Parameters related to transcoding input file

ParameterTypeRequiredDescription
BucketStringYesThe OSS bucket in which the input file is stored.
  • You must grant MPS permissions to read the bucket in the OSS console.
  • For more information about buckets, see Terms.
LocationStringYesThe region in which the OSS bucket resides.

For more information about regions, see Terms.

ObjectStringYesThe input file.
  • The URL of the object must comply with RFC 2396 and be encoded in UTF-8. For more information, see URL encoding.
  • For more information about objects, see Terms.
AudioStringNoThe audio configuration of the input file. The value must be a JSON object.
Note This parameter is required if the input file is in the ADPCM or PCM format.
  • For more information, see InputAudio.
  • Example: {"Channels":"2","Samplerate":"44100"}.
ContainerStringNoThe container configuration of the input file. The value must be a JSON object.
Note This parameter is required if the input file is in the ADPCM or PCM format.

InputContainer

ParameterTypeRequiredDescription
FormatStringYesThe audio format of the input file.

Valid values: alaw, f32be, f32le, f64be, f64le, mulaw, s16be, s16le, s24be, s24le, s32be, s32le, s8, u16be, u16le, u24be, u24le, u32be, u32le, and u8.

InputAudio

ParameterTypeRequiredDescription
ChannelsStringYesThe number of sound channels in the input file. Valid values: [1,8].
SamplerateStringYesThe audio sampling rate of the input file.
  • Valid values: (0, 320000].
  • Unit: Hz.

AnalysisConfig

ParameterTypeRequiredDescription
QualityControlStringNoThe configuration of the output file quality. The value must be a JSON object. For more information, see AnalysisConfig.
PropertiesControlStringNoThe property configurations. The value must be a JSON object. For more information, see PropertiesControl.

QualityControl

ParameterTypeRequiredDescription
RateQualityStringNoThe quality level of the output file.
  • Value values: (0,51).
  • The value must be an integer.
  • Default value: 25.
MethodStreamingStringNoThe playback mode. Valid values: network and local.

Default value: network.

PropertiesControl

ParameterTypeRequiredDescription
DeinterlaceStringNoSpecifies whether to forcefully run deinterlacing. Valid values:
  • Auto: automatically runs deinterlacing.
  • Force: forcefully runs deinterlacing.
  • None: forbids deinterlacing.
CropStringNoThe cropping configuration of the video image.
  • By default, automatic cropping is performed.
  • If you do not set this parameter to an empty JSON object, the Mode parameter is required.
  • For more information, see Crop.

Crop

ParameterTypeRequiredDescription
ModeStringNoThe cropping mode. This parameter is required if the value of Crop is not an empty JSON object. Valid values:
  • Auto: automatically runs cropping.
  • Force: forcefully runs cropping.
  • None: forbids cropping.
WidthIntegerNoThe width of the video image after the margins are cropped out.
  • Valid values: [8,4096].
  • If you set Mode to Auto or None, the setting of this parameter is invalid.
HeightIntegerNoThe height of the video image after the margins are cropped out.
  • Valid values: [8,4096].
  • If you set Mode to Auto or None, the setting of this parameter is invalid.
TopIntegerNoThe top margin that you want to crop out.
  • Valid values: [8,4096].
  • If you set Mode to Auto or None, the setting of this parameter is invalid.
LeftIntegerNoThe left margin to be cropped out.
  • Valid values: [8,4096].
  • If you set Mode to Auto or None, the setting of this parameter is invalid.

TransFeatures

ParameterTypeRequiredDescription
MergeListStringNoThe URLs of the clips that you want to merge.
  • The value must be a JSON array that contains up to four URLs. For more information, see MergeList.
  • Example: [{"MergeURL":"http://example-bucket-****.oss-cn-hangzhou.aliyuncs.com/k/mp4.mp4"},{"MergeURL":"http://example-bucket-****.oss-cn-hangzhou.aliyuncs.com/c/ts.ts","Start":"1:14","Duration":"29"}].

Parameters related to transcoding output file

ParameterTypeRequiredDescription
URLStringNoThe OSS URL of the output file.
  • Example: http://example-bucket-****.oss-cn-hangzhou.aliyuncs.com/example.flv.
  • If you leave this parameter empty, you must specify Bucket, Location, and Object.
BucketStringNo
  • The OSS bucket that stores the output file. If you do not specify URL, this parameter is required.
  • If you specify URL, the setting of this parameter is invalid. Before you specify a bucket, grant MPS permissions to write the bucket on the Access Control page in the OSS console.
  • For more information about buckets, see Terms.
LocationStringNo
  • The region in which the OSS bucket resides. If you do not specify URL, this parameter is required.
  • If you specify URL, the setting of this parameter is invalid.
  • For more information about regions, see Terms.
ObjectStringNo
  • The name of the output file. If you do not specify URL, this parameter is required.
  • If you specify URL, the setting of this parameter is invalid. The value of this parameter must comply with RFC 2396 and be encoded in UTF-8. For more information about URL encoding, see URL encoding.
  • For more information about objects, see Terms.

MultiBitrateVideoStream

ParameterTypeRequiredDescription
URIStringNoThe name of the output video stream. The name must end with .m3u8. Example: a/b/test.m3u8. Format: ^[a-z]{1}[a-z0-9./-]+$.
RefActivityNameStringYesThe name of the associated activity.
ExtXStreamInfoJsonYesThe information about the stream. Example: {"BandWidth": "111110","Audio": "auds","Subtitles": "subs"}.

ExtXMedia

ParameterTypeRequiredDescription
NameStringYesRequired. The name of the resource. The name can be up to 64 bytes in length and must be encoded in UTF-8. This parameter corresponds to NAME in the HTTP Live Streaming (HLS) V5 protocol.
LanguageStringNoOptional. The language of the resource. The value must comply with RFC 5646. This parameter corresponds to LANGUAGE in the HLS V5 protocol.
URIStringYesRequired. The path of the resource.

Format: ^[a-z]{1}[a-z0-9./-]+$. Example: a/b/c/d/audio-1.m3u8.

MasterPlayList

ParameterTypeRequiredDescription
MultiBitrateVideoStreamsJsonArrayYesThe array of multiple streams. Example: [{"RefActivityName": "video-1","ExtXStreamInfo": {"BandWidth": "111110","Audio":"auds","Subtitles": "subs"}}].

ExtXStreamInfo

ParameterTypeRequiredDescription
BandWidthStringYesThe bandwidth. Required. The upper limit of the total bitrate. This parameter corresponds to BANDWIDTH in the HLS V5 protocol.
AudioStringNoOptional. The ID of the audio stream group. This parameter corresponds to AUDIO in the HLS V5 protocol.
SubtitlesStringNoOptional. The ID of the subtitle stream group. This parameter corresponds to SUBTITLES in the HLS V5 protocol.

AdaptationSet

ParameterTypeRequiredDescription
GroupStringYesRequired. The name of the group. Example:
<AdaptationSet group="videostreams" mimeType="video/mp4" par="4096:1744"
              minBandwidth="258157" maxBandwidth="10285391" minWidth="426" maxWidth="4096"
              minHeight="180" maxHeight="1744" segmentAlignment="true"
              startWithSAP="1">
LangStringNoThe language. You can specify this parameter for audio and subtitle resources.

Representation

ParameterTypeRequiredDescription
IdStringYesRequired. The ID of the stream. Example:
<Representation id="240p250kbps" frameRate="24" bandwidth="258157"
              codecs="avc1.4d400d" width="426" height="180">
URIStringYesRequired. The path of the resource. Format: ^[a-z]{1}[a-z0-9./-]+$. Example: a/b/c/d/video-1.mpd.

InputConfig

ParameterTypeRequiredDescription
FormatStringYesRequired. The format of the input subtitle file. Valid values: stl, ttml, and vtt.
InputFileStringYes
{"Bucket":"example-bucket-****","Location":"oss-cn-hangzhou","Object":"example-logo****.png"}
              Or
              {"URL":"http://exampleBucket****.oss-cn-hangzhou.aliyuncs.com/subtitle/test****.chs.vtt"}