Intelligent Media Services (IMS) provides professional online editing capabilities, designed for automated, AI-driven video processing, and collaborative production. It lets you edit video based on a timeline in the cloud. This topic describes how to integrate and use the Web SDK for video editing.
Usage notes
This topic uses V5.2.2 of the Web SDK as an example. Starting with V5.0.0, you must obtain a license to use the Web SDK.
Submit a ticket to apply for a license.
Procedure
Import the video editing Web SDK.
In your project's HTML page, import the Web SDK CSS file in the
<head>tag:<head> <link rel="stylesheet" href="https://g.alicdn.com/thor-server/video-editing-websdk/5.2.2/index.css"> </head>In the
<body>tag, add a<div>container for the editor UI. Then, at the end of the<body>tag, import the SDK's JavaScript file and add a<script>node for your initialization code.<body> <div id="aliyun-video-editor" style="height:700px"></div> // You can change the container height as needed. <script src="https://g.alicdn.com/thor-server/video-editing-websdk/5.2.2/index.js"></script> <script> // Place the code to call the SDK here. </script> </body>Initialize the Web SDK.
window.AliyunVideoEditor.init(config);The
configparameter is an object. For a description of its properties, see config properties.For an example of the
init()function, see init() sample code.
config properties
config parameters
Parameter | Type | Required | Description | Version introduced |
locale | string | No | The UI language. Valid values:
| 3.0.0 |
container | Element | Yes | The DOM node where the Web SDK mounts the editor interface. | 3.0.0 |
defaultAspectRatio | No | The default aspect ratio for the video preview. The default value is 16:9. | 3.4.0 | |
defaultSubtitleText | string | No | The default subtitle content. Cannot exceed 20 characters. The default value is "Alibaba Cloud Editing". | 3.6.0 |
useDynamicSrc | boolean | No | Specifies whether to dynamically fetch resource information. | 3.0.0 |
getDynamicSrc | (mediaId: string, mediaType: 'video' | 'audio' | 'image' | 'font', mediaOrigin?:'private' | 'public', inputUrl?: string) => Promise<string>; | No | Dynamically fetches resource information. This parameter is required if useDynamicSrc is true. This function must return a Promise that resolves with the new resource URL. | 3.10.0 |
getEditingProjectMaterials | () => Promise<InputMedia[]>; | Yes | Gets the media assets associated with the project. The returned Promise object must resolve with an array of all media asset types. | 3.0.0 |
searchMedia | (mediaType: 'video' | 'audio' | 'image') => Promise<InputMedia[]>; | Yes | Handles the Import action, which searches for and imports assets from your asset library into the project. The returned Promise object must resolve with an array of the newly added media assets. Important You must call AddEditingProjectMaterials to associate the new media assets with the project. | 3.0.0 |
deleteEditingProjectMaterials | (mediaId: string, mediaType: 'video' | 'audio' | 'image') => Promise<void>; | Yes | Disassociates a media asset from the project. The returned Promise object must resolve upon completion. | 3.0.0 |
submitASRJob | (mediaId: string, startTime: string, duration: string) => Promise<ASRResult[]>; | No | Submits an Automatic Speech Recognition (ASR) task for smart subtitling. The returned Promise object must resolve with an array of ASRResult. | 3.1.0 We recommend using AsrConfig, which overrides the submitASRJob method. |
submitAudioProduceJob | (text: string, voice: string, voiceConfig?: VoiceConfig) => Promise<InputMedia>; | No | Submits a Text-to-Speech (TTS) task. The parameters are subtitle content, voice value, and voice configuration. The returned Promise object must resolve with the generated audio data. | 4.3.5 We recommend using TTSConfig, which overrides the submitAudioProduceJob method. |
licenseConfig | Yes | To use the Web SDK in a production environment, a license is required. Without a License, you can only use it on a localhost domain, which displays a watermark during preview. In a production environment without a license, the preview player shows a black screen. | 5.0.1 | |
dynamicSrcQps | number | No | Limits the request frequency of getDynamicSrc. | 4.13.0 |
getTimelineMaterials | (params: TimelineMaterial[]) => Promise<InputMedia[]> | No | Gets media asset information for materials in the Timeline. Use this to fetch assets not registered in getEditingProjectMaterials, such as third-party media assets. | 4.13.4 |
asrConfig | No | Configuration for submitting smart subtitling tasks. | 4.13.0 | |
ttsConfig | No | Configuration for submitting smart dubbing (TTS) tasks. | 5.0.1 | |
disableAutoJobModal | boolean | No | Disables the automatic pop-up window that appears when an AI task is present in the project. | 5.0.1 |
disableGreenMatting | boolean | No | Disables the entry point for chroma keying. | 4.13.0 |
disableRealMatting | boolean | No | Disables the entry point for image matting. | 4.13.0 |
disableDenoise | boolean | No | Disables the entry point for noise reduction. | 4.13.0 |
audioWaveRenderDisabled | boolean | No | Disables waveform rendering. | 4.13.0 |
publicMaterials | No | Configuration for the public media asset library. | 4.13.0 | |
subtitleConfig | No | Configuration for subtitle backgrounds, gradients, and other options. | 4.13.0 | |
getStickerCategories | () => Promise<StickerCategory[]>; | No | Gets sticker categories. If this function is not provided, stickers will not be categorized. The returned Promise object must resolve with an array of sticker categories. | 3.0.0 |
getStickers | (config: {categoryId?: string; page: number; size: number}) => Promise<StickerResponse>; | No | Gets stickers. If stickers are not categorized, categoryId is empty. The returned Promise object must resolve with the total count and an array of stickers. | 3.0.0 |
getEditingProject | () => Promise<{timeline?: Timeline; projectId?: string; modifiedTime?: string}>; | Yes | Gets the project's Timeline. The returned Promise object must resolve with the Timeline data, project ID, and last modified time. | 3.0.0 |
updateEditingProject | (data: {coverUrl: string; duration: number; timeline: Timeline; isAuto: boolean}) => Promise<{projectId: string}>; | Yes | Saves the project's Timeline. Parameters include: project cover URL, duration (in seconds), Timeline data, and a boolean indicating if the save was automatic (the SDK auto-saves once per minute). The returned Promise object must resolve with the project ID. | 3.0.0 |
produceEditingProjectVideo | (data:{ coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline: Timeline; recommend: IProduceRecommend; }) => Promise<void>; | Yes | Produces the video. Parameters include: cover URL, duration in seconds, aspect ratio, media marks, Timeline data, and recommend (recommended resolution and bitrate for video production). The returned Promise object must resolve upon completion. | 4.4.0 |
customTexts | {importButton?:string;updateButton?:string;produceButton?:string;backButton?:string;logoUrl?:string;} | No | Customizes UI text. The parameters correspond to the text of the import, save, export, and back buttons, and the logo URL. | 3.7.0 |
customFontList | Array<string | CustomFontItem>; | No | Customizes font types. | 3.10.0 |
customVoiceGroups | No | Customizes voice options. | 4.3.5 | |
getPreviewWaterMarks | () => Promise<Array<{ url?: string; mediaId?:string; width?: number; height?: number; x?: number; y?: number; xPlusWidth?: number; yPlusHeight?: number; opacity?: number; }>>; | No | Adds a watermark to the preview area to prevent screen captures. The watermark is not included in the final produced video. Parameters:
| 4.3.5 |
exportVideoClipsSplit | (data: Array<{ coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline: Timeline; recommend?: IProduceRecommend; }>) => Promise<void>; | No | Splits multiple independent clips from the selected timeline into different timelines and exports them. Parameters include: default cover, duration, aspect ratio, media marks, timeline clips, and recommended production settings (resolution/bitrate). | 4.4.0 |
exportFromMediaMarks | (data: Array<{coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline:Timeline; recommend?: IProduceRecommend;}>,) => Promise<void>; | No | Splits multiple marked clips from the selected timeline into different timelines and exports them. Parameters include: default cover, duration, aspect ratio, media marks, timeline clips, and recommended production settings (resolution/bitrate). | 4.4.5 |
exportVideoClipsMerge | (data: { coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline: Timeline; recommend?:IProduceRecommend; }) => Promise<void>; | No | Merges multiple independent clips from the same track on the selected timeline into a single timeline and exports it. Parameters include: default cover, duration, aspect ratio, media marks, timeline clips, and recommended production settings (resolution/bitrate). | 4.4.0 |
getAudioByMediaId | (mediaId: string) =>Promise<string>; | No | Gets an audio URL to accelerate audio waveform rendering for a video. Pass the video asset ID. During initialization, the SDK prioritizes the returned audio URL to parse the video's audio waveform. The returned Promise object must resolve with the audio URL. | 4.3.5 |
hasTranscodedAudio | boolean | No | Specifies whether all videos in the project have Proxy Audio (transcoded audio). Valid values:
| 4.3.6 |
avatarConfig | No | Configuration for digital human. | 4.10.0 | |
disableAutoAspectRatio | boolean | No | Disables the pop-up that prompts the user to switch the aspect ratio based on media asset resolution. | 4.12.2 |
videoTranslation | VideoTranslation | No | Configuration for video translation. | 5.1.0 |
Data structure
PlayerAspectRatio
enum PlayerAspectRatio { w1h1 = '1:1', w2h1 = '2:1', w4h3 = '4:3', w3h4 = '3:4', w9h16 = '9:16', w16h9 = '16:9', w21h9 = '21:9', }VoiceConfig
interface VoiceConfig { volume: number; // The volume. Valid values: 0 to 100. Default value: 50. speech_rate: number; // The speech rate. Valid values: -500 to 500. Default value: 0. pitch_rate: number; // The pitch. Valid values: -500 to 500. Default value: 0. format?: string; // The output file format. Supported formats: PCM, WAV, and MP3. }InputMedia
type InputMedia = (InputVideo | InputAudio | InputImage) interface InputSource { sourceState?: 'ready' | 'loading' | 'fail'; } type MediaIdType = 'mediaId' | 'mediaURL'; interface SpriteConfig { num: string; lines: string; cols: string; cellWidth?: string; cellHeight?: string; } interface MediaMark { startTime: number; endTime: number; content: string; } interface InputVideo extends InputSource { mediaId: string; mediaIdType?: MediaIdType; mediaType: 'video'; video: { title: string; coverUrl?: string; duration: number; format?: string; src?: string; snapshots?: string[]; sprites?: string[]; spriteConfig?: SpriteConfig; width?: number; height?: number; rotate?: number; bitrate?: number; fps?: number; hasTranscodedAudio?: true; agentAudioSrc?: string; marks?: MediaMark[]; codec?: string; }; } interface InputAudio extends InputSource { mediaId: string; mediaIdType?: MediaIdType; mediaType: 'audio'; audio: { title: string; duration: number; coverUrl?: string; src?: string; marks?: MediaMark[]; formatNames?: string[]; }; } interface InputImage extends InputSource { mediaId: string; mediaIdType?: MediaIdType; mediaType: 'image'; image: { title: string; coverUrl?: string; src?: string; width?: number; height?: number; rotate?: number; }; } type TimelineMaterial = { mediaIdType: MediaIdType; mediaId: string; mediaType: MediaType };MediaMark
interface MediaMark { startTime: number; endTime: number; content: string; }ASRResult
interface ASRResult { content: string; // The content of the subtitle. from: number; // The start time offset of the subtitle, relative to the beginning of the media asset. to: number; // The end time offset of the subtitle, relative to the beginning of the media asset. }StickerCategory
interface StickerCategory { id: string; // The ID of the category. name: string; // The name of the category. The caller is responsible for language switching. }StickerResponse
interface Sticker { mediaId: string; src: string; } interface StickerResponse { total: number; stickers: Sticker[]; }IProduceRecommend
interface IProduceRecommend { width?: number; height?: number; bitrate?: number; }CustomFontItem
interface CustomFontItem { key: string; // The unique identifier of the font. name?: string; // The display name. If not provided, the key is used. url: string; // The URL of the font. // To maintain rendering consistency between the front-end and back-end, the rendered text size on the page is the font size you set, multiplied by this scale. fontServerScale?: { // The scale for common subtitle fonts. common: number; // The scale for decorated fonts. decorated: number; }; }VoiceGroup
export interface VoiceGroup { type: string; // The classification. category:string; // The main category. voiceList?: Voice[]; emptyContent?: { description: string; linkText: string; link: string; }; getVoiceList?: (page: number, pageSize: number) => Promise<{ items: Voice[]; total: number }>; getVoice?: (voiceId: string) => Promise<Voice | null>; getDemo?: (mediaId: string) => Promise<{ src: string }>; }Voice
export interface Voice { voiceUrl?: string; // The URL of the sample audio. demoMediaId?: string; // The playback URL of the sample audio. voiceType: VoiceType; // The type. voice: string; // The voice key. name: string; // The voice's display name. desc: string; // The description. tag?: string; // The tag. remark?: string; // Remarks, such as supported languages. custom?: boolean; // Indicates if it is an exclusive voice. }VoiceType
enum VoiceType { Male = 'Male', // Male voice. Female = 'Female', // Female voice. Boy = 'Boy', // Boy's voice. Girl = 'Girl', // Girl's voice. }AvatarConfig
// Digital human configurations. interface AvatarConfig { // Digital human list. getAvatarList: () => DigitalHumanList[]; // Submit a digital human task. submitAvatarVideoJob: <T extends keyof DigitalHumanJobParamTypes>( job: DigitalHumanJob<T>, ) => Promise<DigitalHumanJobInfo>; // Get the result of a digital human task. getAvatarVideoJob: (jobId: string) => Promise<DigitalHumanJobResult>; // The task polling interval. refreshInterval: number; // The output video configurations for the digital human. outputConfigs: Array<{ width: number; height: number; bitrates: number[]; }>; filterOutputConfig?: ( item: DigitalHuman, config: Array<{ width: number; height: number; bitrates: number[]; }>, ) => Array<{ width: number; height: number; bitrates: number[]; }>; } // Detailed type description for the digital human. // Digital human parameters. interface DigitalHuman { avatarId: string; // The digital human ID. avatarName: string; // The digital human name. coverUrl: string; // The digital human thumbnail. videoUrl?: string; // The URL of the digital human video demo. outputMask?: boolean; // Specifies whether to output a mask. transparent?: boolean; // Specifies whether the background is transparent. } // Digital human list. interface DigitalHumanList { default: boolean; id: string; name: string; getItems: (pageNo: number, pageSize: number) => Promise<{ total: number; items: DigitalHuman[] }>; } // Information returned after submitting a digital human task. interface DigitalHumanJobInfo { jobId: string; mediaId: string; } // Parameter types for the digital human task. type DigitalHumanJobParamTypes = { text: {//Text-driven. text?: string; params?: DigitalHumanTextParams; output?: DigitalHumanOutputParams; }; audio: {//Audio-file-driven. mediaId?: string; params?: DigitalHumanAudioParams; output?: DigitalHumanOutputParams; }; }; // text|audio type DigitalHumanJobType = keyof DigitalHumanJobParamTypes; // Parameters for a text-driven digital human task. type DigitalHumanTextParams = { voice: string; volume: number; speechRate: number; pitchRate: number; autoASRJob?: boolean; }; // Parameters for an audio-file-driven digital human task. type DigitalHumanAudioParams = { title: string; autoASRJob?: boolean; }; // Other parameters for the output digital human video. type DigitalHumanOutputParams = { bitrate: number; width: number; height: number; }; // The type of subtitle clip generated by the digital human. type SubtitleClip = { from: number; to: number; content: string }; // The polling result of a running digital human task. interface DigitalHumanJobResult { jobId: string; mediaId: string; done: boolean; errorMessage?: string; job?: DigitalHumanJob<any>; video?: InputVideo; subtitleClips?: SubtitleClip[]; } // The digital human task. type DigitalHumanJob<T extends DigitalHumanJobType> = { type: T; title: string; avatar: DigitalHuman; data: DigitalHumanJobParamTypes[T]; }; // The video generated by the digital human. interface InputVideo { mediaId: string; mediaType: 'video'; video: { title: string; coverUrl?: string; duration: number; src?: string; // If useDynamicUrl is set to true, you can leave src empty. snapshots?: string[]; sprites?: string[]; spriteConfig?: SpriteConfig;//Image sprite. width?: number; // The width of the source video. height?: number; // The height of the source video. rotate?: number; // The rotation angle of the source video. bitrate?: number; // The bitrate of the source video. fps?: number; // The frame rate of the source video. hasTranscodedAudio?: true; // Specifies whether it contains a transcoded audio stream. agentAudioSrc?: string; // Proxy audio URL for audio track separation. Can be omitted if useDynamicSrc is true. marks?: MediaMark[];// Video marks. }; }LicenseConfig
// License configurations. type LicenseConfig = { rootDomain?: string; // The root domain name for the license. For example, if the domain name is editor.abc.com, set this value to abc.com. licenseKey?: string; // The requested licenseKey. Apply for it as described in the usage notes at the top. }AsrConfig
// Configurations for smart subtitling. type AsrConfig = { interval?: number; // The polling interval in milliseconds. defaultText?: string; // The default text. maxPlaceHolderLength?: number; // The maximum length of the default text. submitASRJob: (mediaId: string, startTime: string, duration: string) => Promise<ASRJobInfo>; getASRJobResult?: (jobId: string) => Promise<ASRJobInfo>; } interface ASRJobInfo { jobId?: string; jobDone: boolean; jobError?: string; result?: ASRResult[]; }TTSConfig
// Configurations for the smart dubbing task. type TTSConfig = { interval?: number; // The polling interval in milliseconds. submitAudioProduceJob: (text: string, voice: string, voiceConfig?: VoiceConfig) => Promise<TTSJobInfo>; getAudioJobResult?: (jobId: string) => Promise<TTSJobInfo>; } interface VoiceConfig { volume: number; speech_rate: number; pitch_rate: number; format?: string; custom?: boolean; } interface TTSJobInfo { jobId?: string; jobDone: boolean; jobError?: string; asr?: AudioASRResult[]; result?: InputAudio | null; } interface AudioASRResult { begin_time?: string; end_time?: string; text?: string; content?: string; from?: number; to?: number; }PublicMaterialLibrary
// Configurations of the public material library. type PublicMaterialLibrary = { getLists: () => Promise<MaterialList[]>; name?: string; pageSize?: number; // The number of items to display per page. }; type MaterialList = { name?: string; key: string; tag?: string; mediaType: 'video' | 'audio' | 'image'; styleType?: 'video' | 'audio' | 'image' | 'background'; getItems: ( pageIndex: number, pageSize: number, ) => Promise<{ items: InputMedia[]; end: boolean; }>; };SubtitleConfig
type SubtitleConfig = { // Custom texture list. customTextures?: { list: () => Promise< Array<{ key: string; url: string; }> >; // Add a custom texture. onAddTexture: () => Promise<{ key: string; url: string; }>; // Delete a custom texture. onDeleteTexture: (key: string) => Promise<void>; }; }AliyunVideoEditor
// AliyunVideoEditor instance methods. type AliyunVideoEditor = { init: (config: IConfig) => void; // Initialize the editor. destroy: (keepState?: boolean) => boolean; // Destroy the editor. version: string | undefined; // Get the editor version. setCurrentTime: (currentTime: number) => void; // Set the editor preview time. getCurrentTime: () => number; // Get the editor preview time. getDuration: () => number; // Get the editor duration. addProjectMaterials: (materials: InputMedia[]) => void; // Add project materials to the editor. setProjectMaterials: (materials: InputMedia[]) => void; // Set project materials in the editor. updateProjectMaterials: (update: (materials: InputMedia[]) => InputMedia[]) => void; // Update the current project materials in the editor. deleteProjectMaterial: (mediaId: string) => void; // Delete a project material from the editor. setProjectTimeline: ({ VideoTracks, AudioTracks, AspectRatio }: CustomTimeline) => Promise<void>; // Set the editor's timeline. getProjectTimeline: () => any; // Get the editor's timeline. getEvents: (eventType?: 'ui' | 'player' | 'error' | 'websdk' | 'timeline') => IObservable<EventData<any>>; // Get the editor's events. importSubtitles: (type: 'ass' | 'srt' | 'clip' | 'asr', config: string) => void; // Import subtitles to the editor in batches. }VideoTranslation
type VideoTranslation = { language?: { // Source language source: Array<{ value: string; label: string; }>; // Target language target: Array<{ value: string; label: string; }>; }; // Video translation translation?: { interval?: number; submitVideoTranslationJob: (params: TranslationJobParams) => Promise<TranslationJobInfo>; getVideoTranslationJob: (jobId: string) => Promise<TranslationJobInfo>; }; // Subtitle removal detext?: { interval?: number; submitDetextJob: (param: DetextJobParams) => Promise<DetextJobInfo>; getDetextJob: (jobId: string) => Promise<DetextJobInfo>; }; // Subtitle extraction captionExtraction?: { interval?: number; submitCaptionExtractionJob: (param: CaptionExtractionJobParams) => Promise<CaptionExtractionJobInfo>; getCaptionExtractionJob: (jobId: string) => Promise<CaptionExtractionJobInfo>; }; }; interface TranslationJobParams { type: 'Video' | 'Text' | 'TextArray'; mediaId?: string; mediaIdType?: MediaIdType; text?: string; textArray?: string[]; editingConfig: { SourceLanguage: string; TargetLanguage: string; DetextArea?: string; SupportEditing?: boolean; SubtitleTranslate?: { TextSource: 'OCR' | 'SubtitleFile'; OcrArea?: string; SubtitleConfig?: string; }; }; } interface TranslationJobInfo { jobId?: string; jobDone: boolean; jobError?: string; result?: { video?: InputVideo; timeline?: string; text?: string; textArray?: Array<{ Target: string; Source: string; }>; }; } interface DetextJobParams { mediaId: string; mediaIdType: MediaIdType; box?: 'auto' | Array<[number, number, number, number]>; } interface DetextJobInfo { jobId?: string; jobDone: boolean; jobError?: string; result?: { video?: InputVideo; }; } interface CaptionExtractionJobParams { mediaId: string; mediaIdType: MediaIdType; box?: 'auto' | Array<[number, number, number, number]>; } interface CaptionExtractionJobInfo { jobId?: string; jobDone: boolean; jobError?: string; result?: { srtContent?: string; }; }
init() sample code
The Web SDK handles UI interactions but does not make API requests. The SDK calls the functions you provide; your server-side code is responsible for making the actual requests to the relevant Alibaba Cloud APIs based on your AccessKey pair (AccessKey ID and AccessKey Secret).
// Note: The Web SDK itself does not provide the 'request' method. This is just an example. You can use your preferred network request library, such as axios.
window.AliyunVideoEditor.init({
container: document.getElementById('aliyun-video-editor'),
locale: 'zh-CN',
licenseConfig: {
rootDomain: "", // The root domain name for the license. For example, abc.com.
licenseKey: "", // The requested licenseKey. If no licenseKey is configured, a watermark appears during preview. Without a license, you can only preview on the localhost domain.
},
useDynamicSrc: true, // By default, playback URLs in the media asset library expire. Therefore, you need to dynamically obtain them.
getDynamicSrc: (mediaId, mediaType) => new Promise((resolve, reject) => {
request('GetMediaInfo', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-getmediainfo
MediaId: mediaId
}).then((res) => {
if (res.code === '200') {
// Note: This is for demonstration only. In practice, handle errors properly to prevent exceptions, such as an error when FileInfoList is an empty array.
resolve(res.data.MediaInfo.FileInfoList[0].FileBasicInfo.FileUrl);
} else {
reject();
}
});
}),
getEditingProjectMaterials: () => {
if (projectId) { // If you do not have a projectId, create a video editing project in the IMS console to obtain one.
return request('GetEditingProjectMaterials', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-geteditingprojectmaterials
ProjectId: projectId
}).then((res) => {
const data = res.data.MediaInfos;
return transMediaList(data); // You need to transform the data. For more information, see the following sections.
});
}
return Promise.resolve([]);
},
searchMedia: (mediaType) => { // mediaType indicates the current media asset tab (video, audio, or image). You can display addable media assets of the corresponding type based on this parameter.
return new Promise((resolve) => {
// You must implement the UI for displaying and selecting media assets to add. 'callDialog' is just an example and is not provided by the Web SDK.
// For information on displaying media assets, see: https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-listmediabasicinfos
callDialog({
onSubmit: async (materials) => {
if (!projectId) { // If you do not have a projectId, create a project first. If you can ensure that a projectId exists, this step is not required.
const addRes = await request('CreateEditingProject', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-createeditingproject
Title: 'xxxx',
});
projectId = addRes.data.Project.ProjectId;
}
// Assemble the data.
const valueObj = {};
materials.forEach(({ mediaType, mediaId }) => {
if (!valueObj[mediaType]) {
valueObj[mediaType] = mediaId;
} else {
valueObj[mediaType] += mediaId;
}
})
const res = await request('AddEditingProjectMaterials', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-addeditingprojectmaterials
ProjectId: projectId,
MaterialMaps: valueObj,
});
if (res.code === '200') {
return resolve(transMediaList(res.data.MediaInfos));
}
resolve([]);
}
});
});
},
deleteEditingProjectMaterials: async (mediaId, mediaType) => {
const res = await request('DeleteEditingProjectMaterials', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-deleteeditingprojectmaterials
ProjectId: projectId,
MaterialType: mediaType,
MaterialIds: mediaId
});
if (res.code === '200') return Promise.resolve();
return Promise.reject();
},
getStickerCategories: async () => {
const res = await request('ListAllPublicMediaTags', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-listallpublicmediatags
BusinessType: 'sticker',
WebSdkVersion: window.AliyunVideoEditor.version
});
const stickerCategories = res.data.MediaTagList.map(item => ({
id: item.MediaTagId,
name: myLocale === 'zh-CN' ? item.MediaTagNameChinese : item.MediaTagNameEnglish // myLocale is your desired language.
}));
return stickerCategories;
},
getStickers: async ({ categoryId, page, size }) => {
const params = {
PageNo: page,
PageSize: size,
IncludeFileBasicInfo: true,
MediaTagId: categoryId
};
const res = await request('ListPublicMediaBasicInfos', params); // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-listpublicmediabasicinfos
const fileList = res.data.MediaInfos.map(item => ({
mediaId: item.MediaId,
src: item.FileInfoList[0].FileBasicInfo.FileUrl
}));
return {
total: res.data.TotalCount,
stickers: fileList
};
},
getEditingProject: async () => {
if (projectId) {
const res = await request('GetEditingProject', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-geteditingproject
ProjectId: projectId
});
const timelineString = res.data.Project.Timeline;
return {
projectId,
timeline: timelineString ? JSON.parse(timelineString) : undefined,
modifiedTime: res.data.Project.ModifiedTime,
title:res.data.Project.Title // Project title
};
}
return {};
},
updateEditingProject: ({ coverUrl, duration, timeline, isAuto }) => new Promise((resolve, reject) => {
request('UpdateEditingProject', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-updateeditingproject
ProjectId: projectId,
CoverURL: coverUrl,
Duration: duration,
Timeline: JSON.stringify(timeline)
}).then((res) => {
if (res.code === '200') {
// The Web SDK performs autosaves. The isAuto parameter tells the caller whether this save was an autosave. The caller can control the display of a success message to show it only on manual saves.
!isAuto && Message.success('Saved successfully');
resolve();
} else {
reject();
}
});
}),
produceEditingProjectVideo: ({ coverUrl, duration = 0, aspectRatio, timeline, recommend }) => {
return new Promise((resolve) => {
callDialog({ // he caller needs to implement the UI for submitting the production job. 'callDialog' is just an example.
onSubmit: async ({ fileName, format, bitrate, description }) => { // Assume that you have obtained this data from the production task submission interface.
// First, concatenate fileName and format to create the stored mediaURL.
const mediaURL = `http://bucketName.oss-cn-hangzhou.aliyuncs.com/${fileName}.${format}`;
// Determine the production width and height based on the aspect ratio passed by the Web SDK.
const width = aspectRatio === '16:9' ? 640 : 360;
const height = aspectRatio === '16:9' ? 360 : 640;
// If the video or image assets include width, height, and bitrate, the `recommend` object returned by this function will contain a recommended resolution and bitrate calculated from those assets.
// For the recommend data structure, see IProduceRecommend.
// You can display the recommended data on the submission interface or use it directly in the submission API parameters.
const res = await request('SubmitMediaProducingJob', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-submitmediaproducingjob
OutputMediaConfig: JSON.stringify({
mediaURL,
bitrate: recommend.bitrate || bitrate,
width: recommend.width || width,
height: recommend.height || height
}),
OutputMediaTarget: 'oss-object',
ProjectMetadata: JSON.stringify({ Description: description }),
ProjectId: projectId,
Timeline: JSON.stringify(timeline)
});
if (res.code === '200') {
Message.success('Video generated successfully');
}
resolve();
}
});
});
}
});
/**
* Transform the material information from the server into the format required by the Web SDK.
*/
function transMediaList(data) {
if (!data) return [];
if (Array.isArray(data)) {
return data.map((item) => {
const basicInfo = item.MediaBasicInfo;
const fileBasicInfo = item.FileInfoList[0].FileBasicInfo;
const mediaId = basicInfo.MediaId;
const result = {
mediaId
};
const mediaType = basicInfo.MediaType
result.mediaType = mediaType;
if (mediaType === 'video') {
result.video = {
title: fileBasicInfo.FileName,
duration: Number(fileBasicInfo.Duration),
// The width, height, and bitrate of the source video. These values are used for production recommendations. No recommendations are generated if these values are not provided or are 0.
width: Number(fileBasicInfo.Width) || 0,
height: Number(fileBasicInfo.Height) || 0,
bitrate: Number(fileBasicInfo.Bitrate) || 0,
coverUrl: basicInfo.CoverURL
};
const spriteImages = basicInfo.SpriteImages
if (spriteImages) {
try {
const spriteArr = JSON.parse(spriteImages);
const sprite = spriteArr[0];
const config = JSON.parse(sprite.Config);
result.video.spriteConfig = {
num: config.Num,
lines: config.SpriteSnapshotConfig?.Lines,
cols: config.SpriteSnapshotConfig?.Columns,
cellWidth: config.SpriteSnapshotConfig?.CellWidth,
cellHeight: config.SpriteSnapshotConfig?.CellHeight
};
result.video.sprites = sprite.SnapshotUrlList;
} catch (e) {
console.log(e);
}
}
} else if (mediaType === 'audio') {
result.audio = {
title: fileBasicInfo.FileName,
duration: Number(fileBasicInfo.Duration),
coverURL: '' // Provide a default thumbnail for the audio file.
}
} else if (mediaType === 'image') {
result.image = {
title: fileBasicInfo.FileName,
coverUrl: fileBasicInfo.FileUrl,
// The width, height, etc., of the image. These values are used for production recommendations. No recommendations are generated if these values are not provided or are 0.
width: Number(fileBasicInfo.Width) || 0,
height: Number(fileBasicInfo.Height) || 0,
}
}
return result;
});
} else {
return [data];
}
}