Intelligent production provides professional online video editing capabilities. You can edit videos in the timeline. This allows you to perform automated and intelligent video editing or create videos with multiple users. This topic describes how to integrate the video editing SDK for web.
Usage notes
In this topic, the video editing SDK for web V5.0.1 is used. Before you use the video editing SDK for web V5.0.1 or later, you must apply for a license. For more information about how to obtain the video editing SDK for web of the latest version, see the note on the Video Editing Projects tab in the Intelligent Media Services (IMS) console.
Submit a ticket to apply for a license.
Procedure
Integrate the video editing SDK for web.
Import the CSS file of the video editing SDK for web under the
<head>
tag in the HTML document of the project. Sample code:<head> <link rel="stylesheet" href="https://g.alicdn.com/thor-server/video-editing-websdk/5.0.0/index.css"> </head>
Under the
<body>
tag, add a<div>
node that is used to mount the editing window, import the JavaScript file of the video editing SDK for web, and then add a<script>
node that is used to call the video editing SDK for web.<body> <div id="aliyun-video-editor" style="height:700px"></div> // Change the height of the container based on your business requirements. <script src="https://g.alicdn.com/thor-server/video-editing-websdk/5.0.0/index.js"></script> <script> // Place the code that is used to call the video editing SDK for web here. </script> </body>
Initialize the video editing SDK for web.
window.AliyunVideoEditor.init(config);
For more information about the
config
object, see the config section of this topic.For more information about the sample code of calling the
init()
initialization function, see the Sample code of calling init() section of this topic.
config
config parameters
Parameter | Type | Required | Description | SDK version |
locale | string | No | The language of the user interface (UI). Valid values:
| 3.0.0 |
container | Element | Yes | The document object model (DOM) node that is used to mount the editing window in the video editing SDK for web. | 3.0.0 |
defaultAspectRatio | No | The default aspect ratio of the video preview area. Default value: 16:9. | 3.4.0 | |
defaultSubtitleText | string | No | The default subtitle text. The value can be up to 20 characters in length. The default subtitle text is "Online Editing". | 3.6.0 |
useDynamicSrc | boolean | No | Specifies whether to dynamically obtain resource information. | 3.0.0 |
getDynamicSrc | (mediaId: string, mediaType: 'video' | 'audio' | 'image' | 'font', mediaOrigin?:'private' | 'public', inputUrl?: string) => Promise<string>; | No | The operation that is called to dynamically obtain resource information. This parameter is required if you specify the useDynamicSrc parameter to dynamically obtain resource information. You must resolve the resource information in the returned Promise object. | 3.10.0 |
getEditingProjectMaterials | () => Promise<InputMedia[]>; | Yes | The operation that is called to obtain materials that are associated with the project. You must resolve all types of materials in the returned Promise object. | 3.0.0 |
searchMedia | (mediaType: 'video' | 'audio' | 'image') => Promise<InputMedia[]>; | Yes | The function that is called after the user clicks Add from Media Asset Library. After the user clicks Add from Media Asset Library, the system searches for the media assets in the media asset library and imports the desired material to the editing window. You must resolve the material array in the returned Promise object. Important You must call the AddEditingProjectMaterials operation to associate the material with the project. | 3.0.0 |
deleteEditingProjectMaterials | (mediaId: string, mediaType: 'video' | 'audio' | 'image') => Promise<void>; | Yes | The operation that is called to disassociate materials from the project. You must resolve the returned Promise object. | 3.0.0 |
submitASRJob | (mediaId: string, startTime: string, duration: string) => Promise<ASRResult[]>; | No | The operation that is called to submit intelligent subtitle recognition tasks. You must resolve the identified ASRResult array in the returned Promise object. | 3.1.0 We recommend that you use AsrConfig. If you use AsrConfig, the submitASRJob method is overwritten. |
submitAudioProduceJob | (text: string, voice: string, voiceConfig?: VoiceConfig) => Promise<InputMedia>; | No | The operation that is called to submit text-to-speech tasks. Specify the following information in the request: subtitle content, sample intelligent voices, and speech configurations. You must resolve the generated speech data in the returned Promise object. | 4.3.5 We recommend that you use TTSConfig. If you use TTSConfig, the submitAudioProduceJob method is overwritten. |
licenseConfig | Yes | The license configurations before you use the video editing SDK for web. You can use the video editing SDK for web in the production environment only after a license is configured. If a license is not configured, you can use the video editing SDK for web only under the localhost domain name. In this case, watermarks appear if you use the video editing SDK for web under the localhost domain name, and a black screen issue occurs if you preview a video in the production environment. | 5.0.1 | |
dynamicSrcQps | number | No | The operation that is called to limit the frequency of dynamically loading media assets. | 4.13.0 |
getTimelineMaterials | (params: TimelineMaterial[]) => Promise<InputMedia[]> | No | The operation that is called to obtain the media assets in the timeline. After this operation is called, you can obtain the unregistered media assets in the returned result of the getEditingProjectMaterials operation, such as third-party media assets. | 4.13.4 |
asrConfig | No | The configurations of submitting an intelligent subtitling task. | 4.13.0 | |
ttsConfig | No | The configurations of submitting an intelligent dubbing task. | 5.0.1 | |
disableAutoJobModal | boolean | No | Specifies whether to close the window that opens automatically when an AI task appears in the project. | 5.0.1 |
disableGreenMatting | boolean | No | Specifies whether to disable image matting. | 4.13.0 |
disableRealMatting | boolean | No | Specifies whether to disable background replacement. | 4.13.0 |
disableDenoise | boolean | No | Specifies whether to disable noise reduction. | 4.13.0 |
audioWaveRenderDisabled | boolean | No | Specifies whether to disable waveform graph rendering. | 4.13.0 |
publicMaterials | No | The configurations of the public media asset library. | 4.13.0 | |
subtitleConfig | No | The configurations such as gradient background colors of subtitles. | 4.13.0 | |
getStickerCategories | () => Promise<StickerCategory[]>; | No | The operation that is called to obtain the categories of stickers. If you leave this parameter empty, the stickers are not categorized. You must resolve the category array in the returned Promise object. | 3.0.0 |
getStickers | (config: {categoryId?: string; page: number; size: number}) => Promise<StickerResponse>; | No | The operation that is called to obtain the stickers. If you do not specify categories for stickers, leave the categoryId parameter empty. You must resolve the total number of stickers and sticker arrays in the returned Promise object. | 3.0.0 |
getEditingProject | () => Promise<{timeline?: Timeline; projectId?: string; modifiedTime?: string}>; | Yes | The operation that is called to obtain the timeline of the project. You must resolve the timeline data, project ID, and last modification time in the returned Promise object. | 3.0.0 |
updateEditingProject | (data: {coverUrl: string; duration: number; timeline: Timeline; isAuto: boolean}) => Promise<{projectId: string}>; | Yes | The operation that is called to save the timeline of the project after modification. Specify the following information about the project in the request: the thumbnail URL, duration (unit: seconds), timeline data, and whether to automatically save the project timeline. By default, the project timeline is automatically saved every minute. You must resolve the project ID in the returned Promise object. | 3.0.0 |
produceEditingProjectVideo | (data:{ coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline: Timeline; recommend: IProduceRecommend; }) => Promise<void>; | Yes | The operation that is called to generate the output video. Specify the following information about the output video in the request: the thumbnail URL, duration (unit: seconds), aspect ratio, media asset tags, timeline data, and recommended resolution or bitrate of the output video. You must resolve the returned Promise object. | 4.4.0 |
customTexts | {importButton?:string;updateButton?:string;produceButton?:string;backButton?:string;logoUrl?:string;} | No | The custom text for UI elements. Specify the custom text that you want to display on the Add from Media Asset Library, Save, Export, and Back buttons in the editing window and the logo displayed in the upper-left corner of the editing window. | 3.7.0 |
customFontList | Array<string | CustomFontItem>; | No | The custom fonts. | 3.10.0 |
customVoiceGroups | No | The custom voice options. | 4.3.5 | |
getPreviewWaterMarks | () => Promise<Array<{ url?: string; mediaId?:string; width?: number; height?: number; x?: number; y?: number; xPlusWidth?: number; yPlusHeight?: number; opacity?: number; }>>; | No | The operation that is called to add a watermark to the preview area. The watermark prevents unauthorized snapshots during preview and is not displayed on output videos. The following items describe the request parameters:
| 4.3.5 |
exportVideoClipsSplit | (data: Array<{ coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline: Timeline; recommend?: IProduceRecommend; }>) => Promise<void>; | No | The operation that is called to split multiple independent segments in the selected timeline and export them as different timelines. Specify the following information about the segments in the request: the default thumbnail image, timeline duration, aspect ratio, media asset tags, timeline configuration, and recommended resolution or bitrate of the output video. | 4.4.0 |
exportFromMediaMarks | (data: Array<{coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline:Timeline; recommend?: IProduceRecommend;}>,) => Promise<void>; | No | The operation that is called to split multiple tag segments in the selected timeline and export them as different timelines. Specify the following information about the segments in the request: the default thumbnail image, timeline duration, aspect ratio, media asset tags, timeline configuration, and recommended resolution or bitrate of the output video. | 4.4.5 |
exportVideoClipsMerge | (data: { coverUrl: string; duration: number; aspectRatio: PlayerAspectRatio; mediaMarks: MediaMark[]; timeline: Timeline; recommend?:IProduceRecommend; }) => Promise<void>; | No | The operation that is called to merge multiple independent segments in the selected timeline and export them as another timeline. Specify the following information about the output video in the request: the default thumbnail image, timeline duration, aspect ratio, media asset tags, timeline configuration, and recommended resolution or bitrate of the output video. | 4.4.0 |
getAudioByMediaId | (mediaId: string) =>Promise<string>; | No | The operation that is called to obtain the audio URL, which is used to accelerate audio waveform graph drawing. The input parameter is mediaId, which specifies the video material ID. If you specify this parameter during initialization, the SDK preferentially uses the returned audio URL to resolve the audio waveform of the video. You must resolve the audio URL in the returned Promise object. | 4.3.5 |
hasTranscodedAudio | boolean | No | Specifies whether all video materials that are imported into the project have proxy audio, which is the transcoded audio. Valid values:
| 4.3.6 |
avatarConfig | No | The integration configurations of the digital human. | 4.10.0 | |
disableAutoAspectRatio | boolean | No | Specifies whether to close the aspect ratio switching window based on the resolution of the material. | 4.12.2 |
Data structure
PlayerAspectRatio
enum PlayerAspectRatio { w1h1 = '1:1', w2h1 = '2:1', w4h3 = '4:3', w3h4 = '3:4', w9h16 = '9:16', w16h9 = '16:9', w21h9 = '21:9', }
VoiceConfig
interface VoiceConfig { volume: number; // The volume. Valid values: 0 to 100. Default value: 50. speech_rate: number; // The speech tempo. Valid values: -500 to 500. Default value: 0. pitch_rate: number; // The intonation. Valid values: -500 to 500. Default value: 0. format?: string; // The format of the output file. Valid values: PCM, WAV, and MP3. }
InputMedia
type InputMedia = (InputVideo | InputAudio | InputImage) interface InputSource { sourceState?: 'ready' | 'loading' | 'fail'; } type MediaIdType = 'mediaId' | 'mediaURL'; interface SpriteConfig { num: string; lines: string; cols: string; cellWidth?: string; cellHeight?: string; } interface MediaMark { startTime: number; endTime: number; content: string; } interface InputVideo extends InputSource { mediaId: string; mediaIdType?: MediaIdType; mediaType: 'video'; video: { title: string; coverUrl?: string; duration: number; format?: string; src?: string; snapshots?: string[]; sprites?: string[]; spriteConfig?: SpriteConfig; width?: number; height?: number; rotate?: number; bitrate?: number; fps?: number; hasTranscodedAudio?: true; agentAudioSrc?: string; marks?: MediaMark[]; codec?: string; }; } interface InputAudio extends InputSource { mediaId: string; mediaIdType?: MediaIdType; mediaType: 'audio'; audio: { title: string; duration: number; coverUrl?: string; src?: string; marks?: MediaMark[]; formatNames?: string[]; }; } interface InputImage extends InputSource { mediaId: string; mediaIdType?: MediaIdType; mediaType: 'image'; image: { title: string; coverUrl?: string; src?: string; width?: number; height?: number; rotate?: number; }; } type TimelineMaterial = { mediaIdType: MediaIdType; mediaId: string; mediaType: MediaType };
MediaMark
interface MediaMark { startTime: number; endTime: number; content: string; }
ASRResult
interface ASRResult { content: string; // The subtitle text. from: number; // The time offset, which is the duration between the start time of the subtitle text and that of the material recognized. to: number; // The time offset, which is the duration between the end time of the subtitle text and the start time of the material recognized. }
StickerCategory
interface StickerCategory { id: string; // The ID of the category. name: string; // The name of the category. You must specify category names in the language that you specify. }
StickerResponse
interface Sticker { mediaId: string; src: string; } interface StickerResponse { total: number; stickers: Sticker[]; }
IProduceRecommend
interface IProduceRecommend { width?: number; height?: number; bitrate?: number; }
CustomFontItem
interface CustomFontItem { key: string; // The unique identifier of the font. name?: string; // The display name of the font. If you do not specify this parameter, the value of the key parameter is used. url: string; // The URL of the font. // The multiple that is used to keep font rendering consistent at the frontend and backend. The page text rendering size is the specified font size multiplied by this multiple. fontServerScale?: { // The font multiple of regular subtitles. common: number; // The font multiple of word art. decorated: number; }; }
VoiceGroup
export interface VoiceGroup { type: string; // The type. category: string; // The category. voiceList?: Voice[]; emptyContent?: { description: string; linkText: string; link: string; }; getVoiceList?: (page: number, pageSize: number) => Promise<{ items: Voice[]; total: number }>; getVoice?: (voiceId: string) => Promise<Voice | null>; getDemo?: (mediaId: string) => Promise<{ src: string }>; }
Voice
export interface Voice { voiceUrl?: string; // The sample audio URL. demoMediaId?: string; // The sample audio URL that can be used for direct playback. voiceType: VoiceType; // The type of the voice. voice: string; // The key of the voice. name: string; // The name of the person. desc: string; // The description of the voice. tag?: string; // The tag of the voice. remark?: string; // The information such as languages that are supported by the voice. custom?: boolean; // Specifies whether to use the dedicated voice. }
VoiceType
enum VoiceType { Male = 'Male', // The male voice. Female = 'Female', // The female voice. Boy = 'Boy', // The boy voice. Girl = 'Girl', // The girl voice. }
AvatarConfig
// The configurations of the digital human. interface AvatarConfig { // The digital humans. getAvatarList: () => DigitalHumanList[]; // Submit the digital human task. submitAvatarVideoJob: <T extends keyof DigitalHumanJobParamTypes>( job: DigitalHumanJob<T>, ) => Promise<DigitalHumanJobInfo>; // Obtain the result of the digital human task. getAvatarVideoJob: (jobId: string) => Promise<DigitalHumanJobResult>; // The interval at which the status of the task is polled. refreshInterval: number; // The configurations of the output digital human video. outputConfigs: Array<{ width: number; height: number; bitrates: number[]; }>; filterOutputConfig?: ( item: DigitalHuman, config: Array<{ width: number; height: number; bitrates: number[]; }>, ) => Array<{ width: number; height: number; bitrates: number[]; }>; } // The details of the digital human type. // The digital human parameters. interface DigitalHuman { avatarId: string; // The ID of the digital human. avatarName: string; // The name of the digital human. coverUrl: string; // The thumbnail of the digital human video. videoUrl?: string; // The URL of the digital human video demo. outputMask?: boolean; // Specifies whether to generate a mask video. transparent?: boolean; // Specifies whether the background is transparent. } // The digital humans. interface DigitalHumanList { default: boolean; id: string; name: string; getItems: (pageNo: number, pageSize: number) => Promise<{ total: number; items: DigitalHuman[] }>; } // The information returned for the digital human task that is submitted. interface DigitalHumanJobInfo { jobId: string; mediaId: string; } // The parameter types of the digital human task. type DigitalHumanJobParamTypes = { text: {// The text-driven digital human task. text?: string; params?: DigitalHumanTextParams; output?: DigitalHumanOutputParams; }; audio: {// The audio-driven digital human task. mediaId?: string; params?: DigitalHumanAudioParams; output?: DigitalHumanOutputParams; }; }; // text|audio type DigitalHumanJobType = keyof DigitalHumanJobParamTypes; // The parameters for the text-driven digital human task. type DigitalHumanTextParams = { voice: string; volume: number; speechRate: number; pitchRate: number; autoASRJob?: boolean; }; // The parameters for the audio-driven digital human task. type DigitalHumanAudioParams = { title: string; autoASRJob?: boolean; }; // Other parameters for the output digital human video. type DigitalHumanOutputParams = { bitrate: number; width: number; height: number; }; // The type of digital human subtitle segments that are generated. type SubtitleClip = { from: number; to: number; content: string }; // The polling result of the digital human task. interface DigitalHumanJobResult { jobId: string; mediaId: string; done: boolean; errorMessage?: string; job?: DigitalHumanJob<any>; video?: InputVideo; subtitleClips?: SubtitleClip[]; } // The digital human task. type DigitalHumanJob<T extends DigitalHumanJobType> = { type: T; title: string; avatar: DigitalHuman; data: DigitalHumanJobParamTypes[T]; }; // The digital human video that is generated. interface InputVideo { mediaId: string; mediaType: 'video'; video: { title: string; coverUrl?: string; duration: number; src?: string; // If useDynamicUrl is set to true, you can leave this parameter empty. snapshots?: string[]; sprites?: string[]; spriteConfig?: SpriteConfig; // The image sprite. width?: number; // The width of the source video. height?: number; // The height of the source video. rotate?: number; // The rotation angle of the source video. bitrate?: number; // The bitrate of the source video. fps?: number; // The frame rate of the source video. hasTranscodedAudio?: true; // Specifies whether the transcoded audio stream is included. agentAudioSrc?: string; // The URL of the proxy audio that is used to separate audio tracks. If useDynamicUrl is set to true, you can leave this parameter empty. marks?: MediaMark[]; // The video tag. }; }
LicenseConfig
// The license configurations. type LicenseConfig = { rootDomain?: string; // The root domain name used by the license. For example, if the used domain name is editor.abc.com, set this parameter to abc.com. licenseKey?: string; // The applied license key. You can apply for a license key in the IMS console by referring to the "Usage notes" section of this topic. }
AsrConfig
// The configurations of intelligent subtitling. type AsrConfig = { interval?: number; // The interval at which the status of the task is polled. Unit: milliseconds. defaultText?: string; // The default text. maxPlaceHolderLength?: number; // The maximum length of the default text. submitASRJob: (mediaId: string, startTime: string, duration: string) => Promise<ASRJobInfo>; getASRJobResult?: (jobId: string) => Promise<ASRJobInfo>; } interface ASRJobInfo { jobId?: string; jobDone: boolean; jobError?: string; result?: ASRResult[]; }
TTSConfig
// The configurations of the intelligent dubbing task. type TTSConfig = { interval?: number; // The interval at which the status of the task is polled. Unit: milliseconds. submitAudioProduceJob: (text: string, voice: string, voiceConfig?: VoiceConfig) => Promise<TTSJobInfo>; getAudioJobResult?: (jobId: string) => Promise<TTSJobInfo>; } interface VoiceConfig { volume: number; speech_rate: number; pitch_rate: number; format?: string; custom?: boolean; } interface TTSJobInfo { jobId?: string; jobDone: boolean; jobError?: string; asr?: AudioASRResult[]; result?: InputAudio | null; } interface AudioASRResult { begin_time?: string; end_time?: string; text?: string; content?: string; from?: number; to?: number; }
PublicMaterialLibrary
// The configurations of the public media asset library. type PublicMaterialLibrary = { getLists: () => Promise<MaterialList[]>; name?: string; pageSize?: number; // The number of media assets displayed on each page. }; type MaterialList = { name?: string; key: string; tag?: string; mediaType: 'video' | 'audio' | 'image'; styleType?: 'video' | 'audio' | 'image' | 'background'; getItems: ( pageIndex: number, pageSize: number, ) => Promise<{ items: InputMedia[]; end: boolean; }>; };
SubtitleConfig
type SubtitleConfig = { // The custom textures. customTextures?: { list: () => Promise< Array<{ key: string; url: string; }> >; // Add a custom texture. onAddTexture: () => Promise<{ key: string; url: string; }>; // Remove a custom texture. onDeleteTexture: (key: string) => Promise<void>; }; }
AliyunVideoEditor
// The AliyunVideoEditor instance method. type AliyunVideoEditor = { init: (config: IConfig) => void; // Initialize the editor. destroy: (keepState?: boolean) => boolean; // Destroy the editor. version: string | undefined; // Obtain the editor version. setCurrentTime: (currentTime: number) => void; // Specify the editor preview time. getCurrentTime: () => number; // Obtain the editor preview time. getDuration: () => number; // Obtain the total duration of the ongoing editing projects in the editor. addProjectMaterials: (materials: InputMedia[]) => void; // Add project materials to the editor. setProjectMaterials: (materials: InputMedia[]) => void; // Configure project materials in the editor. updateProjectMaterials: (update: (materials: InputMedia[]) => InputMedia[]) => void; // Update the current project material in the editor. deleteProjectMaterial: (mediaId: string) => void; // Remove a project material from the editor. setProjectTimeline: ({ VideoTracks, AudioTracks, AspectRatio }: CustomTimeline) => Promise<void>; // Configure the timeline of the editor. getProjectTimeline: () => any; // Obtain the timeline of the editor. getEvents: (eventType?: 'ui' | 'player' | 'error' | 'websdk' | 'timeline') => IObservable<EventData<any>>; // Obtain the events of the editor. importSubtitles: (type: 'ass' | 'srt' | 'clip' | 'asr', config: string) => void; // Import multiple subtitle files to the editor at a time. }
Sample code of calling init()
The video editing SDK for web is used to support UI interactions and does not send requests. You must develop request logic and use the video editing SDK for web to call the request logic. The request must be sent to your server and forwarded to Alibaba Cloud OpenAPI Explorer based on the AccessKey ID and the AccessKey secret.
// The video editing SDK for web does not provide request logic. The following sample code is provided only for reference. You can use a network library such as Axios based on your business requirements.
window.AliyunVideoEditor.init({
container: document.getElementById('aliyun-video-editor'),
locale: 'zh-CN',
licenseConfig: {
rootDomain: "", // The root domain name used by the license. Example: abc.com.
licenseKey: "", // The applied license key. If a license key is not configured, watermarks appear when you preview a video under the localhost domain name.
},
useDynamicSrc: true, // Dynamically obtain the playback URL. By default, the playback URLs of media assets in the media asset library expire after a period of time.
getDynamicSrc: (mediaId, mediaType) => new Promise((resolve, reject) => {
request('GetMediaInfo', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-getmediainfo
MediaId: mediaId
}).then((res) => {
if (res.code === '200') {
// The following sample code is provided only for reference. We recommend that you configure error logic. For example, you can configure the error message that is returned if FileInfoList is an empty array.
resolve(res.data.MediaInfo.FileInfoList[0].FileBasicInfo.FileUrl);
} else {
reject();
}
});
}),
getEditingProjectMaterials: () => {
if (projectId) { // If no project ID exists, create an editing project in the IMS console and obtain the project ID.
return request('GetEditingProjectMaterials', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-geteditingprojectmaterials
ProjectId: projectId
}).then((res) => {
const data = res.data.MediaInfos;
return transMediaList(data); // Convert data. For more information, see the following section.
});
}
return Promise.resolve([]);
},
searchMedia: (mediaType) => { // mediaType specifies the type of materials that are displayed on a tab. Valid values: video, audio, and image. You can use this parameter to display materials of the same type that are available for video editing.
return new Promise((resolve) => {
// You need to develop logic for the material area that is used to display and add materials. The video editing SDK for web does not provide the logic. In the following sample code, callDialog is used.
// To query the basic information of media assets, call the ListMediaBasicInfos operation of IMS by referring to the following link: https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-listmediabasicinfos.
callDialog({
onSubmit: async (materials) => {
if (!projectId) { // If no project ID exists, create a project. If a project ID exists, skip this step.
const addRes = await request('CreateEditingProject', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-addeditingprojectmaterials
Title: 'xxxx',
});
projectId = addRes.data.Project.ProjectId;
}
// Assemble data.
const valueObj = {};
materials.forEach(({ mediaType, mediaId }) => {
if (!valueObj[mediaType]) {
valueObj[mediaType] = mediaId;
} else {
valueObj[mediaType] += mediaId;
}
})
const res = await request('AddEditingProjectMaterials', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-addeditingprojectmaterials
ProjectId: projectId,
MaterialMaps: valueObj,
});
if (res.code === '200') {
return resolve(transMediaList(res.data.MediaInfos));
}
resolve([]);
}
});
});
},
deleteEditingProjectMaterials: async (mediaId, mediaType) => {
const res = await request('DeleteEditingProjectMaterials', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-deleteeditingprojectmaterials
ProjectId: projectId,
MaterialType: mediaType,
MaterialIds: mediaId
});
if (res.code === '200') return Promise.resolve();
return Promise.reject();
},
getStickerCategories: async () => {
const res = await request('ListAllPublicMediaTags', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-listallpublicmediatags
BusinessType: 'sticker',
WebSdkVersion: window.AliyunVideoEditor.version
});
const stickerCategories = res.data.MediaTagList.map(item => ({
id: item.MediaTagId,
name: myLocale === 'zh-CN' ? item.MediaTagNameChinese : item.MediaTagNameEnglish // myLocale specifies the language that you want to use.
}));
return stickerCategories;
},
getStickers: async ({ categoryId, page, size }) => {
const params = {
PageNo: page,
PageSize: size,
IncludeFileBasicInfo: true,
MediaTagId: categoryId
};
const res = await request('ListPublicMediaBasicInfos', params); // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-listpublicmediabasicinfos
const fileList = res.data.MediaInfos.map(item => ({
mediaId: item.MediaId,
src: item.FileInfoList[0].FileBasicInfo.FileUrl
}));
return {
total: res.data.TotalCount,
stickers: fileList
};
},
getEditingProject: async () => {
if (projectId) {
const res = await request('GetEditingProject', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-geteditingproject
ProjectId: projectId
});
const timelineString = res.data.Project.Timeline;
return {
projectId,
timeline: timelineString ? JSON.parse(timelineString) : undefined,
modifiedTime: res.data.Project.ModifiedTime,
title:res.data.Project.Title // The title of the project.
};
}
return {};
},
updateEditingProject: ({ coverUrl, duration, timeline, isAuto }) => new Promise((resolve, reject) => {
request('UpdateEditingProject', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-updateeditingproject
ProjectId: projectId,
CoverURL: coverUrl,
Duration: duration,
Timeline: JSON.stringify(timeline)
}).then((res) => {
if (res.code === '200') {
// The video editing SDK for web automatically saves projects. You can use isAuto to determine whether the system automatically saves a project. You can develop logic to display the "Saved" message only when a project is manually saved.
!isAuto && Message.success('Saved.');
resolve();
} else {
reject();
}
});
}),
produceEditingProjectVideo: ({ coverUrl, duration = 0, aspectRatio, timeline, recommend }) => {
return new Promise((resolve) => {
callDialog({ // You need to develop logic for the video production dialog box. In the following sample code, callDialog is used.
onSubmit: async ({ fileName, format, bitrate, description }) => { // The configurations of the output video that are displayed in the video production dialog box.
// Obtain mediaURL by concatenating fileName and format.
const mediaURL = `http://bucketName.oss-cn-hangzhou.aliyuncs.com/${fileName}.${format}`;
// Specify the width and height of the output video based on the aspect ratio of the preview area that you specified in the video editing SDK for web.
const width = aspectRatio === '16:9' ? 640 : 360;
const height = aspectRatio === '16:9' ? 360 : 640;
// If you specify the information about the source video or image material, such as the height, width, and bitrate, the recommended resolution and bitrate are returned for the recommend parameter in the response. The recommended resolution and bitrate are calculated based on the videos and images that you use for the project.
// For more information about the recommend data structure, see the "IProduceRecommend" section of this topic.
// You can display the recommended resolution and bitrate in the video production dialog box or use them in the API operation that you call to generate the video.
const res = await request('SubmitMediaProducingJob', { // https://www.alibabacloud.com/help/en/ims/developer-reference/api-ice-2020-11-09-submitmediaproducingjob
OutputMediaConfig: JSON.stringify({
mediaURL,
bitrate: recommend.bitrate || bitrate,
width: recommend.width || width,
height: recommend.height || height
}),
OutputMediaTarget: 'oss-object',
ProjectMetadata: JSON.stringify({ Description: description }),
ProjectId: projectId,
Timeline: JSON.stringify(timeline)
});
if (res.code === '200') {
Message.success('The video is generated.');
}
resolve();
}
});
});
}
});
/**
* Convert the material information on the server to the format that is supported by the video editing SDK for web.
*/
function transMediaList(data) {
if (!data) return [];
if (Array.isArray(data)) {
return data.map((item) => {
const basicInfo = item.MediaBasicInfo;
const fileBasicInfo = item.FileInfoList[0].FileBasicInfo;
const mediaId = basicInfo.MediaId;
const result = {
mediaId
};
const mediaType = basicInfo.MediaType
result.mediaType = mediaType;
if (mediaType === 'video') {
result.video = {
title: fileBasicInfo.FileName,
duration: Number(fileBasicInfo.Duration),
// The information about the source video, such as the width, height, and bitrate. The information is used to calculate the recommended specification for the output video. If you leave this parameter empty or specify 0, no specification is recommended for the output video.
width: Number(fileBasicInfo.Width) || 0,
height: Number(fileBasicInfo.Height) || 0,
bitrate: Number(fileBasicInfo.Bitrate) || 0,
coverUrl: basicInfo.CoverURL
};
const spriteImages = basicInfo.SpriteImages
if (spriteImages) {
try {
const spriteArr = JSON.parse(spriteImages);
const sprite = spriteArr[0];
const config = JSON.parse(sprite.Config);
result.video.spriteConfig = {
num: config.Num,
lines: config.SpriteSnapshotConfig?.Lines,
cols: config.SpriteSnapshotConfig?.Columns,
cellWidth: config.SpriteSnapshotConfig?.CellWidth,
cellHeight: config.SpriteSnapshotConfig?.CellHeight
};
result.video.sprites = sprite.SnapshotUrlList;
} catch (e) {
console.log(e);
}
}
} else if (mediaType === 'audio') {
result.audio = {
title: fileBasicInfo.FileName,
duration: Number(fileBasicInfo.Duration),
coverURL: '' // The default thumbnail image for audio files.
}
} else if (mediaType === 'image') {
result.image = {
title: fileBasicInfo.FileName,
coverUrl: fileBasicInfo.FileUrl,
// The information about the image, such as the width and height. The information is used to calculate the recommended specification for the output image. If you leave this parameter empty or specify 0, no specification is recommended for the output image.
width: Number(fileBasicInfo.Width) || 0,
height: Number(fileBasicInfo.Height) || 0,
}
}
return result;
});
} else {
return [data];
}
}