All Products
Search
Document Center

Intelligent Media Services:Integrate the live editing SDK for web

Last Updated:Dec 10, 2024

Intelligent production provides professional online live editing capabilities. For time-sensitive content, you can perform streaming and editing at the same time. This topic describes how to integrate the live editing SDK for web.

Usage notes

In this topic, the live editing SDK V1.1.2 for web is used for reference only. You can obtain the latest SDK version from the Note section on the Live Editing Project tab of the Online Editing page in the console.

Procedure

  1. Integrate the live editing SDK for web.

    Import the CSS file of the live editing SDK for web under the <head> tag in the frontend page file of your project. Sample code:

    <head>
      <link rel="stylesheet" href="https://g.alicdn.com/thor-server/live-editing-websdk/1.1.2/index.css">
    </head>

    Under the <body> tag, add a <div> node that is used to mount the editing window, import the JavaScript file of the live editing SDK for web, and then add a <script> node that is used to call the live editing SDK for web.

    <body>
      <div id="aliyun-live-editor" style="height:700px"></div> // You can change the height of the container based on your business requirements.
      <script src="https://g.alicdn.com/thor-server/live-editing-websdk/1.1.2/index.js"></script>
      <script>
        // The code that is used to call the live editing SDK for web.
      </script>
    </body>
  2. Initialize the live editing SDK for web.

    window.AliyunLiveEditor.init(config);
    • For more information about the config object, see the config section of this topic.

    • For more information about the sample code of calling the init() initialization function, see the Sample code of calling init() section of this topic.

config

Parameter

Type

Required

Description

SDK version

locale

string

No

The language of the user interface (UI). Valid values:

  • zh-CN (default): Chinese.

  • en-US: English.

1.0.0

container

Element

Yes

The document object model (DOM) node that is used to mount the editing window in the live editing SDK for web.

1.0.0

projectId

string

Yes

The ID of the live editing project.

1.0.0

onBackButtonClick

() => void;

No

The callback triggered when the Back button in the upper-left corner is clicked. If you leave this parameter empty, the button is not displayed.

1.0.0

updateEditingProject

(req: { ProjectId: string; Title: string }) => Promise<void>;

No

The operation that is called to modify the title of the editing project. If you leave this parameter empty, you cannot modify the title. For more information, see UpdateEditingProject.

1.0.0

getEditingProject

(req: { ProjectId: string }) => Promise<Response<GetEditingProjectRsp>>;

Yes

The operation that is called to query the metadata of the editing project, including the project title and storage path. For more information, see GetEditingProject.

1.0.0

getEditingProjectMaterials

(req: { ProjectId: string }) => Promise<Response<GetEditingProjectMaterialsRsp>>;

Yes

The operation that is called to query the live streams and clips associated with the live editing project. For more information, see GetEditingProjectMaterials.

1.0.0

getLiveEditingIndexFile

(req: { ProjectId: string; DomainName: string; AppName: string; StreamName: string }) => Promise<Response<{ IndexFile: string }>>;

Yes

The operation that is called to query the streaming URL of the live recording stream. For more information, see GetLiveEditingIndexFile.

1.0.0

describeLiveDomainConfigs

(req: { DomainName: string; FunctionNames: string }) => Promise<Response<DomainConfigs>>;

Yes

The operation that is called to query the configurations of the streaming domain. For more information, see DescribeLiveDomainConfigs.

1.0.0

submitLiveEditingJob

(req: SubmitLiveEditingJobReq) => Promise<Response<{ MediaId: string }>>;

Yes

The operation that is called to submit the clip merging job. For more information, see SubmitLiveEditingJob.

1.0.0

getMediaInfo

(req: { MediaId: string }) => Promise<Response<{ MediaInfo: MediaInfo }>>;

Yes

The operation that is called to query the clip merging status. For more information, see GetMediaInfo.

1.0.0

onExport

(segments: Segment[]) => void;

Yes

The callback triggered when the Export to Video Editing button in the upper-right corner of the editing page is clicked. The segments parameter specifies an array of clips that you select.

1.0.0

getDescribeLiveSnapshotConfig

(req: { DomainName: string; AppName: string }) => Promise<Response<GetDescribeLiveSnapshotConfigList>>;

No

The operation that is called to query the configurations of live stream snapshots. For more information, see DescribeLiveSnapshotConfig.

1.1.0

getDescribeLiveStreamSnapshotInfo

(req: { DomainName: string; AppName: string; StreamName: string; StartTime: string; EndTime: string; Limit: number; Order: string }) => Promise<Response<LiveStreamSnapshotInfoList>>;

No

The operation that is called to query snapshots. For more information, see DescribeLiveStreamSnapshotInfo.

1.1.0

batchGenOSSUrlWithSign

(req: { signList: SnapshotOssInfo[] }) => Promise<Response<SignedUrl[]>>;

No

The operation that is called to generate multiple Object Storage Service (OSS) URLs that include access tokens at a time, which are the returned value of the SignedUrl parameter.

Note

If snapshots are stored in a private bucket, you must call an API operation on your server to generate URLs that include temporary access credentials for multiple OSS URLs at a time and take note of the validity period of URLs. For more information, see Include a V1 signature in a URL.

1.1.2

Data structures:

  • Segment

    interface Segment {
      title: string; // The title of the clip.
      mediaId: string; // The media asset ID of the clip.
      coverUrl: string | null; // The thumbnail URL of the clip, which may be null.
      duration: number; // The duration of the clip. Unit: milliseconds.
    }
  • SnapshotOssInfo

    interface SnapshotOssInfo {
      region: string; // The region.
      bucketName: string; // The name of the OSS bucket.
      objectName: string; // The name of the OSS object, which is in the path/to/object.* format.
    }

Sample code of calling init()

Important

The live editing SDK for web is used to support UI interactions and does not send requests. You must develop request logic and use this SDK for web to call the request logic. The request must be sent to your server and forwarded to Alibaba Cloud OpenAPI Explorer based on the AccessKey ID and the AccessKey secret.

// The live editing SDK for web does not provide request logic. The following sample code is provided for reference only. You can use a network library such as Axios based on your business requirements.

const projectId = 'exampleId';

window.AliyunLiveEditor.init({
  locale: 'zh-CN',
  container: document.getElementById('aliyun-live-editor'),
  projectId,
  onBackButtonClick: () => {
    // Return to the previous page.
    window.location.href = '/mediaEdit/list/live';
  },
  updateEditingProject: req => {
    return request('UpdateEditingProject', req);
  },
  getEditingProject: req => {
    return request('GetEditingProject', req);
  },
  getEditingProjectMaterials: req => {
    return request('GetEditingProjectMaterials', req);
  },
  getLiveEditingIndexFile: req => {
    return request('GetLiveEditingIndexFile', req);
  },
  describeLiveDomainConfigs: req => {
    return request('DescribeLiveDomainConfigs', req);
  },
  submitLiveEditingJob: req => {
    return request('SubmitLiveEditingJob', req);
  },
  getMediaInfo: req => {
    return request('GetMediaInfo', req);
  },
  getDescribeLiveSnapshotConfig: req => {
    return request('DescribeLiveSnapshotConfig', req);
  },
  getDescribeLiveStreamSnapshotInfo: req => {
    return request('DescribeLiveStreamSnapshotInfo', req);
  },
  batchGenOSSUrlWithSign: req => {
    return request('multiGenerateOSSURLWithSign', req); // https://www.alibabacloud.com/help/en/oss/developer-reference/ddd-signatures-to-urls
  },
  onExport: async segments => {
    const { ProjectMaterials = [] } = await request('GetEditingProjectMaterials', {
      ProjectId: projectId
    }).then(res => res.data);
    let videoEditingProjectId;
    if (ProjectMaterials.length) {
      // Specify an existing video editing project.
      videoEditingProjectId = ProjectMaterials[0];
    } else {
      // Create a regular editing project.
      const { Project } = await request('CreateEditingProject', {
        Title: `Live editing video_${projectId}`,
      }).then(res => res.data);
      // Associate the live editing project with the regular editing project.
      await request('AddEditingProjectMaterials', {
        ProjectId: projectId,
        MaterialMaps: JSON.stringify({ editingProject: Project.ProjectId })
      });
      videoEditingProjectId = Project.ProjectId;
    }

    const mediaIds = segments.map(s => s.mediaId);

    await handleBindingMaterials(mediaIds, videoEditingProjectId);

    // Open the page for the regular editing project.
    window.open(`/mediaEdit/detail/${videoEditingProjectId}`);
  },
});

// You can call the AddEditingProjectMaterials operation to add up to 10 materials at a time. To add more materials, call this operation multiple times.
async function handleBindingMaterials(MediaIds, ProjectId) => {
  const promiseGroup = [];

  const addTimes = Math.ceil(MediaIds.length / 10);
  for (let i = 0; i < addTimes; i++) {
    const newMap = {};
    const videoList = MediaIds.slice(i * 10, (i + 1) * 10);

    if (videoList.length) {
      newMap.video = videoList.join(',');
    }

    promiseGroup.push(
      request('AddEditingProjectMaterials', {
        ProjectId,
        MaterialMaps: JSON.stringify(newMap)
      })
    );
  }

  await Promise.all(promiseGroup);
};

Related API operations: